Our lives depend on computers. They control our money, transport, our exam results. Yet their programs are now so complex that no one can get rid of all the mistakes.
[1] Life without computers has become unimaginable. They are designed to look after so many boring but essential tasks - from microwave cooking to flying across the Atlantic - that we have become dependent on them.
[2] But as the demands placed on computers grow, so have the number of incidents involving computer errors. Now computer experts are warning that the traditional ways of building computer systems are just not good enough to deal with complex tasks like flying planes or maintaining nuclear power stations. It is only a matter of time before a computer-made catastrophe occurs.
[3] As early as 1889, a word entered the language that was to become all too familiar to computer scientists: a 'bug', meaning a mistake. For decades bugs and 'de-bugging' were taken to be part of every computer engineer's job. Everyone accepted that there would always be some mistakes in any new system. But 'safety critical' systems that fly planes, drive trains or control nuclear power stations can have bugs that could kill. This is obviously unacceptable.
[4] One way to stop bugs in computer systems is to get different teams of programmers to work in isolation from each other. That way, runs the theory, they won't all make the same type of mistake when designing and writing computer codes. In fact research shows that programmers think alike, have the same type of training - and make similar mistakes. So even if they work separately, mistakes can still occur. Another technique is to produce back up systems that start to operate when the first system fails. This has been used on everything from the space shuttle to the A320 airbus, but unfortunately problems that cause one computer to fail can make all the others fail, too.
[5] A growing number of computer safety experts believe the time has come to stop trying to 'patch up' computer systems. They say programmers have to learn to think clearly and to be able to demonstrate through mathematical symbols that the program cannot go seriously wrong. Until programmers learn to do this, we will probably just have to live with the results of computer bugs.
[6] Of course, more often than not the errors are just annoying, but sometimes they can come close to causing tragedies. On the Picadilly line in London's Underground a driver who was going south along a track got confused while moving his empty train through a cross-over point. He started to head north straight at a south-bound train full of people. The computerised signalling system failed to warn him of impeding disaster and it was only his quick human reactions that prevented a crash.