Time runs out
Further Reading:

Search archive

The millennium computer bug is totally predictable in its timing, but completely unpredictable in its effects. Its greatest danger, writes Frances Cairncross, lies in that uncertainty

IT HAS never happened before, and it will probably never happen again. No technical hitch has ever has been so pervasive, so expensive or so potentially damaging as the failure of many of the world’s computer systems to understand the difference between dates in this century and the next. Ludicrous in its banality but frightening in the unpredictability of its consequences, the “Year 2000 problem” is already the main preoccupation of information-technology departments around the world. As the millennium approaches, it will increasingly preoccupy policymakers and the public too.

The cause of the problem is ridiculously simple. In the days when computer memory was scarce, programmers got into the frugal habit of using only two digits to write a date, so that 1998 was represented as 98. But, as 99 is a higher number than 00, millions of computers simply cannot place the year 2000. Instead, they may read the first year of the new millennium as the first year of this century, or as a date that does not exist. Even if they pass that test, they may fail to notice that 2000, unlike most centennial years, is a leap year. (Among the ways devised by Christopher Clavius, a 16th-century Jesuit mathematician, to align human time with astronomical, one was to make every fourth full century year a leap year. As luck would have it, 2000 will be the first year to which this rule applies.) All sorts of functions that depend on dates will therefore go wrong, but in ways that are hard to forecast.

Because the millennium-bug problem is so trivial, senior managers have found it hard to take seriously, and politicians have found it even harder. Only two heads of government have given speeches on the subject: Britain’s Tony Blair (with a sure instinct for a gap in the world market for leadership), and, more recently, Bill Clinton. The Group of Eight top industrial countries and the European heads of government both stitched a few lines on the millennium bug into communiqués earlier this year. But for most politicians, the issue is barely on the radar.

That will change. Officials in the foreign ministries of the world’s richest countries have begun to worry seriously about the way the millennium bug might affect weaponry and nuclear-power stations in the former Soviet countries and the developing world, where key countries seem to have done almost nothing to prepare. Even in the rich world, governments are torn between the need to create a sense of urgency and the fear of whipping up public hysteria.

Central bankers were quicker off the mark. “The Year 2000 is potentially the biggest challenge ever faced by the financial industry,” pronounced the Bank for International Settlements, the central bankers’ bank, earlier this year. William McDonough, president of the New York Federal Reserve Bank, thinks the problem is “potentially a survival issue for firms or even markets”.

A few private commentators have gone further. Edward Yardeni, chief economist at Deutsche Bank Securities in New York, who acquired a reputation by accurately forecasting the long bull market on Wall Street, is being strongly bearish about the Year 2000 problem. He thinks it may well lead to a global recession as severe as that of 1973-74, and lasting for at least a year. In the past year he has raised the odds on such an outcome from 30% to 70%. (“I’m not a doomster, just an alarmist,” he reassures audiences.) Dennis Grabow, who runs a company called Millennium Investment, is another pessimist: he predicts a recession lasting two to three years.

Others expect something infinitely worse. The Internet, that brilliant child of the computer age, has paradoxically become the forum for a host of exotic discussions on how to survive the ultimate computer crash, which (say discussants) will surely lead to the end of civilisation as we know it. Stock up on baked beans, head for the hills and make sure you bring your water purifier and your Winchester.

Such dire predictions thrive on the impossibility of saying with certainty how much is likely to go wrong. One of the many extraordinary aspects of the Year 2000 problem is the range of unknowns it reveals. Nobody really knows how widespread it is; how much of it will be fixed; and what it will cost to fix. The curious truth is that the carriers of the bug—computer hardware, software and microchips, which account for so much of the productive power of modern economies—are measured and monitored far less carefully than the economy’s stock of machines, vehicles and buildings.

Utterly predictable

Against such unknowns, though, set one great certainty: the date itself. Unlike most of the disasters with which the Year 2000 problem has been compared—storms, strikes, wars—it is completely foreseeable. In contrast to an earthquake, which arrives unexpectedly but does a fairly predictable amount of damage, it will arrive bang on schedule but do an unpredictable amount of harm. Indeed, because the problem can be anticipated, a vast industry has sprung up to do just that. The hope is that, unless prevention has been unduly delayed or faces special difficulties, disaster can be averted.

That is not to dismiss the dangers. As this survey will show, in some important parts of the world economy, work has indeed started perilously late. Sometimes procrastination makes good sense: why should a small business struggle to fix its solitary personal computer when the price of PCs is falling and a pencil and paper offer a safe backstop? And occasionally it may be quicker to deal with a crisis once it occurs than to try laboriously to forestall it. Often, though, delay seems to be the result of disorganisation or ignorance. Public services, medium-sized companies and middle-income countries all seem to be running special risks.

In the rich world, computer systems have become integral to almost every aspect of life: as Mr Clinton pointed out in July, the typical American home now has more computer power than the entire Massachusetts Institute of Technology did 20 years ago. And computers are increasingly interconnected, so that if one misbehaves, the effect may multiply. Just as in the global economy, more specialisation, complexity and interaction—the drivers of economic growth—come at the cost of greater vulnerability to disruption. “There isn’t quite as much forgiveness in the economy,” says Paul Romer, an economist at Stanford University in California. “But the trade-off is well worth making.”

The greatest worry about the Year 2000 problem, this survey will argue, may be neither its direct effect on economic growth nor, probably, its potential impact on human welfare. Rather, it may be the extra uncertainty it will create just at the moment when the world economy is already becoming increasingly fragile, with confidence shaken by events in Asia and Russia, and weakened by the faltering of America’s long boom.


Since the start of modern times, the end of a century has been a time of economic unease. The British and Dutch stockmarkets in 1699 and 1799 and the Dow in 1899 all saw sharp falls in prices, according to ING Barings, a Dutch bank; between December 2nd and 18th 1899, the Dow fell by 23% (see chart 1). A millennium, even more than a centennial, would be spooky enough without the fear of computer failure. Perceptions, rather than reality, may turn out to be the most dangerous aspect of that pesky millennium bug.


A huge number of websites now cover the Year 2000 problem. Among the best on its international dimensions are those of the OECD, the European Commission, and the World Bank. In the United States, good official sites include a general government site, that of the Federal Reserve Board and of the General Accounting Office.

Other useful sites include those of the Institution of Electrical Engineers, the Information Technology Association of America (ITAA), and Peter de Jager . Links to more sites are given in the text of this survey.

This survey has benefitted especially from help from Vladimir Lopez-Bassols and his colleagues at the OECD, which is publishing a paper on the Year 2000 problem in October and co-sponsoring a conference on it with the ITAA.

For information on Frances Cairncross, the author of this survey, and her book, “The Death of Distance”, see her site. The book may also be bought on-line from The Economist Shop.


To comment on this survey or to put questions to its author, Frances Cairncross, please send email to Selected messages and her replies to them will be published here on October 1st. Your own email address will not be published unless you specifically request this.