There has been an orchestrated chorus amongst government officials over the past several months. "Don't worry, don't panic, it'll be a three-day storm, a bump-in-the-road." This is in stark contrast to what many said only a few months ago. So why this sudden optimism? Are all the computers now repaired and tested? What's the basis for this comforting reassurance?
Picture yourself as a senator or bureaucrat; You are respected and knowledgeable on y2k. Reporters and members of the public come to you for your advice and want straight forward answers. You then face a dilemma. You know that y2k will surely bring an extended disaster, as the evidence suggests so, but alas "no one knows exactly what will happen." You do know one thing: whatever you say will potentially have an impact on public opinion and reaction thereof. You are terrified of being responsible for creating a panic if you were to spill the beans.
As a "responsible" public official, knowing that the supply lines would jam up if the entire population were to do the guideline of "storing a month or two" of food and ...you guessed it...cash from the bank.
Not wanting to panic anyone, the safe route is taken: Tell everyone to prepare like you would for a storm that last 3 days. Little do you realize, however, that you are actually guaranteeing a panic in the long run. This is because by saying it'll nothing more than 2-3 days, the problem is downplayed thereby encouraging a do-nothing complacency as most already have a few days worth of food, etc. in their fridges and cupboards already.
So is there any logical basis for this positive scenario? Ed Yourdon, a 30-year software veteran, unequivocally debunks the whole concept in the following essay:
Y2K And
The Year
of Living
Dangerously
Refuting the Y2K "3-day
snowstorm" metaphor
Whether Y2K is something more than a bump in the road may depend on whether it causes a mere 0.1% reduction in economic growth (as many economists argue), or something far worse: bankruptcies, increased unemployment, and a sharp decline in industrial output lasting a month or more. And "bump in the road" will only be a useful metaphor if we manage to avoid serious industrial accidents in the 60 Russian nuclear utility plants, or the 278,000 U.S. chemical sites believed to be vulnerable to the Y2K problem. Indeed, there are many other scenarios that make many of us question the relevance of "bump in the road," but I'll focus on only one for the remainder of this essay: the relationship between the severity of the problem, and the duration of the problem.
To illustrate, think for a moment how the residents of the East Coast would have described the Blizzard of 1888. It certainly would not have been described as a bump in the road; it was remembered as a killer storm. One account summarized it as follows:
So, how long can we expect the Y2K winter storm to last? The 3-day time-frame has become such a consistent theme in official government statements that it seems orchestrated and coordinated. I'll comment on this later, but now I'll simply assume that consistency implies consensus -- i.e., that everyone responsible for making such statements arrived independently at the same conclusion.The "Great White Hurricane," as it was called, paralyzed the East Coast from the Chesapeake Bay to Maine. Telegraph and telephone wires snapped, isolating New York, Boston, Philadelphia, and Washington for days. Two hundred ships were grounded, and at least one hundred seamen died. Fire stations were immobilized, and property loss from fire alone was estimated at $25 million. Overall, more than 400 deaths were reported.
I won't dwell on the obvious fact that Y2K will arrive over a 24 hour period around the world, even though that alone could be the source of problems. During the course of that 24 hour period, for example, it will be possible for people whose clock still says 1999 to place telephone calls to people whose clocks have rolled over to 2000; and it will be possible for people living in the 21st century to make phone calls to people whose time zone is still in the 20th century. One hopes that the telephone companies have tested the behavior of their software for such quirks, but we won't really know until the Big Day.
Aside from details like this, what about the basic concept? Does it really make sense to imagine that all of the Y2K bugs will occur simultaneously at the stroke of midnight on Jan 1, 2000? Of course not! This should be obvious with a moment's thought, but the range of possibilities is surprising. Indeed, it requires some experience with computer software projects to appreciate this, which we can't realistically expect from the journalists, politicians, bureaucrats, and senior business executives who are preaching the winter snowstorm metaphor -- after all, most of them wouldn't recognize a computer if they fell over one. In any case, consider the following possibilities:
There's a reason for discussing this issue about the simultaneity of Y2K problems: the overall impact of Y2K on society will depend, to some extent, on whether the problems are "clustered" in one short time period, or (as I believe to be the case) spread over a period of weeks, months, or even longer. Think of it in terms of another familiar metaphor: a family of adults and children who are exposed to the flu. In some cases, the entire family comes down with the flu at approximately the same time; it's extremely difficult for the parents to take care of their sick children when they're bedridden themselves. (I have some personal experience with such a problem: my father, one of my sisters, and I all came down with polio within a 48 hour period in the early 1950s, leaving my mother to cope with another infant sister, while spending most of each day traveling to three different hospitals to visit the rest of the family.) The other extreme involves one person at a time contracting the disease -- first one of the children, who manages to infect his or her siblings, and then the parents. The result is often a period of weeks, if not longer, when the entire family is disrupted; in the worst case, the disease recirculates, reinfecting the first family member just when he or she was getting well. And so it will be, I believe, with Y2K.
Even if one assumes that the Federal government and the major private-sector companies achieve a 100%-perfect record of finishing their mission-critical systems in time, it's hard to imagine that all of the non-mission-critical systems will be ready. And even if the failure of a non-mission-critical system does not cause the organization to suffer bankruptcy, that doesn't mean that customers and consumers won't be affected; after all, what's mission-critical to you is not necessarily mission-critical to me, and vice versa. A bank, for example, might argue that it weathered the Y2K storm with little or no disruption in its mission-critical systems; but if it manages to lose the data for my company's business account, then my company may be out of business. As far as the bank (and the government) are concerned, Y2K doesn't even compare to a snow flurry; but for my company, it's the end of the world.
But let's be more optimistic: even if the Federal government and the major private-sector companies finish 100% of their mission-critical systems and 100% of their non-mission-critical systems in time, we still have a problem -- three problems, actually: small companies, small towns and municipalities, and small countries. Numerous surveys have confirmed that well over half of small businesses have done nothing to prepare for Y2K, and a recent survey indicated that approximately 40% of U.S. small businesses do not plan to take any action to prepare for Y2K until they see what problems occur on January 1, 2000. I assume that even the most fervent Y2K optimist will agree that if you don't start your Y2K project until January 1, 2000 then you cannot possibly hope to be finished before January 1, 2000. The situation is approximately the same for small towns and counties: several surveys have confirmed that approximately 50% have not yet done anything, and a few brave cities (including Denver) have already announced a "fix-on-failure" strategy because it has become evident to them that they cannot possibly finish in time. And the situation is approximately the same for a significant percentage of the countries outside North America and Western Europe.
It's hard to imagine how the Federal agencies can believe that they're fully Y2K-compliant if the state and local government agencies with whom they interact are not ready. It's hard to imagine how a Fortune 500 company with 10,000 suppliers and vendors can believe that it's fully Y2K-compliant if its key vendors (some of whom are small businesses) are not compliant. And it's hard to imagine how the United States can truly believe that it has reduced the Y2K problem to the scope of a minor three-day storm if the rest of the world is not compliant. But apparently the optimists can achieve a leap of faith that enables them to say, "Everything that matters will be finished in time. Everything that's not finished doesn't matter." Well, everyone is entitled to believe whatever they want -- and if this is part of your belief structure, then feel free to skip forward to the next section of this essay.
But if you accept the fundamental notion that some mission-critical systems, and many non-mission-critical systems, won't be finished before the ultimate deadline of January 1, 2000, then we need to ask: how much more time will they need to reach completion? Three days? Is that what the government officials and optimistic commentators are trying to tell us? It's almost like a Greek tragedy: "We almost made it, we came ever so close to finishing on time, but despite our best efforts, we just couldn't quite pull it off. Why, all we needed was just another three measly days! So, you'll have to forgive us at this point, because we're going to have three days of winter-storm disruptions while we finish off the last of the work."
Well, if we're already anticipating that to be the case -- i.e., if our government leaders are warning us to be prepared for this situation -- then why aren't we asking our programmers to put in three extra days between now and December 31st? (For any programmers reading this essay, please recognize that this is a rhetorical question -- I know you're already working pretty hard!). Why not pay them quadruple overtime to work on, say, Memorial Day, Fourth of July, and Labor Day to make up those critical three days?
I won't belabor the point any further, because when you look at it this way, it's patently absurd. On a larger scale, it's simply implausible to imagine that an organization that has spent the past three or four years trying to repair its systems would fail to finish in time, but would need only another three days to wrap things up. And if we really did believe such an absurd notion, the solution would be trivial: declare a three-day national holiday for everyone but the beleaguered programmers, and tell everyone to stay home on January 1st, 2nd, and 3rd. Voila! On January 4th, life would return to normal...
The situation is even worse internationally. An example: Senators Bennett and Dodd, who have occasionally acknowledged the advisability of modest stockpiling for the proverbial three-day snowstorm, are also the co-chairman of a Senate committee that recently issued a 160-page report that identified Mexico, Venezuela, and Saudi Arabia as three major oil exporters whose Y2K projects are 9-24 months behind schedule. If they're that far behind schedule in the spring of 1999, it requires an almost superhuman act of faith to believe that they'll somehow catch up and finish on time ... or simply be three days behind schedule. Of course, there are some optimists who do believe in miracles of this kind; that's probably why we see reports, from time to time, of "amazing progress" on the part of such government agencies as FAA. But things don't usually work this way on a software project: you can speed things up, to some extent, by allocating more resources (people) at the beginning of the project, but if you try to make up for lost time toward the end of the project by throwing money (or people) at the problem, it usually doesn't work. This frustrating reality is often referred to as "Brooks' Law," named after Dr. Fred Brooks,the project manager for IBM's OS/360 project in the 1960s: "Adding more people to a late software project just makes it later." (For more about this, see Brooks' classic textbook, The Mythical Man-Month (20th anniversary edition, 1995)).
This notion of superhuman effort in the last few weeks or months of a late software project is a popular one, and it probably comes from watching too many Hollywood movies where the beleaguered heroes overcome incredible odds and prevail at the very end of the movie. Remember actor Jeff Goldbum's feat in Independence Day of using his Macintosh Powerbook to inject a software virus into the alien spaceship? Very entertaining, great drama, but utterly implausible. Consider the following analogy to illustrate how you could make a foolish prediction if you didn't know anything about jogging and running: suppose you're observing the progress of a marathon runner whom you desperately want to finish the marathon in a record 2 hours. Well, nobody has ever done such a thing before (the world record is approximately 2 hours and 8 minutes), but perhaps you believe that the power of positive thinking will make a miracle possible this time. Alas, you discover that your runner has only reached the 20 mile mark at the end of one hour and 45 minutes. "No problem!" you exclaim. "We'll just shout at our runner to make a super-human effort to run those last 6 miles in the remaining 15 minutes!" Unfortunately, the 20-mile mark is known as the "wall," where many marathon runners seriously wonder if they'll be able to finish at all; they're certainly not in a mood to be told to increase their pace from a 5.25-minute mile to a 2.5-minute mile for the remainder of the marathon. Do the arithmetic to verify the numbers, if you wish; and keep in mind that the world record for a single mile is approximately 3 minutes and 49 seconds.
The reality, I believe, is that if an organization doesn't finish its Y2K project by Jan 1, 2000, it will still be working on unfinished systems for weeks, months, or even years. Or it will be forced to admit defeat in some areas, and simply shut down parts of its normal operations. The best-prepared organizations may have contingency plans that provide for a manual "work-around" to substitute for non-compliant systems, but even that is likely to take more than a few days to put into operation; in any case, the effect of the manual work-around (in terms of lost productivity, slower response time, reduced functionality, etc.) will be felt for a long time.
Bottom line: to whatever extent we discover that X% of the computer systems are not yet remediated by January 1, 2000, the effects are likely to be felt for weeks, months, or even longer.
Where does such an assumption come from? There may be a good explanation somewhere, but I haven't been able to find one. The only thing that makes sense to me is that most of the familiar, non-Y2K failures that we've experienced with electric utilities -- i.e., the very failures that one would expect in a real winter snow storm -- are repaired within a matter of two or three days. Of course, there are exceptions, as the residents of Virginia learned during a storm at the beginning of 1999; and there are occasionally major exceptions, as the residents of Montreal learned during the ice storm of early 1998, and the residents of Aukland, New Zealand discovered in February 1998. Others have suggested that the 2-3 day time-frame represents the period of time usually required to organize a relief effort (e.g., the Red Cross, National Guard, FEMA, etc.) and transport the appropriate people and equipment to the disaster area.
But none of this has anything to do with Y2K computer failures. In the worst case, all it will take is one catastrophic failure at one of the 432 world-wide nuclear utility facilities, or one catastrophic failure at one of the hundreds of thousands of world-wide chemical processing plants, to find ourselves facing a crisis that will render any comparison with three-day snowstorms ludicrous and embarrassing. We all hope that such a catastrophe won't happen, and we all assume that the responsible officials are taking appropriate actions to ensure that they won't happen. Since discussions of such catastrophes are likely to evoke charges of "scaremongering," and since I like to be an optimist as much as anyone else, I'll put the catastrophe scenario aside for the remainder of this essay, and focus instead on "normal" bugs and glitches and problems that we might expect to encounter with Y2K.
Interestingly, we don't even have to wait until Jan 1, 2000 to convince ourselves that this discussion is not "hypothetical" -- because we're already seeing some early indications of Y2K problems popping up because of "look-ahead" calculations within many computer systems. Everyone was watching very closely on January 1, 1999 to see what might go wrong; and when it appeared that the few reported problems were solved in a matter of hours, or a few days at worst, everyone lost interest. Meanwhile, though, we've begun to see a few cases that were not solved so quickly; for example, residents throughout the state of New Mexico received a letter dated March 8, 1999 from the dominant utility company (PNM Electric and Gas Services) that read as follows:
OOPS.. ran out of room...CLICK HERE to see remainder of essay
Home Page: Future, Doomsday, Year2000