Site hosted by Angelfire.com: Build your free website today!

y2k, Y2K, year 2000, millennium, glitch, millenium, bug, facts,economics, investments.
click

How did we get into this mess?

"Denial, short-sightedness and procrastination got us here, and is also what will ensure our demise."
"A global financial crash, nuclear meltdowns, hospital life-support system shut downs, a collapse of the air-traffic system are possible without proper attention now." ____-NY Times, feature article 4/2/97

The Year 2000 Millennium Bug

The millennium bug is deceptively simple and trivial. It is also an overwhelming problem .

To get an idea of how serious this problem is, see Scary Quotes from well known Y2K figures.

When computer systems were built in the 1950s 60s and 70s, computer hardware was very costly. To reduce system development costs, programmers looked for ways to minimize data storage requirements. Only data deemed essential was stored in the computer databases. The century portion of date fields were frequently omitted as few people imagined that those early systems would still be operating in the new century. It was common for dates to be stored as 6 digit fields (YYMMDD) rather than 8 digits (CCYYMMDD). The century (19) was simply 'hard coded' into computer programs and displayed as appropriate.

The computer systems built during this period were not very "user friendly" by today's standards, but nonetheless were often the most critical to the operation of the business. When personal computers became popular in the 1980s, new graphical "front-ends" were added to the existing systems. However, the underlying databases and computer programs were often retained. They often grew in size, becoming more complicated and difficult to maintain with each passing year.

Computers only do what they are programmed to do. The legacy systems of the 60s and 70s were not designed to run in multiple centuries. They have been programmed to recognize "2000" as "1900" (uncorrected PC architecture DOS Windows-based desktop computers will revert back to 1980 or 1984). They can be corrected, but once you turn the machine off the correction dies. It will reboot to "1980" or "1984" as a default. PC programs must be redesigned. The y2k bug is also built in to millions of embedded micro-chips (technically called imbedded systems).

Programmers who recognized the implications of this problem did not care. They assumed that their software would be updated by the year 2000. That assumption now threatens every piece of custom software on every mainframe computer around the world, unless they have rewritten the code. In some cases, this involves coordinating half a million lines of code. That means that AT&T?s whole system could be shut down because of a one-digit error, the same way America OnLine was shut down.

The "Millennium Bug," or "Bomb" is unsolvable--There just is not enough time left, and most started compliance repairs too late. Nevertheless, a brave attempt to minimize the damage is now getting under way both in the private sector and on the federal government level. Despite these efforts, most will not make it before 2000. No large company or government has yet claimed "We are year 2000 compliant", This is troubling indeed because this time bomb cannot be fully disarmed - at any price -. Don?t count on a " silver bullet " either --remediating corporate systems is time and labor intensive. What makes matters worse is the fact that compliant companies cannot exchange data with that aren't. So what happens when the clock strikes 01-01-00 (or 01-01-2000), most computers will produce incorrect, useless data, or simply crash due to the inability to access the right century in processing date-critical calculations. Embedded systems may sense that a power plant will have to be shut down, or an elevator to malfunction.

Excellent introductory article that appeared in Vanity Fair: (It's worth the read)

Vanity Fair January 1999

Will the millennium arrive in darkness and chaos as billions of lines of computer code containing two-digit year dates shut down hospitals, banks, police and fire departments, airlines, utilities, the Pentagon, and the White House? These nightmare scenarios are only too possible, Robert Sam Anson discovers as he traces the birth of the Y2K "bug," the folly, greed and denial that have muffled two decades of warnings from technology experts, and the ominous results of Y2K tests that lay bare the dimensions of a ticking global time bomb.

The Y2K Nightmare

By Robert Sam Anson

The nightmare scenario goes like this:

It is an instant past midnight, January 1, 2000, and suddenly nothing works. Not ATM.'s, which have stopped dispensing cash; not credit cards, which are being rejected; not VCRS, which now really are impossible to program. The power in some cities isn't working, either; and that means no heat, lights, or coffee in the morning, not to mention no televisions, stereos, or phones, which even in places with power-aren't working, either. Bank vaults and prison gates have swung open; so have valves on sewer lines. The 911 service isn't functioning, but fire trucks are on the prowl (though the blaze had better be no higher than the second floor; since their ladders won't lift). People in elevators are trapped, and those with electronic hotel or office keys can't get anywhere, either. Hospitals have shut down because their ventilators and X-ray machines won't work and, in any case, it's now impossible to bill the H.M.O.

Traffic is a mess, since no streetlights are working. Trains are running, but their control switches aren't, which is bad news for supermarkets, utilities, car dealers, and international trade, which can't move by ship either. Only the brave or foolhardy are getting on airplanes-but with so many countries degenerating into riots and revolution, it's wiser to stay at home anyway. There are no newspapers to read or movies to go to or welfare checks to cash. Meantime, retirees are opening letters saying that their pensions have been canceled because they are minus-23 years old. Many banks and small businesses have gone bust, and it will be weeks-if ever before the mess that is the broker's statement is sorted out.

On the brighter side, no one can punch a time clock; on the darker; most of the big manufacturing plants have shut down because their lathes and robots aren't work- ing. Pharmacies aren't filling prescriptions; the D.M.V. is not processing license renewals, and everyone's dashboard keeps flashing SERVICE ENGINE Now. Mortgage payments sent on time have been marked late, and everyone's phone bill is messed up because of all those calls that began in 1999 and ended in 1900. On the Internet where thousands of Web sites are suggesting how to find God and when to move to the wilderness the acronym for what's occurring is TEOTWAWKI: The End Of The World As We Know It.

Will it happen? "Yes," "No", and "Maybe," say the experts. And that's the most unnerving thing about the phenomenon variously known as "Y2K," "The Year 2000 Problem," or "The Millennium Bug": No one will know the extent of its consequences until after they occur. The one sure thing is that the wondrous machines that govern and ease our lives won't know what to do. Some will freeze, electronically paralyzed; others will become imbecilic, giving idiot. answers and issuing lunatic commands; still others, overwhelmed, will simply die -as will the blind faith the world has placed in them. "The reason we are in this screwup," says Paul Strassmann, who oversaw the Pentagon's vast computer operations during the Bush administration, "is that ... Americans fell in love with computers and put up with ... failures that they would not have put up with in a crummy toaster or microwave."

The product of this folly is a looming disaster with an immovable deadline that will touch the entire world. Its scale is unique, too: 1.2 trillion lines of potentially lethal software code located in virtually every country, plus 30 billion microprocessors. Many are linked-computer to computer; network to network, place to place -everywhere from birthing rooms to crematoriums. And if potential catastrophe is to be avoided, every line and chip must be checked-a task variously likened to building the pyramids, changing all the light bulbs in Las Vegas in an afternoon, or individually polishing enough marbles to fill the Grand Canyon.

The global cost of putting everything right is estimated to be as much as $3.6 trillion, according to Capers Jones, author of the 1997 book The Year 2000 Software Problem. This includes lawsuits, which some expect will total $1 trillion in the United States alone. But squandered treasuries are just the beginning of the misery. Because of Y2K there are predictions of a recession matching the oil shock of 1973-74, hundreds of thousands of bankruptcies, and a disruptions of government services from police protection to food inspection. The chairman of the New York Stock Exchange, Richard Grasso, has warned of "potentially disastrous consequences"; a senior executive at Barclays, the British bank, reportedly advised clients to sell their homes, stockpile cash, and buy gold. And even Utah senator Robert Bennett-the most knowledgeable Y2K expert in Congress-was moved to say, "I'm not yet ready to dig a shelter in the backyard. But it might not be a bad idea to have a little extra food and water."

For some, Y2K offers rewards. Sales of survival gear are at record levels, Southern Baptists foresee "historic evangelism opportunities," and the career of singer Pat Boone-who has already recorded public-service announcements "to bring Y2K to the family dinner table for dialogue"-has been revived. The prospect of impending doom has provided fodder for Pat Robertson's Christian Broadcasting Network, which asks the faithful, "How does God want us to redeem this situation for Him?"

Some experts are more optimistic. But even the cheeriest assurances are not completely comforting. Y2K, said Jeanne Terrile, an analyst at Merrill Lynch-which has spent $375 million fixing its problems-is like the space shuttle: "It always goes off smoothly ... except once it didn't, and the country came to a standstill."

Preparing for the worst, Canada has made plans to call out the troops; Wisconsin and Iowa, the National Guard. Meanwhile, airlines are considering imposing "no-fly zones" on areas of the world not Y2K-ready. Computer experts are also getting ready. A recent survey of technology executives found that 10 percent of them planned to stockpile canned goods, 11 percent were preparing to buy generators and woodstoves, and 13 percent were going to purchase "alarm Systems, fencing, and firearms." "The problem," Intel chairman Andy Grove has warned, "is going to be pretty bad."

Belatedly, that's become apparent to the U.S. government. "If we don't fix the century-date problems, we will have a situation scarier than the average disaster movie you might see on Sunday night," I.R.S. commissioner Charles Rossotti said last year. "The whole financial system of the United States will come to a halt. It not only could happen; it will happen, if we don't fix it right." "This is going to have implications in the world and in American society we can't even begin to comprehend," added Deputy Secretary of Defense John Hamre. "I would be the last person to suggest we're not going to have some nasty surprises, because I definitely think we will." Among those surprises could be weapons-such as Tomahawk missiles or ICBMs-that won't launch when they're supposed to or will fire when they're not supposed to.

So grave are the concerns for the nation's power grid-in which a failure in one region can cascade to others-that Connecticut senator Christopher Dodd said, "We're no longer at the point of asking whether or not there will be any power disruptions but now we are forced to ask how severe the disruptions are going to be." There are similar trepidations about telecommunications, the healthcare industry, and nuclear power plants at least one of which has already failed a Y2K test. White House Y2K czar John Koskinen has said, "We could have, if not the equivalent, something that is very much like a hurricane on the East Coast, an earthquake in San Francisco, and flooding on the Mississippi River happening all at once."

Things will be infinitely worse overseas, where Y2K's impact on the delivery of food, seed, and fertilizer could result in between 10 million and 300 million deaths. When one Middle Eastern contact was told of the millennium bug, according to Sherry Burns, who heads the C.I.A.'s Y2K office, he replied, "When we see it, we'll spray for it."

The Middle East, where half the oil companies expect at least one critical breakdown, is in particularly bad shape-as are Japan, which has the world's largest banks, and China, where much of the software was pirated, not to mention Indonesia. "Asia," said Deutsche Bank Securities chief economist Edward Yardeni, "is toast. In the year 2000, Asia will be burnt toast." But the biggest jitters are over Russia, which possesses not only 11 Chernobyl-type power plants, 22,500 nuclear warheads, and the funds to fix none of them, but also an attack-warning system so vulnerable to Y2K that the Pentagon has proposed stationing officers in each other's command centers New Year's Eve 1999. "[The missiles] are probably just going to sit there and tell their operators, 'I'm confused,"' John Pike of the Federation of American Scientists said in a recent interview. "But there is a small, finite risk that this could lead to accidental nuclear war, simply be cause people fail to fix their computers."

How all this happened is an unlikely story, and it begins nearly a half-century ago with a most unlikely woman. Her name was Grace Murray Hopper, and in the field of computer science there has seldom been her like. Feisty, quick tongued, and irreverent ("Life was simple before World War II," she liked to say, "after that, we had systems"), 'Amazing Grace," as colleagues called her, racked up many distinctions during her 85-year life. They included coining the phrase "computer bug" (a moth flew into one she was working on); becoming the first female admiral in the navy; and inventing the "compiler"-the software element that translates text into the is and Os a computer can understand. But the accomplishment for which Grace Hopper is best remembered was helping to create a computer code actually useful in everyday life. Its name was common business-oriented language-COBOL, for short.

Able to perform myriad business tasks, COBOL was the Windows of its day-the program that ordered the functioning of virtually every business computer. The earliest of these gargantuan mainframes had a flaw, however. In order to operate, they relied on a dollar-bill-size piece of cardboard called a Hollerith card. Named after the inventor of the first electric tabulating machine-and used in one form or another since as early as 1890-Holleriths passed information by means of small rectangular holes punched into 80 narrow columns. "You can put a hole in this card representing one dollar," IBM chairman Tom Watson would tell customers. 'A dollar of sales, perhaps, or a dollar you owe someone. From that point on, you have a permanent record. It can never be erased, and you never have to enter it again. It can be added, subtracted, and multiplied. It can be filed, accumulated, and printed, all automatically."

All of which was true-as was the fact that "IBM cards," as Watson preferred to call them, barely had room for a name, birth date, and address. Initially, this was handled by simply using more cards-three, for instance, to record a single magazine subscription. But as time went on and businesses got bigger, companies found that they needed entire buildings just to store punch cards. This, they made clear to IBM, would not do. So, in order to squeeze in as much information as possible, programmers shortened COBOL instructions whenever they could. That included dates, which were reduced from eight digits to six by lopping off the "19" from the year. A computer would thus read "123199" and know the digits stood for December 31, 1999. What a computer could not do was realize that one second after midnight on that date it would be January 1, 2000. So, in the manner of an odometer passing 99999 miles, the numbers would roll back to "00," which a computer would interpret as 1900-provided that the sudden loss of a hundred years didn't prevent it from functioning, period.

Programmers were aware of the problem, but believed improving technology would fix it decades before 2000. And by the end of the 1950s, technology did get better, with Holleriths giving way to more compact magnetic tape. The guts of computers transformed, too, shrinking from vacuum tubes to transistors to integrated circuits. In 1964 there was another revolution when, after an expenditure of $5 billion (more than it had cost to create the first atomic bomb), IBM introduced its System/360-the first "family" of computers which could use the same disk drives, printers, and peripherals, regardless of a model's size or power. The intention, Watson's son and successor, Tom junior, recorded in his memoirs, "was to make all other computers obsolete.... Once customers shifted to System/360, they'd be able to expand their installations simply by mixing and matching components from our sales catalog. That was good for them, and the benefit for IBM was equally compelling once a customer entered the circle of 360 users, we knew we could keep them there for a very long time."

In achieving that goal, IBM succeeded spectacularly well, demolishing the competition and becoming the mainframe standard setter. Less gloriously, it also retained the two-digit year.

Holding on to old format was rationalized as a cost-cutter, particularly in computer memory-at the time the cost was $761 to store the equivalent of the information contained in a Danielle Steel novel. Considering that the same amount of memory now costs less than a thousandth of that, it was pound-foolish writ gigantic. But at the time, few realized it. "I'm one of the culprits who created this problem," the former proprietor of an economic-consulting firm told Congress last year. "I used to write those programs back in the Sixties and Seventies, and was so proud 0£ the fact that I was able to squeeze a few elements of space by not having to put 19 before the year.... It never entered our minds that those programs would have lasted more than a few years." Ordinarily expert at spotting future difficulties, the speaker was Federal Reserve Chairman Alan Greenspan.

A handful were more foresighted. Among them was Robert Bemer, an IBM wizard who had invented the "Escape" key, and was one of the creators of "ASCII," the language that enabled different computer systems to "talk" to one another. During the 50s, Bemer also developed a feature that permitted COBOL programmers to use either two or four digit year dates. A passionate proponent of the latter, in 1960 Bemer joined with 47 other industry and government specialists to come up with universally accepted computer standards. The wrangling, however, stretched out for years-too many years for the White House, which, in 1967, ordered the National Bureau of Standards to settle the matter. In so doing, the bureau was to gather input from various federal agencies, some of which were using two-digit years, others four. As a practical matter, the only opinion that counted was that of the Department of Defense, the largest computer operator on earth. For bigger-bang-for-the-buck reasons, it was unshakable on the subject of year dates: no 19s. "They wouldn't listen to anything else," says Harry White, a D.O.D. computer-code specialist and Bemer ally. "They were more occupied with ... Vietnam."

After years of losing fights, White transferred to the Standards Bureau. Hardly had he arrived when the bureau succumbed to Pentagon pressure and announced that two digit years would become the preferred option for federal agencies, starting January 1, 1970. Hoping for presidential intervention, White and Bemer rounded up 86 technical societies and asked Richard Nixon to declare 1970 "The National Computer Year." When D.O.D. lobbying kept that appeal from reaching the Oval Office, Bemer recruited the presidential science advisor, Edward E. David, to plead the case in person. Nixon listened, then asked for help fixing his TV set. Frantic, Bemer and White beseeched private organizations to call for a voluntary four-digit-year option. But once more the Pentagon's position prevailed. Mindful of government contracts, big business went along. Bemer was reduced to issuing caveats in obscure technical journals. "There are many horror stories about programs, working for years, that died on some significant change in the date," he wrote in the February 1979 issue of Interface Age. "Don't drop the first two digits [of the century]. The program may well fail from ambiguity in the year 2000." The reaction in some quarters, Bemer recalls, was laughter. Quietly, though, the Y2K word was spreading. Two years after Bemer's article was published, technology commentators Joe Celko and Jackie McDonald jokingly wrote in a trade paper, "I have a plan to make a fortune in the year 2000. I will start a company called 'Father Time Software' that does nothing but correct programs and data files that used only the. last two digits of-the year ... for keeping their records.... We will charge fantastic fees, and clients will have no choice but to pay." In 1983 a Detroit programmer, William Schoen, tried to make the idea a reality. After stumbling on the date problem while working in the basement of a Big Three automaker, he devised a $995 solution on his home PC., then set up a company to peddle it. The Charmar Correction-the answer for "the problem you may not know you have," as the American weekly Computerworld called it-had exactly two sales.

Nevertheless, warnings about Y2K persisted, including in a book written by an Illinois couple, Jerome and Marilyn Murray. Published in mid-1984, Computers in Crisis: How to Avert the Coming Worldwide Computer Systems Collapse had its genesis on a day when Mrs. Murray, an assistant vice president at an insurance company, was figuring annuities. All went well until she keyed in an annuity due after 2000, at which point the computer spat back "1900," then reams of gibberish. She reported the incident to her husband, a former IBM consultant, who went to his old colleagues for explanation. The one they provided boiled down to two sentences: This is a user problem-a crisis for which computer users alone are responsible. There is no IBM solution to the problem in this room. Predicting "domestic and international chaos" if someone didn't soon come up with one, the Murrays wrote in their foreword: "We have placed our confidence, physical and economic wellbeing, and future hope in the development of a technology now seen to be fatally flawed through collective human oversight. What have we done? What will we do?"

Two years later, in South Africa, a programmer named Chris Anderson started asking himself the same questions. The answers he came up with were sufficiently alarming that he took out a magazine ad-"The Timebomb in Your IBM Mainframe System." Big Blue responded: "IBM and other vendors have known about this for many years. This problem is fully understood by IBM's software developers, who anticipate no difficulty in programming around it."

But with every new computer-62 million of which were in use in the U.S. by 1991-the scale of the problem increased. So, too, did the complexity of fixing it. For as was disturbingly becoming apparent, "COBOL Cowboys," as rough-and-ready programmers called themselves, had worked according to whim, sometimes deliberately hiding dates (as this guaranteed they'd later need to be hired back), other times disguising them under the names of girlfriends, cars, even Star Trek characters (because this was thought idiosyncratic and amusing). Thus, "2000 - 1983 = 17" might read as "Gloria + Chevy = Spock." Not that the code needed to be so complicated. In 1997 the Washington State Department of Social and Health Services discovered that many of its computer functions were being governed by one word: "Bob." Had the mischief been contained, the impact would have been negligible. But in the name of "downsizing" and "productivity," computers were increasingly running everything. And how they ran never stopped changing, as business kept demanding better, faster, cheaper thises and thats. In the rush, no one bothered to get rid of Grace Hopper's COBOL core. Instead, revisions were piled on top of it, layer upon layer, until a system containing hundreds of millions of records could have thousands of levels, constructed by hundreds of different programmers--each of whom had his own way of doing things. "It's as if you were building a bridge," says Dr. Leon A. Kappelman, co-chairman of the Society for Information Management's Year 2000 working group, "and let every riveter pick their own kind of rivet and drill different boles and use different rivet guns."

The result was "Spaghetti Code," an unending tangle of 0s and 1s. To decipher their meaning-and what it might hold for the millennium-one would have to possess the "source code," a frequently misplaced, decades-old document most of whose writers were retired, deceased, or, as in the case of Alan Greenspan, had gone on to other chores. "I don't remember having any significant documentation," the Fed chairman told Congress. Earlier he had said, "If I were to go back and look at some of the programs I wrote 30 years ago--I mean, I would have one terribly difficult time."

Things were made even harder by the nature of the software beast. "It cannot make a single mistake," said Greenspan, seven months after explaining that "you can be extremely scrupulous in going through every single line of code in all your computer operations, make all the adjustments that are required ... say, 'We have solved the 2000 problem,' and then find that when the date arrives, all of the interconnects that are now built in start to break down."

(end of part 1) CLICK HERE FOR PART TWO of article




Future,Doomsday,Year2000

See also: y2k can't and won't be fixed.


To see big picture of y2k's global economic effects, go to: Macro-economic Thesis for y2k.


y2kbn468
>
since May6,1998 E-mail Feedback to: AUTHOR