The key to understanding the magnitude and potential impact y2k will have on the economy and infrastructure is realizing how failing systems interact with other systems, failing or functional. As Gary North frequently mentions, "It's systemic, it can't be fixed." What does he mean by this? In essence, the year 2000 bug is built into "the system, " or the body. For the whole of the system to remain intact, virtually all of it must be fixed.
A computer network is a system. What happens if 20% of computers of a network of 5,000 interlinked terminals are not compliant and start spreading bad data? Would it only take 5% or less to disrupt the entire network, rendering it unusable? What about 3%? Alan Greenspan was a mainframe programmer in days gone by. He is the head of the Federal Reserve banking system. The following is a testimony made in front of the House Banking Committee:
....Experts also emphasize that the problem must be fixed properly and on time if Year 2000 related problems are to be avoided. I was intrigued by a statement Federal Reserve Chairman Alan Greenspan made a couple of weeks ago. He pointed out that 99 percent readiness for the Year 2000 will not be enough. It must be 100 percent. Thus, the message seems clear: all financial institutions must be ready; federal and state regulatory agencies must be ready; data processing service providers and other bank vendors must be ready; bank customers and borrowers must be ready; and international counterparties must be ready."
This isn't myth or theory... it's fact. See Imported Data.
We live in an extremely complex, automated society with countless interacting systems. There is a certain fault tolerance that can be reached before the function of the larger system begins breaking down. It may be 2%...or 5%. No one really knows, as we have never seen the system put to the test as y2k will do only months from now. It can be pictured as a house of cards; if you carefully remove a few cards, the structure remains intact--though weakened. Continue to remove cards, one-by-one, and the structure eventually collapses--100%. Yet it may only take a removal 2 or 3% of the cards to do this.
This can be demonstrated by writing mistakes. If you are a regular reader to this site you may notice that I occasionally make errors in spelling or grammar. While I'm not a professional writer, I usually know how to correctly spell these aberrant words, but tend to do this without knowing it. Sometimes out of carelessness or simple absent-mindedness. If the error rate is perhaps one in a hundred words, the effective ness of the material is not compromised to any significant degree. If it reaches a 10% error-rate the credibility of the subject being written is reduced to the point where the reader tends to disregard the message. When it reaches 20% or more it becomes gibberish; useless. To make a point, the following paragraph has a hundred words. 17% are misspelled.
I try my best too be a good speller as I now it is the key to good writing style and reeder acceptance. I recieved good grades in scholl and know how to make a good presentation but cant seem to get it rite. Woe is me for shoddy matarial not worth reeding! See how the mistakes aer seriously detracting from the potential mesage--even if important? Maybee I shud be more careful or use a spel-cheker. This is a must for anyone to take yor're material seriously. Or perhaps I should re-enroll in gramar scool! Elementary deja voo.Would it not be optimistic to assume that 80% of the Fortune 500 companies got 90% of their mission-critical systems repaired before 2000? What happens if 20% fail miserably and suffer severe business disruption or out-right bankruptcy? How many out-right failures of GM's 100,000 suppliers would it take to put GM out of business? 40,000 of those are critical to its operations. The failure of only 1%, or 400 of those would surely sink this multi-billion dollar company. (I like picking on GM:-)
See: Domino Effect and Macro-economic Thesis On Y2K.
The systemic nature of y2k is by far the most significant issue y2k involves; yet the least covered.
The following is from Y2KNEWSWIRE.COM called "Y2K Is Systemic"
Y2K is Systemic
A local journalist working for the Orlando Sentinel scooped every mainstream reporter in the nation when he published a stunning article titled, "User beware -- Y2K compliance claims could be bogus." This story revealed that when claims of Y2K compliance are put to the test, approximately half of them turn out to be false. Click here to read the story.
(This, the word from companies like Lockheed-Martin, Bellsouth and IBM...)
Upon publication, however, the story wasn't even a front-page topic. It was published on the same day the NBA draft picks were announced, and you can guess how important that event is to most Americans. "Sports!"
As a result, few people read it. Even then, it wasn't a national story. The point, though, is of global importance: fifty percent of the Y2K compliance claims were bogus.
If this holds true across the board, we are in worse shape than previously thought. Even the Y2K "doomsday" people -- who predict a total collapse of the government, economy and banking system -- have submitted that half the computers might work. In other words, they're saying the "end of modern civilization" is possible with just half the systems going down.
Alan Greenspan himself said, last year, that 99% of the banking computers working wasn't good enough. We need 100%... or something very, very close to it.
THE TWO QUESTIONS:
Two questions arise:
1.What is the rate of Y2K compliance claim falsification, really?
2.What is the end result if x% of the computers don't work?
Y2KNEWSWIRE submits that, if 50% of the world's computers fail on January 1, the doomsday crowd is undoubtedly correct. The list of things that would collapse is longer than the list of those that would stay up. However, we believe significantly more than 50% of the systems will work. Nevertheless, there is a lot of room between 50% and 100%, and this is the key question.
If 50% of those claiming Y2K compliance are lying, it stands to reason that at least 50% of the systems in question are not Y2K-compliant. Making conservative, optimistic assumptions, let's submit that 50% of the individual systems (both software and hardware systems) work right now -- today.
Out of the 50% that are non-compliant right now, suppose that another half of those achieve compliance in the next six months. That leaves 25% that are non-compliant.
Out of the 25% that are non-compliant, suppose that 10% of those fail outright, while the other 90% continue to operate, but churn out bad data. So we have 2.5% of all the computers failing (in this scenario), with 22.5% churning out bad data in one form or another. (2.5% is 10% of the 25%.)
Given this -- 2.5% failure rate, 22.5% bad operations rate -- what is the real-world outcome?
A BIT OF COMMON GROUND
Interestingly, many people from both camps -- the Y2K Deniers and the Y2K Alarmers -- probably agree on this 2.5% figure. Not all, of course, but a surprisingly-large number of people put this 2.5% figure well within the realm of "probable," not just "possible." What they disagree on, primarily, is the impact of this 2.5% failure rate.
That is, a large part of the debate surrounding Y2K isn't as much about how many computers will fail as it is about the real-world impact of those failures. Given a reasonable number -- like 2.5% -- the two camps can aggressively disagree on what that failure rate means in the real world.
The Y2K Deniers believe that society is highly-redundant, easily repairable and non-systemic. The Y2K Alarmers (those convinced the problem is real) believe society is insufficiently redundant, not so easy to repair and systemic.
WHO'S RIGHT?
This argument is perhaps best demonstrated by the debate surrounding "back to manual." Y2K Deniers generally believe most computer-controlled operations have a manual backup that is nearly as good as the computer. They believe that if computers were to fail, human beings could take over with minimal disruption. Y2K Alarmers, on the other hand, believe that "going back to manual" is, in many cases, a futile exercise, pointing to the highly-automated systems now in place and the simple fact that if computers weren't more efficient than human beings, there would have been no reason to install them in the first place.
Y2K Deniers have great faith in the ability of today's workers to handle math, accounting and record keeping with nothing but pencil and paper. Y2K Alarmers point out that a large percentage of today's workforce can't even do math with a calculator, much less by hand.
[This author, by the way, was once at a major office supplies retailer where the store manager could not calculate a 10% discount using a calculator. When the author offered to run the numbers for the manager, producing the correct price, the manager looked at the numbers and said, "Yeah, you probably gave yourself a 50% discount."]
INTERDEPENDENCE IS NOT OBVIOUS
American companies go to great lengths to make systemic interdependencies transparent to the end-user (the consumer). In a grocery store, for example, the customer is unaware of all the happenings taking place in shipping, inventory, restocking, financials, security and so on. These things are intentionally hidden from the customer. The stock room even says, "Employees only," meaning "Customer, you're not supposed to see all this." It's not a conspiracy, it's simply good customer service. You don't distract the customer with details they don't need to know.
The same holds true with any shipping company -- like Fedex. Fedex does an amazingly-good job of removing highly-automated, complex requirements from the customer. The customer doesn't see the web of computers keeping the operation going: they don't see the jets, the trucks, the loaders and unloaders. They don't see the mechanics, the safety personnel, administrators and managers. They simply see a package. Here is it! Next-day! (Yet the chain of events required to make that letter appear in your hands is enormously complex and highly-automated.)
In this way, most people are unaware of the complex, systemic nature of our society. This makes it extraordinarily easy for people to suppose society really isn't all that complex, thus putting them in the Y2K Deniers camp.
It is seductively easy to be Y2K Denier -- all that's necessary is a naοve acceptance of the way things appear rather than the way things are. It only takes an acceptance of the surface, the pretty package, the illusion that companies want to present to customers. Again, this isn't a conspiracy, it's simply good business. If you're selling diamonds, you present the image of luxury, not the image of poverty-wage workers slaving away underground in South Africa, digging through tons of dirt and rock to find a single shiny stone. One of these is the illusion, the other is the reality (or, at least, the rest of the story).
It is easy to observe the surface image and assume it "real." This passive act requires no mental effort whatsoever. More challenging, however, is to examine the hidden inner-workings of our modern civilization: the interconnections between financial institutions, the computer processing that feeds data to the radar scopes in the twenty en route air traffic control centers, the record-keeping in law-enforcement, the high-speed communications that keep products on order for Walmart. It is not easy to identify these things, but it is very educational to do so.
The public's transparent interactions with computers is the fulcrum of the Y2K debate. How many computers do we depend on, anyway?
The obvious (but incorrect) answer is that we don't depend on them all that much. This is part of the Y2K Deniers' argument. Most people don't think they depend on computers unless they are actually sitting in front of one and using it. The rest of the time, they believe, computers don't matter that much.
But this is demonstrably incorrect. Take just one hour out of an average day for a white-collar worker. We will list his actions followed by a few of the computer systems upon which those actions depend.
Tom wakes up to the alarm clock: date circuits in the alarm clock, power generating control systems, power distribution systems, telecommunications systems between power plants, safety sensors and systems
Tom takes a shower: embedded systems in the water treatment facility, rate of flow sensors, fluoride saturation equipment, safety sensors, distribution-control systems, natural gas flow control systems, safety systems and distribution systems (for the hot water in his shower)
Tom eats breakfast: farm planning satellite systems, farm equipment, fuel for farm equipment (depends on embedded systems in oil rigs, transportation systems, navigation systems, refineries), food processing equipment, food inventory facilities, food delivery, telecommunications for handling food orders, inventory computers at food processing plants, electricity for everything here
Tom catches fifteen minutes of news: satellite communications systems, broadcast station computers, news office computer systems, cell phone, fax machines, phone communications systems
Tom walks out of his house and to his car, feeling safe because he's in a safe neighborhood: 911 systems, telecommunications systems, police radios, law enforcement vehicles, criminal records computer systems, jail records systems, court records systems
Tom drives to work: traffic light control systems, fuel for the car (and all the dependent systems mentioned above), vehicle systems (brakes, engine & maintenance control, etc.), local and federal roadways infrastructure, road workers, construction companies: scheduling systems, accounting systems, delivery of raw materials, steel refineries (for elevated highways and bridges)
Tom walks into his office: building security systems, electricity for lighting, telecommunications, fire response systems
Tom fires up his computer and begins to work (he's a stock broker): local computer systems, network hardware (routers, switches, etc.), telecommunications and data lines, stock exchange computer systems, bank computers and funds-transfer systems
Of course, this just scratches the surface. To someone looking at the simple explanation of what happened here, computers weren't really involved: Tom simply woke up, took a shower, ate breakfast and drove to work. Big deal, right?
But, in fact, Tom depended on literally thousands of automated systems.
Now, taking the original question of a 2.5% failure rate, what happens to Tom if 2.5% of these systems fail?
WHAT IS A "SYSTEM"?
Central to this question is the definition of the term "system." A city's 911 emergency response system, for example, normally consists of dozens of computers interacting with the phone lines to route and handle a near-constant stream of incoming calls. If you call "911" a "system" all by itself, then a 2.5% failure rate doesn't seem like a worst-case deal. After all, if 2.5% of the 911 systems in the country fail all at once, it's not the end of the world. (Bad, yes, but not doomsday.)
However, this is once again oversimplified. Any 911 installation actually consists of multiple computer systems. There is one system that routes incoming calls, another system handles tracking and logging, a third system handles outbound communications with law enforcement personnel, and so on. Suppose a 911 installation has ten interdependent computer systems (or embedded systems), each with a 2.5% chance of failing. What is the overall chance of this 911 installation being unable to operate?
The chance is (1-(1-2.5%)^10), or a 24% chance of the installation failing. Suddenly the 2.5% "system" failure rate balloons to an overall 24% chance of the installation going down thanks to the interdependencies of the internal systems.
This point, too, is largely missed by Y2K Deniers. In fact, in our experience, not one in ten Y2K Deniers understands the math example given above. (Notably, if computers fail and people are forced to go back to manual, they will be required to perform calculations far more complex than the one given above.)
Now, instead of 2.5% of the 911 systems failing across the country, we have 24% failing -- almost a quarter of the systems in question. This is a much bigger problem. Y2K problems tend to "gang up" on you: they're cumulative. One city in a riot can be handled. Thirty cities in a riot is simply beyond control. You would simply have to wait it out.
Some systems are far more complex than the 10-system 911 example given above. Fuel refining and distribution, for example, depends on hundreds of systems. How much larger are the odds of failure in a system where a hundred computers must all operate correctly? The answer is (1-(1-2.5%)^100), or a 92% chance of failure.
Again, if given the odds of any single system failing at 2.5%, plus a "chain" of 100 systems that must all work correctly, the odds of the entire chain failing are very high: 92%.
At the same time, if the chain is short, the odds of failure are very low. The natural gas distribution systems, for example, contain fewer systems than oil refinery and distribution. If you assume a natural gas system only has five computer systems between the gas well and your house, your odds of success are considerably higher: 88%, with only a 12% chance of the system failing (once again assuming the 2.5% individual system failure rate).
The effect of the 2.5% "failure" rate, then, can only really be determined once we know the number of internal systems taking part in any particular "chain." The longer the chain, the higher the chance of failure. The shorter the chain, the higher the chance of success.
Notably, most chains have "power" as one of their links, meaning that if power is lost in any particular area, it breaks nearly every chain worth considering. That's the Domino Effect in action. If you lose power for more than just a few days, you lose nearly everything in our economy and infrastructure.
BUT WHAT ABOUT TOM?
Back to Tom's day: let's backtrack for a minute. We know Tom depends on thousands of computer systems just to wake up and get to work. We're assuming that 2.5% of those systems will fail, and we know this means some "major" systems have a much larger chance of failure due to the OR probabilities described above.
What will Tom's day be like?
For this demonstration, we'll give the Y2K Deniers a key assumption: we'll assume there are no power problems whatsoever.
Tom wakes up to the alarm clock: the alarm clock and the power all work. (Whew!)
Tom takes a shower: the shower works because we'll also assume natural gas is working. The water treatment plant works, too, but unknown both to Tom and the water treatment facility, a safety valve of a manufacturing plant failed upstream, polluting the city's water with an unsafe level of PCBs. Neither Tom nor the rest of the population are aware they're being poisoned, and they will probably never know.
Tom eats breakfast: Tom finds himself out of food because the food distribution system has failed. He also didn't bother to pick up any extra, figuring the whole Y2K thing was a hoax. Tom goes hungry today, but he still wants to go to work because he knows he'll need that paycheck to buy food in the over-priced underground food market.
Tom catches fifteen minutes of news: We'll assume the news system works. On the news, he sees stories about the breakdown of food deliveries.
Tom walks out of his house and to his car: Unfortunately, the 911 system has failed. A pickup truck screeches to a halt directly in front of his house, six thugs hop out and quickly surround him. "Your keys and your wallet," one of them says. He hands it over and they leave, now with the pickup truck and his car. He goes back inside to call 911 and gets a busy signal. Notably, his phone service is working just fine.
Tom still needs to get to work, so he rides with a neighbor, figuring he'll stop by the police station to report his car stolen: Fortunately, the traffic lights work. He gets to the station and finds a mob of angry citizens. A lone police officer is holding them back, blocking the front door with his body and urging them to calm down. Tom asks somebody what the mob is all about. "I don't know," the man answers. "I just know my apartment was looted, and I need to file a report..."
Tom finally gets to his office and find most things to be in working order: his computer even boots up correctly, and the stock ticker prices are scrolling to the left like always. But when he tries to place a trade, he runs into another glitch. "The exchange is temporarily shut down," he's told. But it has nothing to do with Y2K, they assure him. The phone begins to ring: his clients are telling him to "sell everything" once the exchange opens. The food shortage apparently convinced them the Y2K problem was real. Tom sits back, wide-eyed, knowing that when everybody wants to sell, prices plummet. He'll be lucky to get 1% of his clients out near the top.
This is but one hour out of Tom's day, given a 2.5% system failure rate that ricochets through the system, taking down some systems while leaving others intact. Now imagine Tom's next thirty days. Imagine the incorrect phone bills, the attempts to get through to his mutual fund company, the payroll system problems, and a thousand other minor (but cumulative) glitches. Imagine Tom trying to get reimbursed for his stolen car: the insurance company is struggling under a mountain of claims, trying to deny them all by claiming "Y2K" as the root cause. Imagine Tom watching his neighbors lose their jobs because their employers went bankrupt under the burden of non-stop, pile-it-on glitches. Imagine what happens when Tom can't get to work because he has no transportation. Will he be able to hold on to his job?
All this, by the way, assumes the lights stay on. Given the well-documented efforts by the NRC and NERC to mislead the American public about the Y2K status of the power industry, this is a significant assumption. If electric utilities are compliant, why will none actually say so? With NERC confirming that "compliant" is a higher standard than "ready," and with every single electric utility in the country now using the word, "ready," we are only left with the somewhat-vague idea that electric utilities are something-less-than-compliant. To assume that every city will have non-interrupted power requires a great leap of faith, no doubt.
2.5% IS A 6 OR 7
On the scale of 1 - 10, where a '1' is the no-big-deal scenario and a '10' is the worst-case, doomsday scenario, Y2KNEWSWIRE submits that an overall failure rate of 2.5% of all hardware or software systems is a '6' or '7.' This describes a scenario where problems are widespread and serious, where the economic disruption is severe and long-term, and where millions of Americans end up losing their jobs, their houses and their savings.
A '7' does not indicate the collapse of modern civilization, the government, the economy, or law and order. A '7' is easily survivable (in the physical sense) by making prudent preparations. It is, however, rather difficult to prepare for the long-term economic impact of a '7.'
BUT WE CAN POSSIBLY BEAT THE '7' Should a compliance rate of, say, 99% be reached, the unfolding scenario could perhaps be limited to a 4 or 5. If 99.8% were reached, it might be a 1 or 2. With 100% compliance, of course, the problem disappears. Here's the Y2KNEWSWIRE "Chart of Severity" that attempts to equate a system failure rate with a potential outcome:
0 - No effect
1 - Minor glitches
2 - Small disruptions, 72-hour scenario
3 - Medium disruptions, exceeds the three-day scenario
4 - Medium, widespread disruptions
5 - Severe, local disruptions, interruptions in major infrastructure
6 - Severe, regional disruptions, medium-term interruptions in major infrastructure
7 - Severe, national disruptions, medium-term interruptions in major infrastructure, serious economic impact
8 - Severe, national disruptions, long-term interruptions in major infrastructure, severe economic impact
9 - Collapse of some infrastructure, severe global disruptions, severe long-term economic impact
10 - Collapse of modern society, collapse of governments, economies, international trade, long-term loss of most infrastructure
This chart shows that things begin to get out of hand very quickly when the failure rate goes above 2.5%. Although this table is, of course, only an educated guess, we believe it accurately models the cumulative effects of multiple failures.
"Y2K IS SYSTEMIC"
No person has repeated this phrase more often than Dr. Gary North, but the core meaning of it seems to be lost on Y2K Deniers. "Y2K is systemic" sums up the point of this article, and in technical terms, it really means this: that due to the computer-based interconnections in modern society, a failure rate of 20% does not equal a 20% reduction in the combined ability of the economy and infrastructure to deliver goods and services. Rather, a 20% failure rate equals a collapse of this ability.
At the same time, a failure rate of 1% does not equal a 1% reduction in this economic / infrastructure ability. Rather, a 1% failure rate leads to "medium, widespread disruptions."
"Y2K is systemic" is perhaps the point that is most-often missed by those who deny the reality of the Y2K problem. To them, "Y2K is systemic" is simply three words strung together rather than a description of the state of affairs in our modern civilization.
The point can be argued, of course, but if you've been following Y2K for very long, you already know this: the argument against Y2K problems is rarely framed in terms of whether modern civilization is systemic. The common arguments go like this, "Y2K can't be bad because companies wouldn't let it be bad." (Notice the lack of addressing the systemic nature of our economy.)
Another argument: "Y2K can't be bad because human beings can always overcome." (Again, the Denier is failing to address the systemic nature of our economy and infrastructure.) A more intelligent argument by a Y2K Denier would be this one: "Y2K won't be bad because no threat to our modern civilization is systemic." This is thoughtful, of course, but wrong. It is analogous to stating, "Nothing is connected, therefore Y2K can't be bad." This argument, although wrong, at least addresses the point of the interconnections.
Any argument that fails to address the systemic nature of Y2K is missing the boat. Without a discussion of the interdependencies, all such "debate" is rather meaningless. Those three innocent-looking words, "Y2K is systemic" capture the nature of the problem better than any string of arguments, logic-based or otherwise.
Y2K is systemic.
To any friend, colleague or coworker who doesn't believe Y2K might be a problem, ask them to prove that Y2K is not systemic. Because faced with a 2.5% system failure rate, the only way we're all going to get out of this without suffering some damage is if the problem remains isolated in those systems.
BOTTOM LINE
If 100% of the computers are claimed to be Y2K-compliant by December 31, 1999, given that up to 50% of those claims are bogus today, what are the odds that at least 2.5% of the systems in question will fail on January 1, 2000? (Bonus question: if at least 2.5% of the systems fail on January 1, what happens to the economy and the stock market?)
Home Page: Future, Doomsday, Year2000