Site hosted by Angelfire.com: Build your free website today!

Computers: A Trend From The Past, Continued in the Present, and the Gateway to the Future

This is an end of the year final essay I wrote for English 2 (sophomore english). Names of person interviewed have been omitted except for the first letter for the sake of privacy. (Written 3/15/00)


Years ago, it invaded our lives, came into our homes. It has, since its creation, become such an integral part of our everyday life, many of us could not imagine an existence without it. This contraption has given us a world of opportunity. It has been one of the greatest achievements leading us into the information age. While keeping us glued in front of a glowing screen, it has allowed us to travel to places and talk to people clear across the world. It has brought us out of a slow-paced past, into a fast-paced present, and will carry us into the technological marvels of the future. What is this wondrous machine? What is this contraption that is more a part of our lives now than many people in the past ever thought it could be? The computer, which has, is, and will continue to be an overall positive influence in our lives.

It all started in 2000 B.C., in a place far away, with a little computing device called the abacus. “The abacus was basically the first machine to be used for computations” (Swanson). But let’s not go that far back. It’s been a long time since 2000 B.C., and frankly, the abacus seems a bit trivial compared to the accomplishments of the ENIAC and UNIVAC. The ENIAC was the first actual computer, “Able to perform calculations 1,000 times faster than its electromechanical predecessors” (Elmer-DeWitt, 63). It was developed to perform trajectory calculations, and was able to conclude results in a matter of mere seconds, whereas before it may have taken days to get an answer. One drawback to the ENIAC was it’s size and the effort involved in maintaining it. “It weighed 30 tons and occupied a space as large as a boxcar. Its 40 modular memory and processing units, each housed in a 9-foot-high black metal cabinet, and bristling with dials, wires and indicator lights, filled a room the size of a small gymnasium. Its 18,000 vacuum tubes radiated so much heat that industrial cooling fans were needed to keep its circuitry from melting down” (Elmer-DeWitt, 63). Just consider for a moment how many hours of upkeep that really is. Thousands of hours were most likely used to maintain the ENIAC over its lifetime. But without this vital step forward in our technological history, there would be no way for us to have the type of computers we have now. There may not even be computers today if it weren’t for the ENIAC. Although we may think it old and outdated, having to have a vacuum tube replaced every two hours due to burn out, it was the “Technological wonder of its day” (Elmer-DeWitt, 63). We owe quite a bit to this marvelous and complex piece of machinery. Without it, there would never have been the UNIVAC, the next step in computing and the first commercial computer; there would never have been BASIC or DOS, two of the first computer languages; we would never have experienced the joy of Windows, Apple Computers, computer mice, or portable, affordable, personal computers. The ENIAC started it all.

After the ENIAC was created, scientists and other people immediately began to try to improve the computer and make it available to the every day user. The first step in this was to develop a computer language that people could learn and be able to use. BASIC began this trend, followed by DOS. “Many people learned the BASIC language because it was included with the PC” (Miller, 112). This goes for the later DOS, as well. These may have been wonderful in their time, but many were thrilled when Windows came along and they no longer had to remember a horrendously large amount of commands. Next, the computer mouse was the most marvelous thing many had seen, when it debuted at a convention in 1968. “The original mouse was a polished wooden box fitted over two desktop-skating flywheels” (Kanaley, 1). Douglas C. Engelbart is the man accredited with inventing the mouse. He originally named it the “X-Y Position Indicator for a Display System.” Thankfully, one of Engelbart’s co-workers had the decency to come up with the nickname “mouse.” Just imagine saying, “Click the X-Y Position Indicator’s left button,” instead of “Click the left mouse button.” That unnamed fellow deserves some credit, too, although the “XYPI” does have a certain ring to it. “Today it sounds like an obvious thing to do. But at the time, it was a major leap forward,” said Guerrino De Luca, president and chief executive of Logitech (Kanaley, 1). The computer mouse opened up a whole new window in computing, literally. It allowed for a pointer to be located on the screen; with this pointer one could navigate easily around the newly developed Windows, which had pull down menus and many other fascinating neo-creations for computers.

Some of these neo-creations included personal computers that were affordable enough to be bought once they were placed on the market. The invention of the Macintosh (often known as just plain “Mac”) and the printer were major steps forward as well. And without the invention of the point-contact transistor to replace the vacuum tubes that were originally used in computers, we might still be changing parts every two hours. “In the early 1980’s, a ‘portable’ PC meant a mere 28 pounds” (Miller 112). No need to go to the gym, just haul your PC around with you all day! The invention of the microprocessor by Intel’s Robert Noyce, compacted the computer quite a bit. It went from the big and bulky ENIAC, to a more compact, home-friendly computer. A sign hung at Moore School near old computer parts pretty much sums it all up. The sign reads: “In less than 40 years, advances in microelectronics technology has enabled the digital computer with performance far superior to the ENIAC to be placed on a 1-quarter-inch piece of silicon” (Elmer-DeWitt, 63). This reduction in size allowed for a “Reduction in cost that has been the main factor in making possible the production of simple computers for use in homes and schools” (Information Processing and Information Systems, 565). Some feel it was the “Macintosh that paved the way for the digital revolution” (Fenton, 116). Nicholas Negroponte, director of MIT’s Media Laboratory, certainly seems to think Macs are great. He says, “The Mac changed the course of computing with its interface. It was so easy to use and so dreadfully obvious that the first thing you did with the manual was throw it away” (116). Some critics didn’t seem to appreciate the Macintosh as much, calling it “The world’s most expensive Etch-A-Sketch” (116). If only they could have foreseen what something as simplistic as Apple’s Macintosh could lead to.

Windows could be considered the next big step in computing. It gave us the type of computer/user compatibility we have and use today. It is far simpler than DOS ever was, and not to mention faster with more options to choose from. “Windows 95 allowed for full 32-bit applications, had preemptive multitasking, was Plug and Play compatible, supported new e-mail and communication standards, and featured a new user interface” (Miller, 112). It was truly a major leap forward from the ENIAC and other various predecessors. Now we can have menus and options at the click of a mouse, instead of needing two days to program one thing on the ENIAC. It’s certainly true that people were thrilled with the new operating system. They flocked to Windows like ducks to water, with consumers buying 1 million copies in just the first four days after its release.

Computers are continually being improved upon and delivered to the public at a faster and faster pace. This period of time, between when the computers are manufactured to when they are delivered, is known as “Time to Market.” This time to market is growing shorter and shorter due to domestic competition and pressures from foreign markets. Not only is time to market getting faster, so are the operating systems themselves. “Computers are running about 10 times faster than they were 5 years ago” (K--). That is quite an improvement from the slug-paced ancestors of today’s modern computers. Engineers at Intel are now working with a CPU (Central Processing Unit) known as Willamette in the labs right now that is running at 1500 Megahertz (1.5 Gigahertz.) And if Moore’s Law holds true, the speed of computers will continue to increase dramatically in the next few years. (Moore's Law states that the speed of the CPU will double every 1.5 years).

Computers in the workplace have advanced over the years as well. For example, at Ford they use computers for: “Communication, data processing, data transfer, running the equipment, report writing, programming, time keeping, booking rooms for meetings, and planning job schedules” (B--). Computer Aided Engineering (CAE) is also a big deal in a lot of places. It’s more than likely places other than just Ford use CAE everyday; businesses such as computer development, machine engineering, electronics, etc. Computers have become such an integral part of business life that many companies must close if their machines go off-line. But computers in the workplace are not a new thing. N-- B--, a Ford Engineer, recalls, “I’ve been working on the d*mn things since I went to uni in 1982, way back when the WWW didn’t exist and the internet was for computer geeks at universities and research labs only” (Interview). Even though computers have been operating in places of work and design for a long time, it wasn’t until the 1990’s that this trend really started to pick up. Some people have extreme feelings one way or the other about how much good computers are really doing us. Not everyone enjoyed the decade of computers and technological advancements. Although these people are entitled to their opinions, they will be left behind in the future. John Dvorak has stated, “The 1990’s won’t be remembered for much that is positive. Good riddance to that decade, which I look back on as a decade of fear and cowardice. That fear and cowardice were brought on in part by computer technology” (Dvorak, 103). Many people would consider this man crazy for thinking this opinion, they themselves adoring the computer revolution of the nineties. But do the cons really out-weigh the pros? No Way! Computers have given us much hope for opportunity and a more high-tech future, than they have ever given us anything else.

Maybe one of the biggest crazes of the 1990’s was the internet. It has been noted that “In the ‘90’s, surfing the Net was the nation’s favorite legal vice” (The Best and Worst of the Decade). And it seems that this is a trend that has caught on around the world. The internet has become a multi-lingual, multi-cultural, and generally opinionated recreational pastime. The internet itself has been around since the early 1960’s, but it wasn’t until the invention of Mosaic and HTML (Hyper-Text Mark-up Language) that the internet really became available. “In 1993, Mosaic was released to the public and made the Internet, particularly the Web, available to virtually anyone with a personal computer” (Miller, 112). And it wasn’t until the mid-1990’s that people really started flocking to this world-wide source of entertainment and education. But once they did, there was a dramatic boom around the globe for computers with internet capability.

Not only was the internet a driving source for computers, but also new applications that would make the lives of people much easier and more fun-filled. Speech recognition is becoming quite a big thing now. Almost all the computer stores that sell software will have voice recognition applications. “Real good speech recognition should be available within the next 2 years. This means... voice control” (K--). Imagine, being able to work around the house and surf the internet at the same time. That would be a major time-saver! Or being able to shuffle through documents while the computer writes down what you need, now that would be nice! Hands-free computing is just around the corner, and people are already rushing to be the first to have it. 3D graphics are another item that is advancing at an astounding rate. Soon, we will not be able to discern applications using 3D graphics from actual movies. It has also been noted that “TV and PC’s are rapidly converging into a single device” (Howard, 158). Just think of what this could mean: voice controlled computer games that you play on your television and look so real that they are indiscernible from actual movies. Computers stores will be mobbed for applications and devices that allow this type of computing.

A lot of people have high hopes for computers in the future becoming integrated into the home life. Bill Howard believes, “Homes in the future will also have several more systems, including a desktop or notebook machine with easy expandability and automatic configuration, a ‘kitchen PC’ acting as a phone and answering machine, and a telephone/PC/organizer in other rooms. Even the family car will be PC-equipped...” (158). These are some great expectations that will probably become a reality in the next few decades. Cars, for example, are already becoming “PC-equipped.” And think of all the advantages to having a home run with computers. No more worrying about not getting messages, no more being disorganized, and no more not understanding how to program something. Obviously, if computers are going to be such an important part of household life, then they will be easy to program. “...You’ll actually have a VCR that you’ll know how to program -- no more blinking 12:00...12:00...12:00...” (158). Won’t that be nice, we might not have to use duct tape to cover it up anymore. Howard also thinks that, “In 2012 there might possibly be a bullet proof, high-reliability PC that will be the brains of the house, tucked away in the basement or where the phone lines come in” (158). Now, wouldn’t it be convenient to be able to just program the computer and have it take care of any problems, then to have to try to hunt down the source and fix it yourself? A lot of people think so. For example, if you lacked the knowledge to handle electrical problems, and lacked the money to hire someone, you would have the computer to fix it for you. Problem solved. In the future, multiple computers will become common place in the home. “In 15 years, home PC’s will be just like peanuts: You won’t be able to stop with just one” (158). And they might actually be affordable enough for this to come true. But there are always those few people who refuse to buy something new, such as a computer, without first knowing that it will be useful to them. “Hardware is cool to techno-weenies, but most people want to know what useful tasks they can actually do with it” (158). Well, for those of you who must know, just imagine having your house cleaned for you, food cooked perfectly, groceries delivered right to your door... all done by computers.

Computers have advanced us from the stoneage to today’s awe-inspiring technology. They have found their way into our day-to-day lives, and are not likely to be leaving in the near future. People have reacted to the presence of computers in many ways, although most have been amazed and satisfied with the machine. While there are the few that resent the computer, they are becoming fewer and fewer every day. More people are realizing the many benefits we gain from using these contraptions, and how much simpler our lives have become. People see the computer as our way out of a slow-paced past, our link to the present, and our gateway into the future. Many people have high hopes for what this type of technology will be able to do for us in the years to come. “((Geek on)) I’m hoping that computer technology will allow me to man the quad laser cannon of a Corellian YT-1300 Light Freighter. ((Geek off))” (K--). And then, there are those among us that are just plain weird.


Works Cited:

B--, N--. E-mail interview. 16 Feb. 2000.

Dvorak, John C. “Decade of Fear and Cowardice.” PC Magazine. Feb. 2000: 103.

Elmer-DeWitt, Philip. “A Birthday Party for ENIAC; Remembering the Granddaddy of Modern Computers.” Time. Feb. 1986: 63.

Fenton, Matthew McCann. “Big Mac Attack.” Entertainment Weekly. Jan. 1999: 116.

Howard, Bill. “Future Home Computers.” PC Magazine. Mar. 1997: 158.

“Information Processing and Information Systems.” The New Encyclopedia Britannica. 15th ed.

Kanaley, Reid. “A Mouse’s Tale: 30 Years Ago, An Idea Clicked.” Knight-Ridder/Tribune News Service. Dec. 1998.

K--, C--. E-mail interview. 26 Feb. 2000.

Miller, Michael J. “1982-83: The Early Years.” PC Magazine. Mar. 1997: 112.

Swanson, Carl Ballard. "Milestones in Computer Development." Website. 15 February, 2000.


Back to English Essays.