The History of The Computers

The History of the Computers Documentary by Modern Marvels TV show was an interesting and informative perspective of the past, present and potential future of computer technology, as it was aired in 2001.  I always find it a bit ironic when ‘modern’ technology (be it real or animated future tech) is presented in movies or videos, as this tech in hindsight is always outdated. The technology presented in this educational historical video shows the personal computer as the cube screen monitor with the chunky keyboard and styluses with early phones with internet access.  It is thought provoking to think that the documentary mentions the future with a “world wide communication grid” being possible, as in that time it was pre universal social media, as social networks like Myspace, Facebook, Twitter, Instagram and Snapchat had not even been invented. This documentary stated that “soon half the jobs of America will use computers”, which is a funny statement now, as most jobs probably use a computer.  The view of modern tech was extremely optimistic as they mention that computers will help guide space travel, taking us to distant galaxies, as most Nasa travel has slowed some down due to expenses, but the first crewed flight Mars One is proposed for 2031 (!).

Acquiring the knowledge about different parts of a computer was fascinating, as I had never had a computer science class/prior knowledge regarding the subject.  The hardware is the physical monitor and hard drive while the software is the instructions of the computers, terms that beforehand could be interchangeable or were really complicated concepts.  The keyboard provides input, while the input is processed by the cpu (computer brain), instructions are then provided by software are then taken by the cpu. The cpu speed is MIP (millions of instructions per second) and  the data is stored in RAM (random access memory) or stored on hardware, the cpu then appears on the monitor instantly.

When thinking about computers, I always assumed that the term “computer” originated computing and doing math problems, which is somewhat true, but is mainly from tables critical for mathematical computation during the industrial revolution.  “Computers” actually referred to those who did mathematical computation for this job and was of course full of human errors. Learning about the roots with the computer age, I had never known of Charles Babbage and his governmental supported mechanical computer, ‘Differential Machine’.  It was fascinating to learn of Ada Lovelace, Lord Byron’s daughter and the “world’s first programmer”, as she worked for Babbage, describing what a computer to do. Lovelace, however did not program a computer herself. The drive to make a functional computer was based off finding a easier and faster way to record the US census.  It was crazy to think that a woman was still recording the census for 1880 in 1887, due to the lack of tech. I had never known that Hollerith sold his company to Thomas Watson, the founder of International Business Machines (IBM), that sold business equipment. I had always heard about the ‘Enigma’ (especially after The Imitation Game), but I had not known the details of Alan Turing and his machine, the Colossus, which broke the German code.  It was impressive to know that Turing broke the code and that they had to keep their knowledge effectively, in the sense of not letting that they knew the code and still making plans reactively to the German code.

I had always thought that large programming computers were originated during World War II, so I was relatively shocked to know that Motley’s Eniac was finished in post war 1946.  It’s funny to think the terms software “bug” and “debug” came from the Mark II computer not working due to a moth being located in the system and removed. A leader in the modern computer was John Von Neumann, the advisor of the UNIVAC, with a photographic memory, he wrote about the structure desired for the modern computer.  Such aspects for the modern computer included the processing unit, controlling unit, memory, input and output. The progressive thought of Von Neumann was to hold the computer’s program internally, which gave it power and versatility, as it allowed it to modify what it does based on data or results of computations. This paper by Von Neumann spurred the computer production.  With the combination of companies, UNIVAC (Census Bureau dependent) and Remington Rand (typewriter company) built an entire computer system designed for business, which became the first commercial computer produced in the United States. UNIVAC was an entire computer system designed for business and results were automatically printed. Perceptions and sales of computers changed dramatically following the 1952 Election, where UNIVAC predicted Eisenhower winner, with an accuracy off less than one percent of the final result.  Data processing systems of large companies heavily depended on IBM’s office machine, the monolith. However, it is noteworthy to mention that the company IBM did not sell computers itself due to the staggering cost to build them. IBM only started to build computers once Watson’s son took over the company, and by the early 60s, their computers dominated the business. In 1947, by Bell Labs, which invented the transistor. The transistor replaced computer vacuums since they produced less heat, cost less and took less space. The integrated circuit (or chip), replaced the transistor, as they were sets of electronic circuits on conductive material (usually silicon), led the American space travel, as ten thousand total were used to leave earth.  Ted Hoff created the microprocessor, the circuit that makes today’s personal computers possible. Steve Wozniak and Steve Jobs was joined by Mark Markkula (of intel) in 1976 to sell the Apple Two, a computer that sold well but did not have graphical interfaces. PARC Xerox’s Robert Taylor created the personal computer, the Alto, which had a mouse, graphical interface, built in networking and printed on a laser printer. Jobs was inspired by the Alto and created the first popular personal computer, the Mackintosh. Mackintosh was easy to use based on the operating system and applications, which is also known as software. Bill Gates, president of Microsoft, created a software empire, which made him the richest man in the world.