Who Invented the Computer?
CREDIT: Dreamstime/BusinessNewsDaily illustration
The computer was born not in a single moment of genius; rather, it evolved from years of engineering and discovery. Some innovations spread across continents, while others died in isolation.
One of the early ancestors of the modern computer was Charles Babbage's Analytical Engine. In 1834, Babbage, a British mathematician and inventor, envisioned a machine that could be programmed using punched cards, had temporary storage for numbers as it worked (memory), and had a separate unit to do calculations (central processing unit).
Although the manufacturing capabilities of his day were too crude to build his Analytical Engine, Babbage kept working on its designs. The Analytical Engine was also capable of “if-then” logic, and could be looped into repeating calculations. However, it was Ada Lovelace, not Babbage, who recognized the full potential of the Analytical Engine. In 1842, Babbage asked Lovelace to translate a paper about his machine from Italian to English. Lovelace added notes, longer than the original script, detailing how the Analytical Engine could be programmed to perform algorithms. Unfortunately, Babbage and Lovelace's advances were largely lost to the next generation of inventors.
War and computers
Europe's world wars funded an engineering frenzy that gave way to many crucial advances in the development of computers. In 1936, a German engineer named Konrad Zuse began building machines to simplify his job. His Z3 model is considered to be the earliest, fully functioning program-controlled machine, according to the Computer History Museum in Mountain View, Calif. But Zuse's first patents for the Z3 were denied for being vague, and his early machines were destroyed during World War II.
While Zuse worked in isolation, a young British mathematician named Alan Turing wrote a paper that caught attention overseas. Turing's aim was to show there was no systematic method to find which mathematical statements are provable. To demonstrate the point, Turing described a hypothetical calculating machine that also laid down fundamental concepts in modern computer science.
Across the Atlantic, the U.S. Army funded an engineering team at the University of Pennsylvania to build a machine that could improve ballistic equations. The result was the world's first general-purpose, electronic digital computer. The Electronic Numerical Integrator and Computer, or ENIAC, was fast, precise and powerful.
“In about two weeks, during which we had maybe a dozen hours of actual calculation, we did the work it would have taken a man with a desk calculator 100 years,” co-inventor of the ENIAC J. Presper Eckert said in an interview kept by the Smithsonian Institution.
Eckert and John Mauchly, the man leading the ENIAC team, filed for patents on the computer. The pair created a company and sold it to Remington Rand Corporation in 1950. But by then universities around the world were already building their own computers.
In 1945, mathematician John von Neumann wrote a paper describing the ENIAC. Neumann later headed a project at Princeton University's Institute for Advanced Study (IAS) to build a stored-program computer. By 1951, the computer was operational and Neumann made the designs public.
Mauchly and Eckert eventually lost their patents in a court case, Honeywell vs. Sperry Rand Corporation. They lost, in part, because in 1941 Mauchly made a visit to see the work of an Iowa State University professor, John Vincent Atanasoff. In the late 1930s, Atanasoff started work on a computing machine that would be electric, digital, use re-writable memory and operate on a binary number system (the 0s and 1s that make up computer programming today). Atanasoff built the computer with his student Clifford Berry but never won a patent. However, Mauchly's visit convinced the courts that the ENIAC was partly derived from Atanasoff's work and invalidated the patent in 1973. By then, technical advances had created machines far more powerful than the ENIAC.
An innovation boom
The scientists who built the ENIAC used vacuum tubes – the same sort of technology used in Edison's light bulbs – as electric switches to signal the 0s and 1s of the binary system used in computer programming. But vacuum tubes burned out, used a lot of power and took up space. The ENIAC needed a room 30 by 50 feet wide to house it.
But in 1947, William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invented the transistor. They discovered how to make an electric switch with solid materials and no need for a vacuum. The key was to use both a conductive material such as gold and a semiconductor, such as germanium or silicon. A semiconductor can block an electric current, or let it flow between conductors depending on how electric current is applied to it. Transistors didn't burn out like vacuum tubes, and early commercial transistors were about the size of a thumbnail. Today, you can fit thousands of transistors on a slice of human hair.
Transistors could make computers smaller, and faster. But they had to be connected to a circuit by hand, and one faulty connection could cause huge problems. Between 1958 and 1959, Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor developed the integrated circuit. Instead of manually fitting transistors to circuits, an integrated circuit etches the circuit – all the transistors, resistors, capacitors – onto a single chip of silicon. Kilby won a Nobel Prize in 2000 for his work. Noyce went on to be a co-founder of the Intel Corporation with fellow Fairchild alumnus Gordon Moore.
In 1971, inventor John V. Blankenbaker marketed the first personal computer. That same year, Intel employee Ted Hoff thought to put the central processing unit of the computer right on the integrated circuit. Voila: a microprocessor. While the integrated circuit made computers small, fast and reliable, the microprocessor made computers versatile and easier to program.
By 1977, Steve Wozniak and Steve Jobs released their Apple II, which had a keyboard, manual, game paddles and could produce color graphics. IBM introduced its personal computer in 1981. Although the Internet was invented long before, 1994 marked the year the user-friendly Netscape browser became available.
When don't you use a computer?
There were more than 1 billion personal computers in use in 2008, according to industry analyst Gartner. The U.S. Census Bureau estimates that more than 77 percent of households have a personal computer. That's not counting the computers people use at work.
But personal computer sales may no longer reflect our total computer use. Computers are in our cars, our phones, at toll booths, at cash registers — nearly all aspects of life have moved onto computers. Now computer industry giants like Oracle and IBM are competing in the business of big data: using computing power to analyze the digital evidence we leave behind. Presumably studying how we email, post, or buy online will help companies sell products or run more efficiently themselves.
However, many people today still live without computers. Exactly who is using computers and where can be estimated with Internet access. According to the International Telecommunication Union, the highest percentage of people with Internet access in 2011 was in Europe, at 71 percent. The lowest was Africa – where only 4 percent of people have access to the Internet.