A Study on the history of computers including some speculation on the future of computing
Created by sjanca on Nov 17, 2009
Last updated: 02/21/12 at 06:42 PM
Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization. Some consider cloud computing to be the next 'generation'.
Worldwide internet usage has passed the 2 billion mark.
The graphical interface for the Internet that was first introduced to the public by Tim Berners-Lee.
The PC platform finally moves to a graphical user interface to compete with the Apple MacIntosh
Apple introduced its Macintosh computer in early 1984. It had a graphical user interface (GUI).
The IBM PC model 5150 was announced at a press conference in New York on August 12, 1981 and became available for purchase in early Fall 1981.
Bill Gates developed the MS-DOS operating system for the IBM PC.
The first Apple computer was not pretty but was the first single circuit board computer, making it a computer that fit on a desk.
The Intel 4004 microprocessor invented in 1971 had all the processing power of the ENIAC computer.
Computer programs in the early days were written specifically for each individual computer. It was not until many years later that the idea of sharing, then later marketing software was developed. The software industry was born once people realized that they could use the same program on several computers, though sometimes after adapting it for a specific customer's needs.
After transistors were invented, scientists found ways to make them smaller and smaller.
Miniaturized transistors were placed on a circuit board to create an integrated ciruit or chip.
Because of the problems with vacuum tubes, computer scientists worked to invent something to correct those problems. The transistor was faster, more reliable, smaller, and much cheaper to build than a vacuum tube. One transistor replaced the equivalent of 40 vacuum tubes.
The vacuum tube looks and behaves very much like a light bulb; it generates a lot of heat and has a tendency to burn out. Also, it is slow, big and bulky.
The Electronic Numerical Integrator And Computer (ENIAC), was built by John Mauchly and J. Presper Eckert in 1946.
It weighed more than 27,000 kg (60,000 lb), and contained more than 18,000 vacuum tubes. Roughly 2000 of the computer's vacuum tubes were replaced each month by a team of six technicians.
The official first digital computer was the ABC computer, built by John Atanasoff and Clifford Berry at Iowa State University.
An introduction to the evolution of technology