The first adding machine, a precursor of the digital computer, was devised in 1642 by the French philosopher and mathematician Blaise Pascal. This device employed a series of ten-toothed wheels, each tooth representing a digit from 0 to 9. The wheels were connected so that numbers could be added to each other by advancing the wheels by a correct number of teeth. In the 1670s the German philosopher and mathematician Gottfried Wilhelm Leibniz improved on this machine by devising one that could also multiply.
The French inventor Joseph Marie Jacquard , in designing an automatic loom, used thin, perforated wooden boards to control the weaving of complicated designs. During the 1880s the American statistician Herman Hollerith conceived the idea of using perforated cards, similar to Jacquard's boards, for processing data. Employing a system that passed punched cards over electrical contacts, he was able to compile statistical information for the 1890 U.S. census.
The Analytical Engine
Also in the 19th century, the British mathematician and inventor Charles Babbage worked out the principles of the modern digital computer. He conceived a number of machines, such as the Difference Engine, that were designed to handle complicated mathematical problems. Many historians consider Babbage and his associate, the British mathematician Augusta Ada Byron (Lady Lovelace, 1815-52), the daughter of the English poet Lord Byron, the true inventors of the modern digital computer. The technology of their time was not capable of translating their sound concepts into practice; but one of their inventions, the Analytical Engine, had many features of a modern computer. It had an input stream in the form of a deck of punched cards, a “store” for saving data, a “mill” for arithmetic operations, and a printer that made a permanent record.
Analog computers began to be built at the start
of the 20th century. Early models calculated by means of rotating shafts
and gears. Numerical approximations of equations too difficult to solve
in any other way were evaluated with such machines. During both world wars,
mechanical and, later, electrical analog computing systems were used as
torpedo course predictors in submarines and as bombsight controllers in
aircraft. Another system was designed to predict spring floods in the Mississippi
In the 1940s, Howard Aiken, a Harvard University mathematician, created what is usually considered the first digital computer. This machine was constructed from mechanical adding machine parts. The instruction sequence to be used to solve a problem was fed into the machine on a roll of punched paper tape, rather than being stored in the computer. In 1945, however, a computer with program storage was built, based on the concepts of the Hungarian-American mathematician John von Neumann. The instructions were stored within a so-called memory, freeing the computer from the speed limitations of the paper tape reader during execution and permitting problems to be solved without rewiring the computer.
The rapidly advancing field of electronics led to
construction of the first general-purpose all-electronic computer in 1946
at the University of Pennsylvania by the American engineer John Presper
Eckert, Jr. and the American physicist John William Mauchly. (Another American
physicist, John Vincent Atanasoff, later successfully claimed that certain
basic techniques he had developed were used in this computer.) Called ENIAC,
for Electronic Numerical Integrator And Computer, the device contained
18,000 vacuum tubes and had a speed of several hundred multiplications
per minute. Its program was wired into the processor and had to be manually
The use of the transistor in computers in the late 1950s marked the advent of smaller, faster, and more versatile logical elements than were possible with vacuum-tube machines. Because transistors use much less power and have a much longer life, this development alone was responsible for the improved machines called second-generation computers. Components became smaller, as did intercomponent spacings, and the system became much less expensive to build.
Late in the 1960s the integrated circuit, or IC, was introduced, making it possible for many transistors to be fabricated on one silicon substrate, with inter- connecting wires plated in place. The IC resulted in a further reduction in price, size, and failure rate. The microprocessor became a reality in the mid-1970s with the introduction of the large scale integrated (LSI) circuit and, later, the very large scale integrated (VLSI) circuit, with many thousands of interconnected transistors etched into a single silicon substrate.
To return, then, to the “switch-checking” capabilities of a modern computer: computers in the 1970s generally were able to check eight switches at a time. That is, they could check eight binary digits, or bits, of data, at every cycle. A group of eight bits is called a byte, each byte containing 256 possible patterns of ONs and OFFs (or 1's and 0's). Each pattern is the equivalent of an instruction, a part of an instruction, or a particular type of datum, such as a number or a character or a graphics symbol. The pattern 11010010, for example, might be binary data—in this case, the decimal number 210 (see Number Systems)—or it might tell the computer to compare data stored in its switches to data stored in a certain memory-chip location.
The development of processors that can handle 16,
32, and 64 bits of data at a time has increased the speed of computers.
The complete collection of recognizable patterns—the total list of operations—of
which a computer is capable is called its instruction set. Both factors—number
of bits at a time, and size of instruction sets—continue to increase with
the ongoing development of modern digital computers.