1960s - 1970s: Increased Research in Artificial Intelligence (AI)

During the cold war the U.S. tried to ensure that it would stay ahead of the Soviet Union in technological advancements. Therefore in 1963 the Defense Advanced Research Projects Agency (DARPA) granted the Massachusetts Institute of Technology (MIT) U.S.$ 2.2 million for research in machine-aided cognition (artificial intelligence). The major effect of the project was an increase in the pace of AI research and a continuation of funding.

In the 1960s and 1970s a multitude of AI programs were developed, most notably SHRDLU. Headed by Marvin Minsky the MIT's research team showed, that when confined to a small subject matter, computer programs could solve spatial and logic problems. Other progresses in the field of AI at the time were: the proposal of new theories about machine vision by David Marr, Marvin Minsky's frame theory, the PROLOGUE language (1972) and the development of expert systems.

TEXTBLOCK 1/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659474
 
1940s - Early 1950s: First Generation Computers

Probably the most important contributor concerning the theoretical basis for the digital computers that were developed in the 1940s was Alan Turing, an English mathematician and logician. In 1936 he created the Turing machine, which was originally conceived as a mathematical tool that could infallibly recognize undecidable propositions. Although he instead proved that there cannot exist any universal method of determination, Turing's machine represented an idealized mathematical model that reduced the logical structure of any computing device to its essentials. His basic scheme of an input/output device, memory, and central processing unit became the basis for all subsequent digital computers.

The onset of the Second World War led to an increased funding for computer projects, which hastened technical progress, as governments sought to develop computers to exploit their potential strategic importance.

By 1941 the German engineer Konrad Zuse had developed a computer, the Z3, to design airplanes and missiles. Two years later the British completed a secret code-breaking computer called Colossus to decode German messages and by 1944 the Harvard engineer Howard H. Aiken had produced an all-electronic calculator, whose purpose was to create ballistic charts for the U.S. Navy.

Also spurred by the war the Electronic Numerical Integrator and Computer (ENIAC), a general-purpose computer, was produced by a partnership between the U.S. government and the University of Pennsylvania (1943). Consisting of 18.000 vacuum tubes, 70.000 resistors and 5 million soldered joints, the computer was such a massive piece of machinery (floor space: 1,000 square feet) that it consumed 160 kilowatts of electrical power, enough energy to dim lights in an entire section of a bigger town.

Concepts in computer design that remained central to computer engineering for the next 40 years were developed by the Hungarian-American mathematician John von Neumann in the mid-1940s. By 1945 he created the Electronic Discrete Variable Automatic Computer (EDVAC) with a memory to hold both a stored program as well as data. The key element of the Neumann architecture was the central processing unit (CPU), which allowed all computer functions to be coordinated through a single source. One of the first commercially available computers to take advantage of the development of the CPU was the UNIVAC I (1951). Both the U.S. Census bureau and General Electric owned UNIVACs (Universal Automatic Computer).

Characteristic for first generation computers was the fact, that instructions were made-to-order for the specific task for which the computer was to be used. Each computer had a different binary-coded program called a machine language that told it how to operate. Therefore computers were difficult to program and limited in versatility and speed. Another feature of early computers was that they used vacuum tubes and magnetic drums for storage.

TEXTBLOCK 2/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659338
 
Central processing unit

A CPU is the principal part of any digital computer system, generally composed of the main memory, control unit, and arithmetic-logic unit. It constitutes the physical heart of the entire computer system; to it is linked various peripheral equipment, including input/output devices and auxiliary storage units...

INDEXCARD, 1/1