World-Information City

 CONTENTS   SEARCH   HISTORY   HELP 



  Report: Slave and Expert Systems

Browse:
  Related Search:


 WORLD-INFOSTRUCTURE > SLAVE AND EXPERT SYSTEMS > 1940S - EARLY 1950S: FIRST ...
  1940s - Early 1950s: First Generation Computers


Probably the most important contributor concerning the theoretical basis for the digital computers that were developed in the 1940s was Alan Turing, an English mathematician and logician. In 1936 he created the Turing machine, which was originally conceived as a mathematical tool that could infallibly recognize undecidable propositions. Although he instead proved that there cannot exist any universal method of determination, Turing's machine represented an idealized mathematical model that reduced the logical structure of any computing device to its essentials. His basic scheme of an input/output device, memory, and central processing unit became the basis for all subsequent digital computers.

The onset of the Second World War led to an increased funding for computer projects, which hastened technical progress, as governments sought to develop computers to exploit their potential strategic importance.

By 1941 the German engineer Konrad Zuse had developed a computer, the Z3, to design airplanes and missiles. Two years later the British completed a secret code-breaking computer called Colossus to decode German messages and by 1944 the Harvard engineer Howard H. Aiken had produced an all-electronic calculator, whose purpose was to create ballistic charts for the U.S. Navy.

Also spurred by the war the Electronic Numerical Integrator and Computer (ENIAC), a general-purpose computer, was produced by a partnership between the U.S. government and the University of Pennsylvania (1943). Consisting of 18.000 vacuum tubes, 70.000 resistors and 5 million soldered joints, the computer was such a massive piece of machinery (floor space: 1,000 square feet) that it consumed 160 kilowatts of electrical power, enough energy to dim lights in an entire section of a bigger town.

Concepts in computer design that remained central to computer engineering for the next 40 years were developed by the Hungarian-American mathematician John von Neumann in the mid-1940s. By 1945 he created the Electronic Discrete Variable Automatic Computer (EDVAC) with a memory to hold both a stored program as well as data. The key element of the Neumann architecture was the central processing unit (CPU), which allowed all computer functions to be coordinated through a single source. One of the first commercially available computers to take advantage of the development of the CPU was the UNIVAC I (1951). Both the U.S. Census bureau and General Electric owned UNIVACs (Universal Automatic Computer).

Characteristic for first generation computers was the fact, that instructions were made-to-order for the specific task for which the computer was to be used. Each computer had a different binary-coded program called a machine language that told it how to operate. Therefore computers were difficult to program and limited in versatility and speed. Another feature of early computers was that they used vacuum tubes and magnetic drums for storage.




browse Report:
Slave and Expert Systems
    Introduction: The Substitution of Human Faculties with Technology: Early Tools
 ...
-3   The 19th Century: Machine-Assisted Manufacturing
-2   The 19th Century: First Programmable Computing Devices
-1   1913: Henry Ford and the Assembly Line
0   1940s - Early 1950s: First Generation Computers
+1   1950: The Turing Test
+2   1940s - 1950s: The Development of Early Robotics Technology
+3   1950s: The Beginnings of Artificial Intelligence (AI) Research
     ...
1980s: Artificial Intelligence (AI) - From Lab to Life
 INDEX CARD     RESEARCH MATRIX 
Expert system
Expert systems are advanced computer programs that mimic the knowledge and reasoning capabilities of an expert in a particular discipline. Their creators strive to clone the expertise of one or several human specialists to develop a tool that can be used by the layman to solve difficult or ambiguous problems. Expert systems differ from conventional computer programs as they combine facts with rules that state relations between the facts to achieve a crude form of reasoning analogous to artificial intelligence. The three main elements of expert systems are: (1) an interface which allows interaction between the system and the user, (2) a database (also called the knowledge base) which consists of axioms and rules, and (3) the inference engine, a computer program that executes the inference-making process. The disadvantage of rule-based expert systems is that they cannot handle unanticipated events, as every condition that may be encountered must be described by a rule. They also remain limited to narrow problem domains such as troubleshooting malfunctioning equipment or medical image interpretation, but still have the advantage of being much lower in costs compared with paying an expert or a team of specialists.