World-Information City

 CONTENTS   SEARCH   HISTORY   HELP 



  Report: Slave and Expert Systems

Browse:
  Related Search:


 WORLD-INFOSTRUCTURE > SLAVE AND EXPERT SYSTEMS > THE 19TH CENTURY: FIRST ...
  The 19th Century: First Programmable Computing Devices


Until the 19th century "early computers", probably better described as calculating machines, were basically mechanical devices and operated by hand. Early calculators like the abacus worked with a system of sliding beads arranged on a rack and the centerpiece of Leibniz's multiplier was a stepped-drum gear design.

Therefore Charles Babbage's proposal of the Difference Engine (1822), which would have (it was never completed) a stored program and should perform calculations and print the results automatically, was a major breakthrough, as it for the first time suggested the automation of computers. The construction of the Difference Engine, which should perform differential equations, was inspired by Babbage's idea to apply the ability of machines to the needs of mathematics. Machines, he noted, were best at performing tasks repeatedly without mistakes, while mathematics often required the simple repetition of steps.

After working on the Difference Engine for ten years Babbage was inspired to build another machine, which he called Analytical Engine. Its invention was a major step towards the design of modern computers, as it was conceived the first general-purpose computer. Instrumental to the machine's design was his assistant, Augusta Ada King, Countess of Lovelace, the first female computer programmer.

The second major breakthrough in the design of computing machines in the 19th century may be attributed to the American inventor Herman Hollerith. He was concerned with finding a faster way to compute the U.S. census, which in 1880 had taken nearly seven years. Therefore Hollerith invented a method, which used cards to store data information which he fed into a machine that compiled the results automatically. The punch cards not only served as a storage method and helped reduce computational errors, but furthermore significantly increased speed.

Of extraordinary importance for the evolution of digital computers and artificial intelligence have furthermore been the contributions of the English mathematician and logician George Boole. In his postulates concerning the Laws of Thought (1854) he started to theorize about the true/false nature of binary numbers. His principles make up what today is known as Boolean algebra, the collection of logic concerning AND, OR, NOT operands, on which computer switching theory and procedures are grounded. Boole also assumed that the human mind works according to these laws, it performs logical operations that could be reasoned. Ninety years later Boole's principles were applied to circuits, the blueprint for electronic computers, by Claude Shannon.




browse Report:
Slave and Expert Systems
    Introduction: The Substitution of Human Faculties with Technology: Early Tools
 ...
-3   The 17th Century: The Invention of the First "Computers"
-2   The 18th Century: Powered Machines and the Industrial Revolution
-1   The 19th Century: Machine-Assisted Manufacturing
0   The 19th Century: First Programmable Computing Devices
+1   1913: Henry Ford and the Assembly Line
+2   1940s - Early 1950s: First Generation Computers
+3   1950: The Turing Test
     ...
1980s: Artificial Intelligence (AI) - From Lab to Life
 INDEX CARD     RESEARCH MATRIX 
Artificial Intelligence
Artificial Intelligence is concerned with the simulation of human thinking and emotions in information technology. AI develops "intelligent systems" capable, for example, of learning and logical deduction. AI systems are used for creatively handling large amounts of data (as in data mining), as well as in natural speech processing and image recognition. AI is also used as to support decision taking in highly complex environments.
Yahoo AI sites: http://dir.yahoo.com/Science/Computer_Science/Artificial_Intelligence/
MIT AI lab: http://www.ai.mit.edu/


http://dir.yahoo.com/Science/Computer_Science...
http://www.ai.mit.edu/