The 17th Century: The Invention of the First "Computers"

The devices often considered the first "computers" in our understanding were rather calculators than the sophisticated combination of hard- and software we call computers today.

In 1642 Blaise Pascal, the son of a French tax collector, developed a device to perform additions. His numerical wheel calculator was a brass rectangular box and used eight movable dials to add sums up to eight figures long. Designed to help his father with his duties, the big disadvantage of the Pascaline was its limitation to addition.

Gottfried Wilhelm von Leibniz, a German mathematician and philosopher, in 1694 improved the Pascaline by creating a machine that could also multiply. As its predecessor Leibniz's mechanical multiplier likewise worked by a system of gears and dials. Leibniz also formulated a model that may be considered the theoretical ancestor of some modern computers. In De Arte Combinatoria (1666) Leibniz argued that all reasoning, all discover, verbal or not, is reducible to an ordered combination of elements, such as numbers, words, colors, or sounds.

Further improvements in the field of early computing devices were made by Charles Xavier Thomas de Colmar, a Frenchmen. His arithometer could not only add and multiply, but perform the four basic arithmetic functions and was widely used up until the First World War.

TEXTBLOCK 1/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659397
 
1980s: Artificial Intelligence (AI) - From Lab to Life

Following the commercial success of expert systems, which started in the 1970s, also other AI technologies began to make their way into the marketplace. In 1986, U.S. sales of AI-related hardware and software rose to U.S.$ 425 million. Especially expert systems, because of their efficiency, were still in demand. Yet also other fields of AI turned out to be successful in the corporate world.

Machine vision systems for example were used for the cameras and computers on assembly lines to perform quality control. By 1985 over a hundred companies offered machine vision systems in the U.S., and sales totaled U.S.$ 80 million. Although there was a breakdown in the market for AI-systems in 1986 - 1987, which led to a cut back in funding, the industry slowly recovered.

New technologies were being invented in Japan. Fuzzy logic pioneered in the U.S. and also neural networks were being reconsidered for achieving artificial intelligence. The probably most important development of the 1980s was, that it showed that AI technology had real life uses. AI applications like voice and character recognition systems or steadying camcorders using fuzzy logic were not only made available to business and industry, but also to the average customer.

TEXTBLOCK 2/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659445
 
George Boole

b. Nov. 2, 1815, Lincoln, Lincolnshire, England
d. Dec. 8, 1864, Ballintemple, County Cork, Ireland

English mathematician who helped establish modern symbolic logic and whose algebra of logic, now called Boolean algebra, is basic to the design of digital computer circuits. One of the first Englishmen to write on logic, Boole pointed out the analogy between the algebraic symbols and those that can represent logical forms and syllogisms, showing how the symbols of quantity can be separated from those of operation. With Boole in 1847 and 1854 began the algebra of logic, or what is now called Boolean algebra. It is basically two-valued in that it involves a subdivision of objects into separate classes, each with a given property. Different classes can then be treated as to the presence or absence of the same property.


INDEXCARD, 1/1