1980s: Artificial Intelligence (AI) - From Lab to Life
Following the commercial success of expert systems, which started in the 1970s, also other AI technologies began to make their way into the marketplace. In 1986, U.S. sales of AI-related hardware and software rose to U.S.$ 425 million. Especially expert systems, because of their efficiency, were still in demand. Yet also other fields of AI turned out to be successful in the corporate world.
Machine vision systems for example were used for the cameras and computers on assembly lines to perform quality control. By 1985 over a hundred companies offered machine vision systems in the U.S., and sales totaled U.S.$ 80 million. Although there was a breakdown in the market for AI-systems in 1986 - 1987, which led to a cut back in funding, the industry slowly recovered.
New technologies were being invented in Japan. Fuzzy logic pioneered in the U.S. and also neural networks were being reconsidered for achieving artificial intelligence. The probably most important development of the 1980s was, that it showed that AI technology had real life uses. AI applications like voice and character recognition systems or steadying camcorders using fuzzy logic were not only made available to business and industry, but also to the average customer.
|
TEXTBLOCK 1/1 // URL: http://world-information.org/wio/infostructure/100437611663/100438659445
|
| |
Central processing unit
A CPU is the principal part of any digital computer system, generally composed of the main memory, control unit, and arithmetic-logic unit. It constitutes the physical heart of the entire computer system; to it is linked various peripheral equipment, including input/output devices and auxiliary storage units...
|
INDEXCARD, 1/2
|
| |
Alan Turing
b. June 23, 1912, London, England d. June 7, 1954, Wilmslow, Cheshire
English mathematician and logician who pioneered in the field of computer theory and who contributed important logical analyses of computer processes. Many mathematicians in the first decades of the 20th century had attempted to eliminate all possible error from mathematics by establishing a formal, or purely algorithmic, procedure for establishing truth. The mathematician Kurt Gödel threw up an obstacle to this effort with his incompleteness theorem. Turing was motivated by Gödel's work to seek an algorithmic method of determining whether any given propositions were undecidable, with the ultimate goal of eliminating them from mathematics. Instead, he proved in his seminal paper "On Computable Numbers, with an Application to the Entscheidungsproblem [Decision Problem]" (1936) that there cannot exist any such universal method of determination and, hence, that mathematics will always contain undecidable propositions. During World War II he served with the Government Code and Cypher School, at Bletchley, Buckinghamshire, where he played a significant role in breaking the codes of the German " Enigma Machine". He also championed the theory that computers eventually could be constructed that would be capable of human thought, and he proposed the Turing test, to assess this capability. Turing's papers on the subject are widely acknowledged as the foundation of research in artificial intelligence. In 1952 Alan M. Turing committed suicide, probably because of the depressing medical treatment that he had been forced to undergo (in lieu of prison) to "cure" him of homosexuality.
|
INDEXCARD, 2/2
|
| |