1980s: Artificial Intelligence (AI) - From Lab to Life

Following the commercial success of expert systems, which started in the 1970s, also other AI technologies began to make their way into the marketplace. In 1986, U.S. sales of AI-related hardware and software rose to U.S.$ 425 million. Especially expert systems, because of their efficiency, were still in demand. Yet also other fields of AI turned out to be successful in the corporate world.

Machine vision systems for example were used for the cameras and computers on assembly lines to perform quality control. By 1985 over a hundred companies offered machine vision systems in the U.S., and sales totaled U.S.$ 80 million. Although there was a breakdown in the market for AI-systems in 1986 - 1987, which led to a cut back in funding, the industry slowly recovered.

New technologies were being invented in Japan. Fuzzy logic pioneered in the U.S. and also neural networks were being reconsidered for achieving artificial intelligence. The probably most important development of the 1980s was, that it showed that AI technology had real life uses. AI applications like voice and character recognition systems or steadying camcorders using fuzzy logic were not only made available to business and industry, but also to the average customer.

TEXTBLOCK 1/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659445
 
Late 1950s - Early 1960s: Second Generation Computers

An important change in the development of computers occurred in 1948 with the invention of the transistor. It replaced the large, unwieldy vacuum tube and as a result led to a shrinking in size of electronic machinery. The transistor was first applied to a computer in 1956. Combined with the advances in magnetic-core memory, the use of transistors resulted in computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.

Stretch by IBM and LARC by Sperry-Rand (1959) were the first large-scale machines to take advantage of the transistor technology (and also used assembly language instead of the difficult machine language). Both developed for atomic energy laboratories could handle enormous amounts of data, but still were costly and too powerful for the business sector's needs. Therefore only two LARC's were ever installed.

Throughout the early 1960s there were a number of commercially successful computers (for example the IBM 1401) used in business, universities, and government and by 1965 most large firms routinely processed financial information by using computers. Decisive for the success of computers in business was the stored program concept and the development of sophisticated high-level programming languages like FORTRAN (Formular Translator), 1956, and COBOL (Common Business-Oriented Language), 1960, that gave them the flexibility to be cost effective and productive. The invention of second generation computers also marked the beginning of an entire branch, the software industry, and the birth of a wide range of new types of careers.

TEXTBLOCK 2/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659439
 
Calculator

Calculators are machines for automatically performing arithmetical operations and certain mathematical functions. Modern calculators are descendants of a digital arithmetic machine devised by Blaise Pascal in 1642. Later in the 17th century, Gottfried Wilhelm von Leibniz created a more advanced machine, and, especially in the late 19th century, inventors produced calculating machines that were smaller and smaller and less and less laborious to use.

INDEXCARD, 1/2
 
Machine vision

A branch of artificial intelligence and image processing concerned with the identification of graphic patterns or images that involves both cognition and abstraction. In such a system, a device linked to a computer scans, senses, and transforms images into digital patterns, which in turn are compared with patterns stored in the computer's memory. The computer processes the incoming patterns in rapid succession, isolating relevant features, filtering out unwanted signals, and adding to its memory new patterns that deviate beyond a specified threshold from the old and are thus perceived as new entities.

INDEXCARD, 2/2