Late 1950s - Early 1960s: Second Generation Computers

An important change in the development of computers occurred in 1948 with the invention of the transistor. It replaced the large, unwieldy vacuum tube and as a result led to a shrinking in size of electronic machinery. The transistor was first applied to a computer in 1956. Combined with the advances in magnetic-core memory, the use of transistors resulted in computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.

Stretch by IBM and LARC by Sperry-Rand (1959) were the first large-scale machines to take advantage of the transistor technology (and also used assembly language instead of the difficult machine language). Both developed for atomic energy laboratories could handle enormous amounts of data, but still were costly and too powerful for the business sector's needs. Therefore only two LARC's were ever installed.

Throughout the early 1960s there were a number of commercially successful computers (for example the IBM 1401) used in business, universities, and government and by 1965 most large firms routinely processed financial information by using computers. Decisive for the success of computers in business was the stored program concept and the development of sophisticated high-level programming languages like FORTRAN (Formular Translator), 1956, and COBOL (Common Business-Oriented Language), 1960, that gave them the flexibility to be cost effective and productive. The invention of second generation computers also marked the beginning of an entire branch, the software industry, and the birth of a wide range of new types of careers.

TEXTBLOCK 1/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659439
 
1980s: Artificial Intelligence (AI) - From Lab to Life

Following the commercial success of expert systems, which started in the 1970s, also other AI technologies began to make their way into the marketplace. In 1986, U.S. sales of AI-related hardware and software rose to U.S.$ 425 million. Especially expert systems, because of their efficiency, were still in demand. Yet also other fields of AI turned out to be successful in the corporate world.

Machine vision systems for example were used for the cameras and computers on assembly lines to perform quality control. By 1985 over a hundred companies offered machine vision systems in the U.S., and sales totaled U.S.$ 80 million. Although there was a breakdown in the market for AI-systems in 1986 - 1987, which led to a cut back in funding, the industry slowly recovered.

New technologies were being invented in Japan. Fuzzy logic pioneered in the U.S. and also neural networks were being reconsidered for achieving artificial intelligence. The probably most important development of the 1980s was, that it showed that AI technology had real life uses. AI applications like voice and character recognition systems or steadying camcorders using fuzzy logic were not only made available to business and industry, but also to the average customer.

TEXTBLOCK 2/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659445
 
Robot

Robot relates to any automatically operated machine that replaces human effort, though it may not resemble human beings in appearance or perform functions in a humanlike manner. The term is derived from the Czech word robota, meaning "forced labor." Modern use of the term stems from the play R.U.R., written in 1920 by the Czech author Karel Capek, which depicts society as having become dependent on mechanical workers called robots that are capable of doing any kind of mental or physical work. Modern robot devices descend through two distinct lines of development--the early automation, essentially mechanical toys, and the successive innovations and refinements introduced in the development of industrial machinery.

INDEXCARD, 1/2
 
Artificial Intelligence

Artificial Intelligence is concerned with the simulation of human thinking and emotions in information technology. AI develops "intelligent systems" capable, for example, of learning and logical deduction. AI systems are used for creatively handling large amounts of data (as in data mining), as well as in natural speech processing and image recognition. AI is also used as to support decision taking in highly complex environments.
Yahoo AI sites: http://dir.yahoo.com/Science/Computer_Science/Artificial_Intelligence/
MIT AI lab: http://www.ai.mit.edu/


http://dir.yahoo.com/Science/Computer_Science...
http://www.ai.mit.edu/
INDEXCARD, 2/2