Report: Slave and Expert Systems

  Related Search:

  Late 1950s - Early 1960s: Second Generation Computers

An important change in the development of computers occurred in 1948 with the invention of the transistor. It replaced the large, unwieldy vacuum tube and as a result led to a shrinking in size of electronic machinery. The transistor was first applied to a computer in 1956. Combined with the advances in magnetic-core memory, the use of transistors resulted in computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.

Stretch by IBM and LARC by Sperry-Rand (1959) were the first large-scale machines to take advantage of the transistor technology (and also used assembly language instead of the difficult machine language). Both developed for atomic energy laboratories could handle enormous amounts of data, but still were costly and too powerful for the business sector's needs. Therefore only two LARC's were ever installed.

Throughout the early 1960s there were a number of commercially successful computers (for example the IBM 1401) used in business, universities, and government and by 1965 most large firms routinely processed financial information by using computers. Decisive for the success of computers in business was the stored program concept and the development of sophisticated high-level programming languages like FORTRAN (Formular Translator), 1956, and COBOL (Common Business-Oriented Language), 1960, that gave them the flexibility to be cost effective and productive. The invention of second generation computers also marked the beginning of an entire branch, the software industry, and the birth of a wide range of new types of careers.

browse Report:
Slave and Expert Systems
    Introduction: The Substitution of Human Faculties with Technology: Early Tools
-3   1950: The Turing Test
-2   1940s - 1950s: The Development of Early Robotics Technology
-1   1950s: The Beginnings of Artificial Intelligence (AI) Research
0   Late 1950s - Early 1960s: Second Generation Computers
+1   1961: Installation of the First Industrial Robot
+2   Late 1960s - Early 1970s: Third Generation Computers
+3   1960s - 1970s: Increased Research in Artificial Intelligence (AI)
1980s: Artificial Intelligence (AI) - From Lab to Life
Automation is concerned with the application of machines to tasks once performed by humans or, increasingly, to tasks that would otherwise be impossible. Although the term mechanization is often used to refer to the simple replacement of human labor by machines, automation generally implies the integration of machines into a self-governing system. Automation has revolutionized those areas in which it has been introduced, and there is scarcely an aspect of modern life that has been unaffected by it. Nearly all industrial installations of automation, and in particular robotics, involve a replacement of human labor by an automated system. Therefore, one of the direct effects of automation in factory operations is the dislocation of human labor from the workplace. The long-term effects of automation on employment and unemployment rates are debatable. Most studies in this area have been controversial and inconclusive. As of the early 1990s, there were fewer than 100,000 robots installed in American factories, compared with a total work force of more than 100 million persons, about 20 million of whom work in factories.