The 18th Century: Powered Machines and the Industrial Revolution

The invention of the steam engine by James Watt in 1776 represented a major advance in the development of powered machines. It was first applied to an industrial operation - the spinning of cotton - in 1785. A new kind of work-slave it not only marked the beginning of the Industrial Revolution, but also the coming age of mass production.

In the England of the 18th century five important inventions in the textile industry advanced the automation of work processes. 1) John Kay's flying shuttle in 1733 , which permitted the weaving of larger widths of cloth and significantly increased weaving speed, 2) Edmund Cartwright's power loom in 1785, which increased weaving speed still further, 3) James Hargreaves' spinning jenny in 1764, 4) Richard Arkwright's water frame and 5) Samuel Crompton's spinning mule in 1779, whereby the last three inventions improved the speed and quality of thread-spinning operations. Those developments, combined with the invention of the steam engine, in short time led to the creation of new machine-slaves and the mechanization of the production of most major goods, such as iron, paper, leather, glass and bricks.

Large-scale machine production was soon applied in many manufacturing sectors and resulted in a reduction of production costs. Yet the widespread use of the novel work-slaves also led to new demands concerning the work force's qualifications. The utilization of machines enabled a differentiated kind of division of labor and eventuated in a (further) specialization of skills. While before many goods were produced by skilled craftsmen the use of modern machinery increased the demand for semiskilled and unskilled workers. Also, the nature of the work process altered from one mainly dependent on physical power to one primarily dominated by technology and an increasing proportion of the labor force employed to operate machines.

TEXTBLOCK 1/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659368
 
1940s - Early 1950s: First Generation Computers

Probably the most important contributor concerning the theoretical basis for the digital computers that were developed in the 1940s was Alan Turing, an English mathematician and logician. In 1936 he created the Turing machine, which was originally conceived as a mathematical tool that could infallibly recognize undecidable propositions. Although he instead proved that there cannot exist any universal method of determination, Turing's machine represented an idealized mathematical model that reduced the logical structure of any computing device to its essentials. His basic scheme of an input/output device, memory, and central processing unit became the basis for all subsequent digital computers.

The onset of the Second World War led to an increased funding for computer projects, which hastened technical progress, as governments sought to develop computers to exploit their potential strategic importance.

By 1941 the German engineer Konrad Zuse had developed a computer, the Z3, to design airplanes and missiles. Two years later the British completed a secret code-breaking computer called Colossus to decode German messages and by 1944 the Harvard engineer Howard H. Aiken had produced an all-electronic calculator, whose purpose was to create ballistic charts for the U.S. Navy.

Also spurred by the war the Electronic Numerical Integrator and Computer (ENIAC), a general-purpose computer, was produced by a partnership between the U.S. government and the University of Pennsylvania (1943). Consisting of 18.000 vacuum tubes, 70.000 resistors and 5 million soldered joints, the computer was such a massive piece of machinery (floor space: 1,000 square feet) that it consumed 160 kilowatts of electrical power, enough energy to dim lights in an entire section of a bigger town.

Concepts in computer design that remained central to computer engineering for the next 40 years were developed by the Hungarian-American mathematician John von Neumann in the mid-1940s. By 1945 he created the Electronic Discrete Variable Automatic Computer (EDVAC) with a memory to hold both a stored program as well as data. The key element of the Neumann architecture was the central processing unit (CPU), which allowed all computer functions to be coordinated through a single source. One of the first commercially available computers to take advantage of the development of the CPU was the UNIVAC I (1951). Both the U.S. Census bureau and General Electric owned UNIVACs (Universal Automatic Computer).

Characteristic for first generation computers was the fact, that instructions were made-to-order for the specific task for which the computer was to be used. Each computer had a different binary-coded program called a machine language that told it how to operate. Therefore computers were difficult to program and limited in versatility and speed. Another feature of early computers was that they used vacuum tubes and magnetic drums for storage.

TEXTBLOCK 2/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659338
 
Expert system

Expert systems are advanced computer programs that mimic the knowledge and reasoning capabilities of an expert in a particular discipline. Their creators strive to clone the expertise of one or several human specialists to develop a tool that can be used by the layman to solve difficult or ambiguous problems. Expert systems differ from conventional computer programs as they combine facts with rules that state relations between the facts to achieve a crude form of reasoning analogous to artificial intelligence. The three main elements of expert systems are: (1) an interface which allows interaction between the system and the user, (2) a database (also called the knowledge base) which consists of axioms and rules, and (3) the inference engine, a computer program that executes the inference-making process. The disadvantage of rule-based expert systems is that they cannot handle unanticipated events, as every condition that may be encountered must be described by a rule. They also remain limited to narrow problem domains such as troubleshooting malfunctioning equipment or medical image interpretation, but still have the advantage of being much lower in costs compared with paying an expert or a team of specialists.

INDEXCARD, 1/5
 
Vacuum tube

The first half of the 20th century was the era of the vacuum tube in electronics. This variety of electron tube permitted the development of radio broadcasting, long-distance telephony, television, and the first electronic digital computers. These early electronic computers were, in fact, the largest vacuum-tube systems ever built. Perhaps the best-known representative is the ENIAC (Electronic Numerical Integrator and Computer, completed in 1946).

INDEXCARD, 2/5
 
UNIVAC

Built by Remington Rand in 1951 the UNIVAC I (Universal Automatic Computer) was one of the first commercially available computers to take advantage of the development of the central processing unit (CPU). Both the U.S. Census bureau and General Electric owned UNIVACs. Speed: 1,905 operations per second; input/output: magnetic tape, unityper, printer; memory size: 1,000 12-digit words in delay line; technology: serial vacuum tubes, delay lines, magnetic tape; floor space: 943 cubic feet; cost: F.O.B. factory U.S.$ 750,000 plus U.S.$ 185,000 for a high speed printer.

INDEXCARD, 3/5
 
Internet Exchanges

Internet exchanges are intersecting points between major networks.

List of the World's Public Internet exchanges (http://www.ep.net)

http://www.ep.net/
INDEXCARD, 4/5
 
Central processing unit

A CPU is the principal part of any digital computer system, generally composed of the main memory, control unit, and arithmetic-logic unit. It constitutes the physical heart of the entire computer system; to it is linked various peripheral equipment, including input/output devices and auxiliary storage units...

INDEXCARD, 5/5