The 18th Century: Powered Machines and the Industrial Revolution
The invention of the steam engine by James Watt in 1776 represented a major advance in the development of powered machines. It was first applied to an industrial operation - the spinning of cotton - in 1785. A new kind of work-slave it not only marked the beginning of the Industrial Revolution, but also the coming age of mass production.
In the England of the 18th century five important inventions in the textile industry advanced the automation of work processes. 1) John Kay's flying shuttle in 1733 , which permitted the weaving of larger widths of cloth and significantly increased weaving speed, 2) Edmund Cartwright's power loom in 1785, which increased weaving speed still further, 3) James Hargreaves' spinning jenny in 1764, 4) Richard Arkwright's water frame and 5) Samuel Crompton's spinning mule in 1779, whereby the last three inventions improved the speed and quality of thread-spinning operations. Those developments, combined with the invention of the steam engine, in short time led to the creation of new machine-slaves and the mechanization of the production of most major goods, such as iron, paper, leather, glass and bricks.
Large-scale machine production was soon applied in many manufacturing sectors and resulted in a reduction of production costs. Yet the widespread use of the novel work-slaves also led to new demands concerning the work force's qualifications. The utilization of machines enabled a differentiated kind of division of labor and eventuated in a (further) specialization of skills. While before many goods were produced by skilled craftsmen the use of modern machinery increased the demand for semiskilled and unskilled workers. Also, the nature of the work process altered from one mainly dependent on physical power to one primarily dominated by technology and an increasing proportion of the labor force employed to operate machines.
|
TEXTBLOCK 1/3 // URL: http://world-information.org/wio/infostructure/100437611663/100438659368
|
|
The 17th Century: The Invention of the First "Computers"
The devices often considered the first "computers" in our understanding were rather calculators than the sophisticated combination of hard- and software we call computers today.
In 1642 Blaise Pascal, the son of a French tax collector, developed a device to perform additions. His numerical wheel calculator was a brass rectangular box and used eight movable dials to add sums up to eight figures long. Designed to help his father with his duties, the big disadvantage of the Pascaline was its limitation to addition.
Gottfried Wilhelm von Leibniz, a German mathematician and philosopher, in 1694 improved the Pascaline by creating a machine that could also multiply. As its predecessor Leibniz's mechanical multiplier likewise worked by a system of gears and dials. Leibniz also formulated a model that may be considered the theoretical ancestor of some modern computers. In De Arte Combinatoria (1666) Leibniz argued that all reasoning, all discover, verbal or not, is reducible to an ordered combination of elements, such as numbers, words, colors, or sounds.
Further improvements in the field of early computing devices were made by Charles Xavier Thomas de Colmar, a Frenchmen. His arithometer could not only add and multiply, but perform the four basic arithmetic functions and was widely used up until the First World War.
|
TEXTBLOCK 2/3 // URL: http://world-information.org/wio/infostructure/100437611663/100438659397
|
|
1940s - Early 1950s: First Generation Computers
Probably the most important contributor concerning the theoretical basis for the digital computers that were developed in the 1940s was Alan Turing, an English mathematician and logician. In 1936 he created the Turing machine, which was originally conceived as a mathematical tool that could infallibly recognize undecidable propositions. Although he instead proved that there cannot exist any universal method of determination, Turing's machine represented an idealized mathematical model that reduced the logical structure of any computing device to its essentials. His basic scheme of an input/output device, memory, and central processing unit became the basis for all subsequent digital computers.
The onset of the Second World War led to an increased funding for computer projects, which hastened technical progress, as governments sought to develop computers to exploit their potential strategic importance.
By 1941 the German engineer Konrad Zuse had developed a computer, the Z3, to design airplanes and missiles. Two years later the British completed a secret code-breaking computer called Colossus to decode German messages and by 1944 the Harvard engineer Howard H. Aiken had produced an all-electronic calculator, whose purpose was to create ballistic charts for the U.S. Navy.
Also spurred by the war the Electronic Numerical Integrator and Computer (ENIAC), a general-purpose computer, was produced by a partnership between the U.S. government and the University of Pennsylvania (1943). Consisting of 18.000 vacuum tubes, 70.000 resistors and 5 million soldered joints, the computer was such a massive piece of machinery (floor space: 1,000 square feet) that it consumed 160 kilowatts of electrical power, enough energy to dim lights in an entire section of a bigger town.
Concepts in computer design that remained central to computer engineering for the next 40 years were developed by the Hungarian-American mathematician John von Neumann in the mid-1940s. By 1945 he created the Electronic Discrete Variable Automatic Computer (EDVAC) with a memory to hold both a stored program as well as data. The key element of the Neumann architecture was the central processing unit (CPU), which allowed all computer functions to be coordinated through a single source. One of the first commercially available computers to take advantage of the development of the CPU was the UNIVAC I (1951). Both the U.S. Census bureau and General Electric owned UNIVACs (Universal Automatic Computer).
Characteristic for first generation computers was the fact, that instructions were made-to-order for the specific task for which the computer was to be used. Each computer had a different binary-coded program called a machine language that told it how to operate. Therefore computers were difficult to program and limited in versatility and speed. Another feature of early computers was that they used vacuum tubes and magnetic drums for storage.
|
TEXTBLOCK 3/3 // URL: http://world-information.org/wio/infostructure/100437611663/100438659338
|
|
Division of labor
The term refers to the separation of a work process into a number of tasks, with each task performed by a separate person or group of persons. It is most often applied to mass production systems, where it is one of the basic organizing principles of the assembly line. Breaking down work into simple, repetitive tasks eliminates unnecessary motion and limits the handling of tools and parts. The consequent reduction in production time and the ability to replace craftsmen with lower-paid, unskilled workers result in lower production costs and a less expensive final product. The Scottish economist Adam Smith saw in this splitting of tasks a key to economic progress by providing a cheaper and more efficient means of producing economic goods.
|
INDEXCARD, 1/3
|
|
Charles Babbage
b. December 26, 1791, London, England d. October 18, 1871, London, England
English mathematician and inventor who is credited with having conceived the first automatic digital computer. The idea of mechanically calculating mathematical tables first came to Babbage in 1812 or 1813. Later he made a small calculator that could perform certain mathematical computations to eight decimals. During the mid-1830s Babbage developed plans for the so-called analytical engine, the forerunner of the modern digital computer. In this device he envisioned the capability of performing any arithmetical operation on the basis of instructions from punched cards, a memory unit in which to store numbers, sequential control, and most of the other basic elements of the present-day computer.
|
INDEXCARD, 2/3
|
|
Calculator
Calculators are machines for automatically performing arithmetical operations and certain mathematical functions. Modern calculators are descendants of a digital arithmetic machine devised by Blaise Pascal in 1642. Later in the 17th century, Gottfried Wilhelm von Leibniz created a more advanced machine, and, especially in the late 19th century, inventors produced calculating machines that were smaller and smaller and less and less laborious to use.
|
INDEXCARD, 3/3
|
|