The 19th Century: First Programmable Computing Devices

Until the 19th century "early computers", probably better described as calculating machines, were basically mechanical devices and operated by hand. Early calculators like the abacus worked with a system of sliding beads arranged on a rack and the centerpiece of Leibniz's multiplier was a stepped-drum gear design.

Therefore Charles Babbage's proposal of the Difference Engine (1822), which would have (it was never completed) a stored program and should perform calculations and print the results automatically, was a major breakthrough, as it for the first time suggested the automation of computers. The construction of the Difference Engine, which should perform differential equations, was inspired by Babbage's idea to apply the ability of machines to the needs of mathematics. Machines, he noted, were best at performing tasks repeatedly without mistakes, while mathematics often required the simple repetition of steps.

After working on the Difference Engine for ten years Babbage was inspired to build another machine, which he called Analytical Engine. Its invention was a major step towards the design of modern computers, as it was conceived the first general-purpose computer. Instrumental to the machine's design was his assistant, Augusta Ada King, Countess of Lovelace, the first female computer programmer.

The second major breakthrough in the design of computing machines in the 19th century may be attributed to the American inventor Herman Hollerith. He was concerned with finding a faster way to compute the U.S. census, which in 1880 had taken nearly seven years. Therefore Hollerith invented a method, which used cards to store data information which he fed into a machine that compiled the results automatically. The punch cards not only served as a storage method and helped reduce computational errors, but furthermore significantly increased speed.

Of extraordinary importance for the evolution of digital computers and artificial intelligence have furthermore been the contributions of the English mathematician and logician George Boole. In his postulates concerning the Laws of Thought (1854) he started to theorize about the true/false nature of binary numbers. His principles make up what today is known as Boolean algebra, the collection of logic concerning AND, OR, NOT operands, on which computer switching theory and procedures are grounded. Boole also assumed that the human mind works according to these laws, it performs logical operations that could be reasoned. Ninety years later Boole's principles were applied to circuits, the blueprint for electronic computers, by Claude Shannon.

TEXTBLOCK 1/5 // URL: http://world-information.org/wio/infostructure/100437611663/100438659426
 
1900 - 2000 A.D.

1904
First broadcast talk

1918
Invention of the short-wave radio

1929
Invention of television in Germany and Russia

1941
Invention of microwave transmission

1946
Long-distance coaxial cable systems and mobile telephone services are introduced in the USA.

1957
Sputnik, the first satellite, is launched by the USSR
First data transmissions over regular phone circuits.

At the beginning of the story of today's global data networks is the story of the development of satellite communication.

In 1955 President Eisenhower announced the USA's intention to launch a satellite. But it in the end it was the Soviet Union, which launched the first satellite in 1957: Sputnik I. After Sputnik's launch it became evident that the Cold War was also a race for leadership in the application of state-of-the-art technology to defense. As the US Department of Defense encouraged the formation of high-tech companies, it laid the ground to Silicon Valley, the hot spot of the world's computer industry.

The same year as the USA launched their first satellite - Explorer I - data was transmitted over regular phone circuits for the first time, thus laying the ground for today's global data networks.

Today's satellites may record weather data, scan the planet with powerful cameras, offer global positioning and monitoring services, and relay high-speed data transmissions. Yet up to now, most satellites are designed for military purposes such as reconnaissance.

1969
ARPAnet online

ARPAnet was the small network of individual computers connected by leased lines that marked the beginning of today's global data networks. An experimental network it mainly served the purpose of testing the feasibility of wide area networks and the possibility of remote computing. It was created for resource sharing between research institutions and not for messaging services like E-mail. Although US military sponsored its research, ARPAnet was not designed for directly martial use but to support military-related research.

In 1969 ARPANET went online and linked the first two computers, one located at the University of California, Los Angeles, the other at the Stanford Research Institute.

Yet ARPAnet did not become widely accepted before it was demonstrated in action to a public of computer experts at the First International Conference on Computers and Communication in Washington, D. C. in 1972.

Before it was decommissioned in 1990, NSFnet, a network of scientific and academic computers funded by the National Science Foundation, and a separate new military network went online in 1986. In 1988 the first private Internet service providers started offering access to NSFnet to a general public. After having become the backbone of the Internet in the USA, in 1995 NSFnet was turned into a consortium of commercial backbone providers. This and the launch of the World Wide Web added to the success of the global data network we call the Net.

In the USA it was already in 1994 that commercial users outnumbered military and academic users.

Despite the rapid growth of the Net, most computers linked to it are still located in the United States.

1971
Invention of E-Mail

1979
Introduction of fiber-optic cable systems

1992
Launch of the World Wide Web

TEXTBLOCK 2/5 // URL: http://world-information.org/wio/infostructure/100437611796/100438659828
 
"Project Censored"

Project Censored was launched at Sonoma State University (U.S.) in 1976 as an annual review of the systematic withholding of public access to important news facts by the mainstream media. The team composed of student media researcher and media analysts annually selects and publishes what they believe are the 25 most important under-covered news stories. "The essential issue raised by the project is the failure of the mass media to provide the people with all the information they need to make informed decisions concerning their own lives and in the voting booth". (Project Censored)

TEXTBLOCK 3/5 // URL: http://world-information.org/wio/infostructure/100437611734/100438658583
 
Private data bunkers

On the other hand are the data bunkers of the private sector, whose position is different. Although these are fast-growing engines of data collection with a much greater degree of dynamism, they may not have the same privileged position - although one has to differentiate among the general historical and social conditions into which a data bunker is embedded. For example, it can safely be assumed that the databases of a large credit card company or bank are more protected than the bureaucracies of small developing countries.

Private data bunkers include

    Banks

    Building societies

    Credit bureaus

    Credit card companies

    Direct marketing companies

    Insurance companies

    Telecom service providers

    Mail order stores

    Online stores


TEXTBLOCK 4/5 // URL: http://world-information.org/wio/infostructure/100437611761/100438659735
 
1960s - 1970s: Increased Research in Artificial Intelligence (AI)

During the cold war the U.S. tried to ensure that it would stay ahead of the Soviet Union in technological advancements. Therefore in 1963 the Defense Advanced Research Projects Agency (DARPA) granted the Massachusetts Institute of Technology (MIT) U.S.$ 2.2 million for research in machine-aided cognition (artificial intelligence). The major effect of the project was an increase in the pace of AI research and a continuation of funding.

In the 1960s and 1970s a multitude of AI programs were developed, most notably SHRDLU. Headed by Marvin Minsky the MIT's research team showed, that when confined to a small subject matter, computer programs could solve spatial and logic problems. Other progresses in the field of AI at the time were: the proposal of new theories about machine vision by David Marr, Marvin Minsky's frame theory, the PROLOGUE language (1972) and the development of expert systems.

TEXTBLOCK 5/5 // URL: http://world-information.org/wio/infostructure/100437611663/100438659474
 
Robot

Robot relates to any automatically operated machine that replaces human effort, though it may not resemble human beings in appearance or perform functions in a humanlike manner. The term is derived from the Czech word robota, meaning "forced labor." Modern use of the term stems from the play R.U.R., written in 1920 by the Czech author Karel Capek, which depicts society as having become dependent on mechanical workers called robots that are capable of doing any kind of mental or physical work. Modern robot devices descend through two distinct lines of development--the early automation, essentially mechanical toys, and the successive innovations and refinements introduced in the development of industrial machinery.

INDEXCARD, 1/3
 
Java Applets

Java applets are small programs that can be sent along with a Web page to a user. Java applets can perform interactive animations, immediate calculations, or other simple tasks without having to send a user request back to the server. They are written in Java, a platform-independent computer language, which was invented by Sun Microsystems, Inc.

Source: Whatis.com

INDEXCARD, 2/3
 
Memex Animation by Ian Adelman and Paul Kahn


INDEXCARD, 3/3