Late 1970s - Present: Fourth Generation Computers

Following the invention of the first integrated circuits always more and more components could be fitted onto one chip. LSI (Large Scale Integration) was followed by VLSI (Very Large Scale Integration) and ULSI (Ultra-Large Scale Integration), which increased the number of components squeezed onto one chip into the millions and helped diminish the size as well as the price of computers. The new chips took the idea of the integrated circuit one step further as they allowed to manufacture one microprocessor which could then be programmed to meet any number of demands.

Also, ensuing the introduction of the minicomputer in the mid 1970s by the early 1980s a market for personal computers (PC) was established. As computers had become easier to use and cheaper they were no longer mainly utilized in offices and manufacturing, but also by the average consumer. Therefore the number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used.

Further developments included the creation of mobile computers (laptops and palmtops) and especially networking technology. While mainframes shared time with many terminals for many applications, networking allowed individual computers to form electronic co-operations. LANs (Local Area Network) permitted computers to share memory space, information, software and communicate with each other. Although already LANs could reach enormous proportions with the invention of the Internet an information and communication-network on a global basis was established for the first time.

TEXTBLOCK 1/6 // URL: http://world-information.org/wio/infostructure/100437611663/100438659451
 
ECHELON Main Stations

Location

Country

Target/Task

Relations

MORWENSTOW

UK

INTELSAT, Atlantic, Europe, Indian Ocean

NSA, GCHQ

SUGAR GROVE

USA

INTELSAT, Atlantic, North and South America

NSA

YAKIMA FIRING CENTER

USA

INTELSAT, Pacific

NSA

WAIHOPAI

NEW ZEALAND

INTELSAT, Pacific

NSA, GCSB

GERALDTON

AUSTRALIA

INTELSAT, Pacific

NSA, DSD

















MENWITH HILL

UK

Sat, Groundstation, Microwave(land based)

NSA, GCHQ

SHOAL BAY

AUSTRALIA

Indonesian Sat

NSA, DSD

LEITRIM

CANADA

Latin American Sat

NSA, CSE

BAD AIBLING

GERMANY

Sat, Groundstation

NSA

MISAWA

JAPAN

Sat

NSA

















PINE GAP

AUSTRALIA

Groundstation

CIA

















FORT MEADE

USA

Dictionary Processing

NSA Headquarters

WASHINGTON

USA

Dictionary Processing

NSA

OTTAWA

CANADA

Dictionary Processing

CSE

CHELTENHAM

UK

Dictionary Processing

GCHQ

CANBERRA

AUSTRALIA

Dictionary Processing

DSD

WELLINGTON

NEW ZEALAND

Dictionary Processing

GCSB Headquarters



TEXTBLOCK 2/6 // URL: http://world-information.org/wio/infostructure/100437611746/100438659207
 
The 19th Century: First Programmable Computing Devices

Until the 19th century "early computers", probably better described as calculating machines, were basically mechanical devices and operated by hand. Early calculators like the abacus worked with a system of sliding beads arranged on a rack and the centerpiece of Leibniz's multiplier was a stepped-drum gear design.

Therefore Charles Babbage's proposal of the Difference Engine (1822), which would have (it was never completed) a stored program and should perform calculations and print the results automatically, was a major breakthrough, as it for the first time suggested the automation of computers. The construction of the Difference Engine, which should perform differential equations, was inspired by Babbage's idea to apply the ability of machines to the needs of mathematics. Machines, he noted, were best at performing tasks repeatedly without mistakes, while mathematics often required the simple repetition of steps.

After working on the Difference Engine for ten years Babbage was inspired to build another machine, which he called Analytical Engine. Its invention was a major step towards the design of modern computers, as it was conceived the first general-purpose computer. Instrumental to the machine's design was his assistant, Augusta Ada King, Countess of Lovelace, the first female computer programmer.

The second major breakthrough in the design of computing machines in the 19th century may be attributed to the American inventor Herman Hollerith. He was concerned with finding a faster way to compute the U.S. census, which in 1880 had taken nearly seven years. Therefore Hollerith invented a method, which used cards to store data information which he fed into a machine that compiled the results automatically. The punch cards not only served as a storage method and helped reduce computational errors, but furthermore significantly increased speed.

Of extraordinary importance for the evolution of digital computers and artificial intelligence have furthermore been the contributions of the English mathematician and logician George Boole. In his postulates concerning the Laws of Thought (1854) he started to theorize about the true/false nature of binary numbers. His principles make up what today is known as Boolean algebra, the collection of logic concerning AND, OR, NOT operands, on which computer switching theory and procedures are grounded. Boole also assumed that the human mind works according to these laws, it performs logical operations that could be reasoned. Ninety years later Boole's principles were applied to circuits, the blueprint for electronic computers, by Claude Shannon.

TEXTBLOCK 3/6 // URL: http://world-information.org/wio/infostructure/100437611663/100438659426
 
Biometrics applications: gate keeping

Identity has to do with "place". In less mobile societies, the place where a person finds him/herself tells us something about his/her identity. In pre-industrial times, gatekeepers had the function to control access of people to particular places, i.e. the gatekeepers function was to identify people and then decide whether somebody's identity would allow that person to physically occupy another place - a town, a building, a vehicle, etc.

In modern societies, the unambiguous nature of place has been weakened. There is a great amount of physical mobility, and ever since the emergence and spread of electronic communication technologies there has been a "virtualisation" of places in what today we call "virtual space" (unlike place, space has been a virtual reality from the beginning, a mathematical formula) The question as to who one is no longer coupled to the physical abode. Highly mobile and virtualised social contexts require a new generation of gatekeepers which biometric technology aims to provide.

TEXTBLOCK 4/6 // URL: http://world-information.org/wio/infostructure/100437611729/100438658757
 
Late 1960s - Early 1970s: Third Generation Computers

One of the most important advances in the development of computer hardware in the late 1960s and early 1970s was the invention of the integrated circuit, a solid-state device containing hundreds of transistors, diodes, and resistors on a tiny silicon chip. It made possible the production of large-scale computers (mainframes) of higher operating speeds, capacity, and reliability at significantly lower costs.

Another type of computer developed at the time was the minicomputer. It profited from the progresses in microelectronics and was considerably smaller than the standard mainframe, but, for instance, powerful enough to control the instruments of an entire scientific laboratory. Furthermore operating systems, that allowed machines to run many different programs at once with a central program that monitored and coordinated the computer's memory, attained widespread use.

TEXTBLOCK 5/6 // URL: http://world-information.org/wio/infostructure/100437611663/100438659498
 
1960s - 1970s: Increased Research in Artificial Intelligence (AI)

During the cold war the U.S. tried to ensure that it would stay ahead of the Soviet Union in technological advancements. Therefore in 1963 the Defense Advanced Research Projects Agency (DARPA) granted the Massachusetts Institute of Technology (MIT) U.S.$ 2.2 million for research in machine-aided cognition (artificial intelligence). The major effect of the project was an increase in the pace of AI research and a continuation of funding.

In the 1960s and 1970s a multitude of AI programs were developed, most notably SHRDLU. Headed by Marvin Minsky the MIT's research team showed, that when confined to a small subject matter, computer programs could solve spatial and logic problems. Other progresses in the field of AI at the time were: the proposal of new theories about machine vision by David Marr, Marvin Minsky's frame theory, the PROLOGUE language (1972) and the development of expert systems.

TEXTBLOCK 6/6 // URL: http://world-information.org/wio/infostructure/100437611663/100438659474
 
Transistor

A transistor is a solid-state device for amplifying, controlling, and generating electrical signals. Transistors are used in a wide array of electronic equipment, ranging from pocket calculators and radios to industrial robots and communications satellites.

INDEXCARD, 1/5
 
Satellites

Communications satellites are relay stations for radio signals and provide reliable and distance-independent high-speed connections even at remote locations without high-bandwidth infrastructure.

On point-to-point transmission, the transmission method originally employed on, satellites face increasing competition from fiber optic cables, so point-to-multipoint transmission increasingly becomes the ruling satellite technology. Point-to-multipoint transmission enables the quick implementation of private networks consisting of very small aperture terminals (VSAT). Such networks are independent and make mobile access possible.

In the future, satellites will become stronger, cheaper and their orbits will be lower; their services might become as common as satellite TV is today.

For more information about satellites, see How Satellites Work (http://octopus.gma.org/surfing/satellites) and the Tech Museum's satellite site (http://www.thetech.org/hyper/satellite).

http://www.whatis.com/vsat.htm
http://octopus.gma.org/surfing/satellites
INDEXCARD, 2/5
 
Expert system

Expert systems are advanced computer programs that mimic the knowledge and reasoning capabilities of an expert in a particular discipline. Their creators strive to clone the expertise of one or several human specialists to develop a tool that can be used by the layman to solve difficult or ambiguous problems. Expert systems differ from conventional computer programs as they combine facts with rules that state relations between the facts to achieve a crude form of reasoning analogous to artificial intelligence. The three main elements of expert systems are: (1) an interface which allows interaction between the system and the user, (2) a database (also called the knowledge base) which consists of axioms and rules, and (3) the inference engine, a computer program that executes the inference-making process. The disadvantage of rule-based expert systems is that they cannot handle unanticipated events, as every condition that may be encountered must be described by a rule. They also remain limited to narrow problem domains such as troubleshooting malfunctioning equipment or medical image interpretation, but still have the advantage of being much lower in costs compared with paying an expert or a team of specialists.

INDEXCARD, 3/5
 
Central processing unit

A CPU is the principal part of any digital computer system, generally composed of the main memory, control unit, and arithmetic-logic unit. It constitutes the physical heart of the entire computer system; to it is linked various peripheral equipment, including input/output devices and auxiliary storage units...

INDEXCARD, 4/5
 
ARPAnet

ARPAnet was the small network of individual computers connected by leased lines that marked the beginning of today's global data networks. Being an experimental network mainly serving the purpose to test the feasibility of wide area networks, the possibility of remote computing, it was created for resource sharing between research institutions, not for messaging services like E-mail. Although research was sponsored by US military, ARPAnet was not designed for directly martial use but to support military-related research.

In 1969 ARPANET went online and links the first two computers, one of them located at the University of California, Los Angeles, the other at the Stanford Research Institute.

But ARPAnet has not become widely accepted before it was demonstrated in action to a public of computer experts at the First International Conference on Computers and Communication in Washington, D. C. in 1972.

Before it was decommissioned in 1990, NSFnet, a network of scientific and academic computers funded by the National Science Foundation, and a separate new military network went online in 1986. In 1988 the first private Internet service providers offered a general public access to NSFnet. Beginning in 1995, after having become the backbone of the Internet in the USA, NSFnet was turned over to a consortium of commercial backbone providers. This and the launch of the World Wide Web added to the success of the global data network we call the Net.

In the USA commercial users already outnumbered military and academic users in 1994.

Despite the rapid growth of the Net, most computers linked to it are still located in the United States.

INDEXCARD, 5/5