1940s - Early 1950s: First Generation Computers
Probably the most important contributor concerning the theoretical basis for the digital computers that were developed in the 1940s was Alan Turing, an English mathematician and logician. In 1936 he created the Turing machine, which was originally conceived as a mathematical tool that could infallibly recognize undecidable propositions. Although he instead proved that there cannot exist any universal method of determination, Turing's machine represented an idealized mathematical model that reduced the logical structure of any computing device to its essentials. His basic scheme of an input/output device, memory, and central processing unit became the basis for all subsequent digital computers.
The onset of the Second World War led to an increased funding for computer projects, which hastened technical progress, as governments sought to develop computers to exploit their potential strategic importance.
By 1941 the German engineer Konrad Zuse had developed a computer, the Z3, to design airplanes and missiles. Two years later the British completed a secret code-breaking computer called Colossus to decode German messages and by 1944 the Harvard engineer Howard H. Aiken had produced an all-electronic calculator, whose purpose was to create ballistic charts for the U.S. Navy.
Also spurred by the war the Electronic Numerical Integrator and Computer (ENIAC), a general-purpose computer, was produced by a partnership between the U.S. government and the University of Pennsylvania (1943). Consisting of 18.000 vacuum tubes, 70.000 resistors and 5 million soldered joints, the computer was such a massive piece of machinery (floor space: 1,000 square feet) that it consumed 160 kilowatts of electrical power, enough energy to dim lights in an entire section of a bigger town.
Concepts in computer design that remained central to computer engineering for the next 40 years were developed by the Hungarian-American mathematician John von Neumann in the mid-1940s. By 1945 he created the Electronic Discrete Variable Automatic Computer (EDVAC) with a memory to hold both a stored program as well as data. The key element of the Neumann architecture was the central processing unit (CPU), which allowed all computer functions to be coordinated through a single source. One of the first commercially available computers to take advantage of the development of the CPU was the UNIVAC I (1951). Both the U.S. Census bureau and General Electric owned UNIVACs (Universal Automatic Computer).
Characteristic for first generation computers was the fact, that instructions were made-to-order for the specific task for which the computer was to be used. Each computer had a different binary-coded program called a machine language that told it how to operate. Therefore computers were difficult to program and limited in versatility and speed. Another feature of early computers was that they used vacuum tubes and magnetic drums for storage.
|
TEXTBLOCK 1/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659338
|
|
1980s: Artificial Intelligence (AI) - From Lab to Life
Following the commercial success of expert systems, which started in the 1970s, also other AI technologies began to make their way into the marketplace. In 1986, U.S. sales of AI-related hardware and software rose to U.S.$ 425 million. Especially expert systems, because of their efficiency, were still in demand. Yet also other fields of AI turned out to be successful in the corporate world.
Machine vision systems for example were used for the cameras and computers on assembly lines to perform quality control. By 1985 over a hundred companies offered machine vision systems in the U.S., and sales totaled U.S.$ 80 million. Although there was a breakdown in the market for AI-systems in 1986 - 1987, which led to a cut back in funding, the industry slowly recovered.
New technologies were being invented in Japan. Fuzzy logic pioneered in the U.S. and also neural networks were being reconsidered for achieving artificial intelligence. The probably most important development of the 1980s was, that it showed that AI technology had real life uses. AI applications like voice and character recognition systems or steadying camcorders using fuzzy logic were not only made available to business and industry, but also to the average customer.
|
TEXTBLOCK 2/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659445
|
|
Cooperative Association of Internet Data Analysis (CAIDA)
Based at the University of California's San Diego Supercomputer Center, CAIDA supports cooperative efforts among the commercial, government and research communities aimed at promoting a scalable, robust Internet infrastructure. It is sponsored by the Defense Advanced Research Project Agency (DARPA) through its Next Generation Internet program, by the National Science Foundation, Cisco, Inc., and Above.net.
|
INDEXCARD, 1/3
|
|
Virtual Private Networks
Virtual Private Networks provide secured connections to a corporate site over a public network as the Internet. Data transmitted through secure connections are encrypted and therefore have to be encrypted before they can be read. These networks are called virtual because connections are provided only when you connect to a corporate site; they do not rely on dedicated lines and support mobile use.
|
INDEXCARD, 2/3
|
|
DES
The U.S. Data Encryption Standard (= DES) is the most widely used encryption algorithm, especially used for protection of financial transactions. It was developed by IBM in 1971. It is a symmetric-key cryptosystem. The DES algorithm uses a 56-bit encryption key, meaning that there are 72,057,594,037,927,936 possible keys.
for more information see:
http://www.britannica.com/bcom/eb/article/3/0,5716,117763+5,00.html
http://www.cryptography.com/des/
http://www.britannica.com/bcom/eb/article/3/0...
http://www.cryptography.com/des/
|
INDEXCARD, 3/3
|
|