Late 1950s - Early 1960s: Second Generation Computers

An important change in the development of computers occurred in 1948 with the invention of the transistor. It replaced the large, unwieldy vacuum tube and as a result led to a shrinking in size of electronic machinery. The transistor was first applied to a computer in 1956. Combined with the advances in magnetic-core memory, the use of transistors resulted in computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.

Stretch by IBM and LARC by Sperry-Rand (1959) were the first large-scale machines to take advantage of the transistor technology (and also used assembly language instead of the difficult machine language). Both developed for atomic energy laboratories could handle enormous amounts of data, but still were costly and too powerful for the business sector's needs. Therefore only two LARC's were ever installed.

Throughout the early 1960s there were a number of commercially successful computers (for example the IBM 1401) used in business, universities, and government and by 1965 most large firms routinely processed financial information by using computers. Decisive for the success of computers in business was the stored program concept and the development of sophisticated high-level programming languages like FORTRAN (Formular Translator), 1956, and COBOL (Common Business-Oriented Language), 1960, that gave them the flexibility to be cost effective and productive. The invention of second generation computers also marked the beginning of an entire branch, the software industry, and the birth of a wide range of new types of careers.

TEXTBLOCK 1/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659439
 
Virtual body and data body



The result of this informatisation is the creation of a virtual body which is the exterior of a man or woman's social existence. It plays the same role that the physical body, except located in virtual space (it has no real location). The virtual body holds a certain emancipatory potential. It allows us to go to places and to do things which in the physical world would be impossible. It does not have the weight of the physical body, and is less conditioned by physical laws. It therefore allows one to create an identity of one's own, with much less restrictions than would apply in the physical world.

But this new freedom has a price. In the shadow of virtualisation, the data body has emerged. The data body is a virtual body which is composed of the files connected to an individual. As the Critical Art Ensemble observe in their book Flesh Machine, the data body is the "fascist sibling" of the virtual body; it is " a much more highly developed virtual form, and one that exists in complete service to the corporate and police state."

The virtual character of the data body means that social regulation that applies to the real body is absent. While there are limits to the manipulation and exploitation of the real body (even if these limits are not respected everywhere), there is little regulation concerning the manipulation and exploitation of the data body, although the manipulation of the data body is much easier to perform than that of the real body. The seizure of the data body from outside the concerned individual is often undetected as it has become part of the basic structure of an informatised society. But data bodies serve as raw material for the "New Economy". Both business and governments claim access to data bodies. Power can be exercised, and democratic decision-taking procedures bypassed by seizing data bodies. This totalitarian potential of the data body makes the data body a deeply problematic phenomenon that calls for an understanding of data as social construction rather than as something representative of an objective reality. How data bodies are generated, what happens to them and who has control over them is therefore a highly relevant political question.

TEXTBLOCK 2/2 // URL: http://world-information.org/wio/infostructure/100437611761/100438659695
 
Leitrim Station

Latitude: 45.33, Longitude: -75.6

At Canadian Forces Station Leitrim collection of foreign signals intelligence is collected. Foreign radio, radar and other electronic emissions are intercepted and analyzed to provide foreign intelligence to the Canadian government. Leitrim contains a wide variety of antennae, including a Pusher HF-DF circularly-disposed antenna array (CDAA), three other large circular arrays, four satellite dishes, and a number of other, small antennae. The targets of Leitrim's dishes are probably Mexican and/or Brazilian communications satellites. Both countries' satellite constellations were established in 1985, at about the same time as Leitrim's new dishes started to be installed. A focus on these satellites would also explain CSE's rumoured increase in Spanish language activities.

Source: http://watserv1.uwaterloo.ca/~brobinso/cse.html

http://watserv1.uwaterloo.ca/~brobinso/cse.ht...
INDEXCARD, 1/2
 
NSA

U.S. intelligence agency within the Department of Defense that is responsible for cryptographic and communications intelligence and security. The NSA grew out of the communications intelligence activities of U.S. military units during World War II. The NSA was established in 1952 by a presidential directive and, not being a creation of the Congress, is relatively immune to Congressional review; it is the most secret of all U.S. intelligence agencies. The agency's mission includes the protection and formulation of codes, ciphers, and other cryptology for the U.S. military and other government agencies, as well as the interception, analysis, and solution of coded transmissions by electronic or other means. The agency conducts research into all forms of electronic transmission. It operates posts for the interception of signals around the world. Being a target of the highest priority for penetration by hostile intelligence services, the NSA maintains no contact with the public or the press.

http://www.nsa.gov/index.html

http://www.nsa.gov/index.html
INDEXCARD, 2/2