The Privatization of Censorship

According to a still widely held conviction, the global data networks constitute the long desired arena for uncensorable expression. This much is true: Because of the Net it has become increasingly difficult to sustain cultural and legal standards. Geographical proximity and territorial boundaries prove to be less relevant, when it does not affect a document's availability if it is stored on your desktop or on a host some thousand kilometers away. There is no international agreement on non-prohibited contents, so human rights organizations and nazi groups alike can bypass restrictions. No single authority or organization can impose its rules and standards on all others. This is why the Net is public space, a political arena where free expression is possible.

This freedom is conditioned by the design of the Net. But the Net's design is not a given, as Lawrence Lessig reminds us. Originally the design of the Net allowed a relatively high degree of privacy and communication was not controlled directly. But now this design is changing and this invisible agora in electronic space is endangered. Governments - even elected ones - and corporations introduce new technologies that allow us to be identified, monitored and tracked, that identify and block content, and that can allow our behaviour to be efficiently controlled.

When the World Wide Web was introduced, soon small independent media and human rights organizations began to use this platform for drawing worldwide attention to their publications and causes. It seemed to be the dawning of a new era with authoritarian regimes and multinational media corporations on the looser side. But now the Net's design is changing according to their needs.

"In every context that it can, the entertaining industry is trying to force the Internet into its own business model: the perfect control of content. From music (fighting MP3) and film (fighting the portability of DVD) to television, the industry is resisting the Net's original design. It was about the free flow of content; Hollywood wants perfect control instead" (Lawrence Lessig, Cyberspace Prosecutor, in: The Industry Standard, February 2000).

In the United States, Hollywood and AT&T, after its merger with MediaOne becoming the biggest US cable service provider, return to their prior positions in the Seventies: the control of content and infrastructure. If most people will access the Net via set up boxes connected to a TV set, it will become a kind of television, at least in the USA.

For small independent media it will become very hard to be heard, especially for those offering streaming video and music. Increasingly faster data transmissions just apply to download capacities; upload capacities are much - on the average about eight times - lower than download capacities. As an AT&T executive said in response to criticism: "We haven't built a 56 billion dollar cable network to have the blood sucked from our veins" (Lawrence Lessig, The Law in the Code: How the Net is Regulated, Lecture at the Institute for Human Sciences, Vienna, May 29th, 2000).

Consumers, not producers are preferred.

For corporations what remains to be done to control the Net is mainly to cope with the fact that because of the Net it has become increasingly difficult to sustain cultural and legal standards. On Nov 11, 1995 the German prosecuting attorney's office searched Compuserve Germany, the branch of an international Internet service provider, because the company was suspected of having offered access to child pornography. Consequently Compuserve blocked access to more than 200 newsgroups, all containing "sex" or "gay" in their names, for all its customers. But a few days later, an instruction for access to these blocked newsgroups via Compuserve came into circulation. On February 26, 1997, Felix Somm, the Chief Executive Officer of Compuserve Germany, was accused of complicity with the distribution of child and animal pornography in newsgroups. In May 1998 he received a prison sentence for two years. This sentence was suspended against a bail of about 51.000 Euro. The sentence was justified by pointing to the fact that Compuserve Germany offered access to its US parent company's servers hosting child pornography. Felix Somm was held responsible for access to forbidden content he could not know of. (For further information (in German) click here.)

Also in 1995, as an attack on US Vice-President Al Gore's intention to supply all public schools with Internet access, Republican Senator Charles Grassley warned of the lurking dangers for children on the Net. By referring to a Time magazine cover story by Philip Elmer-Dewitt from July 3 on pornography on the Net, he pointed out that 83,5% of all images online are pornographic. But Elmer-Dewitt was wrong. Obviously unaware of the difference between Bulletin Board Systems and the Net, he referred misleadingly to Marty Rimm's article Marketing Pornography on the Information Superhighway, published in the prestigious Georgetown Law Journal (vol. 83, June 1995, pp. 1849-1935). Rimm knew of this difference, of course, and stated it clearly. (For further information see Hoffman & Novak, The Cyberporn debate, http://ecommerce.vanderbilt.edu/cyberporn.debate.html and Franz Wegener, Cyberpornographie: Chronologie einer Hexenjagd; http://www.intro-online.de/c6.html)

Almost inevitably anxieties accompany the introduction of new technologies. In the 19th century it was said that traveling by train is bad for health. The debate produced by Time magazine's cover story and Senator Grassley's attack caused the impression that the Net has multiplied possible dangers for children. The global communication networks seem to be a inexhaustible source of mushrooming child pornography. Later would-be bomb recipes found on the Net added to already prevailing anxieties. As even in industrialized countries most people still have little or no first-hand experience with the Net, anxieties about child pornography or terrorist attacks can be stirred up and employed easily.

A similar and related debate is going on about the glorification of violence and erotic depictions in media. Pointing to a "toxic popular culture" shaped by media that "distort children's view of reality and even undermine their character growth", US right-wing social welfare organizations and think tanks call for strong media censorship. (See An Appeal to Hollywood, http://www.media-appeal.org/appeal.htm) Media, especially films and videos, are already censored and rated, so it is more censorship that is wanted.

The intentions for stimulating a debate on child pornography on the Net were manifold: Inter alia, it served the Republican Party to attack Democrat Al Gore's initiative to supply all public schools with Internet access; additionally, the big media corporations realized that because of the Net they might have to face new competitors and rushed to press for content regulation. Taking all these intentions together, we can say that this still ongoing debate constitutes the first and most well known attempt to impose content regulation on the Net. Consequently, at least in Western countries, governments and media corporations refer to child pornography for justifying legal requirement and the implementation of technologies for the surveillance and monitoring of individuals, the filtering, rating and blocking of content, and the prohibition of anonymous publishing on the Net.

In the name of "cleaning" the Net of child pornography, our basic rights are restricted. It is the insistence on unrestricted basic rights that needs to be justified, as it may seem.

Underlying the campaign to control the Net are several assumptions. Inter alia: The Net lacks control and needs to be made safe and secure; we may be exposed inadvertently to pornographic content; this content is harmful to children. Remarkably, racism seems to be not an issue.

The Net, especially the World Wide Web, is not like television (although it is to be feared this is what it might become like within the next years). Say, little Mary types "Barbie" in a search engine. Click here to see what happens. It is true, sometimes you might have the opportunity to see that pornography is just a few mouse clicks away, but it is not likely that you might be exposed to pornographic content unless you make deliberate mouse clicks.

In reaction to these anxieties, but in absence of data how children use the Internet, the US government released the Communications Decency Act (CDA) in 1996. In consequence the Electronic Frontier Foundation (EFF) launched the famous Blue Ribbon Campaign and, among others, America Online and Microsoft Corporation supported a lawsuit of the American Civil Liberties Union (ACLU) against this Act. On June 26, 1997, the US Supreme Court ruled the CDA as unconstitutional under the provisions of the First Amendment to the Constitution: The Communications Decency Act violated the basic right to free expression. After a summit with the US government industry leaders announced the using of existing rating and blocking systems and the development of new ones for "inappropriate" online resources.

So, after the failing of the CDA the US government has shifted its responsibility to the industry by inviting corporations to taking on governmental tasks. Bearing in the mind the CompuServe case and its possible consequences, the industry welcomed this decision and was quick to call this newly assumed responsibility "self-regulation". Strictly speaking, "self-regulation" as meant by the industry does not amount to the regulation of the behaviour of corporations by themselves. On the opposite, "self-regulation" is to be understood as the regulation of users' behaviour by the rating, filtering and blocking of Internet content considered being inappropriate. The Internet industry tries to show that technical solutions are more favourable than legislation und wants to be sure, not being held responsible and liable for illegal, offensive or harmful content. A new CompuServe case and a new Communications Decency Act shall be averted.

In the Memorandum Self-regulation of Internet Content released in late 1999 by the Bertelsmann Foundation it is recommended that the Internet industry joins forces with governmental institutions for enforcing codes of conduct and encouraging the implementation of filters and ratings systems. For further details on the Memorandum see the study by the Center for Democracy and Technology, An Analysis of the Bertelsmann Foundation Memorandum on Self-Regulation of Internet Content: Concerns from a User Empowerment Perspective.

In fact, the "self-regulation" of the Internet industry is privatized censorship performed by corporations and right-wing NGOs. Censorship has become a business. "Crucially, the lifting of restrictions on market competition hasn't advanced the cause of freedom of expression at all. On the contrary, the privatisation of cyberspace seems to be taking place alongside the introduction of heavy censorship." (Richard Barbrook and Andy Cameron, The Californian Ideology)

While trying to convince us that its technical solutions are appropriate alternatives to government regulation, the Internet industry cannot dispense of governmental backing to enforce the proposed measures. This adds to and enforces the censorship measures already undertaken by governments. We are encouraged to use today's information and communication technologies, while the flow of information is restricted.

According to a report by Reporters Sans Frontières, quoted by Leonard R. Sussman in his essay Censor Dot Gov. The Internet and Press Freedom 2000, the following countries totally or largely control Internet access: Azerbaijan, Belarus, Burma, China, Cuba, Iran, Iraq, Kazakhstan, Kirghizstan, Libya, North Korea, Saudi Arabia, Sierra Leone, Sudan, Syria, Tajikistan, Tunisia, Turkmenistan, Uzbekistan, and Vietnam.

TEXTBLOCK 1/52 // URL: http://world-information.org/wio/infostructure/100437611742/100438658968
 
Racism on the Internet

The internet can be regarded as a mirror of the variety of interests, attitudes and needs of human kind. Propaganda and disinformation in that way have to be part of it, whether they struggle for something good or evil. But the classifications do no longer function.
During the last years the internet opened up a new source for racism as it can be difficult to find the person who gave a certain message into the net. The anarchy of the internet provides racists with a lot of possibilities to reach people which they do not possess in other media, for legal and other reasons.

In the 1980s racist groups used mailboxes to communicate on an international level; the first ones to do so were supposedly the Ku Klux Klan and mailboxes like the Aryan Nations Liberty Net. In the meantime those mailboxes can be found in the internet. In 1997 about 600 extreme right websites were in the net, the number is growing, most of them coming from the USA. The shocking element is not the number of racist pages, because still it is a very small number compared to the variety of millions of pages one can find in this media, it is the evidence of intentional disinformation, the language and the hatred that makes it dangerous.
A complete network of anti-racist organizations, including a high number of websites are fighting against racism. For example:

http://motlc.wiesenthal.com/text/x32/xr3257.html

http://www.aranet.org/

http://www.freespeech.org/waronracism/files/allies.htm
http://www.nsdapmuseum.com
http://www.globalissues.org/HumanRights/Racism.asp

TEXTBLOCK 2/52 // URL: http://world-information.org/wio/infostructure/100437611661/100438658620
 
World War II ...

Never before propaganda had been as important as in the 2nd World War. From now on education was one more field of propaganda: its purpose was to teach how to think, while pure propaganda was supposed to show what to think.
Every nation founded at least one ministry of propaganda - of course without calling it that way. For example the British called it the Ministry of Information (= MOI), the U.S. distinguished between the Office of Strategic Services (= OSS) and the Office of War Information (= OWI), the Germans created a Ministry of Propaganda and Public Enlightenment (= RMVP) and the Japanese called their disinformation and propaganda campaign the "Thought War".
British censorship was so strict that the text of an ordinary propaganda leaflet, that had been dropped from planes several million times, was not given to a journalist who asked for it.

Atrocity stories were no longer used the same way as in the 1st World War. Instead, black propaganda was preferred, especially to separate the Germans from their leaders.
German war propaganda had started long before the war. In the middle of the 1930s Leni Riefenstahl filmed Hitler best propaganda movies. For the most famous one, "Triumph of the Will" (1935), she was the only professional filmier who was allowed to make close-up pictures of her admirer.

Some of the pictures of fear, hatred and intolerance still exist in people's heads. Considering this propaganda did a good job, unfortunately it was the anti-national-socialist propaganda that failed at that time.

TEXTBLOCK 3/52 // URL: http://world-information.org/wio/infostructure/100437611661/100438658610
 
The "Corpse-Conversion Factory"-rumor

Supposedly the most famous British atrocity story concerning the Germans during World War I was the "Corpse-Conversion Factory"-rumor; it was said the Germans produced soap out of corpses. A story, which got so well believed that it was repeated for years - without a clear evidence of reality at that time. (Taylor, Munitions of the Mind, p.180)

TEXTBLOCK 4/52 // URL: http://world-information.org/wio/infostructure/100437611661/100438658427
 
Global Data Flows

Fiber-optic cables, coaxial cables, copper wires, electric power lines, microwaves, satellite communication, mobile telephony, computer networks: Various telecommunication networks following a variety of standards with bewildering abbreviations - DSL, WAP, GSM, UMTS, Ipv4 etc. - and carrying endless flows of capital and information are the blood veins of modern societies.

In the space of flows constituted by today's global data networks the space of places is transcended. Visualizations of these global data flows show arches bridging seas and continents, thereby linking the world's centres of research and development, economics and politics. In the global "Network Society" (Manuel Castells) the traditional centres of power and domination are not discarded, in the opposite, they are strengthened and reinforced by the use of information and communication technologies. Political, economical and symbolical power becomes increasingly linked to the use of modern information and communication technologies. The most sensitive and advanced centres of information and communication technologies are the stock markets. Excluded from the network constituted by modern information and communication technologies, large parts of Africa, Asia and South America, but also the poor of industrialized countries, are ranking increasingly marginal to the world economy.

Cities are centres of communications, trade and power. The higher the percentage of urban population, the more it is likely that the telecommunications infrastructure is generally good to excellent. This goes hand in hand with lower telecommunications costs. Those parts of the world with the poorest infrastructure are also the world's poorhouse. In Bangladesh for most parts of the population a personal computer is as expensive as a limousine in European one-month's salary in Europe, they have to pay eight annual salaries. Therefore telecommunications infrastructure is concentrated on the highly industrialized world: Most telephone mainlines, mobile telephones, computers, Internet accounts and Internet hosts (computers connected to the global data networks) can be found here. The same applies to media: the daily circulation of newspapers and the use of TV sets and radios. - Telecommunication and media services affordable to most parts of the population are mostly restricted to industrialized countries.

This situation will not change in the foreseeable future: Most expenditure for telecommunications infrastructure will be restricted to the richest countries in the world. In 1998, the world's richest countries consumed 75% of all cables and wires.

TEXTBLOCK 5/52 // URL: http://world-information.org/wio/infostructure/100437611791/100438658776
 
Movies as a Propaganda- and Disinformation-Tool in World War I and II

Movies produced in Hollywood in 1918/19 were mainly anti-German. They had some influence but the bigger effect was reached in World War II-movies.
The first propaganda movie of World War II was British.
At that time all films had to pass censoring. Most beloved were entertaining movies with propaganda messages. The enemy was shown as a beast, an animal-like creature, a brutal person without soul and as an idiot. Whereas the own people were the heroes. That was the new form of atrocity.
Leni Riefenstahl was a genius in this respect. Her movies still have an incredible power, while the majority of the other movies of that time look ridiculous today. The combination of light and shadow, the dramatic music and the mass-scenes that resembled ballet, had its effect and political consequences. Some of the German movies of that period still are on the index.

U.S.-President Theodore Roosevelt considered movies the best propaganda-instrument, as they are more subtle than other tools.

In the late twenties, movies got more and more important, in the USSR, too, like Sergei Eisenstein demonstrated with his movies. Historic events were changed into symbolism, exactly the way propaganda should function. It was disinformation - but in its most artistic form, especially in comparison to most U.S.- and European movies of that time.

TEXTBLOCK 6/52 // URL: http://world-information.org/wio/infostructure/100437611661/100438658547
 
Iris recognition

Iris recognition relies upon the fact that every individuals retina has a unique structure. The iris landscape is composed of a corona, crypts, filaments, freckles, pits radial furrows and striatations. Iris scanning is considered a particularly accurate identification technology because the characteristics of the iris do not change during a persons lifetime, and because there are several hundred variables in an iris which can be measured. In addition, iris scanning is fast: it does not take longer than one or two seconds.

These are characteristics which have made iris scanning an attractive technology for high-security applications such as prison surveillance. Iris technology is also used for online identification where it can substitute identification by password. As in other biometric technologies, the use of iris scanning for the protection of privacy is a two-edged sword. The prevention of identity theft applies horizontally but not vertically, i.e. in so far as the data retrieval that accompanies identification and the data body which is created in the process has nothing to do with identity theft.

TEXTBLOCK 7/52 // URL: http://world-information.org/wio/infostructure/100437611729/100438658334
 
1960s - 1970s: Increased Research in Artificial Intelligence (AI)

During the cold war the U.S. tried to ensure that it would stay ahead of the Soviet Union in technological advancements. Therefore in 1963 the Defense Advanced Research Projects Agency (DARPA) granted the Massachusetts Institute of Technology (MIT) U.S.$ 2.2 million for research in machine-aided cognition (artificial intelligence). The major effect of the project was an increase in the pace of AI research and a continuation of funding.

In the 1960s and 1970s a multitude of AI programs were developed, most notably SHRDLU. Headed by Marvin Minsky the MIT's research team showed, that when confined to a small subject matter, computer programs could solve spatial and logic problems. Other progresses in the field of AI at the time were: the proposal of new theories about machine vision by David Marr, Marvin Minsky's frame theory, the PROLOGUE language (1972) and the development of expert systems.

TEXTBLOCK 8/52 // URL: http://world-information.org/wio/infostructure/100437611663/100438659474
 
2000 A.D.

2000
Convergence of telephony, audiovisual technologies and computing

Digital technologies are used to combine previously separated communication and media systems such as telephony, audiovisual technologies and computing to new services and technologies, thus forming extensions of existing communication systems and resulting in fundamentally new communication systems. This is what is meant by today's new buzzwords "multimedia" and "convergence".

Classical dichotomies as the one of computing and telephony and traditional categorizations no longer apply, because these new services no longer fit traditional categories.

Convergence and Regulatory Institutions

Digital technology permits the integration of telecommunications with computing and audiovisual technologies. New services that extend existing communication systems emerge. The convergence of communication and media systems corresponds to a convergence of corporations. Recently, America Online, the world's largest online service provider, merged with Time Warner, the world's largest media corporation. For such corporations the classical approach to regulation - separate institutions regulate separate markets - is no longer appropriate, because the institutions' activities necessarily overlap. The current challenges posed to these institutions are not solely due to the convergence of communication and media systems made possible by digital technologies; they are also due to the liberalization and internationalization of the electronic communications sector. For regulation to be successful, new categorizations and supranational agreements are needed.
For further information on this issue see Natascha Just and Michael Latzer, The European Policy Response to Convergence with Special Consideration of Competition Policy and Market Power Control, http://www.soe.oeaw.ac.at/workpap.htm or http://www.soe.oeaw.ac.at/WP01JustLatzer.doc.

TEXTBLOCK 9/52 // URL: http://world-information.org/wio/infostructure/100437611796/100438659802
 
Cartoons

Cartoons' technique is simplicity.
Images are easier to remember than texts.
Frequently they show jokes about politicians, friendly or against the person shown. In the first decades of this century, cartoons were also used for propaganda against artists; remember the famous cartoons of Oscar Wilde being portrayed as a criminal, aiming to destroy his popularity.
As a tool in politics it had fatal consequences by determining stereotypes, which never again could be erased even if detected as pure disinformation. Most famous got the cartoons about Jews, which were not only distributed by Germans and Austrians but all over Europe; and already in the tens and twenties of our century. Most horrifying is the fact that many of those old, fascist and racist cartoons are coming back now, in slightly different design only.

TEXTBLOCK 10/52 // URL: http://world-information.org/wio/infostructure/100437611661/100438658509
 
1500 - 1700 A.D.

1588
Agostino Ramelli's reading wheel

Agostino Ramelli designed a "reading wheel", which allowed browsing through a large number of documents without moving from one spot to another.

The device presented a large number of books - a small library - laid open on lecterns on a kind of ferry-wheel. It allowed skipping chapters and browsing through pages by turning the wheel to bring lectern after lectern before the eyes. Ramelli's reading wheel thus linked ideas and texts and reminds of today's browsing software used to navigate the World Wide Web.

1597
The first newspaper is printed in Europe.

TEXTBLOCK 11/52 // URL: http://world-information.org/wio/infostructure/100437611796/100438659704
 
Basics: Infringement and Fair Use

The rights of a copyright holder are infringed when one of the acts requiring the authorization of the owner is done by someone else without his consent. In the case of copyright infringement or the violation of neighboring rights the remedies for the copyright owner consist of civil redress. The unauthorized copying of protected works for commercial purposes and the unauthorized commercial dealing in copied material is usually referred to as "piracy".

Yet copyright laws also provide that the rights of copyright owners are subject to the doctrine of "fair use". That allows the reproduction and use of a work, notwithstanding the rights of the author, for limited purposes such as criticism, comment, news reporting, teaching, and research. Fair use may be described as the privilege to use the copyrighted material in a reasonable manner without the owner's consent. To determine whether a use is fair or not most copyright laws consider:

- the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes (usually certain types of educational copying are allowed)

- the nature of the copyrighted work (mostly originals made for commercial reasons are less protected than their purely artistic counterparts)

- the amount and substantiality of the portion used in relation to the copyrighted work as a whole

- the effect of the use upon the potential market for or value of the copyrighted work (as a general rule copying may be permitted if it is unlikely to cause economic harm to the original author)

Examples of activities that may be excused as fair use include: providing a quotation in a book review; distributing copies of a section of an article in class for educational purposes; and imitating a work for the purpose of parody or social commentary.

TEXTBLOCK 12/52 // URL: http://world-information.org/wio/infostructure/100437611725/100438659569
 
Content as Transport Medium for Values and Ideologies

With the dissemination of their content commercial media are among other things also able to transport values and ideologies. Usually their programming reflects society's dominant social, political, ethical, cultural and economical values. A critical view of the prevalent ideologies often is sacrificed so as not to offend the existing political elites and corporate powers, but rather satisfy shareholders and advertisers.

With most of the worlds content produced by a few commercial media conglomerates, with the overwhelming majority of companies (in terms of revenue generation) concentrated in Europe, the U.S., Japan and Australia there is also a strong flow of content from the 'North-West' to the 'South-East'. Popular culture developed in the world's dominant commercial centers and Western values and ideologies are so disseminated into the most distant corners of the earth with far less coming back.

TEXTBLOCK 13/52 // URL: http://world-information.org/wio/infostructure/100437611795/100438659066
 
1940s - 1950s: The Development of Early Robotics Technology

During the 1940s and 1950s two major developments enabled the design of modern robots. Robotics generally is based on two related technologies: numerical control and teleoperators.

Numerical control was invented during the late 1940s and early 1950s. It is a method of controlling machine tool axes by means of numbers that have been coded on media. The first numerical control machine was presented in 1952 at the Massachusetts Institute of Technology (MIT), whose subsequent research led to the development of APT (Automatically Programmed Tools). APT, a language for programming machine tools, was designed for use in computer-assisted manufacturing (CAM).

First teleoperators were developed in the early 1940s. Teleoperators are mechanical manipulators which are controlled by a human from a remote location. In its typical application a human moves a mechanical arm and hand with its moves being duplicated at another location.

TEXTBLOCK 14/52 // URL: http://world-information.org/wio/infostructure/100437611663/100438659348
 
Timeline BC

~ 1900 BC: Egyptian writers use non-standard Hieroglyphs in inscriptions of a royal tomb; supposedly this is not the first but the first documented example of written cryptography

1500 an enciphered formula for the production of pottery is done in Mesopotamia

parts of the Hebrew writing of Jeremiah's words are written down in "atbash", which is nothing else than a reverse alphabet and one of the first famous methods of enciphering

4th century Aeneas Tacticus invents a form of beacons, by introducing a sort of water-clock

487 the Spartans introduce the so called "skytale" for sending short secret messages to and from the battle field

170 Polybius develops a system to convert letters into numerical characters, an invention called the Polybius Chequerboard.

50-60 Julius Caesar develops an enciphering method, later called the Caesar Cipher, shifting each letter of the alphabet an amount which is fixed before. Like atbash this is a monoalphabetic substitution.

TEXTBLOCK 15/52 // URL: http://world-information.org/wio/infostructure/100437611776/100438659084
 
Biometric technologies

In what follows there is a brief description of the principal biometric technologies, whose respective proponents - producers, research laboratories, think tanks - mostly tend to claim superiority over the others. A frequently used definition of "biometric" is that of a "unique, measurable characteristic or trait of a human being for automatically recognizing or verifying identity" (http://www.icsa.net/services/consortia/cbdc/bg/introduction.shtml); biometrics is the study and application of such measurable characteristics. In IT environments, biometrics are categorised as "security" technologies meant to limit access to information, places and other resources to a specific group of people.

All biometric technologies are made up of the same basic processes:

1. A sample of a biometric is first collected, then transformed into digital information and stored as the "biometric template" of the person in question.

2. At every new identification, a second sample is collected and its identity with the first one is examined.

3. If the two samples are identical, the persons identity is confirmed, i.e. the system knows who the person is.

This means that access to the facility or resource can be granted or denied. It also means that information about the persons behaviour and movements has been collected. The system now knows who passed a certain identification point at which time, at what distance from the previous time, and it can combine these data with others, thereby appropriating an individual's data body.

TEXTBLOCK 16/52 // URL: http://world-information.org/wio/infostructure/100437611729/100438658188
 
History: Communist Tradition

Following the communist revolutions of the 20th century all "means of production" became the property of the state as representative of "the masses". Private property ceased to exist. While moral rights of the creator were recognized and economic rights acknowledged with a one-time cash award, all subsequent rights reverted to the state.

With the transformation of many communist countries to a market system most of them have now introduced laws establishing markets in intellectual property rights. Still the high rate of piracy reflects a certain lack of legal tradition.

TEXTBLOCK 17/52 // URL: http://world-information.org/wio/infostructure/100437611725/100438659483
 
Eliminating online censorship: Freenet, Free Haven and Publius

Protecting speech on the global data networks attracts an increasing attention. The efforts and the corresponding abilities of governmental authorities, corporations and copyright enforcement agencies are countered by similar efforts and abilities of researchers and engineers to provide means for anonymous and uncensored communication, as Freenet, Free Haven and Publius. All three of them show a similar design. Content is split up and spread on several servers. When a file is requested, the pieces are reassembled. This design makes it difficult to censor content. All of these systems are not commercial products.

The most advanced system seems to be Publius. Because of being designed by researchers and engineers at the prestigious AT&T Labs, Publius is a strong statement against online censorship. No longer can it be said that taking a firm stand against the use of technologies limiting the freedom of individuals is a position of radical leftists only.

For more information on Publius, see John Schwartz, Online and Unidentifiable? in: The Washington Post, June 30, 2000, http://www.washingtonpost.com/wp-dyn/articles/A21689-2000Jun29.html .

Freenet web site: http://freenet.sourceforge.net

Free Haven web site: http://www.freehaven.net

Publius web site: http://www.cs.nyu.edu/waldman/publius

TEXTBLOCK 18/52 // URL: http://world-information.org/wio/infostructure/100437611742/100438658749
 
Challenges for Copyright by ICT: Introduction

Traditional copyright and the practice of paying royalties to the creators of intellectual property have emerged with the introduction of the printing press (1456). Therefore early copyright law has been tailored to the technology of print and the (re) production of works in analogue form. Over the centuries legislation concerning the protection of intellectual property has been adapted several times in order to respond to the technological changes in the production and distribution of information.

Yet again new technologies have altered the way of how (copyrighted) works are produced, copied, made obtainable and distributed. The emergence of global electronic networks and the increased availability of digitalized intellectual property confront existing copyright with a variety of questions and challenges. Although the combination of several types of works within one larger work or on one data carrier, and the digital format (although this may be a recent development it has been the object of detailed legal scrutiny), as well as networking (telephone and cable networks have been in use for a long time, although they do not permit interactivity) are nothing really new, the circumstance that recent technologies allow the presentation and storage of text, sound and visual information in digital form indeed is a novel fact. Like that the entire information can be generated, altered and used by and on one and the same device, irrespective of whether it is provided online or offline.


TEXTBLOCK 19/52 // URL: http://world-information.org/wio/infostructure/100437611725/100438659517
 
Another Question of Security

Even with the best techniques it is impossible to invent a cryptographic system that is absolutely safe/unbreakable. To decipher a text means to go through many, sometimes nearly - but never really - endless attempts. For the computers of today it might take hundreds of years or even more to go through all possibilities of codes, but still, finally the code stays breakable. The much faster quantum computers will proof that one day.
Therefore the decision to elect a certain method of enciphering finally is a matter of trust.

For the average user of computers it is rather difficult to understand or even realize the dangers and/or the technological background of electronic transmission of data. For the majority thinking about one's own necessities for encryption first of all means to trust others, the specialists, to rely on the information they provide.
The websites explaining the problems behind (and also the articles and books concerning the topic) are written by experts of course as well, very often in their typical scientific language, merely understandable for laymen. The introductions and other superficial elements of those articles can be understood, whereas the real background appears as untouchable spheres of knowledge.

The fact that dangers are hard to see through and the need for security measures appears as something most people know from media reports, leads directly to the problem of an underdeveloped democracy in the field of cryptography. Obviously the connection between cryptography and democracy is rather invisible for many people. Those mentioned media reports often specialize in talking about the work computer hackers do (sometimes being presented as criminals, sometimes as heroes) and the danger to lose control over the money drawn away from one's bank account, if someone steals the credit card number or other important financial data. The term "security", surely connected to those issues, is a completely different one from the one that is connected to privacy.
It is especially the latter that touches the main elements of democracy.

for the question of security see:
http://www-db.stanford.edu/pub/gio/CS99I/security.html

TEXTBLOCK 20/52 // URL: http://world-information.org/wio/infostructure/100437611776/100438658850
 
Who owns the Internet and who is in charge?

The Internet/Matrix still depends heavily on public infrastructure and there is no dedicated owner of the whole Internet/Matrix, but the networks it consists of are run and owned by corporations and institutions. Access to the Internet is usually provided by Internet Service Providers (ISPs) for a monthly fee. Each network is owned by someone and has a network operation center from where it is centrally controlled, but the Internet/Matrix is not owned by any single authority and has no network operation center of its own. No legal authority determines how and where networks can be connected together, this is something the managers of networks have to agree about. So there is no way to ever gain ultimate control of the Matrix/Internet.
The in some respects decentralized Matrix/Internet architecture and administration do not imply that there are no authorities for oversight and common standards for sustaining basic operations, for administration: There are authorities for IP number and domain name registrations, e.g.
Ever since the organizational structures for Internet administration have changed according to the needs to be addressed. Up to now, administration of the Internet is a collaborative undertaking of several loose cooperative bodies with no strict hierarchy of authority. These bodies make decisions on common guidelines, as communication protocols, e.g., cooperatively, so that compatibility of software is guaranteed. But they have no binding legal authority, nor can they enforce the standards they have agreed upon, nor are they wholly representative for the community of Internet users. The Internet has no official governing body or organization; most parts are still administered by volunteers.
Amazingly, there seems to be an unspoken and uncodified consent of what is allowed and what is forbidden on the Internet that is widely accepted. Codifications, as the so-called Netiquette, are due to individual efforts and mostly just expressively stating the prevailing consent. Violations of accepted standards are fiercely rejected, as reactions to misbehavior in mailing lists and newsgroups prove daily.
Sometimes violations not already subject to law become part of governmental regulations, as it was the case with spamming, the unsolicited sending of advertising mail messages. But engineers proved to be quicker and developed software against spamming. So, in some respects, the Internet is self-regulating, indeed.
For a detailed report on Internet governance, click here.

TEXTBLOCK 21/52 // URL: http://world-information.org/wio/infostructure/100437611791/100438658447
 
Biometric applications: surveillance

Biometric technologies are not surveillance technologies in themselves, but as identification technologies they provide an input into surveillance which can make such as face recognition are combined with camera systems and criminal data banks in order to supervise public places and single out individuals.

Another example is the use of biometrics technologies is in the supervision of probationers, who in this way can carry their special hybrid status between imprisonment and freedom with them, so that they can be tracked down easily.

Unlike biometric applications in access control, where one is aware of the biometric data extraction process, what makes biometrics used in surveillance a particularly critical issue is the fact that biometric samples are extracted routinely, unnoticed by the individuals concerned.

TEXTBLOCK 22/52 // URL: http://world-information.org/wio/infostructure/100437611729/100438658740
 
Economic structure; digital euphoria

The dream of a conflict-free capitalism appeals to a diverse audience. No politician can win elections without eulogising the benefits of the information society and promising universal wealth through informatisation. "Europe must not lose track and should be able to make the step into the new knowledge and information society in the 21st century", said Tony Blair.

The US government has declared the construction of a fast information infrastructure network the centerpiece of its economic policies

In Lisbon the EU heads of state agreed to accelerate the informatisation of the European economies

The German Chancellor Schröder has requested the industry to create 20,000 new informatics jobs.

The World Bank understands information as the principal tool for third world development

Electronic classrooms and on-line learning schemes are seen as the ultimate advance in education by politicians and industry leaders alike.

But in the informatised economies, traditional exploitative practices are obscured by the glamour of new technologies. And the nearly universal acceptance of the ICT message has prepared the ground for a revival of 19th century "adapt-or-perish" ideology.

"There is nothing more relentlessly ideological than the apparently anti-ideological rhetoric of information technology"

(Arthur and Marilouise Kroker, media theorists)

TEXTBLOCK 23/52 // URL: http://world-information.org/wio/infostructure/100437611726/100438658999
 
Virtual cartels, oligopolistic structures

Global networks require global technical standards ensuring the compatibility of systems. Being able to define such standards makes a corporation extremely powerful. And it requires the suspension of competitive practices. Competition is relegated to the symbolic realm. Diversity and pluralism become the victims of the globalisation of baroque sameness.

The ICT market is dominated by incomplete competition aimed at short-term market domination. In a very short time, new ideas can turn into best-selling technologies. Innovation cycles are extremely short. But today's state-of-the-art products are embryonic trash.

    According to the Computer and Communications Industry Association, Microsoft is trying to aggressively take over the network market. This would mean that AT&T would control 70 % of all long distance phone calls and 60 % of cable connections.



    AOL and Yahoo are lone leaders in the provider market. AOL has 21 million subscribers in 100 countries. In a single month, AOL registers 94 million visits. Two thirds of all US internet users visited Yahoo in December 1999.



    The world's 13 biggest internet providers are all American.



    AOL and Microsoft have concluded a strategic cross-promotion deal. In the US, the AOL icon is installed on every Windows desktop. AOL has also concluded a strategic alliance with Coca Cola.


TEXTBLOCK 24/52 // URL: http://world-information.org/wio/infostructure/100437611709/100438658963
 
Timeline 1600 - 1900 AD

17th century Cardinal Richelieu invents an encryption-tool called grille, a card with holes for writing messages on paper into the holes of those cards. Afterwards he removes the cards and fills in the blanks, so the message looks like an ordinary letter. The recipient needs to own the same card

- Bishop John Wilkins invents a cryptologic system looking like music notes. In a book he describes several forms of steganographic systems like secrets inks, but also the string cipher. He mentions the so-called Pig Latin, a spoken way of encryption that was already used by the ancient Indians

- the English scientist, magician and astrologer John Dee works on the ancient Enochian alphabet; he also possesses an encrypted writing that could not been broken until today

1605/1623 Sir Francis Bacon (= Francis Tudor = William Shakespeare?) writes several works containing ideas about cryptography. One of his most important advises is to use ciphers in such a way that no-one gets suspicious that the text could be enciphered. For this the steganogram was the best method, very often used in poems. The attempt to decipher Shakespeare's sonnets (in the 20th century) lead to the idea that his works had been written by Francis Bacon originally.

1671 Leibniz invents a calculating machine that uses the binary scale which we still use today, more advanced of course, called the ASCII code

18th century this is the time of the Black Chambers of espionage in Europe, Vienna having one of the most effective ones, called the "Geheime Kabinettskanzlei", headed by Baron Ignaz von Koch. Its task is to read through international diplomatic mail, copy letters and return them to the post-office the same morning. Supposedly about 100 letters are dealt with each day.

1790's Thomas Jefferson and Robert Patterson invent a wheel cipher

1799 the Rosetta Stone is found and makes it possible to decipher the Egyptian Hieroglyphs

1832 or 1838 Sam Morse develops the Morse Code, which actually is no code but an enciphered alphabet of short and long sounds. The first Morse code-message is sent by telegraph in 1844.

1834 the Braille Code for blind people is developed in today's form by Louis Braille

1844 the invention of the telegraph changes cryptography very much, as codes are absolutely necessary by then

1854 the Playfair cipher is invented by Sir Charles Wheatstone

1859 for the first time a tomographic cipher gets described

1861 Friedrich W. Kasiski does a cryptoanalysis of the Vigenère ciphers, which had been supposed to be uncrackable for ages

1891 Major Etienne Bazeries creates a new version of the wheel cipher, which is rejected by the French Army

1895 the invention of the radio changes cryptography-tasks again and makes them even more important

TEXTBLOCK 25/52 // URL: http://world-information.org/wio/infostructure/100437611776/100438658974
 
Databody convergence

In the phrase "the rise of the citizen as a consumer", to be found on the EDS website, the cardinal political problem posed by the databody industry is summarised: the convergence of commercial and political interest in the data body business, the convergence of bureaucratic and commercial data bodies, the erosion of privacy, and the consequent undermining of democratic politics by private business interest.

When the citizen becomes a consumer, the state must become a business. In the data body business, the key word behind this new identity of government is "outsourcing". Functions, that are not considered core functions of government activity are put into the hands of private contractors.

There have long been instances where privately owned data companies, e.g. credit card companies, are allowed access to public records, e.g. public registries or electoral rolls. For example, in a normal credit card transaction, credit card companies have had access to public records in order to verify identity of a customer. For example, in the UK citizen's personal data stored on the Electoral Roll have been used for commercial purposes for a long time. The new British Data Protection Act now allows people to "opt out" of this kind of commercialisation - a legislation that has prompted protests on the part of the data industry: Experian has claimed to lose LST 500 mn as a consequence of this restriction - a figure that, even if exaggerated, may help to understand what the value of personal data actually is.

While this may serve as an example of an increased public awareness of privacy issues, the trend towards outsourcing seems to lead to a complete breakdown of the barriers between commercial and public use of personal data. This trend can be summarised by the term "outsourcing" of government functions.

Governments increasingly outsource work that is not considered core function of government, e.g. cooking meals in hospitals or mowing lawns in public parks. Such peripheral activities marked a first step of outsourcing. In a further step, governmental functions were divided between executive and judgemental functions, and executive functions increasingly entrusted to private agencies. For these agencies to be able to carry out the work assigned to them, the need data. Data that one was stored in public places, and whose handling was therefore subject to democratic accountability. Outsourcing has produced gains in efficiency, and a decrease of accountability. Outsourced data are less secure, what use they are put to is difficult to control.

The world's largest data corporation, EDS, is also among the foremost outsourcing companies. In an article about EDS' involvement in government outsourcing in Britain, Simon Davies shows how the general trend towards outsourcing combined with advances in computer technology allow companies EDS, outside of any public accountability, to create something like blueprints for the societies of the 21st century. But the problem of accountability is not the only one to be considered in this context. As Davies argues, the data business is taking own its own momentum "a ruthless company could easily hold a government to ransom". As the links between government agencies and citizens thin out, however, the links among the various agencies might increase. Linking the various government information systems would amount to further increase in efficiency, and a further undermining of democracy. The latter, after all, relies upon the separation of powers - matching government information systems would therefore pave the way to a kind of electronic totalitarianism that has little to do with the ideological bent of George Orwell's 1984 vision, but operates on purely technocratic principles.

Technically the linking of different systems is already possible. It would also create more efficiency, which means generate more income. The question, then, whether democracy concerns will prevent it from happening is one that is capable of creating

But what the EDS example shows is something that applies everywhere, and that is that the data industry is whether by intention or whether by default, a project with profound political implications. The current that drives the global economy deeper and deeper into becoming a global data body economy may be too strong to be stopped by conventional means.

However, the convergence of political and economic data bodies also has technological roots. The problem is that politically motivated surveillance and economically motivated data collection are located in the same area of information and communication technologies. For example, monitoring internet use requires more or less the same technical equipment whether done for political or economic purposes. Data mining and data warehousing techniques are almost the same. Creating transparency of citizens and customers is therefore a common objective of intelligence services and the data body industry. Given that data are exchanged in electronic networks, a compatibility among the various systems is essential. This is another factor that encourages "leaks" between state-run intelligence networks and the private data body business. And finally, given the secretive nature of state intelligence and commercial data capturing , there is little transparency. Both structures occupy an opaque zone.

TEXTBLOCK 26/52 // URL: http://world-information.org/wio/infostructure/100437611761/100438659769
 
Timeline 1900-1970 AD

1913 the wheel cipher gets re-invented as a strip

1917 William Frederick Friedman starts working as a cryptoanalyst at Riverbank Laboratories, which also works for the U.S. Government. Later he creates a school for military cryptoanalysis

- an AT&T-employee, Gilbert S. Vernam, invents a polyalphabetic cipher machine that works with random-keys

1918 the Germans start using the ADFGVX-system, that later gets later by the French Georges Painvin

- Arthur Scherbius patents a ciphering machine and tries to sell it to the German Military, but is rejected

1919 Hugo Alexander Koch invents a rotor cipher machine

1921 the Hebern Electric Code, a company producing electro-mechanical cipher machines, is founded

1923 Arthur Scherbius founds an enterprise to construct and finally sell his Enigma machine for the German Military

late 1920's/30's more and more it is criminals who use cryptology for their purposes (e.g. for smuggling). Elizabeth Smith Friedman deciphers the codes of rum-smugglers during prohibition regularly

1929 Lester S. Hill publishes his book Cryptography in an Algebraic Alphabet, which contains enciphered parts

1933-1945 the Germans make the Enigma machine its cryptographic main-tool, which is broken by the Poles Marian Rejewski, Gordon Welchman and Alan Turing's team at Bletchley Park in England in 1939

1937 the Japanese invent their so called Purple machine with the help of Herbert O. Yardley. The machine works with telephone stepping relays. It is broken by a team of William Frederick Friedman. As the Japanese were unable to break the US codes, they imagined their own codes to be unbreakable as well - and were not careful enough.

1930's the Sigaba machine is invented in the USA, either by W.F. Friedman or his colleague Frank Rowlett

- at the same time the British develop the Typex machine, similar to the German Enigma machine

1943 Colossus, a code breaking computer is put into action at Bletchley Park

1943-1980 the cryptographic Venona Project, done by the NSA, is taking place for a longer period than any other program of that type

1948 Shannon, one of the first modern cryptographers bringing mathematics into cryptography, publishes his book A Communications Theory of Secrecy Systems

1960's the Communications-Electronics Security Group (= CESG) is founded as a section of Government Communications Headquarters (= GCHQ)

late 1960's the IBM Watson Research Lab develops the Lucifer cipher

1969 James Ellis develops a system of separate public-keys and private-keys

TEXTBLOCK 27/52 // URL: http://world-information.org/wio/infostructure/100437611776/100438658921
 
Late 1970s - Present: Fourth Generation Computers

Following the invention of the first integrated circuits always more and more components could be fitted onto one chip. LSI (Large Scale Integration) was followed by VLSI (Very Large Scale Integration) and ULSI (Ultra-Large Scale Integration), which increased the number of components squeezed onto one chip into the millions and helped diminish the size as well as the price of computers. The new chips took the idea of the integrated circuit one step further as they allowed to manufacture one microprocessor which could then be programmed to meet any number of demands.

Also, ensuing the introduction of the minicomputer in the mid 1970s by the early 1980s a market for personal computers (PC) was established. As computers had become easier to use and cheaper they were no longer mainly utilized in offices and manufacturing, but also by the average consumer. Therefore the number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used.

Further developments included the creation of mobile computers (laptops and palmtops) and especially networking technology. While mainframes shared time with many terminals for many applications, networking allowed individual computers to form electronic co-operations. LANs (Local Area Network) permitted computers to share memory space, information, software and communicate with each other. Although already LANs could reach enormous proportions with the invention of the Internet an information and communication-network on a global basis was established for the first time.

TEXTBLOCK 28/52 // URL: http://world-information.org/wio/infostructure/100437611663/100438659451
 
Znet

ZNet provides forum facilities for online discussion and chatting on various topics ranging from culture and ecology to international relations and economics. ZNet also publishes daily commentaries and maintains a Web-zine, which addresses current news and events as well as many other topics, trying to be provocative, informative and inspiring to its readers.

Strategies and Policies

Daily Commentaries: Znet's commentaries address current news and events, cultural happenings, and organizing efforts, providing context, critique, vision, and analysis, but also references to or reviews of broader ideas, new books, activism, the Internet, and other topics that strike the diverse participating authors as worthy of attention.

Forum System: Znet provides a private (and soon also a public) forum system. The fora are among others concerned with topics such as: activism, cultural, community/race/religion/ethnicity, ecology, economics/class, gender/kinship/sexuality, government/polity, international relations, ParEcon, vision/strategy and popular culture. Each forum has a set of threaded discussions, also the fora hosted by commentary writers like Chomsky, Ehrenreich, Cagan, Peters and Wise.

ZNet Daily WebZine: ZNet Daily WebZine offers commentaries in web format.

Z Education Online (planned): The Z Education Online site will provide instructionals and courses of diverse types as well as other university-like, education-aimed features.

TEXTBLOCK 29/52 // URL: http://world-information.org/wio/infostructure/100437611734/100438659288
 
Gait recognition

The fact that an individual's identity is expressed not only by the way he/she looks or sounds, but also by the manner of walking is a relatively new discovery of in biometrics.

Unlike the more fully developed biometric technologies whose scrutiny is directed at stationary parts of the body, gait recognition has the added difficulty of having to sample and identify movement. Scientists at the University of Southampton, UK (http://www.isis.ecs.soton.ac.uk/research/gait/) have developed a model which likens the movement of legs to those of a pendulum and uses hip inclination as a variable.

Another model considers the shape and length of legs as well as the velocity of joint movements. The objective is to combine both models into one, which would make gait recognition a fully applicable biometric technology.

Given that gait recognition is applied to "moving preambulatory subjects" it is a particularly interesting technology for surveillance. People can no longer hide their identity by covering themselves or moving. Female shop lifters who pretend pregnancy will be detected because they walk differently than those who are really pregnant. Potential wrongdoers might resort walking techniques as developed in Monty Pythons legendary "Ministry of Silly Walks" (http://www.stone-dead.asn.au/sketches/sillwalk.htm)

TEXTBLOCK 30/52 // URL: http://world-information.org/wio/infostructure/100437611729/100438658388
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 31/52 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
Other biometric technologies

Other biometric technologies not specified here include ear recognition, signature dynamics, key stroke dynamics, vein pattern recognition, retinal scan, body odour recognition, and DNA recognition. These are technologies which are either in early stages of development or used in highly specialised and limited contexts.

TEXTBLOCK 32/52 // URL: http://world-information.org/wio/infostructure/100437611729/100438658399
 
Hill & Knowlton

Although it is generally hard to distinguish between public relations and propaganda, Hill & Knowlton, the worlds leading PR agency, represents an extraordinary example for the manipulation of public opinion with public relations activities. Hill & Knowlton did not only lobby for countries, accused of the abuse of human rights, like China, Peru, Israel, Egypt and Indonesia, but also represented the repressive Duvalier regime in Haiti.

It furthermore played a central role in the Gulf War. On behalf of the Kuwaiti government it presented a 15-year-old girl to testify before Congress about human rights violations in a Kuwaiti hospital. The girl, later found out to be the daughter of Kuwait's ambassador to the U.S., and its testimony then became the centerpiece of a finely tuned PR campaign orchestrated by Hill & Knowlton and co-ordinated with the White House on behalf of the government of Kuwait an the Citizens for a Free Kuwait group. Inflaming public opinion against Iraq and bringing the U.S. Congress in favor of war in the Gulf, this probably was one of the largest and most effective public relations campaigns in history.

Running campaigns against abortion for the Catholic Church and representing the Church of Scientology, large PR firms like Hill & Knowlton, scarcely hesitate to manipulate public and congressional opinion and government policy through media campaigns, congressional hearings, and lobbying, when necessary. Also co-operation with intelligence agencies seems to be not unknown to Hill & Knowlton.

Accused of pursuing potentially illegal proxy spying operation for intelligence agencies, Richard Cheney, head of Hill & Knowltons New York office, denied this allegations, but said that "... in such a large organization you never know if there's not some sneak operation going on." On the other hand former CIA official Robert T. Crowley acknowledged, that "Hill & Knowlton's overseas offices were perfect 'cover` for the ever-expanding CIA. Unlike other cover jobs, being a public relations specialist did not require technical training for CIA officers." Furthermore the CIA, Crowley admitted, used its Hill & Knowlton connections to "... put out press releases and make media contacts to further its positions. ... Hill & Knowlton employees at the small Washington office and elsewhere distributed this material through CIA assets working in the United States news media."

(Source: Carlisle, Johan: Public Relationships: Hill & Knowlton, Robert Gray, and the CIA. http://mediafilter.org/caq/)

TEXTBLOCK 33/52 // URL: http://world-information.org/wio/infostructure/100437611652/100438658088
 
Advertising

Advertising as referred to in most economic books is part of the marketing mix. Therefore advertising usually is closely associated with the aim of selling products and services. Still, developments like "branding" show a tendency towards the marketing of not only products and services, but of ideas and values. While advertising activities are also pursued by political parties, politicians and governmental as well as non-governmental organizations, most of the money flowing into the advertising industry comes from corporations. Although these clients come from such diverse fields, their intentions hardly differ. Attempting to influence the public, their main goal is to sell: Products, services, ideas, values and (political) ideology.

TEXTBLOCK 34/52 // URL: http://world-information.org/wio/infostructure/100437611652/100438658361
 
History: "Indigenous Tradition"

In preliterate societies the association of rhythmic or repetitively patterned utterances with supernatural knowledge endures well into historic times. Knowledge is passed from one generation to another. Similar as in the Southern tradition intellectual property rights are rooted in a concept of 'collective' or 'communal' intellectual property existing in perpetuity and not limited to the life of an individual creator plus some number of years after his or her death. Often rights are exercised by only one individual in each generation, often through matrilineal descent.


TEXTBLOCK 35/52 // URL: http://world-information.org/wio/infostructure/100437611725/100438659557
 
The Post-World-War II-period

After World War II the importance of propaganda still increased, on the commercial level as well as on a political level, in the era of the Cold War. The propaganda institutions of the different countries wanted their people to think the right way, which meant, the national way. In the USA the McCarthy-era started, a totalitarian system in struggle against communism. McCarthy even managed to publicly burn critical books that were written about him; and every unbeloved artist was said to be a communist, an out-law.
Cold War brought the era of spies with it, which was the perfect tool of disinformation. But the topic as a movie-genre seems still popular today, as the unchanged success of James Bond-movies show.
A huge net of propaganda was built up for threatening with the nuclear bomb: pretending that the enemy was even more dangerous than the effect of such a bomb.
And later, after the fall of the Iron Curtain, disinformation found other fields of work, like the wars of the 1990s showed us.

TEXTBLOCK 36/52 // URL: http://world-information.org/wio/infostructure/100437611661/100438658581
 
Timeline 1970-2000 AD

1971 IBM's work on the Lucifer cipher and the work of the NSA lead to the U.S. Data Encryption Standard (= DES)

1976 Whitfield Diffie and Martin Hellman publish their book New Directions in Cryptography, playing with the idea of public key cryptography

1977/78 the RSA algorithm is developed by Ron Rivest, Adi Shamir and Leonard M. Adleman and is published

1984 Congress passes Comprehensive Crime Control Act

- The Hacker Quarterly is founded

1986 Computer Fraud and Abuse Act is passed in the USA

- Electronic Communications Privacy Act

1987 Chicago prosecutors found Computer Fraud and Abuse Task Force

1988 U.S. Secret Service covertly videotapes a hacker convention

1989 NuPrometheus League distributes Apple Computer software

1990 - IDEA, using a 128-bit key, is supposed to replace DES

- Charles H. Bennett and Gilles Brassard publish their work on Quantum Cryptography

- Martin Luther King Day Crash strikes AT&T long-distance network nationwide


1991 PGP (= Pretty Good Privacy) is released as freeware on the Internet, soon becoming worldwide state of the art; its creator is Phil Zimmermann

- one of the first conferences for Computers, Freedom and Privacy takes place in San Francisco

- AT&T phone crash; New York City and various airports get affected

1993 the U.S. government announces to introduce the Clipper Chip, an idea that provokes many political discussions during the following years

1994 Ron Rivest releases another algorithm, the RC5, on the Internet

- the blowfish encryption algorithm, a 64-bit block cipher with a key-length up to 448 bits, is designed by Bruce Schneier

1990s work on quantum computer and quantum cryptography

- work on biometrics for authentication (finger prints, the iris, smells, etc.)

1996 France liberates its cryptography law: one now can use cryptography if registered

- OECD issues Cryptography Policy Guidelines; a paper calling for encryption exports-standards and unrestricted access to encryption products

1997 April European Commission issues Electronic Commerce Initiative, in favor of strong encryption

1997 June PGP 5.0 Freeware widely available for non-commercial use

1997 June 56-bit DES code cracked by a network of 14,000 computers

1997 August U.S. judge assesses encryption export regulations as violation of the First Amendment

1998 February foundation of Americans for Computer Privacy, a broad coalition in opposition to the U.S. cryptography policy

1998 March PGP announces plans to sell encryption products outside the USA

1998 April NSA issues a report about the risks of key recovery systems

1998 July DES code cracked in 56 hours by researchers in Silicon Valley

1998 October Finnish government agrees to unrestricted export of strong encryption

1999 January RSA Data Security, establishes worldwide distribution of encryption product outside the USA

- National Institute of Standards and Technologies announces that 56-bit DES is not safe compared to Triple DES

- 56-bit DES code is cracked in 22 hours and 15 minutes

1999 May 27 United Kingdom speaks out against key recovery

1999 Sept: the USA announce to stop the restriction of cryptography-exports

2000 as the German government wants to elaborate a cryptography-law, different organizations start a campaign against that law

- computer hackers do no longer only visit websites and change little details there but cause breakdowns of entire systems, producing big economic losses

for further information about the history of cryptography see:
http://www.clark.net/pub/cme/html/timeline.html
http://www.math.nmsu.edu/~crypto/Timeline.html
http://fly.hiwaay.net/~paul/cryptology/history.html
http://www.achiever.com/freehmpg/cryptology/hocryp.html
http://all.net/books/ip/Chap2-1.html
http://cryptome.org/ukpk-alt.htm
http://www.iwm.org.uk/online/enigma/eni-intro.htm
http://www.achiever.com/freehmpg/cryptology/cryptofr.html
http://www.cdt.org/crypto/milestones.shtml

for information about hacker's history see:
http://www.farcaster.com/sterling/chronology.htm:

TEXTBLOCK 37/52 // URL: http://world-information.org/wio/infostructure/100437611776/100438658960
 
Individualized Audience Targeting

New opportunities for online advertisers arise with the possibility of one-to-one Web applications. Software agents for example promise to "register, recognize and manage end-user profiles; create personalized communities on-line; deliver personalized content to end-users and serve highly targeted advertisements". The probably ultimate tool for advertisers. Although not yet widely used, companies like Amazon.Com have already started to exploit individualized audience targeting for their purposes.

TEXTBLOCK 38/52 // URL: http://world-information.org/wio/infostructure/100437611652/100438658450
 
Feeding the data body

TEXTBLOCK 39/52 // URL: http://world-information.org/wio/infostructure/100437611761/100438659644
 
Key Recovery Systems

As stated before the sense of cryptography is a properly designed cryptosystem making it essentially impossible to recover encrypted data without any knowledge of the used key. The issue of lost keys and the being-locked-out from one's own data as a consequence favors key recovery systems. On the other hand the counter argument is confidentiality: as soon as a possibility to recover a key is provided, the chances for abuses grow.
Finally it is the state that does not want to provide too much secrecy. On the contrary. During the last 20 years endless discussions about the state's necessity and right to restrict private cryptography have taken place, as the governments rarely care for the benefit of private users if they believe in catching essential informations about any kind of enemy, hence looking for unrestricted access to all keys.

The list of "key recovery," "key escrow," and "trusted third-party" as encryption requirements, suggested by governmental agencies, covers all the latest developments and inventions in digital technology.
At the same time the NSA, one of the world's most advanced and most secret enterprises for cryptography, worked hard in getting laws through to forbid the private use of strong encryption in one way or the other. Still, it is also organizations like this one that have to admit that key recovery systems are not without any weaknesses, as the U.S. Escrowed Encryption Standard, the basis for the famous and controversially discussed Clipper Chip, showed. The reason for those weaknesses is the high complexity of those systems.

Another aspect is that key recovery systems are more expensive and certainly much less secure than other systems. So, why should anyone use them?

In that context, one has to understand the legal framework for the use of cryptography, a strict framework in fact, being in high contradiction to the globalised flow of communication.

TEXTBLOCK 40/52 // URL: http://world-information.org/wio/infostructure/100437611776/100438659037
 
Introduction

Political and economic agendas change. People leave, get exchanged. Whereas one of the things that never seem to change is disinformation. Watching different kinds of cultures and regimes throughout history you will always find disinformation.
Its use is variable just like its tools. First of all it does not necessarily need words. It is possible to disinform in any kind of language (sounds, symbols, letters or with the help of the body). As it seems to have come into existence together with human communication, we need not even hope that it will disappear once in a while.
One could rather say: disinformation has always been there.
Instead of hoping to stop it we need to learn to live with it, detect it, restore it to consciousness. Even this will not be any insurance for not walking into the trap. It is an attempt, nothing else.
For detecting disinformation one needs to know what types of disinformation are possible and how they work. This site gives you some ideas about the history, tendencies and different types of disinformation, with the restriction that it will mostly be about the Western types of disinformation, as it is still harder to understand the media of disinformation in other cultures; and anyhow, many methods and tools run parallel in different cultures.

TEXTBLOCK 41/52 // URL: http://world-information.org/wio/infostructure/100437611661/100438658071
 
The Catholic Church

In the beginnings of Christianity most people were illiterate. Therefore the Bible had to be transformed into pictures and symbols; and not only the stories but also the moral duties of everybody. Images and legends of the Saints turned out as useful models for human behavior - easy to tell and easy to understand.
Later, when the crusades began, the Christian Church used propaganda against Muslims, creating pictures of evil, pagan and bloodcurdling people. While the knights and others were fighting abroad, people in Europe were told to pray for them. Daily life was connected to the crusades, also through money-collections - more for the cause of propaganda than for the need of money.
During the period of the Counter-Reformation Catholic propaganda no longer was against foreigners but turned against people at home - the Protestants; and against their publications/books, which got prohibited by starting the so-called index. By then both sides were using disinformation for black propaganda about the other side.

TEXTBLOCK 42/52 // URL: http://world-information.org/wio/infostructure/100437611661/100438658307
 
The ancient Greek

Disinformation was seen as an appropriate tool of politics and rhetoric in ancient Greece. Most of all persuasion was used, which then was considered a type of art.
Religion was (and in many ways still is) the best disinformer or manipulator; prophecies were constructed to manipulate the population. The important thing was to use emotions and more than anything else fear as a tool for manipulation. If the oracle of Delphi said a war was to fight and would be won, then the Greek population - because of religious motives - was prepared to fight that war.
Propaganda was not only used in wars but also in daily life to bring people together and create a nation.
But poets, playwrights and other artists were manipulating as well. Their pieces of literature and plays were full of political messages with different ideologies behind. In the way how theatre at that time was part of life, it can be understood easily that those messages had not only entertainment's character but also a lot of political and social influence.
A different and very famous part of disinformation in ancient Greek history was the story of Themistocles, who won the battle of Salamis against the Persians.

TEXTBLOCK 43/52 // URL: http://world-information.org/wio/infostructure/100437611661/100438658386
 
It is always the others

Disinformation is supposed to be something evil, something ethically not correct. And therefore we prefer to connect it to the past or to other political systems than the ones in the Western hemisphere. It is always the others who work with disinformation. The same is true for propaganda.
Even better, if we can refer it to the past: Adolf Hitler, supposedly one of the world's greatest and most horrible propagandists (together with his Reichsminister für Propaganda Josef Goebbels) did not invent modern propaganda either. It was the British example during World War I, the invention of modern propaganda, where he took his knowledge from. And it was Hitler's Reich, where (racist) propaganda and disinformation were developed to a perfect manipulation-tool in a way that the consequences are still working today.
A war loses support of the people, if it is getting lost. Therefore it is extremely important to launch a feeling of winning the war. Never give up emotions of victory. Governments know this and work hard on keeping the mood up. The Germans did a very hard job on that in the last months of World War II.
But the in the 1990s disinformation- and propaganda-business came back to life (if it ever had gone out of sight) through Iraq's invasion of Kuwait and the reactions by democratic states. After the war, reports made visible that not much had happened the way we had been told it had happened. Regarded like this the Gulf War was the end of the New World Order, a better and geographically broader democratic order, that had just pretended to having begun.

TEXTBLOCK 44/52 // URL: http://world-information.org/wio/infostructure/100437611661/100438658640
 
Further Tools: Photography

Art has always contributed a lot to disinformation.
Many modern tools for disinformation are used in art/photography.
Harold D. Lasswell once stated that propaganda was cheaper than violence. Today this is no longer true. Technology has created new tools for propaganda and disinformation - and they are expensive. But by now our possibilities to manipulate pictures and stories have gone so far that it can get difficult to tell the difference between the original and a manipulation.

Trillions of photographs have been taken in the 20th century. Too many to look at, too many to control them and their use. A paradise for manipulation.
We have to keep in mind: There is the world, and there exist pictures of the world, which does not mean that both are the same thing. Photographs are not objective, because the photographer selects the part of the world which is becoming a picture. The rest is left out.

Some tools for manipulation of photography are:



Some of those are digital ways of manipulation, which helps to change pictures in many ways without showing the manipulation.

Pictures taken from the internet could be anything and come from anywhere. To proof the source is nearly impossible. Therefore scientists created on watermarks for pictures, which make it impossible to "steal" or manipulate a picture out of the net.

TEXTBLOCK 45/52 // URL: http://world-information.org/wio/infostructure/100437611661/100438658730
 
Enforcement: Copyright Management and Control Technologies

With the increased ease of the reproduction and transmission of unauthorized copies of digital works over electronic networks concerns among the copyright holder community have arisen. They fear a further growth of copyright piracy and demand adequate protection of their works. A development, which started in the mid 1990s and considers the copyright owner's apprehensions, is the creation of copyright management systems. Technological protection for their works, the copyright industry argues, is necessary to prevent widespread infringement, thus giving them the incentive to make their works available online. In their view the ideal technology should be "capable of detecting, preventing, and counting a wide range of operations, including open, print, export, copying, modifying, excerpting, and so on." Additionally such systems could be used to maintain "records indicating which permissions have actually been granted and to whom".

TEXTBLOCK 46/52 // URL: http://world-information.org/wio/infostructure/100437611725/100438659674
 
Anonymity

"Freedom of anonymous speech is an essential component of free speech."

Ian Goldberg/David Wagner, TAZ Servers and the Rewebber Network: Enabling Anonymous Publishing on the World Wide Web, in: First Monday 3,4, 1999

Someone wants to hide one's identity, to remain anonymous, if s/he fears to be holding accountable for something, say, a publication, that is considered to be prohibited. Anonymous publishing has a long tradition in European history. Writers of erotic literature or pamphlets, e. g., preferred to use pseudonyms or publish anonymously. During the Enlightenment books as d'Alembert's and Diderot's famous Encyclopaedia were printed and distributed secretly. Today Book Locker, a company selling electronic books, renews this tradition by allowing to post writings anonymously, to publish without the threat of being perishing for it. Sometimes anonymity is a precondition for reporting human rights abuses. For example, investigative journalists and regime critics may rely on anonymity. But we do not have to look that far; even you might need or use anonymity sometimes, say, when you are a woman wanting to avoid sexual harassment in chat rooms.

The original design of the Net, as far as it is preserved, offers a relatively high degree of privacy, because due to the client-server model all what is known about you is a report of the machine from which information was, respectively is requested. But this design of the Net interferes with the wish of corporations to know you, even to know more about you than you want them to know. What is euphemistically called customer relationship management systems means the collection, compilation and analysis of personal information about you by others.

In 1997 America Online member Timothy McVeigh, a Navy employee, made his homosexuality publicly known in a short autobiographical sketch. Another Navy employee reading this sketch informed the Navy. America Online revealed McVeigh's identity to the Navy, who discharged McVeigh. As the consequence of a court ruling on that case, Timothy McVeigh was allowed to return to the Navy. Sometimes anonymity really matters.

On the Net you still have several possibilities to remain anonymous. You may visit web sites via an anonymizing service. You might use a Web mail account (given the personal information given to the web mail service provider is not true) or you might use an anonymous remailing service which strips off the headers of your mail to make it impossible to identify the sender and forward your message. Used in combination with encryption tools and technologies like FreeHaven or Publius anonymous messaging services provide a powerful tool for countering censorship.

In Germany, in 1515, printers had to swear not to print or distribute any publication bypassing the councilmen. Today repressive regimes, such as China and Burma, and democratic governments, such as the France and Great Britain, alike impose or already have imposed laws against anonymous publishing on the Net.

Anonymity might be used for abuses, that is true, but "the burden of proof rests with those who would seek to limit it. (Rob Kling, Ya-ching Lee, Al Teich, Mark S. Frankel, Assessing Anonymous Communication on the Internet: Policy Deliberations, in: The Information Society, 1999).

TEXTBLOCK 47/52 // URL: http://world-information.org/wio/infostructure/100437611742/100438659040
 
Basics: Introduction

Copyright law is a branch of intellectual property law and deals with the rights of intellectual creators in their works. The scope of copyright protection as laid down in Article 2 of the 1996 WIPO Copyright Treaty "... extends to expressions and not to ideas, procedures, methods of operation or mathematical concepts as such." Copyright law protects the creativity concerning the choice and arrangement of words, colors, musical notes etc. It grants the creators of certain specified works exclusive rights relating to the "copying" and use of their original creation.


TEXTBLOCK 48/52 // URL: http://world-information.org/wio/infostructure/100437611725/100438659520
 
0 - 1400 A.D.

150
A smoke signals network covers the Roman Empire

The Roman smoke signals network consisted of towers within a visible range of each other and had a total length of about 4500 kilometers. It was used for military signaling.
For a similar telegraph network in ancient Greece see Aeneas Tacitus' optical communication system.

About 750
In Japan block printing is used for the first time.

868
In China the world's first dated book, the Diamond Sutra, is printed.

1041-1048
In China moveable types made from clay are invented.

1088
First European medieval university is established in Bologna.

The first of the great medieval universities was established in Bologna. At the beginning universities predominantly offered a kind of do-it-yourself publishing service.

Books still had to be copied by hand and were so rare that a copy of a widely desired book qualified for being invited to a university. Holding a lecture equaled to reading a book aloud, like a priest read from the Bible during services. Attending a lecture equaled to copy a lecture word by word, so that you had your own copy of a book, thus enabling you to hold a lecture, too.

For further details see History of the Idea of a University, http://quarles.unbc.edu/ideas/net/history/history.html

TEXTBLOCK 49/52 // URL: http://world-information.org/wio/infostructure/100437611796/100438659702
 
Timeline Cryptography - Introduction

Besides oral conversations and written language many other ways of information-transport are known: like the bush telegraph, drums, smoke signals etc. Those methods are not cryptography, still they need en- and decoding, which means that the history of language, the history of communication and the history of cryptography are closely connected to each other
The timeline gives an insight into the endless fight between enciphering and deciphering. The reasons for them can be found in public and private issues at the same time, though mostly connected to military maneuvers and/or political tasks.

One of the most important researchers on Cryptography through the centuries is David Kahn; many parts of the following timeline are originating from his work.

TEXTBLOCK 50/52 // URL: http://world-information.org/wio/infostructure/100437611776/100438658824
 
Identity vs. Identification

It has become a commonplace observation that the history of modernity has been accompanied by what one might call a general weakening of identity, both as a theoretical concept and as a social and cultural reality. This blurring of identity has come to full fruition in the 20th century. As a theoretical concept, identity has lost its metaphysical foundation of "full correspondence" following the destruction of metaphysics by thinkers such as Nietzsche, Heidegger, Witgenstein or Davidson. Nietzsche's "dead god", his often-quoted metaphor for the demise of metaphysics, has left western cultures not only with the problem of having to learn how to think without permanent foundations; it has left them with both the liberty of constructing identities, and the structural obligation to do so. The dilemmas arising out of this ambivalent situation have given rise to the comment that "god is dead, and men is not doing so well himself". The new promise of freedom is accompanied by the threat of enslavement. Modern, technologically saturated cultures survive and propagate and emancipate themselves by acting as the gatekeepers of their own technological prisons.

On the social and cultural levels, traditional clear-cut identities have become weakened as traditional cultural belonging has been undermined or supplanted by modern socio-technological structures. The question as to "who one is" has become increasingly difficult to answer: hybrid identities are spreading, identities are multiple, temporary, fleeting rather than reflecting an inherited sense of belonging. The war cry of modern culture industry "be yourself" demands the impossible and offers a myriad of tools all outcompeting each other in their promise to fulfil the impossible.

For many, identity has become a matter of choice rather than of cultural or biological heritage, although being able to chose may not have been the result of a choice. A large superstructure of purchasable identification objects caters for an audience finding itself propelled into an ever accelerating and vertiginous spiral of identification and estrangement. In the supermarket of identities, what is useful and cool today is the waste of tomorrow. What is offered as the latest advance in helping you to "be yourself" is as ephemeral as your identification with it; it is trash in embryonic form.

Identity has become both problematic and trivial, causing modern subjects a sense of thrownness and uprootedness as well as granting them the opportunity of overcoming established authoritarian structures. In modern, technologically saturated societies, the general weakening of identities is a prerequisite for emancipation. The return to "strong" clear-cut "real" identities is the way of new fundamentalism demanding a rehabilitation of "traditional values" and protected zones for metaphysical thought, both of which are to be had only at the price of suppression and violence.

It has become difficult to know "who one is", but this difficulty is not merely a private problem. It is also a problem for the exercise of power, for the state and other power institutions also need to know "who you are". With the spread of weak identities, power is exercised in a different manner. Power cannot be exercised without being clear who it addresses; note the dual significance of "subject". A weakened, hybrid undefined subject (in the philosophical sense) cannot be a "good" subject (in the political sense), it is not easy to sub-ject. Without identification, power cannot be exercised. And while identification is itself not a sufficient precondition for authoritarianism, it is certainly a necessary one.

Identities are therefore reconstructed using technologies of identification in order to keep the weakened and hence evasive subjects "sub-jected". States have traditionally employed bureaucratic identification techniques and sanctioned those who trying to evade the grip of administration. Carrying several passports has been the privilege of spies and of dubious outlaws, and not possessing an "ID" at all is the fate of millions of refugees fleeing violence or economic destitution. Lack of identification is structurally sanctioned by placelessness.

The technisised acceleration of societies and the weakening of identities make identification a complicated matter. On the one hand, bureaucratic identification techniques can be technologically bypassed. Passports and signatures can be forged; data can be manipulated and played with. On the other hand, traditional bureaucratic methods are slow. The requirements resulting from these constraints are met by biometric technology.

TEXTBLOCK 51/52 // URL: http://world-information.org/wio/infostructure/100437611729/100438658075
 
What is the Internet?

Each definition of the Internet is a simplified statement and runs the risk of being outdated within a short time. What is usually referred to as the Internet is a network of thousands of computer networks (so called autonomous systems) run by governmental authorities, companies, and universities, etc. Generally speaking, every time a user connects to a computer networks, a new Internet is created. Technically speaking, the Internet is a wide area network (WAN) that may be connected to local area networks (LANs).

What constitutes the Internet is constantly changing. Certainly the state of the future Net will be different to the present one. Some years ago the Internet could still be described as a network of computer networks using a common communication protocol, the so-called IP protocol. Today, however, networks using other communication protocols are also connected to other networks via gateways.

Also, the Internet is not solely constituted by computers connected to other computers, because there are also point-of-sale terminals, cameras, robots, telescopes, cellular phones, TV sets and and an assortment of other hardware components that are connected to the Internet.

At the core of the Internet are so-called Internet exchanges, national backbone networks, regional networks, and local networks.

Since these networks are often privately owned, any description of the Internet as a public network is not an accurate. It is easier to say what the Internet is not than to say what it is. On 24 October, 1995 the U.S. Federal Networking Council made the following resolution concerning the definition of the term "Internet": "Internet" refers to the global information system that (i) is logically linked together by a globally unique address space based on the Internet Protocol (IP) or its subsequent extensions/follow-ons; (ii) is able to support communications using the Transmission Control Protocol/Internet Protocol (TCP/IP) suite or its subsequent extensions/follow-ons, and/or other IP-compatible protocols; and (iii) provides, uses or makes accessible, either publicly or privately, high level services layered on the communications and related infrastructure described herein." (http://www.fnc.gov/Internet_res.html)

What is generally and in a simplyfiying manner called the Internet, may be better referred to as the Matrix, a term introduced by science fiction writer William Gibson, as John S. Quarterman and Smoot Carl-Mitchell have proposed. The Matrix consists of all computer systems worldwide capable of exchanging E-Mail: of the USENET, corporate networks and proprietary networks owned by telecommunication and cable TV companies.

Strictly speaking, the Matrix is not a medium; it is a platform for resources: for media and services. The Matrix is mainly a very powerful means for making information easily accessible worldwide, for sending and receiving messages, videos, texts and audio files, for transferring funds and trading securities, for sharing resources, for collecting weather condition data, for trailing the movements of elephants, for playing games online, for video conferencing, for distance learning, for virtual exhibitions, for jamming with other musicians, for long distance ordering, for auctions, for tracking packaged goods, for doing business, for chatting, and for remote access of computers and devices as telescopes and robots remotely, e. g. The Internet is a wonderful tool for exchanging, retrieving, and storing data and sharing equipment over long distances and eventually real-time, if telecommunication infrastructure is reliable and of high quality.

For a comprehensive view of uses of the Matrix, especially the World Wide Web, see ""24 Hours in Cyberspace"

TEXTBLOCK 52/52 // URL: http://world-information.org/wio/infostructure/100437611791/100438659889
 
Agostino Ramelli's reading wheel, 1588

Agostino Ramelli designed a "reading wheel" which allowed browsing through a large number of documents without moving from one spot.

Presenting a large number of books, a small library, laid open on lecterns on a kind of ferry-wheel, allowing us to skip chapters and to browse through pages by turning the wheel to bring lectern after lectern before our eyes, thus linking ideas and texts together, Ramelli's reading wheel reminds of today's browsing software used to navigate the World Wide Web.

INDEXCARD, 1/57
 
Europe Online

Established in 1998 and privately held, Europe Online created and operates the world's largest broadband "Internet via the Sky" network. The Europe Online "Internet via the Sky" service is available to subscribers in English, French, German, Dutch and Danish with more languages to come.

http://www.europeonline.com

INDEXCARD, 2/57
 
Whitfield Diffie

Whitfield Diffie is an Engineer at Sun Microsystems and co-author of Privacy on the Line (MIT Press) in 1998 with Susan Landau. In 1976 Diffie and Martin Hellman developed public key cryptography, a system to send information without leaving it open to be read by everyone.

INDEXCARD, 3/57
 
Netiquette

Although referred to as a single body of rules, there is not just one Netiquette, but there are several, though overlapping largely. Proposing general guidelines for posting messages to newsgroups and mailing lists and using the World Wide Web and FTP, Netiquettes address civility topics (i.e., avoiding hate speech) and comprise technical advises (i.e., using simple and platform-independent file formats).
Well-known Netiquettes are the Request for Comment #1855 and The Net: User Guidelines and Netiquette by Arlene H. Rinaldi.

ftp://ftp.isi.edu/in-notes/rfc1855.txt
http://www.fau.edu/netiquette/net/index.html
INDEXCARD, 4/57
 
UNIVAC

Built by Remington Rand in 1951 the UNIVAC I (Universal Automatic Computer) was one of the first commercially available computers to take advantage of the development of the central processing unit (CPU). Both the U.S. Census bureau and General Electric owned UNIVACs. Speed: 1,905 operations per second; input/output: magnetic tape, unityper, printer; memory size: 1,000 12-digit words in delay line; technology: serial vacuum tubes, delay lines, magnetic tape; floor space: 943 cubic feet; cost: F.O.B. factory U.S.$ 750,000 plus U.S.$ 185,000 for a high speed printer.

INDEXCARD, 5/57
 
Above.net

Headquartered in San Jose, USA, AboveNet Communications is a backbone service provider. Through its extensive peering relationships, the company has built a network with the largest aggregated bandwidth in the world.

http://www.above.net

INDEXCARD, 6/57
 
Cyrus Reed Teed

C.R. Teed (New York State) was a doctor of alternative medicine in the last century. He worked on alchemy, too. In 1870 he had the idea that the universe was made out of cells, the earth being the biggest one. Thus he imagined the world as a concave system. Out of this thought he founded a religion, calling it Koreshanity.

INDEXCARD, 7/57
 
Java Applets

Java applets are small programs that can be sent along with a Web page to a user. Java applets can perform interactive animations, immediate calculations, or other simple tasks without having to send a user request back to the server. They are written in Java, a platform-independent computer language, which was invented by Sun Microsystems, Inc.

Source: Whatis.com

INDEXCARD, 8/57
 
Black Propaganda

Black propaganda does not tell its source. The recipient cannot find out the correct source. Rather would it be possible to get a wrong idea about the sender. It is very helpful for separating two allies.

INDEXCARD, 9/57
 
Proxy Servers

A proxy server is a server that acts as an intermediary between a workstation user and the Internet so that security, administrative control, and caching service can be ensured.

A proxy server receives a request for an Internet service (such as a Web page request) from a user. If it passes filtering requirements, the proxy server, assuming it is also a cache server, looks in its local cache of previously downloaded Web pages. If it finds the page, it returns it to the user without needing to forward the request to the Internet. If the page is not in the cache, the proxy server, acting as a client on behalf of the user, uses one of its own IP addresses to request the page from the server out on the Internet. When the page is returned, the proxy server relates it to the original request and forwards it on to the user.

Source: Whatis.com

INDEXCARD, 10/57
 
Amazon.com

Amazon.com is an online shop that serves approx. 17 mn customers in 150 countries. Starting out as a bookshop, Amazon today offers a wide range of other products as well.

Among privacy campaigners, the company's name has become almost synonymous with aggressive online direct marketing practices as well as user profiling and tracking. Amazon and has been involved in privacy disputes at numerous occasions.

http://www.amazon.com/
http://www.computeruser.com/newstoday/00/01/0...
INDEXCARD, 11/57
 
Cooperative Association of Internet Data Analysis (CAIDA)

Based at the University of California's San Diego Supercomputer Center, CAIDA supports cooperative efforts among the commercial, government and research communities aimed at promoting a scalable, robust Internet infrastructure. It is sponsored by the Defense Advanced Research Project Agency (DARPA) through its Next Generation Internet program, by the National Science Foundation, Cisco, Inc., and Above.net.

INDEXCARD, 12/57
 
New World Order

http://www.douzzer.ai.mit.edu:8080/conspiracy...
http://www.geocities.com/CapitolHill/Lobby/18...
INDEXCARD, 13/57
 
Enigma

Device used by the German military command to encode strategic messages before and during World War II. The Enigma code was broken by a British intelligence system known as Ultra.

INDEXCARD, 14/57
 
Caching

Caching is a mechanism that attempts to decrease the time it takes to retrieve data by storing a copy at a closer location.

INDEXCARD, 15/57
 
Terrestrial antennas

Microwave transmission systems based on terrestrial antennas are similar to satellite transmission system. Providing reliable high-speed access, they are used for cellular phone networks.

The implementation of the Wide Application Protocol (WAP) makes the wireless access to Internet services as E-Mail and even the World Wide Web via cellular phones convenient. Therefore microwave transmission systems become increasingly important.

INDEXCARD, 16/57
 
Harold. D. Lasswell

Harold. D. Lasswell (* 1902) studied at the London School of Economics. He then became a professor of social sciences at different Universities, like the University of Chicago, Columbia University, and Yale University. He also was a consultant for several governments. One of Lasswell's many famous works was Propaganda Technique in World War. In this he defines propaganda. He also discussed major objectives of propaganda, like to mobilize hatred against the enemy, to preserve the friendship of allies, to procure the co-operation of neutrals and to demoralize the enemy.

INDEXCARD, 17/57
 
IIPA

The International Intellectual Property Alliance formed in 1984 is a private sector coalition and represents the U.S. copyright-based industries. It is comprised of seven trade associations: Association of American Publishers, AFMA, Business Software Alliance, Interactive Digital Software Association, Motion Picture Association of America, National Music Publishers' Association and Recording Industry Association of America. IIPA and its member's track copyright legislative and enforcement developments in over 80 countries and aim at a legal and enforcement regime for copyright that deters piracy. On a national level IIPA cooperates with the U.S. Trade Representative and on the multilateral level has been involved in the development of the TRIPS (Trade-Related Aspects of Intellectual Property Rights) agreement of the WTO (World Trade Organization) and also participates in the copyright discussion of the WIPO (World Intellectual Property Organization).

INDEXCARD, 18/57
 
Internet Software Consortium

The Internet Software Consortium (ISC) is a nonprofit corporation dedicated to the production of high-quality reference implementations of Internet standards that meet production standards. Its goal is to ensure that those reference implementations are properly supported and made freely available to the Internet community.

http://www.isc.org

INDEXCARD, 19/57
 
Instinet

Instinet, a wholly owned subsidiary of Reuters Group plc since 1987, is the world's largest agency brokerage firm and the industry brokerage leader in after hours trading. It trades in over 40 global markets daily and is a member of seventeen exchanges in North America, Europe, and Asia. Its institutional clients represent more than 90 percent of the institutional equity funds under management in the United States. Instinet accounts for about 20 percent of the NASDAQ daily trading volume and trades approximately 170 million shares of all U.S. equities daily.

INDEXCARD, 20/57
 
atbash

Atbash is regarded as the simplest way of encryption. It is nothing else than a reverse-alphabet. a=z, b= y, c=x and so on. Many different nations used it in the early times of writing.

for further explanations see:
http://www.ftech.net/~monark/crypto/crypt/atbash.htm

http://www.ftech.net/~monark/crypto/crypt/atb...
INDEXCARD, 21/57
 
cryptology

also called "the study of code". It includes both, cryptography and cryptoanalysis

INDEXCARD, 22/57
 
Artificial Intelligence

Artificial Intelligence is concerned with the simulation of human thinking and emotions in information technology. AI develops "intelligent systems" capable, for example, of learning and logical deduction. AI systems are used for creatively handling large amounts of data (as in data mining), as well as in natural speech processing and image recognition. AI is also used as to support decision taking in highly complex environments.
Yahoo AI sites: http://dir.yahoo.com/Science/Computer_Science/Artificial_Intelligence/
MIT AI lab: http://www.ai.mit.edu/


http://dir.yahoo.com/Science/Computer_Science...
http://www.ai.mit.edu/
INDEXCARD, 23/57
 
Integrated circuit

Also called microcircuit, the integrated circuit is an assembly of electronic components, fabricated as a single unit, in which active semiconductor devices (transistors and diodes) and passive devices (capacitors and resistors) and their interconnections are built up on a chip of material called a substrate (most commonly made of silicon). The circuit thus consists of a unitary structure with no connecting wires. The individual circuit elements are microscopic in size.

INDEXCARD, 24/57
 
Enigma Machine

The Enigma Encryption Machine was famous for its insecurities as for the security that it gave to German ciphers. It was broken, first by the Poles in the 1930s, then by the British in World War II.

INDEXCARD, 25/57
 
Critical Art Ensemble

Critical Art Ensemble is a collective of five artists of various specializations dedicated to exploring the intersections between art, technology, radical politics, and critical theory. CAE have published a number of books and carried out innovative art projects containing insightful and ironic theoretical contributions to media art. Projects include Addictionmania, Useless Technology, The Therapeutic State, Diseases of Consciousness, Machineworld, As Above So Below, and Flesh Machine.

http://www.critical-art.net

INDEXCARD, 26/57
 
The World Wide Web History Project

The ongoing World Wide Web History Project was established to record and publish the history of the World Wide Web and its roots in hypermedia and networking. As primary research methods are used archival research and the analysis of interviews and talks with pioneers of the World Wide Web. As result a vast of collection of historic video, audio, documents, and software is expected. The project's digital archive is currently under development.

http://www.webhistory.org/home.html

INDEXCARD, 27/57
 
blowfish encryption algorithm

Blowfish is a symmetric key block cipher that can vary its length.
The idea behind is a simple design to make the system faster than others.

http://www.counterpane.com/blowfish.html
http://www.counterpane.com/bfsverlag.html

http://www.counterpane.com/blowfish.html
http://www.counterpane.com/blowfish.html
INDEXCARD, 28/57
 
COMECON

The Council for Mutual Economic Aid (COMECON) was set up in 1949 consisting of six East European countries: Bulgaria, Czechoslovakia, Hungary, Poland, Romania, and the USSR, followed later by the German Democratic Republic (1950), Mongolia (1962), Cuba (1972), and Vietnam (1978). Its aim was, to develop the member countries' economies on a complementary basis for the purpose of achieving self-sufficiency. In 1991, Comecon was replaced by the Organization for International Economic Cooperation.

INDEXCARD, 29/57
 
IBM

IBM (International Business Machines Corporation) manufactures and develops cumputer hardware equipment, application and sysem software, and related equipment.

IBM produced the first PC (Personal Computer), and its decision to make Microsoft DOS the standard operating system initiated Microsoft's rise to global dominance in PC software.

Business indicators:

1999 Sales: $ 86,548 (+ 7,2 % from 1998)

Market capitalization: $ 181 bn

Employees: approx. 291,000

Corporate website: www.ibm.com

http://www.ibm.com/
INDEXCARD, 30/57
 
Amazon.Com

Amazon.Com was one of the first online bookstores. With thousands of books, CDs and videos ordered via the Internet every year, Amazon.Com probably is the most successful Internet bookstore.

INDEXCARD, 31/57
 
Enochian alphabet

Also "Angelic" language. Archaic language alphabet composed of 21 letters, discovered by John Dee and his partner Edward Kelley. It has its own grammar and syntax, but only a small sample of it has ever been translated to English.

INDEXCARD, 32/57
 
Scientology

Official name Church Of Scientology, religio-scientific movement developed in the United States in the 1950s by the author L. Ron Hubbard (1911-86). The Church of Scientology was formally established in the United States in 1954 and was later incorporated in Great Britain and other countries. The scientific basis claimed by the church for its diagnostic and therapeutic practice is disputed, and the church has been criticized for the financial demands that it makes on its followers. From the 1960s the church and various of its officials or former officials faced government prosecutions as well as private lawsuits on charges of fraud, tax evasion, financial mismanagement, and conspiring to steal government documents, while the church on the other hand claimed it was being persecuted by government agencies and by established medical organizations. Some former Scientology officials have charged that Hubbard used the tax-exempt status of the church to build a profitable business empire.

INDEXCARD, 33/57
 
The Spot

http://www.thespot.com/

http://www.thespot.com/
INDEXCARD, 34/57
 
Richard Barbrook and Andy Cameron, The Californian Ideology

According to Barbrook and Cameron there is an emerging global orthodoxy concerning the relation between society, technology and politics. In this paper they are calling this orthodoxy the Californian Ideology in honor of the state where it originated. By naturalizing and giving a technological proof to a political philosophy, and therefore foreclosing on alternative futures, the Californian ideologues are able to assert that social and political debates about the future have now become meaningless and - horror of horrors - unfashionable. - This paper argues for an interactive future.

http://www.wmin.ac.uk/media/HRC/ci/calif.html

INDEXCARD, 35/57
 
Philip M. Taylor

Munitions of the Mind. A history of propaganda from the ancient world to the present era. Manchester 1995 (2nd ed.)
This book gives a quite detailed insight on the tools and tasks of propaganda in European and /or Western history. Starting with ancient times the author goes up till the Gulf War and the meaning of propaganda today. In all those different eras propaganda was transporting similar messages, even when technical possibilities had not been fairly as widespread as today. Taylor's book is leading the reader through those different periods, trying to show the typical elements of each one.

INDEXCARD, 36/57
 
Seneca

Lucius Annaeus Seneca (~4 BC - 65 AD), originally coming from Spain, was a Roman philosopher, statesman, orator and playwright with a lot of influence on the Roman cultural life of his days. Involved into politics, his pupil Nero forced him to commit suicide. The French Renaissance brought his dramas back to stage.

INDEXCARD, 37/57
 
Core copyright industries

Those encompass the industries that create copyrighted works as their primary product. These industries include the motion picture industry (television, theatrical, and home video), the recording industry (records, tapes and CDs), the music publishing industry, the book, journal and newspaper publishing industry, and the computer software industry (including data processing, business applications and interactive entertainment software on all platforms), legitimate theater, advertising, and the radio, television and cable broadcasting industries.

INDEXCARD, 38/57
 
Internet Exchanges

Internet exchanges are intersecting points between major networks.

List of the World's Public Internet exchanges (http://www.ep.net)

http://www.ep.net/
INDEXCARD, 39/57
 
Telephone

The telephone was not invented by Alexander Graham Bell, as is widely held to be true, but by Philipp Reiss, a German teacher. When he demonstrated his invention to important German professors in 1861, it was not enthusiastically greeted. Because of this dismissal, no financial support for further development was provided to him.

And here Bell comes in: In 1876 he successfully filed a patent for the telephone. Soon afterwards he established the first telephone company.

INDEXCARD, 40/57
 
Vladimir Putin

Vladimir Putin is Russian President, Boris Yeltsin's. Until his appointment as Prime Minister in August 1999, he was nearly unknown. He had been working for the Soviet Security Service, the KGB. In July 1998 he took charge of the Federal Security Service, FSB. In March 1999 he became secretary of the Security Council. He has no experience in being at all. Where he demonstrated power until now is the Chechnya War. Soon after the beginning of this 2nd war in the region his popularity rose.

INDEXCARD, 41/57
 
William Frederick Friedman

Friedman is considered the father of U.S.-American cryptoanalysis - he also was the one to start using this term.

INDEXCARD, 42/57
 
Themistocles

Themistocles, a Greek politician and general, conquered the Persians in the battle of Salamis, in 480 BC. The Persians, under their King Xerxes, who were on the edge of winning the battle, got defeated by a propaganda campaign that Themistocles launched, telling the Persians that he was on their side and willing to let them win the battle; his argument was that the Greek were so busy with their quarrels that they were not prepared to fight an aggressive battle and a lot of them would change sides if the power of the Persians was shown in a short and cruel fight. In the end Xerxes got the message that parts of the Greek army were fleeing the battlefield. This disinformation lead to a wrong assessment of Xerxes, which made it easy for the Greek to win the war.

For further details see:
http://www.optonline.com/comptons/ceo/31900_Q.html

http://ds.dial.pipex.com/kitson/ESSAYS/Them.htm

http://www.eptonline.com/comptons/ceo/31900_Q...
http://ds.dial.pipex.com/kitson/ESSAYS/Them.h...
INDEXCARD, 43/57
 
Bill Clinton

William J. Clinton (* 1946) studied law at Yale University, then taught at the University of Arkansas. He was elected Arkansas attorney general in 1976 and served as a governor until 1992. That year he became U.S.-President, the first democratic President after a row of Republicans. His sexual affairs not only cost him nearly his career but he also had to distract from his private affairs: he thought of fighting another war against Saddam Hussein in February 1999. Short afterwards he had a more interesting enemy, Slobodan Milosevic - and the NATO was most willing to fight with him.

For more information see: http://www.whitehouse.gov/WH/glimpse/presidents/html/bc42.html

http://www.whitehouse.gov/WH/glimpse/presiden...
INDEXCARD, 44/57
 
Kosov@

The "word" Kosov@ is a compromise between the Serb name KosovO and the Albanian KosovA. It is mostly used by international people who want to demonstrate a certain consciousness about the conflict including some sort of neutrality, believing that neither the one side nor the other (and maybe not even NATO) is totally right. Using the word Kosov@ is seen as a symbol of peace.

For more explanations (in German) see: http://www.zivildienst.at/kosov@.htm

http://www.zivildienst.at/kosov@.htm
INDEXCARD, 45/57
 
National Laboratory for Applied Network Research

NLANR, initially a collaboration among supercomputer sites supported by the National Science Foundation, was created in 1995 to provide technical and engineering support and overall coordination of the high-speed connections at these five supercomputer centers.

Today NLANR offers support and services to institutions that are qualified to use high performance network service providers - such as Internet 2 and Next Generation Internet.

http://www.nlanr.net

INDEXCARD, 46/57
 
Alan Turing

b. June 23, 1912, London, England
d. June 7, 1954, Wilmslow, Cheshire

English mathematician and logician who pioneered in the field of computer theory and who contributed important logical analyses of computer processes. Many mathematicians in the first decades of the 20th century had attempted to eliminate all possible error from mathematics by establishing a formal, or purely algorithmic, procedure for establishing truth. The mathematician Kurt Gödel threw up an obstacle to this effort with his incompleteness theorem. Turing was motivated by Gödel's work to seek an algorithmic method of determining whether any given propositions were undecidable, with the ultimate goal of eliminating them from mathematics. Instead, he proved in his seminal paper "On Computable Numbers, with an Application to the Entscheidungsproblem [Decision Problem]" (1936) that there cannot exist any such universal method of determination and, hence, that mathematics will always contain undecidable propositions. During World War II he served with the Government Code and Cypher School, at Bletchley, Buckinghamshire, where he played a significant role in breaking the codes of the German "Enigma Machine". He also championed the theory that computers eventually could be constructed that would be capable of human thought, and he proposed the Turing test, to assess this capability. Turing's papers on the subject are widely acknowledged as the foundation of research in artificial intelligence. In 1952 Alan M. Turing committed suicide, probably because of the depressing medical treatment that he had been forced to undergo (in lieu of prison) to "cure" him of homosexuality.

INDEXCARD, 47/57
 
International Standardization Organization

ISO (International Organization for Standardization), founded in 1946, is a worldwide federation of national standards bodies from some 100 countries, one from each country. Among the standards it fosters is Open Systems Interconnection (OSI), a universal reference model for communication protocols. Many countries have national standards organizations that participate in and contribute to ISO standards making.

http://www.iso.ch

Source: Whatis.com

http://www.iso.ch/
INDEXCARD, 48/57
 
to decipher/decode

to put the ciphers/codes back into the plaintext

INDEXCARD, 49/57
 
John von Neumann

b. December 3, 1903, Budapest, Hungary
d. February 8, 1957, Washington, D.C., U.S.

Mathematician who made important contributions in quantum physics, logic, meteorology, and computer science. His theory of games had a significant influence upon economics. In computer theory, von Neumann did much of the pioneering work in logical design, in the problem of obtaining reliable answers from a machine with unreliable components, the function of "memory," machine imitation of "randomness," and the problem of constructing automata that can reproduce their own kind.

INDEXCARD, 50/57
 
Leni Riefenstahl

Leni Riefenstahl (* 1902) began her career as a dancer and actress. Parallel she learnt how to work with a camera, turning out to be one of the most talented directors and cutters of her time - and one of the only female ones. Adolf Hitler appointed her the top film executive of the Nazi Party. Her two most famous works were done in that period, Triumph of the Will (1935) and the two films about the Olympic Games in Berlin in 1936. Later, when she tried to get rid of her image as a NAZI-movie maker, she worked as a photographer in Africa, making pictures of indigenous people and under-water landscape.

INDEXCARD, 51/57
 
Blue Box

The blue box-system works with a special blue colored background. The person in front can act as if he/she was filmed anywhere, also in the middle of a war.

INDEXCARD, 52/57
 
Oscar Wilde

Oscar Flingal O'Flahertie Wills (1854-1900) is one of the best and most famous poets and novelists of England of his time. His satirical and amusing texts exposed the false moral of the Bourgeoisie publicly. Besides, his life as a dandy made him the leader of aesthetics in England, until he was sent to prison because of homosexuality. Afterwards he lived in Paris where he died lonely and nearly forgotten in a hotel in 1900. His poems, fairy tales, novels and dramas survived.

INDEXCARD, 53/57
 
Blaise Pascal

b. June 19, 1623, Clermont-Ferrand, France
d. August 19, 1662, Paris, France

French mathematician, physicist, religious philosopher, and master of prose. He laid the foundation for the modern theory of probabilities, formulated what came to be known as Pascal's law of pressure, and propagated a religious doctrine that taught the experience of God through the heart rather than through reason. The establishment of his principle of intuitionism had an impact on such later philosophers as Jean-Jacques Rousseau and Henri Bergson and also on the Existentialists.

INDEXCARD, 54/57
 
skytale

The skytale (pronunciation: ski-ta-le) was a Spartan tool for encryption. It consisted of a piece of wood and a leather-strip. Any communicating party needed exactly the same size wooden stick. The secret message was written on the leather-strip that was wound around the wood, unwound again and sent to the recipient by a messenger. The recipient would rewound the leather and by doing this enciphering the message.

INDEXCARD, 55/57
 
Backbone Networks

Backbone networks are central networks usually of very high bandwidth, that is, of very high transmitting capacity, connecting regional networks. The first backbone network was the NSFNet run by the National Science Federation of the United States.

INDEXCARD, 56/57
 
Gottfried Wilhelm von Leibniz

b. July 1, 1646, Leipzig
d. November 14, 1716, Hannover, Hanover

German philosopher, mathematician, and political adviser, important both as a metaphysician and as a logician and distinguished also for his independent invention of the differential and integral calculus. 1661, he entered the University of Leipzig as a law student; there he came into contact with the thought of men who had revolutionized science and philosophy--men such as Galileo, Francis Bacon, Thomas Hobbes, and René Descartes. In 1666 he wrote De Arte Combinatoria ("On the Art of Combination"), in which he formulated a model that is the theoretical ancestor of some modern computers.

INDEXCARD, 57/57