The Privatization of Censorship

According to a still widely held conviction, the global data networks constitute the long desired arena for uncensorable expression. This much is true: Because of the Net it has become increasingly difficult to sustain cultural and legal standards. Geographical proximity and territorial boundaries prove to be less relevant, when it does not affect a document's availability if it is stored on your desktop or on a host some thousand kilometers away. There is no international agreement on non-prohibited contents, so human rights organizations and nazi groups alike can bypass restrictions. No single authority or organization can impose its rules and standards on all others. This is why the Net is public space, a political arena where free expression is possible.

This freedom is conditioned by the design of the Net. But the Net's design is not a given, as Lawrence Lessig reminds us. Originally the design of the Net allowed a relatively high degree of privacy and communication was not controlled directly. But now this design is changing and this invisible agora in electronic space is endangered. Governments - even elected ones - and corporations introduce new technologies that allow us to be identified, monitored and tracked, that identify and block content, and that can allow our behaviour to be efficiently controlled.

When the World Wide Web was introduced, soon small independent media and human rights organizations began to use this platform for drawing worldwide attention to their publications and causes. It seemed to be the dawning of a new era with authoritarian regimes and multinational media corporations on the looser side. But now the Net's design is changing according to their needs.

"In every context that it can, the entertaining industry is trying to force the Internet into its own business model: the perfect control of content. From music (fighting MP3) and film (fighting the portability of DVD) to television, the industry is resisting the Net's original design. It was about the free flow of content; Hollywood wants perfect control instead" (Lawrence Lessig, Cyberspace Prosecutor, in: The Industry Standard, February 2000).

In the United States, Hollywood and AT&T, after its merger with MediaOne becoming the biggest US cable service provider, return to their prior positions in the Seventies: the control of content and infrastructure. If most people will access the Net via set up boxes connected to a TV set, it will become a kind of television, at least in the USA.

For small independent media it will become very hard to be heard, especially for those offering streaming video and music. Increasingly faster data transmissions just apply to download capacities; upload capacities are much - on the average about eight times - lower than download capacities. As an AT&T executive said in response to criticism: "We haven't built a 56 billion dollar cable network to have the blood sucked from our veins" (Lawrence Lessig, The Law in the Code: How the Net is Regulated, Lecture at the Institute for Human Sciences, Vienna, May 29th, 2000).

Consumers, not producers are preferred.

For corporations what remains to be done to control the Net is mainly to cope with the fact that because of the Net it has become increasingly difficult to sustain cultural and legal standards. On Nov 11, 1995 the German prosecuting attorney's office searched Compuserve Germany, the branch of an international Internet service provider, because the company was suspected of having offered access to child pornography. Consequently Compuserve blocked access to more than 200 newsgroups, all containing "sex" or "gay" in their names, for all its customers. But a few days later, an instruction for access to these blocked newsgroups via Compuserve came into circulation. On February 26, 1997, Felix Somm, the Chief Executive Officer of Compuserve Germany, was accused of complicity with the distribution of child and animal pornography in newsgroups. In May 1998 he received a prison sentence for two years. This sentence was suspended against a bail of about 51.000 Euro. The sentence was justified by pointing to the fact that Compuserve Germany offered access to its US parent company's servers hosting child pornography. Felix Somm was held responsible for access to forbidden content he could not know of. (For further information (in German) click here.)

Also in 1995, as an attack on US Vice-President Al Gore's intention to supply all public schools with Internet access, Republican Senator Charles Grassley warned of the lurking dangers for children on the Net. By referring to a Time magazine cover story by Philip Elmer-Dewitt from July 3 on pornography on the Net, he pointed out that 83,5% of all images online are pornographic. But Elmer-Dewitt was wrong. Obviously unaware of the difference between Bulletin Board Systems and the Net, he referred misleadingly to Marty Rimm's article Marketing Pornography on the Information Superhighway, published in the prestigious Georgetown Law Journal (vol. 83, June 1995, pp. 1849-1935). Rimm knew of this difference, of course, and stated it clearly. (For further information see Hoffman & Novak, The Cyberporn debate, http://ecommerce.vanderbilt.edu/cyberporn.debate.html and Franz Wegener, Cyberpornographie: Chronologie einer Hexenjagd; http://www.intro-online.de/c6.html)

Almost inevitably anxieties accompany the introduction of new technologies. In the 19th century it was said that traveling by train is bad for health. The debate produced by Time magazine's cover story and Senator Grassley's attack caused the impression that the Net has multiplied possible dangers for children. The global communication networks seem to be a inexhaustible source of mushrooming child pornography. Later would-be bomb recipes found on the Net added to already prevailing anxieties. As even in industrialized countries most people still have little or no first-hand experience with the Net, anxieties about child pornography or terrorist attacks can be stirred up and employed easily.

A similar and related debate is going on about the glorification of violence and erotic depictions in media. Pointing to a "toxic popular culture" shaped by media that "distort children's view of reality and even undermine their character growth", US right-wing social welfare organizations and think tanks call for strong media censorship. (See An Appeal to Hollywood, http://www.media-appeal.org/appeal.htm) Media, especially films and videos, are already censored and rated, so it is more censorship that is wanted.

The intentions for stimulating a debate on child pornography on the Net were manifold: Inter alia, it served the Republican Party to attack Democrat Al Gore's initiative to supply all public schools with Internet access; additionally, the big media corporations realized that because of the Net they might have to face new competitors and rushed to press for content regulation. Taking all these intentions together, we can say that this still ongoing debate constitutes the first and most well known attempt to impose content regulation on the Net. Consequently, at least in Western countries, governments and media corporations refer to child pornography for justifying legal requirement and the implementation of technologies for the surveillance and monitoring of individuals, the filtering, rating and blocking of content, and the prohibition of anonymous publishing on the Net.

In the name of "cleaning" the Net of child pornography, our basic rights are restricted. It is the insistence on unrestricted basic rights that needs to be justified, as it may seem.

Underlying the campaign to control the Net are several assumptions. Inter alia: The Net lacks control and needs to be made safe and secure; we may be exposed inadvertently to pornographic content; this content is harmful to children. Remarkably, racism seems to be not an issue.

The Net, especially the World Wide Web, is not like television (although it is to be feared this is what it might become like within the next years). Say, little Mary types "Barbie" in a search engine. Click here to see what happens. It is true, sometimes you might have the opportunity to see that pornography is just a few mouse clicks away, but it is not likely that you might be exposed to pornographic content unless you make deliberate mouse clicks.

In reaction to these anxieties, but in absence of data how children use the Internet, the US government released the Communications Decency Act (CDA) in 1996. In consequence the Electronic Frontier Foundation (EFF) launched the famous Blue Ribbon Campaign and, among others, America Online and Microsoft Corporation supported a lawsuit of the American Civil Liberties Union (ACLU) against this Act. On June 26, 1997, the US Supreme Court ruled the CDA as unconstitutional under the provisions of the First Amendment to the Constitution: The Communications Decency Act violated the basic right to free expression. After a summit with the US government industry leaders announced the using of existing rating and blocking systems and the development of new ones for "inappropriate" online resources.

So, after the failing of the CDA the US government has shifted its responsibility to the industry by inviting corporations to taking on governmental tasks. Bearing in the mind the CompuServe case and its possible consequences, the industry welcomed this decision and was quick to call this newly assumed responsibility "self-regulation". Strictly speaking, "self-regulation" as meant by the industry does not amount to the regulation of the behaviour of corporations by themselves. On the opposite, "self-regulation" is to be understood as the regulation of users' behaviour by the rating, filtering and blocking of Internet content considered being inappropriate. The Internet industry tries to show that technical solutions are more favourable than legislation und wants to be sure, not being held responsible and liable for illegal, offensive or harmful content. A new CompuServe case and a new Communications Decency Act shall be averted.

In the Memorandum Self-regulation of Internet Content released in late 1999 by the Bertelsmann Foundation it is recommended that the Internet industry joins forces with governmental institutions for enforcing codes of conduct and encouraging the implementation of filters and ratings systems. For further details on the Memorandum see the study by the Center for Democracy and Technology, An Analysis of the Bertelsmann Foundation Memorandum on Self-Regulation of Internet Content: Concerns from a User Empowerment Perspective.

In fact, the "self-regulation" of the Internet industry is privatized censorship performed by corporations and right-wing NGOs. Censorship has become a business. "Crucially, the lifting of restrictions on market competition hasn't advanced the cause of freedom of expression at all. On the contrary, the privatisation of cyberspace seems to be taking place alongside the introduction of heavy censorship." (Richard Barbrook and Andy Cameron, The Californian Ideology)

While trying to convince us that its technical solutions are appropriate alternatives to government regulation, the Internet industry cannot dispense of governmental backing to enforce the proposed measures. This adds to and enforces the censorship measures already undertaken by governments. We are encouraged to use today's information and communication technologies, while the flow of information is restricted.

According to a report by Reporters Sans Frontières, quoted by Leonard R. Sussman in his essay Censor Dot Gov. The Internet and Press Freedom 2000, the following countries totally or largely control Internet access: Azerbaijan, Belarus, Burma, China, Cuba, Iran, Iraq, Kazakhstan, Kirghizstan, Libya, North Korea, Saudi Arabia, Sierra Leone, Sudan, Syria, Tajikistan, Tunisia, Turkmenistan, Uzbekistan, and Vietnam.

TEXTBLOCK 1/28 // URL: http://world-information.org/wio/infostructure/100437611742/100438658968
 
Internet, Intranets, Extranets, and Virtual Private Networks

With the rise of networks and the corresponding decline of mainframe services computers have become communication devices instead of being solely computational or typewriter-like devices. Corporate networks become increasingly important and often use the Internet as a public service network to interconnect. Sometimes they are proprietary networks.

Software companies, consulting agencies, and journalists serving their interests make some further differences by splitting up the easily understandable term "proprietary networks" into terms to be explained and speak of Intranets, Extranets, and Virtual Private Networks.

Cable TV networks and online services as Europe Online, America Online, and Microsoft Network are also proprietary networks. Although their services resemble Internet services, they offer an alternative telecommunication infrastructure with access to Internet services for their subscribers.
America Online is selling its service under the slogan "We organize the Web for you!" Such promises are more frightening than promising because "organizing" is increasingly equated with "filtering" of seemingly objectionable messages and "rating" of content. For more information on these issues, click here If you want to know more about the technical nature of computer networks, here is a link to the corresponding article in the Encyclopaedia Britannica.

Especially for financial transactions, secure proprietary networks become increasingly important. When you transfer funds from your banking account to an account in another country, it is done through the SWIFT network, the network of the Society for Worldwide Interbank Financial Telecommunication (SWIFT). According to SWIFT, in 1998 the average daily value of payments messages was estimated to be above U$ 2 trillion.

Electronic Communications Networks as Instinet force stock exchanges to redefine their positions in trading of equities. They offer faster trading at reduced costs and better prices on trades for brokers and institutional investors as mutual funds and pension funds. Last, but not least clients are not restricted to trading hours and can trade anonymously and directly, thereby bypassing stock exchanges.

TEXTBLOCK 2/28 // URL: http://world-information.org/wio/infostructure/100437611791/100438658384
 
The North against the South?

"Faced with this process of globalization, most governments appear to lack the tools required for facing up to the pressure from important media changes. The new global order is viewed as a daunting challenge, and it most often results in reactions of introversion, withdrawal and narrow assertions of national identity. At the same time, many developing countries seize the opportunity represented by globalization to assert themselves as serious players in the global communications market."
(UNESCO, World Communication Report)

The big hope of the South is that the Internet will close the education gap and economic gap, by making education easier to achieve. But in reality the gap is impossible to close, because the North is not keeping still, but developing itself further and further all the time; inventing new technologies that produce another gap each. The farmer's boy sitting in the dessert and using a cellular telephone and a computer at the same time is a sarcastic picture - nothing else.

Still, the so called developing countries regard modern communication technologies as a tremendous chance - and actually: which other choice is there left?

TEXTBLOCK 3/28 // URL: http://world-information.org/wio/infostructure/100437611730/100438659376
 
Cartoons

Cartoons' technique is simplicity.
Images are easier to remember than texts.
Frequently they show jokes about politicians, friendly or against the person shown. In the first decades of this century, cartoons were also used for propaganda against artists; remember the famous cartoons of Oscar Wilde being portrayed as a criminal, aiming to destroy his popularity.
As a tool in politics it had fatal consequences by determining stereotypes, which never again could be erased even if detected as pure disinformation. Most famous got the cartoons about Jews, which were not only distributed by Germans and Austrians but all over Europe; and already in the tens and twenties of our century. Most horrifying is the fact that many of those old, fascist and racist cartoons are coming back now, in slightly different design only.

TEXTBLOCK 4/28 // URL: http://world-information.org/wio/infostructure/100437611661/100438658509
 
Timeline 1600 - 1900 AD

17th century Cardinal Richelieu invents an encryption-tool called grille, a card with holes for writing messages on paper into the holes of those cards. Afterwards he removes the cards and fills in the blanks, so the message looks like an ordinary letter. The recipient needs to own the same card

- Bishop John Wilkins invents a cryptologic system looking like music notes. In a book he describes several forms of steganographic systems like secrets inks, but also the string cipher. He mentions the so-called Pig Latin, a spoken way of encryption that was already used by the ancient Indians

- the English scientist, magician and astrologer John Dee works on the ancient Enochian alphabet; he also possesses an encrypted writing that could not been broken until today

1605/1623 Sir Francis Bacon (= Francis Tudor = William Shakespeare?) writes several works containing ideas about cryptography. One of his most important advises is to use ciphers in such a way that no-one gets suspicious that the text could be enciphered. For this the steganogram was the best method, very often used in poems. The attempt to decipher Shakespeare's sonnets (in the 20th century) lead to the idea that his works had been written by Francis Bacon originally.

1671 Leibniz invents a calculating machine that uses the binary scale which we still use today, more advanced of course, called the ASCII code

18th century this is the time of the Black Chambers of espionage in Europe, Vienna having one of the most effective ones, called the "Geheime Kabinettskanzlei", headed by Baron Ignaz von Koch. Its task is to read through international diplomatic mail, copy letters and return them to the post-office the same morning. Supposedly about 100 letters are dealt with each day.

1790's Thomas Jefferson and Robert Patterson invent a wheel cipher

1799 the Rosetta Stone is found and makes it possible to decipher the Egyptian Hieroglyphs

1832 or 1838 Sam Morse develops the Morse Code, which actually is no code but an enciphered alphabet of short and long sounds. The first Morse code-message is sent by telegraph in 1844.

1834 the Braille Code for blind people is developed in today's form by Louis Braille

1844 the invention of the telegraph changes cryptography very much, as codes are absolutely necessary by then

1854 the Playfair cipher is invented by Sir Charles Wheatstone

1859 for the first time a tomographic cipher gets described

1861 Friedrich W. Kasiski does a cryptoanalysis of the Vigenère ciphers, which had been supposed to be uncrackable for ages

1891 Major Etienne Bazeries creates a new version of the wheel cipher, which is rejected by the French Army

1895 the invention of the radio changes cryptography-tasks again and makes them even more important

TEXTBLOCK 5/28 // URL: http://world-information.org/wio/infostructure/100437611776/100438658974
 
Identificaiton in history

In biometric technology, the subject is reduced to its physical and therefore inseparable properties. The subject is a subject in so far as it is objectified; that is, in so far as is identified with its own res extensa, Descartes' "extended thing". The subject exists in so far as it can be objectified, if it resists the objectification that comes with measurement, it is rejected or punished. Biometrics therefore provides the ultimate tool for control; in it, the dream of hermetic identity control seems to become a reality, a modern technological reconstruction of traditional identification techniques such as the handshake or the look into somebody's eyes.

The use of identification by states and other institutions of authority is evidently not simply a modern phenomenon. The ancient Babylonians and Chinese already made use of finger printing on clay to identify authors of documents, while the Romans already systematically compared handwritings.

Body measurement has long been used by the military. One of the first measures after entering the military is the identification and appropriation of the body measurements of a soldier. These measurements are filed and combined with other data and make up what today we would call the soldier's data body. With his data body being in possession of the authority, a soldier is no longer able freely socialise and is instead dependent on the disciplinary structure of the military institution. The soldier's social being in the world is defined by the military institution.

However, the military and civilian spheres of modern societies are no longer distinct entities. The very ambivalence of advanced technology (dual use technologies) has meant that "good" and "bad" uses of technology can no longer be clearly distinguished. The measurement of physical properties and the creation of data bodies in therefore no longer a military prerogative, it has become diffused into all areas of modern societies.

If the emancipatory potential of weak identities is to be of use, it is therefore necessary to know how biometric technologies work and what uses they are put to.

TEXTBLOCK 6/28 // URL: http://world-information.org/wio/infostructure/100437611729/100438658096
 
Znet

ZNet provides forum facilities for online discussion and chatting on various topics ranging from culture and ecology to international relations and economics. ZNet also publishes daily commentaries and maintains a Web-zine, which addresses current news and events as well as many other topics, trying to be provocative, informative and inspiring to its readers.

Strategies and Policies

Daily Commentaries: Znet's commentaries address current news and events, cultural happenings, and organizing efforts, providing context, critique, vision, and analysis, but also references to or reviews of broader ideas, new books, activism, the Internet, and other topics that strike the diverse participating authors as worthy of attention.

Forum System: Znet provides a private (and soon also a public) forum system. The fora are among others concerned with topics such as: activism, cultural, community/race/religion/ethnicity, ecology, economics/class, gender/kinship/sexuality, government/polity, international relations, ParEcon, vision/strategy and popular culture. Each forum has a set of threaded discussions, also the fora hosted by commentary writers like Chomsky, Ehrenreich, Cagan, Peters and Wise.

ZNet Daily WebZine: ZNet Daily WebZine offers commentaries in web format.

Z Education Online (planned): The Z Education Online site will provide instructionals and courses of diverse types as well as other university-like, education-aimed features.

TEXTBLOCK 7/28 // URL: http://world-information.org/wio/infostructure/100437611734/100438659288
 
The Kosovo-Crisis

During the Kosovo Crisis and during the war that followed, and probably also after it, all sides of the conflict were manipulating their people and others as well, whenever they could. Some of the propaganda shown on TV was as primitive as in World War II, others were subtler. This propaganda started by telling the history of the geographic point of discussion from the own point of view, it went on with the interpretation of the motives of the enemy and finally came to censorship, manipulation of the number of victims ( for more information see: http://www.oneworld.org/index_oc/kosovo/kadare.html , spreading of atrocity stories and so on.
Many journalists and scientists are still working to detect more propaganda and disinformation stories.

An interesting detail about this war was that more people than ever before took their information about the war out of the internet. In part this had to do with the biased TV-reports on all sides. All parties put their ideas and perspectives in the net, so one could get an overview of the different thoughts and types of disinformation.
One of the big lies of NATO was the numbers of destroyed military facilities in Serbia. After the war the numbers had to be corrected down to a ridiculous number of about 13 destroyed tanks. At the same time the numbers of civilian victims turned out to be much higher than NATO had admitted in the first line. The method how European and American people had been persuaded to support the NATO-bombings was the promise to bomb only targets of the military or military-related facilities. Nearly every day NATO had to stretch this interpretation, as many civilian houses got destroyed. A cynical word was created for this kind of excuse: collateral damage.

The Serbs were not better than Western governments and media, which worked together closely. Serb TV showed the bombed targets and compared persons like Bill Clinton to Adolf Hitler and called the NATO fascist. On the other hand pictures from the situation in Kosov@ were left out in their reports.

More:
http://www.voa.gov/editorials/08261.htm (91)
http://www.foreignpolicy-infocus.org/progresp/vol3/prog3n22.html (92)
http://www.serbia-info.com/news (93)
http://www.nyu.edu/globalbeat/syndicate/Belgrade041399.html (94)
http://www.monde-diplomatique.fr/1999/08/SAID/12320.html (95)

TEXTBLOCK 8/28 // URL: http://world-information.org/wio/infostructure/100437611661/100438658714
 
Global Data Flows

Fiber-optic cables, coaxial cables, copper wires, electric power lines, microwaves, satellite communication, mobile telephony, computer networks: Various telecommunication networks following a variety of standards with bewildering abbreviations - DSL, WAP, GSM, UMTS, Ipv4 etc. - and carrying endless flows of capital and information are the blood veins of modern societies.

In the space of flows constituted by today's global data networks the space of places is transcended. Visualizations of these global data flows show arches bridging seas and continents, thereby linking the world's centres of research and development, economics and politics. In the global "Network Society" (Manuel Castells) the traditional centres of power and domination are not discarded, in the opposite, they are strengthened and reinforced by the use of information and communication technologies. Political, economical and symbolical power becomes increasingly linked to the use of modern information and communication technologies. The most sensitive and advanced centres of information and communication technologies are the stock markets. Excluded from the network constituted by modern information and communication technologies, large parts of Africa, Asia and South America, but also the poor of industrialized countries, are ranking increasingly marginal to the world economy.

Cities are centres of communications, trade and power. The higher the percentage of urban population, the more it is likely that the telecommunications infrastructure is generally good to excellent. This goes hand in hand with lower telecommunications costs. Those parts of the world with the poorest infrastructure are also the world's poorhouse. In Bangladesh for most parts of the population a personal computer is as expensive as a limousine in European one-month's salary in Europe, they have to pay eight annual salaries. Therefore telecommunications infrastructure is concentrated on the highly industrialized world: Most telephone mainlines, mobile telephones, computers, Internet accounts and Internet hosts (computers connected to the global data networks) can be found here. The same applies to media: the daily circulation of newspapers and the use of TV sets and radios. - Telecommunication and media services affordable to most parts of the population are mostly restricted to industrialized countries.

This situation will not change in the foreseeable future: Most expenditure for telecommunications infrastructure will be restricted to the richest countries in the world. In 1998, the world's richest countries consumed 75% of all cables and wires.

TEXTBLOCK 9/28 // URL: http://world-information.org/wio/infostructure/100437611791/100438658776
 
Economic structure; digital euphoria

The dream of a conflict-free capitalism appeals to a diverse audience. No politician can win elections without eulogising the benefits of the information society and promising universal wealth through informatisation. "Europe must not lose track and should be able to make the step into the new knowledge and information society in the 21st century", said Tony Blair.

The US government has declared the construction of a fast information infrastructure network the centerpiece of its economic policies

In Lisbon the EU heads of state agreed to accelerate the informatisation of the European economies

The German Chancellor Schröder has requested the industry to create 20,000 new informatics jobs.

The World Bank understands information as the principal tool for third world development

Electronic classrooms and on-line learning schemes are seen as the ultimate advance in education by politicians and industry leaders alike.

But in the informatised economies, traditional exploitative practices are obscured by the glamour of new technologies. And the nearly universal acceptance of the ICT message has prepared the ground for a revival of 19th century "adapt-or-perish" ideology.

"There is nothing more relentlessly ideological than the apparently anti-ideological rhetoric of information technology"

(Arthur and Marilouise Kroker, media theorists)

TEXTBLOCK 10/28 // URL: http://world-information.org/wio/infostructure/100437611726/100438658999
 
Virtual cartels, oligopolistic structures

Global networks require global technical standards ensuring the compatibility of systems. Being able to define such standards makes a corporation extremely powerful. And it requires the suspension of competitive practices. Competition is relegated to the symbolic realm. Diversity and pluralism become the victims of the globalisation of baroque sameness.

The ICT market is dominated by incomplete competition aimed at short-term market domination. In a very short time, new ideas can turn into best-selling technologies. Innovation cycles are extremely short. But today's state-of-the-art products are embryonic trash.

    According to the Computer and Communications Industry Association, Microsoft is trying to aggressively take over the network market. This would mean that AT&T would control 70 % of all long distance phone calls and 60 % of cable connections.



    AOL and Yahoo are lone leaders in the provider market. AOL has 21 million subscribers in 100 countries. In a single month, AOL registers 94 million visits. Two thirds of all US internet users visited Yahoo in December 1999.



    The world's 13 biggest internet providers are all American.



    AOL and Microsoft have concluded a strategic cross-promotion deal. In the US, the AOL icon is installed on every Windows desktop. AOL has also concluded a strategic alliance with Coca Cola.


TEXTBLOCK 11/28 // URL: http://world-information.org/wio/infostructure/100437611709/100438658963
 
Hill & Knowlton

Although it is generally hard to distinguish between public relations and propaganda, Hill & Knowlton, the worlds leading PR agency, represents an extraordinary example for the manipulation of public opinion with public relations activities. Hill & Knowlton did not only lobby for countries, accused of the abuse of human rights, like China, Peru, Israel, Egypt and Indonesia, but also represented the repressive Duvalier regime in Haiti.

It furthermore played a central role in the Gulf War. On behalf of the Kuwaiti government it presented a 15-year-old girl to testify before Congress about human rights violations in a Kuwaiti hospital. The girl, later found out to be the daughter of Kuwait's ambassador to the U.S., and its testimony then became the centerpiece of a finely tuned PR campaign orchestrated by Hill & Knowlton and co-ordinated with the White House on behalf of the government of Kuwait an the Citizens for a Free Kuwait group. Inflaming public opinion against Iraq and bringing the U.S. Congress in favor of war in the Gulf, this probably was one of the largest and most effective public relations campaigns in history.

Running campaigns against abortion for the Catholic Church and representing the Church of Scientology, large PR firms like Hill & Knowlton, scarcely hesitate to manipulate public and congressional opinion and government policy through media campaigns, congressional hearings, and lobbying, when necessary. Also co-operation with intelligence agencies seems to be not unknown to Hill & Knowlton.

Accused of pursuing potentially illegal proxy spying operation for intelligence agencies, Richard Cheney, head of Hill & Knowltons New York office, denied this allegations, but said that "... in such a large organization you never know if there's not some sneak operation going on." On the other hand former CIA official Robert T. Crowley acknowledged, that "Hill & Knowlton's overseas offices were perfect 'cover` for the ever-expanding CIA. Unlike other cover jobs, being a public relations specialist did not require technical training for CIA officers." Furthermore the CIA, Crowley admitted, used its Hill & Knowlton connections to "... put out press releases and make media contacts to further its positions. ... Hill & Knowlton employees at the small Washington office and elsewhere distributed this material through CIA assets working in the United States news media."

(Source: Carlisle, Johan: Public Relationships: Hill & Knowlton, Robert Gray, and the CIA. http://mediafilter.org/caq/)

TEXTBLOCK 12/28 // URL: http://world-information.org/wio/infostructure/100437611652/100438658088
 
1940s - 1950s: The Development of Early Robotics Technology

During the 1940s and 1950s two major developments enabled the design of modern robots. Robotics generally is based on two related technologies: numerical control and teleoperators.

Numerical control was invented during the late 1940s and early 1950s. It is a method of controlling machine tool axes by means of numbers that have been coded on media. The first numerical control machine was presented in 1952 at the Massachusetts Institute of Technology (MIT), whose subsequent research led to the development of APT (Automatically Programmed Tools). APT, a language for programming machine tools, was designed for use in computer-assisted manufacturing (CAM).

First teleoperators were developed in the early 1940s. Teleoperators are mechanical manipulators which are controlled by a human from a remote location. In its typical application a human moves a mechanical arm and hand with its moves being duplicated at another location.

TEXTBLOCK 13/28 // URL: http://world-information.org/wio/infostructure/100437611663/100438659348
 
Virtual cartels; mergers

In parallel to the deregulation of markets, there has been a trend towards large-scale mergers which ridicules dreams of increased competition.

Recent mega-mergers and acquisitions include

SBC Communications - Ameritech, $ 72,3 bn

Bell Atlantic - GTE, $ 71,3

AT&T - Media One, $ 63,1

AOL - Time Warner, $ 165 bn

MCI Worldcom - Spring, $ 129 bn

The total value of all major mergers since the beginnings of the 1990s has been 20 trillion Dollars, 2,5 times the size of the USA's GIP.

The AOL- Time Warner reflects a trend which can be observed everywhere: the convergence of the ICT and the content industries. This represents the ultimate advance in complete market domination, and a alarming threat to independent content.

"Is TIME going to write something negative about AOL? Will AOL be able to offer anything other than CNN sources? Is the Net becoming as silly and unbearable as television?"

(Detlev Borchers, journalist)

TEXTBLOCK 14/28 // URL: http://world-information.org/wio/infostructure/100437611709/100438658959
 
Late 1950s - Early 1960s: Second Generation Computers

An important change in the development of computers occurred in 1948 with the invention of the transistor. It replaced the large, unwieldy vacuum tube and as a result led to a shrinking in size of electronic machinery. The transistor was first applied to a computer in 1956. Combined with the advances in magnetic-core memory, the use of transistors resulted in computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.

Stretch by IBM and LARC by Sperry-Rand (1959) were the first large-scale machines to take advantage of the transistor technology (and also used assembly language instead of the difficult machine language). Both developed for atomic energy laboratories could handle enormous amounts of data, but still were costly and too powerful for the business sector's needs. Therefore only two LARC's were ever installed.

Throughout the early 1960s there were a number of commercially successful computers (for example the IBM 1401) used in business, universities, and government and by 1965 most large firms routinely processed financial information by using computers. Decisive for the success of computers in business was the stored program concept and the development of sophisticated high-level programming languages like FORTRAN (Formular Translator), 1956, and COBOL (Common Business-Oriented Language), 1960, that gave them the flexibility to be cost effective and productive. The invention of second generation computers also marked the beginning of an entire branch, the software industry, and the birth of a wide range of new types of careers.

TEXTBLOCK 15/28 // URL: http://world-information.org/wio/infostructure/100437611663/100438659439
 
Iris recognition

Iris recognition relies upon the fact that every individuals retina has a unique structure. The iris landscape is composed of a corona, crypts, filaments, freckles, pits radial furrows and striatations. Iris scanning is considered a particularly accurate identification technology because the characteristics of the iris do not change during a persons lifetime, and because there are several hundred variables in an iris which can be measured. In addition, iris scanning is fast: it does not take longer than one or two seconds.

These are characteristics which have made iris scanning an attractive technology for high-security applications such as prison surveillance. Iris technology is also used for online identification where it can substitute identification by password. As in other biometric technologies, the use of iris scanning for the protection of privacy is a two-edged sword. The prevention of identity theft applies horizontally but not vertically, i.e. in so far as the data retrieval that accompanies identification and the data body which is created in the process has nothing to do with identity theft.

TEXTBLOCK 16/28 // URL: http://world-information.org/wio/infostructure/100437611729/100438658334
 
Who owns the Internet and who is in charge?

The Internet/Matrix still depends heavily on public infrastructure and there is no dedicated owner of the whole Internet/Matrix, but the networks it consists of are run and owned by corporations and institutions. Access to the Internet is usually provided by Internet Service Providers (ISPs) for a monthly fee. Each network is owned by someone and has a network operation center from where it is centrally controlled, but the Internet/Matrix is not owned by any single authority and has no network operation center of its own. No legal authority determines how and where networks can be connected together, this is something the managers of networks have to agree about. So there is no way to ever gain ultimate control of the Matrix/Internet.
The in some respects decentralized Matrix/Internet architecture and administration do not imply that there are no authorities for oversight and common standards for sustaining basic operations, for administration: There are authorities for IP number and domain name registrations, e.g.
Ever since the organizational structures for Internet administration have changed according to the needs to be addressed. Up to now, administration of the Internet is a collaborative undertaking of several loose cooperative bodies with no strict hierarchy of authority. These bodies make decisions on common guidelines, as communication protocols, e.g., cooperatively, so that compatibility of software is guaranteed. But they have no binding legal authority, nor can they enforce the standards they have agreed upon, nor are they wholly representative for the community of Internet users. The Internet has no official governing body or organization; most parts are still administered by volunteers.
Amazingly, there seems to be an unspoken and uncodified consent of what is allowed and what is forbidden on the Internet that is widely accepted. Codifications, as the so-called Netiquette, are due to individual efforts and mostly just expressively stating the prevailing consent. Violations of accepted standards are fiercely rejected, as reactions to misbehavior in mailing lists and newsgroups prove daily.
Sometimes violations not already subject to law become part of governmental regulations, as it was the case with spamming, the unsolicited sending of advertising mail messages. But engineers proved to be quicker and developed software against spamming. So, in some respects, the Internet is self-regulating, indeed.
For a detailed report on Internet governance, click here.

TEXTBLOCK 17/28 // URL: http://world-information.org/wio/infostructure/100437611791/100438658447
 
Another Question of Security

Even with the best techniques it is impossible to invent a cryptographic system that is absolutely safe/unbreakable. To decipher a text means to go through many, sometimes nearly - but never really - endless attempts. For the computers of today it might take hundreds of years or even more to go through all possibilities of codes, but still, finally the code stays breakable. The much faster quantum computers will proof that one day.
Therefore the decision to elect a certain method of enciphering finally is a matter of trust.

For the average user of computers it is rather difficult to understand or even realize the dangers and/or the technological background of electronic transmission of data. For the majority thinking about one's own necessities for encryption first of all means to trust others, the specialists, to rely on the information they provide.
The websites explaining the problems behind (and also the articles and books concerning the topic) are written by experts of course as well, very often in their typical scientific language, merely understandable for laymen. The introductions and other superficial elements of those articles can be understood, whereas the real background appears as untouchable spheres of knowledge.

The fact that dangers are hard to see through and the need for security measures appears as something most people know from media reports, leads directly to the problem of an underdeveloped democracy in the field of cryptography. Obviously the connection between cryptography and democracy is rather invisible for many people. Those mentioned media reports often specialize in talking about the work computer hackers do (sometimes being presented as criminals, sometimes as heroes) and the danger to lose control over the money drawn away from one's bank account, if someone steals the credit card number or other important financial data. The term "security", surely connected to those issues, is a completely different one from the one that is connected to privacy.
It is especially the latter that touches the main elements of democracy.

for the question of security see:
http://www-db.stanford.edu/pub/gio/CS99I/security.html

TEXTBLOCK 18/28 // URL: http://world-information.org/wio/infostructure/100437611776/100438658850
 
Changes

Still, disinformation and propaganda are nothing magic. They can change things, but supposedly only if those things/meanings/opinions are not fixed completely. Galileo Galilee was not capable of changing people's opinion about the world being flat until some persons got suspicious that this long-believed-in truth was a mistake. The propaganda of his experiments which made him famous was not enough. On the other hand later all the propaganda of his enemies could not turn back the wheel of enlightenment, as people thought it was more logic to believe in a round world than in a flat one.
It is never just a single idea that changes. Society is following the changes.

Thinking about disinformation brings us to the word truth, of course, and to the doubt that there is no definite truth. And truth can easily be manipulated to another truth. Just present some facts that seem to be logic and there you've got a new truth. And if the facts can supposedly be proved by empirical studies then the quality of the truth definitely rises.
That's what ideologies do all the time. And the media like to do the same thing - as a game with power or mere presentation of power?

But of course there also exist bits of disinformation which are more amusing than evil or dangerous:
- the theory of the celestro-centric world/"Hohlwelttheorie"
- the story of the German philosopher who invented an Italian philosopher, wrote books about him, even reprinted "his" texts, which had gone lost pretendedly 100 years ago - and finally lost his job and all his career when other scientists found out that everything had been made up.

TEXTBLOCK 19/28 // URL: http://world-information.org/wio/infostructure/100437611661/100438658633
 
Global data bodies - intro

- Education files, insurance files, tax files, communication files, consumption files, medical files, travel files, criminal files, investment files, files into infinity ...

Critical Art Ensemble

Global data bodies

1. Introduction

Informatisation has meant that things that once were "real", i.e. whose existence could be experienced sensually, are becoming virtual. Instead of the real existence of a thing, the virtual refers to its possibility of existence. As this process advances, an increasing identification of the possible with the real occurs. Reality migrates into a dim and dematerialised grey area. In the end, the possible counts for the real, virtualisation creates an "as-if" experience.

The experience of the body is also affected by this process. For example, in bio-technology, the human body and its functions are digitised, which prepares and understanding of the body exlusively in terms of its potential manipulation, the body becomes whatever it could be. But digitisation has not only affected the understanding and the social significance of the body, it has also altered the meaning of presence, traditionally identified with the body. The advance of information and communication technologies (ICTs) has meant that for an increasing number of activities we no longer need be physically present, our "virtual" presence, achieved by logging onto a electronic information network, is sufficient.

This development, trumpeted as the pinnacle of convenience by the ICT industries and governments interested in attracting investment, has deeply problematic aspects as well. For example, when it is no longer "necessary" to be physically present, it may soon no longer be possible or allowed. Online-banking, offered to customers as a convenience, is also serves as a justification for charging higher fees from those unwilling or unable to add banking to their household chores. Online public administration may be expected to lead to similar effects. The reason for this is that the digitalisation of the economy relies on the production of surplus data. Data has become the most important raw material of modern economies.

In modern economies, informatisation and virtualisation mean that people are structurally forced to carry out their business and life their lives in such a way as to generate data.

Data are the most important resource for the New Economy. By contrast, activities which do not leave behind a trace of data, as for example growing your own carrots or paying cash rather than by plastic card, are discouraged and structurally suppressed.

TEXTBLOCK 20/28 // URL: http://world-information.org/wio/infostructure/100437611761/100438659649
 
Legal Protection: National Legislation

Intellectual property - comprising industrial property and copyright - in general is protected by national legislation. Therefore those rights are limited territorially and can be exercised only within the jurisdiction of the country or countries under whose laws they are granted.

TEXTBLOCK 21/28 // URL: http://world-information.org/wio/infostructure/100437611725/100438659540
 
Challenges for Copyright by ICT: Digital Content Providers

Providers of digital information might be confronted with copyright related problems when using some of the special features of hypertext media like frames and hyperlinks (which both use third party content available on the Internet to enhance a webpage or CD ROM), or operate a search engine or online directory on their website.

Framing

Frames are often used to help define, and navigate within, a content provider's website. Still, when they are used to present (copyrighted) third party material from other sites issues of passing off and misleading or deceptive conduct, as well as copyright infringement, immediately arise.

Hyperlinking

It is generally held that the mere creation of a hyperlink does not, of itself, infringe copyright as usually the words indicating a link or the displayed URL are unlikely to be considered a "work". Nevertheless if a link is clicked on the users browser will download a full copy of the material at the linked address creating a copy in the RAM of his computer courtesy of the address supplied by the party that published the link. Although it is widely agreed that the permission to download material over the link must be part of an implied license granted by the person who has made the material available on the web in the first place, the scope of this implied license is still the subject of debate. Another option that has been discussed is to consider linking fair use.

Furthermore hyperlinks, and other "information location tools", like online directories or search engines could cause their operators trouble if they refer or link users to a site that contains infringing material. In this case it is yet unclear whether providers can be held liable for infringement.

TEXTBLOCK 22/28 // URL: http://world-information.org/wio/infostructure/100437611725/100438659590
 
Abstract

Disinformation is part of human communication. Thousands of years ago it was already used as a political medium. In the age of mass-communication and information its possibilities have grown tremendously. It plays an important role in many different fields, together with its "companion" propaganda. Some of these fields are: politics, international relations, the (mass-)media and the internet, but also art and science.
There is no evidence at all for a disappearance of disinformation. On this account it is important to understand where it comes from, what its tools are and how nations (democratic as well as totalitarian systems), international organizations and the media work with it or against it.
This report tries to give a short insight into this topic:
on a theoretical level
by demonstrating cases of disinformation, like the 2nd Chechnya War in 1999.

TEXTBLOCK 23/28 // URL: http://world-information.org/wio/infostructure/100437611661/100438658038
 
Data bunkers

Personal data are collected, appropriated, processed and used for commercial purposes on a global scale. In order for such a global system to operate smoothly, there a server nodes at which the data streams converge. Among the foremost of these are the data bases of credit card companies, whose operation has long depended on global networking.

On top of credit card companies such as Visa, American Express, Master Card, and others. It would be erroneous to believe that the primary purpose of business of these companies is the provision of credit, and the facilitation of credit information for sale transactions. In fact, Information means much more than just credit information. In an advertisement of 1982, American Express described itself in these terms: ""Our product is information ...Information that charges airline tickets, hotel rooms, dining out, the newest fashions ...information that grows money funds buys and sells equities ...information that pays life insurance annuities ...information that schedules entertainment on cable television and electronically guards houses ...information that changes kroners into guilders and figures tax rates in Bermuda ..."

Information has become something like the gospel of the New Economy, a doctrine of salvation - the life blood of society, as Bill Gates expresses it. But behind information there are always data that need to be generated and collected. Because of the critical importance of data to the economy, their possession amounts to power and their loss can cause tremendous damage. The data industry therefore locates its data warehouses behind fortifications that bar physical or electronic access. Such structures are somewhat like a digital reconstruction of the medieval fortress

Large amounts of data are concentrated in fortress-like structures, in data bunkers. As the Critical Art Ensemble argue in Electronic Civil Disobedience: "The bunker is the foundation of homogeneity, and allows only a singular action within a given situation." All activities within data bunker revolve around the same principle of calculation. Calculation is the predominant mode of thinking in data-driven societies, and it reaches its greatest density inside data bunkers. However, calculation is not a politically neutral activity, as it provides the rational basis - and therefore the formal legitimisation most every decision taken. Data bunkers therefore have an essentially conservative political function, and function to maintain and strengthen the given social structures.

TEXTBLOCK 24/28 // URL: http://world-information.org/wio/infostructure/100437611761/100438659754
 
Disinformation and Science

Disinformation's tools emerged from science and art.
And furthermore: disinformation can happen in politics of course, but also in science:
for example by launching ideas which have not been proven exactly until the moment of publication. e.g. the thought that time runs backwards in parts of the universe:
http://www.newscientist.com/ns/19991127/newsstory3.html

TEXTBLOCK 25/28 // URL: http://world-information.org/wio/infostructure/100437611661/100438658699
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 26/28 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
Eliminating online censorship: Freenet, Free Haven and Publius

Protecting speech on the global data networks attracts an increasing attention. The efforts and the corresponding abilities of governmental authorities, corporations and copyright enforcement agencies are countered by similar efforts and abilities of researchers and engineers to provide means for anonymous and uncensored communication, as Freenet, Free Haven and Publius. All three of them show a similar design. Content is split up and spread on several servers. When a file is requested, the pieces are reassembled. This design makes it difficult to censor content. All of these systems are not commercial products.

The most advanced system seems to be Publius. Because of being designed by researchers and engineers at the prestigious AT&T Labs, Publius is a strong statement against online censorship. No longer can it be said that taking a firm stand against the use of technologies limiting the freedom of individuals is a position of radical leftists only.

For more information on Publius, see John Schwartz, Online and Unidentifiable? in: The Washington Post, June 30, 2000, http://www.washingtonpost.com/wp-dyn/articles/A21689-2000Jun29.html .

Freenet web site: http://freenet.sourceforge.net

Free Haven web site: http://www.freehaven.net

Publius web site: http://www.cs.nyu.edu/waldman/publius

TEXTBLOCK 27/28 // URL: http://world-information.org/wio/infostructure/100437611742/100438658749
 
Other biometric technologies

Other biometric technologies not specified here include ear recognition, signature dynamics, key stroke dynamics, vein pattern recognition, retinal scan, body odour recognition, and DNA recognition. These are technologies which are either in early stages of development or used in highly specialised and limited contexts.

TEXTBLOCK 28/28 // URL: http://world-information.org/wio/infostructure/100437611729/100438658399
 
Internet Protocol Number (IP Number)

Every computer using TCP/IP has a 32 bit-Internet address, an IP number. This number consists of a network identifier and of a host identifier. The network identifier is registered at and allocated by a Network Information Center (NIC), the host identifier is allocated by the local network administration.

IP numbers are divided into three classes. Class A is restricted for big-sized organizations, Class B to medium-sized ones as universities, and Class C is dedicated to small networks.

Because of the increasing number of networks worldwide, networks belonging together, as LANs forming a corporate network, are allocated a single IP number.

INDEXCARD, 1/25
 
Federal Networking Council

Being an organization established in the name of the US government, the Federal Networking Council (FNC) acts as a forum for networking collaborations among Federal agencies to meet their research, education, and operational mission goals and to bridge the gap between the advanced networking technologies being developed by research FNC agencies and the ultimate acquisition of mature version of these technologies from the commercial sector.

Its members are representatives of agencies as the National Security Agency, the Department of Energy, the National Science Foundation, e.g.

http://www.fnc.gov

INDEXCARD, 2/25
 
Themistocles

Themistocles, a Greek politician and general, conquered the Persians in the battle of Salamis, in 480 BC. The Persians, under their King Xerxes, who were on the edge of winning the battle, got defeated by a propaganda campaign that Themistocles launched, telling the Persians that he was on their side and willing to let them win the battle; his argument was that the Greek were so busy with their quarrels that they were not prepared to fight an aggressive battle and a lot of them would change sides if the power of the Persians was shown in a short and cruel fight. In the end Xerxes got the message that parts of the Greek army were fleeing the battlefield. This disinformation lead to a wrong assessment of Xerxes, which made it easy for the Greek to win the war.

For further details see:
http://www.optonline.com/comptons/ceo/31900_Q.html

http://ds.dial.pipex.com/kitson/ESSAYS/Them.htm

http://www.eptonline.com/comptons/ceo/31900_Q...
http://ds.dial.pipex.com/kitson/ESSAYS/Them.h...
INDEXCARD, 3/25
 
Calculator

Calculators are machines for automatically performing arithmetical operations and certain mathematical functions. Modern calculators are descendants of a digital arithmetic machine devised by Blaise Pascal in 1642. Later in the 17th century, Gottfried Wilhelm von Leibniz created a more advanced machine, and, especially in the late 19th century, inventors produced calculating machines that were smaller and smaller and less and less laborious to use.

INDEXCARD, 4/25
 
Martin Hellman

Martin Hellman was Whitfield Diffie's collegue in creating pubylic key cryptography in the 1970s.

INDEXCARD, 5/25
 
William Frederick Friedman

Friedman is considered the father of U.S.-American cryptoanalysis - he also was the one to start using this term.

INDEXCARD, 6/25
 
User tracking

User tracking is a generic term that covers all the techniques of monitoring the movements of a user on a web site. User tracking has become an essential component in online commerce, where no personal contact to customers is established, leaving companies with the predicament of not knowing who they are talking to. Some companies, such as Red Eye, Cyber Dialogue, and SAS offer complete technology packages for user tracking and data analysis to online businesses. Technologies include software solutions such as e-mine, e-discovery, or WebHound

Whenever user tracking is performed without the explicit agreement of the user, or without laying open which data are collected and what is done with them, considerable privacy concerns have been raised.

http://www.redeye.co.uk/
http://www.cyberdialogue.com/
http://www.sas.com/
http://www.spss.com/emine/
http://www.sas.com/solutions/e-discovery/inde...
http://www.sas.com/products/webhound/index.ht...
http://www.linuxcare.com.au/mbp/meantime/
INDEXCARD, 7/25
 
Reuters Group plc

Founded in 1851 in London, Reuters is the world's largest news and television agency with 1,946 journalists, photographers and camera operators in 183 bureaus serving newspapers, other news agencies, and radio and television broadcasters in 157 countries.
In addition to its traditional news-agency business, over its network Reuters provides financial information and a wide array of electronic trading and brokering services to banks, brokering houses, companies, governments, and individuals worldwide.

http://www.reuters.com

INDEXCARD, 8/25
 
The Spot

http://www.thespot.com/

http://www.thespot.com/
INDEXCARD, 9/25
 
Expert system

Expert systems are advanced computer programs that mimic the knowledge and reasoning capabilities of an expert in a particular discipline. Their creators strive to clone the expertise of one or several human specialists to develop a tool that can be used by the layman to solve difficult or ambiguous problems. Expert systems differ from conventional computer programs as they combine facts with rules that state relations between the facts to achieve a crude form of reasoning analogous to artificial intelligence. The three main elements of expert systems are: (1) an interface which allows interaction between the system and the user, (2) a database (also called the knowledge base) which consists of axioms and rules, and (3) the inference engine, a computer program that executes the inference-making process. The disadvantage of rule-based expert systems is that they cannot handle unanticipated events, as every condition that may be encountered must be described by a rule. They also remain limited to narrow problem domains such as troubleshooting malfunctioning equipment or medical image interpretation, but still have the advantage of being much lower in costs compared with paying an expert or a team of specialists.

INDEXCARD, 10/25
 
Telnet

Telnet allows you to login remotely on a computer connected to the Internet.

INDEXCARD, 11/25
 
Liability of ISPs

ISPs (Internet Service Provider), BBSs (Bulletin Board Service Operators), systems operators and other service providers (in the U.S.) can usually be hold liable for infringing activities that take place through their facilities under three theories: 1) direct liability: to establish direct infringement liability there must be some kind of a direct volitional act, 2) contributory liability: a party may be liable for contributory infringement where "... with knowledge of the infringing activity, [it] induces, causes or materially contributes to the infringing activity of another." Therefore a person must know or have reason to know that the subject matter is copyrighted and that particular uses violated copyright law. There must be a direct infringement of which the contributory infringer has knowledge, and encourages or facilitates for contributory infringement to attach, and 3) vicarious liability: a party may be vicariously liable for the infringing acts of another if it a) has the right and ability to control the infringer's acts and b) receives a direct financial benefit from the infringement. Unlike contributory infringement, knowledge is not an element of vicarious liability.


INDEXCARD, 12/25
 
Open Systems Interconnection (OSI)

Open Systems Interconnection (OSI) is a standard reference model for communication between two end users in a network. It is used in developing products and understanding networks.

Source: Whatis.com

INDEXCARD, 13/25
 
Adi Shamir

Adi Shamir was one of three persons in a team to invent the RSA public-key cryptosystem. The other two authors were Ron Rivest and Leonard M. Adleman.

INDEXCARD, 14/25
 
Sergei Eisenstein

Though Sergei Eisenstein (1898-1948) made only seven films in his entire career, he was the USSR's most important movie-conductor in the 1920s and 1930s. His typical style, putting mountains of metaphors and symbols into his films, is called the "intellectual montage" and was not always understood or even liked by the audience. Still, he succeeded in mixing ideological and abstract ideas with real stories. His most famous work was The Battleship Potemkin (1923).

INDEXCARD, 15/25
 
Enigma

Device used by the German military command to encode strategic messages before and during World War II. The Enigma code was broken by a British intelligence system known as Ultra.

INDEXCARD, 16/25
 
Society for Worldwide Interbank Financial Telecommunication (SWIFT)

Founded in 1973 by 239 banks from 15 countries, SWIFT is responsible for maintaining the world's most important network dedicated to financial transaction processing.
Although facing competition from smart cards, e.g., SWIFT can rely on an increasing clientèle. In September 1999 SWIFT served 6,710 live users in 189 countries.

http://www.swift.com

http://www.swift.com/
INDEXCARD, 17/25
 
John Dee

b. July 13, 1527, London, England
d. December 1608, Mortlake, Surrey

English alchemist, astrologer, and mathematician who contributed greatly to the revival of interest in mathematics in England. After lecturing and studying on the European continent between 1547 and 1550, Dee returned to England in 1551 and was granted a pension by the government. He became astrologer to the queen, Mary Tudor, and shortly thereafter was imprisoned for being a magician but was released in 1555. Dee later toured Poland and Bohemia (1583-89), giving exhibitions of magic at the courts of various princes. He became warden of Manchester College in 1595.

INDEXCARD, 18/25
 
Internet Relay Chat (IRC)

IRC is a text-based chat system used for live discussions of groups.

For a history of IRC see Charles A. Gimon, IRC: The Net in Realtime, http://www.skypoint.com/~gimonca/irc2.html

http://www.skypoint.com/~gimonca/irc2.html
INDEXCARD, 19/25
 
cryptology

also called "the study of code". It includes both, cryptography and cryptoanalysis

INDEXCARD, 20/25
 
First Amendment Handbook

The First Amendment to the US Constitution, though short, lists a number of rights. Only a handful of words refer to freedoms of speech and the press, but those words are of incalculable significance. To understand the current subtleties and controversies surrounding this right, check out this First Amendment site. This detailed handbook of legal information, mostly intended for journalists, should be of interest to anyone who reads or writes. For example, the chapter Invasion of Privacy shows the limits of First Amendment rights, and the balance between the rights of the individual and the rights of the public - or, more crudely, the balance of Tabloid vs. Celebrity. Each section is carefully emended with relevant legal decisions.

http://www.rcfp.org/handbook/viewpage.cgi

INDEXCARD, 21/25
 
Vladimir Putin

Vladimir Putin is Russian President, Boris Yeltsin's. Until his appointment as Prime Minister in August 1999, he was nearly unknown. He had been working for the Soviet Security Service, the KGB. In July 1998 he took charge of the Federal Security Service, FSB. In March 1999 he became secretary of the Security Council. He has no experience in being at all. Where he demonstrated power until now is the Chechnya War. Soon after the beginning of this 2nd war in the region his popularity rose.

INDEXCARD, 22/25
 
Bill Clinton

William J. Clinton (* 1946) studied law at Yale University, then taught at the University of Arkansas. He was elected Arkansas attorney general in 1976 and served as a governor until 1992. That year he became U.S.-President, the first democratic President after a row of Republicans. His sexual affairs not only cost him nearly his career but he also had to distract from his private affairs: he thought of fighting another war against Saddam Hussein in February 1999. Short afterwards he had a more interesting enemy, Slobodan Milosevic - and the NATO was most willing to fight with him.

For more information see: http://www.whitehouse.gov/WH/glimpse/presidents/html/bc42.html

http://www.whitehouse.gov/WH/glimpse/presiden...
INDEXCARD, 23/25
 
Bruce Schneier

Bruce Schneier is president of Counterpane Systems in Minneapolis. This consulting enterprise specialized in cryptography and computer security. He is the author of the book Applied Cryptography and inventor of the Blowfish and Twofish encryption algorithms.

INDEXCARD, 24/25
 
Hieroglyphs

Hieroglyphs are pictures, used for writing in ancient Egypt. First of all those pictures were used for the names of kings, later more and more signs were added, until a number of 750 pictures

INDEXCARD, 25/25