Racism on the Internet

The internet can be regarded as a mirror of the variety of interests, attitudes and needs of human kind. Propaganda and disinformation in that way have to be part of it, whether they struggle for something good or evil. But the classifications do no longer function.
During the last years the internet opened up a new source for racism as it can be difficult to find the person who gave a certain message into the net. The anarchy of the internet provides racists with a lot of possibilities to reach people which they do not possess in other media, for legal and other reasons.

In the 1980s racist groups used mailboxes to communicate on an international level; the first ones to do so were supposedly the Ku Klux Klan and mailboxes like the Aryan Nations Liberty Net. In the meantime those mailboxes can be found in the internet. In 1997 about 600 extreme right websites were in the net, the number is growing, most of them coming from the USA. The shocking element is not the number of racist pages, because still it is a very small number compared to the variety of millions of pages one can find in this media, it is the evidence of intentional disinformation, the language and the hatred that makes it dangerous.
A complete network of anti-racist organizations, including a high number of websites are fighting against racism. For example:

http://motlc.wiesenthal.com/text/x32/xr3257.html

http://www.aranet.org/

http://www.freespeech.org/waronracism/files/allies.htm
http://www.nsdapmuseum.com
http://www.globalissues.org/HumanRights/Racism.asp

TEXTBLOCK 1/24 // URL: http://world-information.org/wio/infostructure/100437611661/100438658620
 
Content as Transport Medium for Values and Ideologies

With the dissemination of their content commercial media are among other things also able to transport values and ideologies. Usually their programming reflects society's dominant social, political, ethical, cultural and economical values. A critical view of the prevalent ideologies often is sacrificed so as not to offend the existing political elites and corporate powers, but rather satisfy shareholders and advertisers.

With most of the worlds content produced by a few commercial media conglomerates, with the overwhelming majority of companies (in terms of revenue generation) concentrated in Europe, the U.S., Japan and Australia there is also a strong flow of content from the 'North-West' to the 'South-East'. Popular culture developed in the world's dominant commercial centers and Western values and ideologies are so disseminated into the most distant corners of the earth with far less coming back.

TEXTBLOCK 2/24 // URL: http://world-information.org/wio/infostructure/100437611795/100438659066
 
Timeline 1970-2000 AD

1971 IBM's work on the Lucifer cipher and the work of the NSA lead to the U.S. Data Encryption Standard (= DES)

1976 Whitfield Diffie and Martin Hellman publish their book New Directions in Cryptography, playing with the idea of public key cryptography

1977/78 the RSA algorithm is developed by Ron Rivest, Adi Shamir and Leonard M. Adleman and is published

1984 Congress passes Comprehensive Crime Control Act

- The Hacker Quarterly is founded

1986 Computer Fraud and Abuse Act is passed in the USA

- Electronic Communications Privacy Act

1987 Chicago prosecutors found Computer Fraud and Abuse Task Force

1988 U.S. Secret Service covertly videotapes a hacker convention

1989 NuPrometheus League distributes Apple Computer software

1990 - IDEA, using a 128-bit key, is supposed to replace DES

- Charles H. Bennett and Gilles Brassard publish their work on Quantum Cryptography

- Martin Luther King Day Crash strikes AT&T long-distance network nationwide


1991 PGP (= Pretty Good Privacy) is released as freeware on the Internet, soon becoming worldwide state of the art; its creator is Phil Zimmermann

- one of the first conferences for Computers, Freedom and Privacy takes place in San Francisco

- AT&T phone crash; New York City and various airports get affected

1993 the U.S. government announces to introduce the Clipper Chip, an idea that provokes many political discussions during the following years

1994 Ron Rivest releases another algorithm, the RC5, on the Internet

- the blowfish encryption algorithm, a 64-bit block cipher with a key-length up to 448 bits, is designed by Bruce Schneier

1990s work on quantum computer and quantum cryptography

- work on biometrics for authentication (finger prints, the iris, smells, etc.)

1996 France liberates its cryptography law: one now can use cryptography if registered

- OECD issues Cryptography Policy Guidelines; a paper calling for encryption exports-standards and unrestricted access to encryption products

1997 April European Commission issues Electronic Commerce Initiative, in favor of strong encryption

1997 June PGP 5.0 Freeware widely available for non-commercial use

1997 June 56-bit DES code cracked by a network of 14,000 computers

1997 August U.S. judge assesses encryption export regulations as violation of the First Amendment

1998 February foundation of Americans for Computer Privacy, a broad coalition in opposition to the U.S. cryptography policy

1998 March PGP announces plans to sell encryption products outside the USA

1998 April NSA issues a report about the risks of key recovery systems

1998 July DES code cracked in 56 hours by researchers in Silicon Valley

1998 October Finnish government agrees to unrestricted export of strong encryption

1999 January RSA Data Security, establishes worldwide distribution of encryption product outside the USA

- National Institute of Standards and Technologies announces that 56-bit DES is not safe compared to Triple DES

- 56-bit DES code is cracked in 22 hours and 15 minutes

1999 May 27 United Kingdom speaks out against key recovery

1999 Sept: the USA announce to stop the restriction of cryptography-exports

2000 as the German government wants to elaborate a cryptography-law, different organizations start a campaign against that law

- computer hackers do no longer only visit websites and change little details there but cause breakdowns of entire systems, producing big economic losses

for further information about the history of cryptography see:
http://www.clark.net/pub/cme/html/timeline.html
http://www.math.nmsu.edu/~crypto/Timeline.html
http://fly.hiwaay.net/~paul/cryptology/history.html
http://www.achiever.com/freehmpg/cryptology/hocryp.html
http://all.net/books/ip/Chap2-1.html
http://cryptome.org/ukpk-alt.htm
http://www.iwm.org.uk/online/enigma/eni-intro.htm
http://www.achiever.com/freehmpg/cryptology/cryptofr.html
http://www.cdt.org/crypto/milestones.shtml

for information about hacker's history see:
http://www.farcaster.com/sterling/chronology.htm:

TEXTBLOCK 3/24 // URL: http://world-information.org/wio/infostructure/100437611776/100438658960
 
Economic structure; digital euphoria

The dream of a conflict-free capitalism appeals to a diverse audience. No politician can win elections without eulogising the benefits of the information society and promising universal wealth through informatisation. "Europe must not lose track and should be able to make the step into the new knowledge and information society in the 21st century", said Tony Blair.

The US government has declared the construction of a fast information infrastructure network the centerpiece of its economic policies

In Lisbon the EU heads of state agreed to accelerate the informatisation of the European economies

The German Chancellor Schröder has requested the industry to create 20,000 new informatics jobs.

The World Bank understands information as the principal tool for third world development

Electronic classrooms and on-line learning schemes are seen as the ultimate advance in education by politicians and industry leaders alike.

But in the informatised economies, traditional exploitative practices are obscured by the glamour of new technologies. And the nearly universal acceptance of the ICT message has prepared the ground for a revival of 19th century "adapt-or-perish" ideology.

"There is nothing more relentlessly ideological than the apparently anti-ideological rhetoric of information technology"

(Arthur and Marilouise Kroker, media theorists)

TEXTBLOCK 4/24 // URL: http://world-information.org/wio/infostructure/100437611726/100438658999
 
History: Communist Tradition

Following the communist revolutions of the 20th century all "means of production" became the property of the state as representative of "the masses". Private property ceased to exist. While moral rights of the creator were recognized and economic rights acknowledged with a one-time cash award, all subsequent rights reverted to the state.

With the transformation of many communist countries to a market system most of them have now introduced laws establishing markets in intellectual property rights. Still the high rate of piracy reflects a certain lack of legal tradition.

TEXTBLOCK 5/24 // URL: http://world-information.org/wio/infostructure/100437611725/100438659483
 
Public Relations and Propaganda

Public relations usually is associated with the influencing of public opinion. Therefore it has subsequently been linked with propaganda. Using one of the many definitions of propaganda "... the manipulation of symbols as a means of influencing attitudes on controversial matters" (Harold D. Lasswell), the terms propaganda and PR seem to be easily interchangeable.

Still many authors explicitly distinguish between public relations, advertising and propaganda. Unlike PR, which is often described as objective and extensive information of the public, advertising and propaganda are associated with manipulative activities. Nevertheless to treat public relations and propaganda as equivalents stands in the tradition of PR. Edward L. Bernays, one of the founders of public relations wrote "The only difference between propaganda and education, really, is the point of view. The advocacy of what we believe in is education. The advocacy of what we don't believe is propaganda."

Also institutions like the German Bundeswehr use the terms publics relations and propaganda synonymously. After a 1990 legislation of the former minister of defense Stoltenberg, the "psychological influence of the enemy" was ceased during peace time and the Academy for Psychological Defense renamed to Academy for Information and Communication, among other things responsible for scientific research in the field of public relations.

TEXTBLOCK 6/24 // URL: http://world-information.org/wio/infostructure/100437611652/100438658084
 
Biometric technologies

In what follows there is a brief description of the principal biometric technologies, whose respective proponents - producers, research laboratories, think tanks - mostly tend to claim superiority over the others. A frequently used definition of "biometric" is that of a "unique, measurable characteristic or trait of a human being for automatically recognizing or verifying identity" (http://www.icsa.net/services/consortia/cbdc/bg/introduction.shtml); biometrics is the study and application of such measurable characteristics. In IT environments, biometrics are categorised as "security" technologies meant to limit access to information, places and other resources to a specific group of people.

All biometric technologies are made up of the same basic processes:

1. A sample of a biometric is first collected, then transformed into digital information and stored as the "biometric template" of the person in question.

2. At every new identification, a second sample is collected and its identity with the first one is examined.

3. If the two samples are identical, the persons identity is confirmed, i.e. the system knows who the person is.

This means that access to the facility or resource can be granted or denied. It also means that information about the persons behaviour and movements has been collected. The system now knows who passed a certain identification point at which time, at what distance from the previous time, and it can combine these data with others, thereby appropriating an individual's data body.

TEXTBLOCK 7/24 // URL: http://world-information.org/wio/infostructure/100437611729/100438658188
 
Global Data Flows

Fiber-optic cables, coaxial cables, copper wires, electric power lines, microwaves, satellite communication, mobile telephony, computer networks: Various telecommunication networks following a variety of standards with bewildering abbreviations - DSL, WAP, GSM, UMTS, Ipv4 etc. - and carrying endless flows of capital and information are the blood veins of modern societies.

In the space of flows constituted by today's global data networks the space of places is transcended. Visualizations of these global data flows show arches bridging seas and continents, thereby linking the world's centres of research and development, economics and politics. In the global "Network Society" (Manuel Castells) the traditional centres of power and domination are not discarded, in the opposite, they are strengthened and reinforced by the use of information and communication technologies. Political, economical and symbolical power becomes increasingly linked to the use of modern information and communication technologies. The most sensitive and advanced centres of information and communication technologies are the stock markets. Excluded from the network constituted by modern information and communication technologies, large parts of Africa, Asia and South America, but also the poor of industrialized countries, are ranking increasingly marginal to the world economy.

Cities are centres of communications, trade and power. The higher the percentage of urban population, the more it is likely that the telecommunications infrastructure is generally good to excellent. This goes hand in hand with lower telecommunications costs. Those parts of the world with the poorest infrastructure are also the world's poorhouse. In Bangladesh for most parts of the population a personal computer is as expensive as a limousine in European one-month's salary in Europe, they have to pay eight annual salaries. Therefore telecommunications infrastructure is concentrated on the highly industrialized world: Most telephone mainlines, mobile telephones, computers, Internet accounts and Internet hosts (computers connected to the global data networks) can be found here. The same applies to media: the daily circulation of newspapers and the use of TV sets and radios. - Telecommunication and media services affordable to most parts of the population are mostly restricted to industrialized countries.

This situation will not change in the foreseeable future: Most expenditure for telecommunications infrastructure will be restricted to the richest countries in the world. In 1998, the world's richest countries consumed 75% of all cables and wires.

TEXTBLOCK 8/24 // URL: http://world-information.org/wio/infostructure/100437611791/100438658776
 
Palm recognition

In palm recognition a 3-dimensional image of the hand is collected and compared to the stored sample. Palm recognition devices are cumbersome artefacts (unlike fingerprint and iris recognition devices) but can absorb perform a great amount of identification acts in a short time. They are therefore preferably installed in situations where a large number of people is identified, as in airports.

TEXTBLOCK 9/24 // URL: http://world-information.org/wio/infostructure/100437611729/100438658375
 
Gait recognition

The fact that an individual's identity is expressed not only by the way he/she looks or sounds, but also by the manner of walking is a relatively new discovery of in biometrics.

Unlike the more fully developed biometric technologies whose scrutiny is directed at stationary parts of the body, gait recognition has the added difficulty of having to sample and identify movement. Scientists at the University of Southampton, UK (http://www.isis.ecs.soton.ac.uk/research/gait/) have developed a model which likens the movement of legs to those of a pendulum and uses hip inclination as a variable.

Another model considers the shape and length of legs as well as the velocity of joint movements. The objective is to combine both models into one, which would make gait recognition a fully applicable biometric technology.

Given that gait recognition is applied to "moving preambulatory subjects" it is a particularly interesting technology for surveillance. People can no longer hide their identity by covering themselves or moving. Female shop lifters who pretend pregnancy will be detected because they walk differently than those who are really pregnant. Potential wrongdoers might resort walking techniques as developed in Monty Pythons legendary "Ministry of Silly Walks" (http://www.stone-dead.asn.au/sketches/sillwalk.htm)

TEXTBLOCK 10/24 // URL: http://world-information.org/wio/infostructure/100437611729/100438658388
 
The Secret Behind

The secret behind all this is the conception that nothing bad could ever be referred to the own nation. All the bad words belong to the enemy, whereas the "we" is the good one, the one who never is the aggressor but always defender, the savior - not only for ones own sake but also for the others, even if they never asked for it, like the German population during World War I and II.
The spiritualization of such thinking leads to the point that it gets nearly impossible to believe that this could be un-true, a fake. To imagine injustice committed by the own nation gets more and more difficult, the longer the tactic of this kind of propaganda goes on. U.S.-Americans voluntarily believe in its politics, believing also the USA works as the police of the world, defending the morally good against those who just do not have reached the same level of civilization until today.
To keep up this image, the enemy must be portrayed ugly and bad, like in fairy-tales, black-and-white-pictures. Any connection between oneself and the enemy must be erased and made impossible. In the case of Slobodan Milosevic or Saddam Hussein this meant to delete the positive contact of the last years from the consciousness of the population. Both had received a high amount of money and material help as long as they kept to the rules of the Western game. Later, when the image of the friend/confederate was destroyed, the contact had to be denied. The media, who had reported that help, no longer seemed to remember and did not write anything about that strange change of mind. And if any did, they were not really listened to, because people tend to hear what they want to hear. And who would want to hear that high amounts of his taxes had formerly gone to "a man" (this personification of the war to one single man is the next disinformation) who now is the demon in one's mind.

All of this is no invention of several politicians. Huge think tanks and different governmental organizations are standing behind that. Part of their work is to hide their own work, or to deny it.

TEXTBLOCK 11/24 // URL: http://world-information.org/wio/infostructure/100437611661/100438658637
 
Face recognition

In order to be able to recognize a person, one commonly looks at this persons face, for it is there where the visual features which distinguish one person from another are concentrated. Eyes in particular seem to tell a story not only about who somebody is, but also about how that persons feel, where his / her attention is directed, etc. People who do not want to show who they are or what is going on inside of them must mask themselves. Consequently, face recognition is a kind of electronic unmasking.

"Real" face-to-face communication is a two-way process. Looking at somebody's face means exposing ones own face and allowing the other to look at oneself. It is a mutual process which is only suspended in extraordinary and voyeuristic situations. Looking at somebody without being looked at places the person who is visually exposed in a vulnerable position vis-à-vis the watcher.

In face recognition this extraordinary situation is normal. Looking at the machine, you only see yourself looking at the machine. Face biometrics are extracted anonymously and painlessly by a mask without a face.

Therefore the resistance against the mass appropriation of biometrical data through surveillance cameras is confronted with particular difficulties. The surveillance structure is largely invisible, it is not evident what the function of a particular camera is, nor whether it is connected to a face recognition system.

In a protest action against the face recognition specialist Visionics, the Surveillance Camera Players therefor adopted the strategy of re-masking: in front of the cameras, they perfomed the play "The Masque of the Red Death" an adaption of Edgar Allen Poe's classic short story by Art Toad.

According to Visionics, whose slogan is "enabling technology with a mass appeal", there are alrady 1.1 bn digitised face images stored on identification data banks world wide. When combined with wide area surveillance camera networks, face recognition is capable of creating a transparent social space that can be controlled by a depersonalised, undetected and unaccountable centre. It is a technology, of which the surveillance engeneers of sunken totalitarian regimes may have dreamt, and one that today is being adopted by democratic governments.

TEXTBLOCK 12/24 // URL: http://world-information.org/wio/infostructure/100437611729/100438658118
 
Late 1970s - Present: Fourth Generation Computers

Following the invention of the first integrated circuits always more and more components could be fitted onto one chip. LSI (Large Scale Integration) was followed by VLSI (Very Large Scale Integration) and ULSI (Ultra-Large Scale Integration), which increased the number of components squeezed onto one chip into the millions and helped diminish the size as well as the price of computers. The new chips took the idea of the integrated circuit one step further as they allowed to manufacture one microprocessor which could then be programmed to meet any number of demands.

Also, ensuing the introduction of the minicomputer in the mid 1970s by the early 1980s a market for personal computers (PC) was established. As computers had become easier to use and cheaper they were no longer mainly utilized in offices and manufacturing, but also by the average consumer. Therefore the number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used.

Further developments included the creation of mobile computers (laptops and palmtops) and especially networking technology. While mainframes shared time with many terminals for many applications, networking allowed individual computers to form electronic co-operations. LANs (Local Area Network) permitted computers to share memory space, information, software and communicate with each other. Although already LANs could reach enormous proportions with the invention of the Internet an information and communication-network on a global basis was established for the first time.

TEXTBLOCK 13/24 // URL: http://world-information.org/wio/infostructure/100437611663/100438659451
 
The Big Five of Commercial Media

After a number of mergers and acquisitions five powerful media conglomerates lead the world's content production and distribution. They operate on an international basis with subsidiaries all around the globe and engage in every imaginable kind of media industry.

Table: The World's Leading Media Companies

Media Company

1998 Revenues

(in US$)

Property/Corporate Information

AOL Time Warner (US)

26,838.000.000*

http://www.timewarner.com/corp/about/timewarnerinc/corporate/index.html

Disney (US)

22,976.000.000

http://www.disney.com

Bertelsmann (GER)

16,389.000.000

http://www.bertelsmann.com/facts/report/report.cfm

News Corporation (AUS)

12,841.000.000

http://www.newscorp.com/public/cor/cor_m.htm

Viacom (US)

12,100.000.000

http://www.viacom.com/global.tin



(* Revenues of Time Warner only (merger with AOL took place in January 2000)

TEXTBLOCK 14/24 // URL: http://world-information.org/wio/infostructure/100437611795/100438659010
 
It is always the others

Disinformation is supposed to be something evil, something ethically not correct. And therefore we prefer to connect it to the past or to other political systems than the ones in the Western hemisphere. It is always the others who work with disinformation. The same is true for propaganda.
Even better, if we can refer it to the past: Adolf Hitler, supposedly one of the world's greatest and most horrible propagandists (together with his Reichsminister für Propaganda Josef Goebbels) did not invent modern propaganda either. It was the British example during World War I, the invention of modern propaganda, where he took his knowledge from. And it was Hitler's Reich, where (racist) propaganda and disinformation were developed to a perfect manipulation-tool in a way that the consequences are still working today.
A war loses support of the people, if it is getting lost. Therefore it is extremely important to launch a feeling of winning the war. Never give up emotions of victory. Governments know this and work hard on keeping the mood up. The Germans did a very hard job on that in the last months of World War II.
But the in the 1990s disinformation- and propaganda-business came back to life (if it ever had gone out of sight) through Iraq's invasion of Kuwait and the reactions by democratic states. After the war, reports made visible that not much had happened the way we had been told it had happened. Regarded like this the Gulf War was the end of the New World Order, a better and geographically broader democratic order, that had just pretended to having begun.

TEXTBLOCK 15/24 // URL: http://world-information.org/wio/infostructure/100437611661/100438658640
 
What is the Internet?

Each definition of the Internet is a simplified statement and runs the risk of being outdated within a short time. What is usually referred to as the Internet is a network of thousands of computer networks (so called autonomous systems) run by governmental authorities, companies, and universities, etc. Generally speaking, every time a user connects to a computer networks, a new Internet is created. Technically speaking, the Internet is a wide area network (WAN) that may be connected to local area networks (LANs).

What constitutes the Internet is constantly changing. Certainly the state of the future Net will be different to the present one. Some years ago the Internet could still be described as a network of computer networks using a common communication protocol, the so-called IP protocol. Today, however, networks using other communication protocols are also connected to other networks via gateways.

Also, the Internet is not solely constituted by computers connected to other computers, because there are also point-of-sale terminals, cameras, robots, telescopes, cellular phones, TV sets and and an assortment of other hardware components that are connected to the Internet.

At the core of the Internet are so-called Internet exchanges, national backbone networks, regional networks, and local networks.

Since these networks are often privately owned, any description of the Internet as a public network is not an accurate. It is easier to say what the Internet is not than to say what it is. On 24 October, 1995 the U.S. Federal Networking Council made the following resolution concerning the definition of the term "Internet": "Internet" refers to the global information system that (i) is logically linked together by a globally unique address space based on the Internet Protocol (IP) or its subsequent extensions/follow-ons; (ii) is able to support communications using the Transmission Control Protocol/Internet Protocol (TCP/IP) suite or its subsequent extensions/follow-ons, and/or other IP-compatible protocols; and (iii) provides, uses or makes accessible, either publicly or privately, high level services layered on the communications and related infrastructure described herein." (http://www.fnc.gov/Internet_res.html)

What is generally and in a simplyfiying manner called the Internet, may be better referred to as the Matrix, a term introduced by science fiction writer William Gibson, as John S. Quarterman and Smoot Carl-Mitchell have proposed. The Matrix consists of all computer systems worldwide capable of exchanging E-Mail: of the USENET, corporate networks and proprietary networks owned by telecommunication and cable TV companies.

Strictly speaking, the Matrix is not a medium; it is a platform for resources: for media and services. The Matrix is mainly a very powerful means for making information easily accessible worldwide, for sending and receiving messages, videos, texts and audio files, for transferring funds and trading securities, for sharing resources, for collecting weather condition data, for trailing the movements of elephants, for playing games online, for video conferencing, for distance learning, for virtual exhibitions, for jamming with other musicians, for long distance ordering, for auctions, for tracking packaged goods, for doing business, for chatting, and for remote access of computers and devices as telescopes and robots remotely, e. g. The Internet is a wonderful tool for exchanging, retrieving, and storing data and sharing equipment over long distances and eventually real-time, if telecommunication infrastructure is reliable and of high quality.

For a comprehensive view of uses of the Matrix, especially the World Wide Web, see ""24 Hours in Cyberspace"

TEXTBLOCK 16/24 // URL: http://world-information.org/wio/infostructure/100437611791/100438659889
 
The 2nd Chechnya-War

In the summer of 1999 between 1.200 and 2.000 Muslim rebels from Chechnya fell into Dagestan. Rumors say that Russian soldiers closed their eyes pretending not to see anything. During the fightings that started soon, many persons got killed. The hole issue was blamed on Chechnya.
At that time there were rumors that there would be heavy bombing in Moscow in September. And there was. Those two things together brought back the hatred against the Chechnya rebels. The 2nd War between Russia and the Muslim country began. While the first war was lost at home, because the Russians, especially mothers, did not understand why their sons should fight against Chechnya, this time the atmosphere was completely different. In the cities 85% and all over Russia 65% of the Russian population agreed with the war. This time the war was a national issue, a legitimate defense.
The media emphasized this.
Alexander Zilin, a journalist, found out that the truth was far from the one presented in the media: First of all there was no evidence that the Moscow-bombings were organized by Chechnyans. On the contrary it is more than probable that the crimes were organized by a governmental institution for national security. The disinformation was part of the strategy to make the population support another war with Chechnya. The media were part of the story, maybe without knowing. They kept on the government's and army's side, showing only special and patriotic parts of the war. For example the number of dead Russian soldiers was held back.

The U.S.-behavior on this:
The USA would like to intervene but they are afraid of ruining the weak relation to Russia. For years the main topic of U.S.-politics has been the struggle against terrorism. Now Russia pretends to be fighting terrorism. How could it be criticized for that?

The reason for this war is rather cynical: it worked as a public relations-campaign for Vladimir Putin, candidate for the president's elections in. When Putin came into power as minister-president of Russia in August 1999, opinion polls gave him 2% for the elections in summer 2000. By the end of November he got already 46%! And finally he won. The public relations war worked well.
At the same time a propaganda-campaign against his rival Y. Primakov (98), formerly the most popular candidate, was spreading lies and bad rumors. Opinion-polls showed very fast that he had lost the elections because of this black propaganda, even before the elections took place.

TEXTBLOCK 17/24 // URL: http://world-information.org/wio/infostructure/100437611661/100438658639
 
More and more, faster and faster, but...

Since the invention of appropriate means and technologies, communication no longer requires face-to-face meetings.

From writing and reading to using computers, expanding and exhausting one's possibilities to communicate relies more and more on the application of skills we have to learn. With the increasing importance of communication technologies, learning to apply them properly becomes a kind of rite of passage.

A Small World

From the very beginning - the first Sumerian pictographs on clay tablets - to today's state of the art technologies - broadband communication via fiber-optic cables and satellites - the amount of information collected, processed and stored, the capabilities to do so, as well as the possible speed of information transmission exponentially accelerate.

Since the invention of the electrical telegraph, but especially with today's growing digital communication networks, every location on earth seems to be close, however distant it may be, and also time no longer remains a significant dimension.

Threatened Cultural Memory

More and more information is transmitted and produced faster and faster, but the shelf life of information becomes more and more fragile. For more than 4500 years Sumerian pictographs written on clay tablets remained intact, but newspapers and books, printed some decades ago, crumble into pieces; film reels, video tapes and cassettes corrode. Digitalization of information is not a cure; on the contrary it even intensifies the danger of destroying cultural heritage. Data increasingly requires specific software and hardware, but to regularly convert all available digitized information is an unexecutable task.

Compared to the longevity of pictographs on clay tablets, digitized information is produced for instant one-time use. The increasing production and processing of information causes a problem hitherto unknown: the loss of our cultural memory.

For further information see T. Matthew Ciolek, Global Networking Timeline.

For another history of communication systems see Friedrich Kittler, The History of Communication Media.

TEXTBLOCK 18/24 // URL: http://world-information.org/wio/infostructure/100437611796/100438659807
 
Timeline 1600 - 1900 AD

17th century Cardinal Richelieu invents an encryption-tool called grille, a card with holes for writing messages on paper into the holes of those cards. Afterwards he removes the cards and fills in the blanks, so the message looks like an ordinary letter. The recipient needs to own the same card

- Bishop John Wilkins invents a cryptologic system looking like music notes. In a book he describes several forms of steganographic systems like secrets inks, but also the string cipher. He mentions the so-called Pig Latin, a spoken way of encryption that was already used by the ancient Indians

- the English scientist, magician and astrologer John Dee works on the ancient Enochian alphabet; he also possesses an encrypted writing that could not been broken until today

1605/1623 Sir Francis Bacon (= Francis Tudor = William Shakespeare?) writes several works containing ideas about cryptography. One of his most important advises is to use ciphers in such a way that no-one gets suspicious that the text could be enciphered. For this the steganogram was the best method, very often used in poems. The attempt to decipher Shakespeare's sonnets (in the 20th century) lead to the idea that his works had been written by Francis Bacon originally.

1671 Leibniz invents a calculating machine that uses the binary scale which we still use today, more advanced of course, called the ASCII code

18th century this is the time of the Black Chambers of espionage in Europe, Vienna having one of the most effective ones, called the "Geheime Kabinettskanzlei", headed by Baron Ignaz von Koch. Its task is to read through international diplomatic mail, copy letters and return them to the post-office the same morning. Supposedly about 100 letters are dealt with each day.

1790's Thomas Jefferson and Robert Patterson invent a wheel cipher

1799 the Rosetta Stone is found and makes it possible to decipher the Egyptian Hieroglyphs

1832 or 1838 Sam Morse develops the Morse Code, which actually is no code but an enciphered alphabet of short and long sounds. The first Morse code-message is sent by telegraph in 1844.

1834 the Braille Code for blind people is developed in today's form by Louis Braille

1844 the invention of the telegraph changes cryptography very much, as codes are absolutely necessary by then

1854 the Playfair cipher is invented by Sir Charles Wheatstone

1859 for the first time a tomographic cipher gets described

1861 Friedrich W. Kasiski does a cryptoanalysis of the Vigenère ciphers, which had been supposed to be uncrackable for ages

1891 Major Etienne Bazeries creates a new version of the wheel cipher, which is rejected by the French Army

1895 the invention of the radio changes cryptography-tasks again and makes them even more important

TEXTBLOCK 19/24 // URL: http://world-information.org/wio/infostructure/100437611776/100438658974
 
Changes

Still, disinformation and propaganda are nothing magic. They can change things, but supposedly only if those things/meanings/opinions are not fixed completely. Galileo Galilee was not capable of changing people's opinion about the world being flat until some persons got suspicious that this long-believed-in truth was a mistake. The propaganda of his experiments which made him famous was not enough. On the other hand later all the propaganda of his enemies could not turn back the wheel of enlightenment, as people thought it was more logic to believe in a round world than in a flat one.
It is never just a single idea that changes. Society is following the changes.

Thinking about disinformation brings us to the word truth, of course, and to the doubt that there is no definite truth. And truth can easily be manipulated to another truth. Just present some facts that seem to be logic and there you've got a new truth. And if the facts can supposedly be proved by empirical studies then the quality of the truth definitely rises.
That's what ideologies do all the time. And the media like to do the same thing - as a game with power or mere presentation of power?

But of course there also exist bits of disinformation which are more amusing than evil or dangerous:
- the theory of the celestro-centric world/"Hohlwelttheorie"
- the story of the German philosopher who invented an Italian philosopher, wrote books about him, even reprinted "his" texts, which had gone lost pretendedly 100 years ago - and finally lost his job and all his career when other scientists found out that everything had been made up.

TEXTBLOCK 20/24 // URL: http://world-information.org/wio/infostructure/100437611661/100438658633
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 21/24 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
The Privatization of Censorship

According to a still widely held conviction, the global data networks constitute the long desired arena for uncensorable expression. This much is true: Because of the Net it has become increasingly difficult to sustain cultural and legal standards. Geographical proximity and territorial boundaries prove to be less relevant, when it does not affect a document's availability if it is stored on your desktop or on a host some thousand kilometers away. There is no international agreement on non-prohibited contents, so human rights organizations and nazi groups alike can bypass restrictions. No single authority or organization can impose its rules and standards on all others. This is why the Net is public space, a political arena where free expression is possible.

This freedom is conditioned by the design of the Net. But the Net's design is not a given, as Lawrence Lessig reminds us. Originally the design of the Net allowed a relatively high degree of privacy and communication was not controlled directly. But now this design is changing and this invisible agora in electronic space is endangered. Governments - even elected ones - and corporations introduce new technologies that allow us to be identified, monitored and tracked, that identify and block content, and that can allow our behaviour to be efficiently controlled.

When the World Wide Web was introduced, soon small independent media and human rights organizations began to use this platform for drawing worldwide attention to their publications and causes. It seemed to be the dawning of a new era with authoritarian regimes and multinational media corporations on the looser side. But now the Net's design is changing according to their needs.

"In every context that it can, the entertaining industry is trying to force the Internet into its own business model: the perfect control of content. From music (fighting MP3) and film (fighting the portability of DVD) to television, the industry is resisting the Net's original design. It was about the free flow of content; Hollywood wants perfect control instead" (Lawrence Lessig, Cyberspace Prosecutor, in: The Industry Standard, February 2000).

In the United States, Hollywood and AT&T, after its merger with MediaOne becoming the biggest US cable service provider, return to their prior positions in the Seventies: the control of content and infrastructure. If most people will access the Net via set up boxes connected to a TV set, it will become a kind of television, at least in the USA.

For small independent media it will become very hard to be heard, especially for those offering streaming video and music. Increasingly faster data transmissions just apply to download capacities; upload capacities are much - on the average about eight times - lower than download capacities. As an AT&T executive said in response to criticism: "We haven't built a 56 billion dollar cable network to have the blood sucked from our veins" (Lawrence Lessig, The Law in the Code: How the Net is Regulated, Lecture at the Institute for Human Sciences, Vienna, May 29th, 2000).

Consumers, not producers are preferred.

For corporations what remains to be done to control the Net is mainly to cope with the fact that because of the Net it has become increasingly difficult to sustain cultural and legal standards. On Nov 11, 1995 the German prosecuting attorney's office searched Compuserve Germany, the branch of an international Internet service provider, because the company was suspected of having offered access to child pornography. Consequently Compuserve blocked access to more than 200 newsgroups, all containing "sex" or "gay" in their names, for all its customers. But a few days later, an instruction for access to these blocked newsgroups via Compuserve came into circulation. On February 26, 1997, Felix Somm, the Chief Executive Officer of Compuserve Germany, was accused of complicity with the distribution of child and animal pornography in newsgroups. In May 1998 he received a prison sentence for two years. This sentence was suspended against a bail of about 51.000 Euro. The sentence was justified by pointing to the fact that Compuserve Germany offered access to its US parent company's servers hosting child pornography. Felix Somm was held responsible for access to forbidden content he could not know of. (For further information (in German) click here.)

Also in 1995, as an attack on US Vice-President Al Gore's intention to supply all public schools with Internet access, Republican Senator Charles Grassley warned of the lurking dangers for children on the Net. By referring to a Time magazine cover story by Philip Elmer-Dewitt from July 3 on pornography on the Net, he pointed out that 83,5% of all images online are pornographic. But Elmer-Dewitt was wrong. Obviously unaware of the difference between Bulletin Board Systems and the Net, he referred misleadingly to Marty Rimm's article Marketing Pornography on the Information Superhighway, published in the prestigious Georgetown Law Journal (vol. 83, June 1995, pp. 1849-1935). Rimm knew of this difference, of course, and stated it clearly. (For further information see Hoffman & Novak, The Cyberporn debate, http://ecommerce.vanderbilt.edu/cyberporn.debate.html and Franz Wegener, Cyberpornographie: Chronologie einer Hexenjagd; http://www.intro-online.de/c6.html)

Almost inevitably anxieties accompany the introduction of new technologies. In the 19th century it was said that traveling by train is bad for health. The debate produced by Time magazine's cover story and Senator Grassley's attack caused the impression that the Net has multiplied possible dangers for children. The global communication networks seem to be a inexhaustible source of mushrooming child pornography. Later would-be bomb recipes found on the Net added to already prevailing anxieties. As even in industrialized countries most people still have little or no first-hand experience with the Net, anxieties about child pornography or terrorist attacks can be stirred up and employed easily.

A similar and related debate is going on about the glorification of violence and erotic depictions in media. Pointing to a "toxic popular culture" shaped by media that "distort children's view of reality and even undermine their character growth", US right-wing social welfare organizations and think tanks call for strong media censorship. (See An Appeal to Hollywood, http://www.media-appeal.org/appeal.htm) Media, especially films and videos, are already censored and rated, so it is more censorship that is wanted.

The intentions for stimulating a debate on child pornography on the Net were manifold: Inter alia, it served the Republican Party to attack Democrat Al Gore's initiative to supply all public schools with Internet access; additionally, the big media corporations realized that because of the Net they might have to face new competitors and rushed to press for content regulation. Taking all these intentions together, we can say that this still ongoing debate constitutes the first and most well known attempt to impose content regulation on the Net. Consequently, at least in Western countries, governments and media corporations refer to child pornography for justifying legal requirement and the implementation of technologies for the surveillance and monitoring of individuals, the filtering, rating and blocking of content, and the prohibition of anonymous publishing on the Net.

In the name of "cleaning" the Net of child pornography, our basic rights are restricted. It is the insistence on unrestricted basic rights that needs to be justified, as it may seem.

Underlying the campaign to control the Net are several assumptions. Inter alia: The Net lacks control and needs to be made safe and secure; we may be exposed inadvertently to pornographic content; this content is harmful to children. Remarkably, racism seems to be not an issue.

The Net, especially the World Wide Web, is not like television (although it is to be feared this is what it might become like within the next years). Say, little Mary types "Barbie" in a search engine. Click here to see what happens. It is true, sometimes you might have the opportunity to see that pornography is just a few mouse clicks away, but it is not likely that you might be exposed to pornographic content unless you make deliberate mouse clicks.

In reaction to these anxieties, but in absence of data how children use the Internet, the US government released the Communications Decency Act (CDA) in 1996. In consequence the Electronic Frontier Foundation (EFF) launched the famous Blue Ribbon Campaign and, among others, America Online and Microsoft Corporation supported a lawsuit of the American Civil Liberties Union (ACLU) against this Act. On June 26, 1997, the US Supreme Court ruled the CDA as unconstitutional under the provisions of the First Amendment to the Constitution: The Communications Decency Act violated the basic right to free expression. After a summit with the US government industry leaders announced the using of existing rating and blocking systems and the development of new ones for "inappropriate" online resources.

So, after the failing of the CDA the US government has shifted its responsibility to the industry by inviting corporations to taking on governmental tasks. Bearing in the mind the CompuServe case and its possible consequences, the industry welcomed this decision and was quick to call this newly assumed responsibility "self-regulation". Strictly speaking, "self-regulation" as meant by the industry does not amount to the regulation of the behaviour of corporations by themselves. On the opposite, "self-regulation" is to be understood as the regulation of users' behaviour by the rating, filtering and blocking of Internet content considered being inappropriate. The Internet industry tries to show that technical solutions are more favourable than legislation und wants to be sure, not being held responsible and liable for illegal, offensive or harmful content. A new CompuServe case and a new Communications Decency Act shall be averted.

In the Memorandum Self-regulation of Internet Content released in late 1999 by the Bertelsmann Foundation it is recommended that the Internet industry joins forces with governmental institutions for enforcing codes of conduct and encouraging the implementation of filters and ratings systems. For further details on the Memorandum see the study by the Center for Democracy and Technology, An Analysis of the Bertelsmann Foundation Memorandum on Self-Regulation of Internet Content: Concerns from a User Empowerment Perspective.

In fact, the "self-regulation" of the Internet industry is privatized censorship performed by corporations and right-wing NGOs. Censorship has become a business. "Crucially, the lifting of restrictions on market competition hasn't advanced the cause of freedom of expression at all. On the contrary, the privatisation of cyberspace seems to be taking place alongside the introduction of heavy censorship." (Richard Barbrook and Andy Cameron, The Californian Ideology)

While trying to convince us that its technical solutions are appropriate alternatives to government regulation, the Internet industry cannot dispense of governmental backing to enforce the proposed measures. This adds to and enforces the censorship measures already undertaken by governments. We are encouraged to use today's information and communication technologies, while the flow of information is restricted.

According to a report by Reporters Sans Frontières, quoted by Leonard R. Sussman in his essay Censor Dot Gov. The Internet and Press Freedom 2000, the following countries totally or largely control Internet access: Azerbaijan, Belarus, Burma, China, Cuba, Iran, Iraq, Kazakhstan, Kirghizstan, Libya, North Korea, Saudi Arabia, Sierra Leone, Sudan, Syria, Tajikistan, Tunisia, Turkmenistan, Uzbekistan, and Vietnam.

TEXTBLOCK 22/24 // URL: http://world-information.org/wio/infostructure/100437611742/100438658968
 
Further Tools: Photography

Art has always contributed a lot to disinformation.
Many modern tools for disinformation are used in art/photography.
Harold D. Lasswell once stated that propaganda was cheaper than violence. Today this is no longer true. Technology has created new tools for propaganda and disinformation - and they are expensive. But by now our possibilities to manipulate pictures and stories have gone so far that it can get difficult to tell the difference between the original and a manipulation.

Trillions of photographs have been taken in the 20th century. Too many to look at, too many to control them and their use. A paradise for manipulation.
We have to keep in mind: There is the world, and there exist pictures of the world, which does not mean that both are the same thing. Photographs are not objective, because the photographer selects the part of the world which is becoming a picture. The rest is left out.

Some tools for manipulation of photography are:



Some of those are digital ways of manipulation, which helps to change pictures in many ways without showing the manipulation.

Pictures taken from the internet could be anything and come from anywhere. To proof the source is nearly impossible. Therefore scientists created on watermarks for pictures, which make it impossible to "steal" or manipulate a picture out of the net.

TEXTBLOCK 23/24 // URL: http://world-information.org/wio/infostructure/100437611661/100438658730
 
Iris recognition

Iris recognition relies upon the fact that every individuals retina has a unique structure. The iris landscape is composed of a corona, crypts, filaments, freckles, pits radial furrows and striatations. Iris scanning is considered a particularly accurate identification technology because the characteristics of the iris do not change during a persons lifetime, and because there are several hundred variables in an iris which can be measured. In addition, iris scanning is fast: it does not take longer than one or two seconds.

These are characteristics which have made iris scanning an attractive technology for high-security applications such as prison surveillance. Iris technology is also used for online identification where it can substitute identification by password. As in other biometric technologies, the use of iris scanning for the protection of privacy is a two-edged sword. The prevention of identity theft applies horizontally but not vertically, i.e. in so far as the data retrieval that accompanies identification and the data body which is created in the process has nothing to do with identity theft.

TEXTBLOCK 24/24 // URL: http://world-information.org/wio/infostructure/100437611729/100438658334
 
Internet Software Consortium

The Internet Software Consortium (ISC) is a nonprofit corporation dedicated to the production of high-quality reference implementations of Internet standards that meet production standards. Its goal is to ensure that those reference implementations are properly supported and made freely available to the Internet community.

http://www.isc.org

INDEXCARD, 1/34
 
Caching

Caching is a mechanism that attempts to decrease the time it takes to retrieve data by storing a copy at a closer location.

INDEXCARD, 2/34
 
Disney

American corporation that became the best-known purveyor of child and adult entertainment in the 20th century. Its headquarters are in Burbank, Calif. The company was founded in 1929 and produced animated motion-picture cartoons.
In 1955 the company opened the Disneyland amusement park, one of the world's most famous. Under a new management, in the 1980s, Disney's motion-picture and animated-film production units became among the most successful in the United States. In 1996 the Disney corporation acquired Capital Cities/ABC Inc., which owned the ABC television network. The Disney Company also operates the Disney Channel, a pay television programming service.

INDEXCARD, 3/34
 
Chappe's fixed optical network

Claude Chappe built a fixed optical network between Paris and Lille. Covering a distance of about 240kms, it consisted of fifteen towers with semaphores.

Because this communication system was destined to practical military use, the transmitted messages were encoded. The messages were kept such secretly, even those who transmit them from tower to tower did not capture their meaning, they just transmitted codes they did not understand. Depending on weather conditions, messages could be sent at a speed of 2880 kms/hr at best.

Forerunners of Chappe's optical network are the Roman smoke signals network and Aeneas Tacitus' optical communication system.

For more information on early communication networks see Gerard J. Holzmann and Bjoern Pehrson, The Early History of Data Networks.

INDEXCARD, 4/34
 
Wide Application Protocol (WAP)

The WAP (Wireless Application Protocol) is a specification for a set of communication protocols to standardize the way that wireless devices, such as cellular telephones and radio transceivers, can be used for Internet access, including e-mail, the World Wide Web, newsgroups, and Internet Relay Chat (IRC).

While Internet access has been possible in the past, different manufacturers have used different technologies. In the future, devices and service systems that use WAP will be able to interoperate.

Source: Whatis.com

INDEXCARD, 5/34
 
ARPAnet

ARPAnet was the small network of individual computers connected by leased lines that marked the beginning of today's global data networks. Being an experimental network mainly serving the purpose to test the feasibility of wide area networks, the possibility of remote computing, it was created for resource sharing between research institutions, not for messaging services like E-mail. Although research was sponsored by US military, ARPAnet was not designed for directly martial use but to support military-related research.

In 1969 ARPANET went online and links the first two computers, one of them located at the University of California, Los Angeles, the other at the Stanford Research Institute.

But ARPAnet has not become widely accepted before it was demonstrated in action to a public of computer experts at the First International Conference on Computers and Communication in Washington, D. C. in 1972.

Before it was decommissioned in 1990, NSFnet, a network of scientific and academic computers funded by the National Science Foundation, and a separate new military network went online in 1986. In 1988 the first private Internet service providers offered a general public access to NSFnet. Beginning in 1995, after having become the backbone of the Internet in the USA, NSFnet was turned over to a consortium of commercial backbone providers. This and the launch of the World Wide Web added to the success of the global data network we call the Net.

In the USA commercial users already outnumbered military and academic users in 1994.

Despite the rapid growth of the Net, most computers linked to it are still located in the United States.

INDEXCARD, 6/34
 
IIPA

The International Intellectual Property Alliance formed in 1984 is a private sector coalition and represents the U.S. copyright-based industries. It is comprised of seven trade associations: Association of American Publishers, AFMA, Business Software Alliance, Interactive Digital Software Association, Motion Picture Association of America, National Music Publishers' Association and Recording Industry Association of America. IIPA and its member's track copyright legislative and enforcement developments in over 80 countries and aim at a legal and enforcement regime for copyright that deters piracy. On a national level IIPA cooperates with the U.S. Trade Representative and on the multilateral level has been involved in the development of the TRIPS (Trade-Related Aspects of Intellectual Property Rights) agreement of the WTO (World Trade Organization) and also participates in the copyright discussion of the WIPO (World Intellectual Property Organization).

INDEXCARD, 7/34
 
Alan Turing

b. June 23, 1912, London, England
d. June 7, 1954, Wilmslow, Cheshire

English mathematician and logician who pioneered in the field of computer theory and who contributed important logical analyses of computer processes. Many mathematicians in the first decades of the 20th century had attempted to eliminate all possible error from mathematics by establishing a formal, or purely algorithmic, procedure for establishing truth. The mathematician Kurt Gödel threw up an obstacle to this effort with his incompleteness theorem. Turing was motivated by Gödel's work to seek an algorithmic method of determining whether any given propositions were undecidable, with the ultimate goal of eliminating them from mathematics. Instead, he proved in his seminal paper "On Computable Numbers, with an Application to the Entscheidungsproblem [Decision Problem]" (1936) that there cannot exist any such universal method of determination and, hence, that mathematics will always contain undecidable propositions. During World War II he served with the Government Code and Cypher School, at Bletchley, Buckinghamshire, where he played a significant role in breaking the codes of the German "Enigma Machine". He also championed the theory that computers eventually could be constructed that would be capable of human thought, and he proposed the Turing test, to assess this capability. Turing's papers on the subject are widely acknowledged as the foundation of research in artificial intelligence. In 1952 Alan M. Turing committed suicide, probably because of the depressing medical treatment that he had been forced to undergo (in lieu of prison) to "cure" him of homosexuality.

INDEXCARD, 8/34
 
Theoedore Roosevelt

With the assassination of President McKinley, Theodore Roosevelt (1858-1919), not quite 43, became the youngest President in the Nation's history. Roosevelt's youth differed sharply from that of the log cabin Presidents. He was born in New York City in 1858 into a wealthy family. Roosevelt steered the United States more actively into world politics. He liked to quote a favorite proverb, "Speak softly and carry a big stick. . . . "

He won the Nobel Peace Prize for mediating the Russo-Japanese War.

for more information see the official website:

http://www.whitehouse.gov/WH/glimpse/presidents/html/tr26.html

http://www.whitehouse.gov/WH/glimpse/presiden...
INDEXCARD, 9/34
 
Napoleon

Napoleon I. (1769-1821) was French King from 1804-1815.
He is regarded as the master of propaganda and disinformation of his time. Not only did he play his game with his own people but also with all European nations. And it worked as long as he managed to keep up his propaganda and the image of the winner.
Part of his already nearly commercial ads was that his name's "N" was painted everywhere.
Napoleon understood the fact that people believe what they want to believe - and he gave them images and stories to believe. He was extraordinary good in black propaganda.
Censorship was an element of his politics, accompanied by a tremendous amount of positive images about himself.
But his enemies - like the British - used him as a negative image, the reincarnation of the evil (a strategy still very popular in the Gulf-War and the Kosovo-War) (see Taylor, Munitions of the Mind p. 156/157).

INDEXCARD, 10/34
 
Internet Engineering Steering Group

On behalf of the Internet Society, the Internet Engineering Steering Group is responsible for the technical management of the evolution of the architecture, the standards and the protocols of the Net.

http://www.ietf.org/iesg.html

http://www.ietf.org/iesg.html
INDEXCARD, 11/34
 
Josef Goebbels

Josef Goebbels (1897-1945) was Hitler's Minister for Propaganda and Public Enlightenment. He had unlimited influence on the press, the radio, movies and all kind of literary work in the whole Reich. In 1944 he received all power over the Total War. At the same time he was one of the most faithful followers of Hitler - and he followed him into death in 1945.

INDEXCARD, 12/34
 
Proprietary Network

Proprietary networks are computer networks with standards different to the ones proposed by the International Standardization Organization (ISO), the Open Systems Interconnection (OSI). Designed to conform to standards implemented by the manufacturer, compatibility to other network standards is not assured.

INDEXCARD, 13/34
 
Enigma Machine

The Enigma Encryption Machine was famous for its insecurities as for the security that it gave to German ciphers. It was broken, first by the Poles in the 1930s, then by the British in World War II.

INDEXCARD, 14/34
 
The Internet Engineering Task Force

The Internet Engineering Task Force contributes to the evolution of the architecture, the protocols and technologies of the Net by developing new Internet standard specifications. The directors of its functional areas form the Internet Engineering Steering Group.

Internet Society: http://www.ietf.org

http://www.ietf.org/
INDEXCARD, 15/34
 
Amazon.Com

Amazon.Com was one of the first online bookstores. With thousands of books, CDs and videos ordered via the Internet every year, Amazon.Com probably is the most successful Internet bookstore.

INDEXCARD, 16/34
 
Operating system

An operating system is software that controls the many different operations of a computer and directs and coordinates its processing of programs. It is a remarkably complex set of instructions that schedules the series of jobs (user applications) to be performed by the computer and allocates them to the computer's various hardware systems, such as the central processing unit, main memory, and peripheral systems. The operating system directs the central processor in the loading, storage, and execution of programs and in such particular tasks as accessing files, operating software applications, controlling monitors and memory storage devices, and interpreting keyboard commands. When a computer is executing several jobs simultaneously, the operating system acts to allocate the computer's time and resources in the most efficient manner, prioritizing some jobs over others in a process called time-sharing. An operating system also governs a computer's interactions with other computers in a network.

INDEXCARD, 17/34
 
Viacom

One of the largest and foremost communications and media conglomerates in the
world. Founded in 1971, the present form of the corporation dates from 1994 when Viacom Inc., which owned radio and television stations and cable television programming services and systems, acquired the entertainment and publishing giant Paramount Communications Inc. and then merged with the video and music retailer Blockbuster Entertainment Corp. Headquarters are in New York City.

INDEXCARD, 18/34
 
The Flesh Machine

This is the tile of a book by the Critical Art Ensemble which puts the development of artifical life into a critical historical and political context, defining the power vectors which act as the driving force behind this development. The book is available in a print version (New York, Autonomedia 1998) and in an online version at http://www.critical-art.net/fles/book/index.html

INDEXCARD, 19/34
 
Internet Research Task Force

Being itself under the umbrella of the Internet Society, the Internet Research Task Force is an umbrella organization of small research groups working on topics related to Internet protocols, applications, architecture and technology. It is governed by the Internet Research Steering Group.

http://www.irtf.org

http://www.irtf.org/
INDEXCARD, 20/34
 
Extranet

An Extranet is an Intranet with limited and controlled access by authenticated outside users, a business-to-business Intranet, e.g.

INDEXCARD, 21/34
 
Adi Shamir

Adi Shamir was one of three persons in a team to invent the RSA public-key cryptosystem. The other two authors were Ron Rivest and Leonard M. Adleman.

INDEXCARD, 22/34
 
Seneca

Lucius Annaeus Seneca (~4 BC - 65 AD), originally coming from Spain, was a Roman philosopher, statesman, orator and playwright with a lot of influence on the Roman cultural life of his days. Involved into politics, his pupil Nero forced him to commit suicide. The French Renaissance brought his dramas back to stage.

INDEXCARD, 23/34
 
Codices, 1th century B.C.

The transformation of writings from scrolls to codices, in basic the hardcover book as we know it today, is an essential event in European history. Quoting accurately by page number, browsing through pages and skipping chapters, all impossible while reading scrolls, become possible.

In the computer age we are witnesses to a kind of revival of the scrolls as we scroll upwards and downwards a document we just see a portion of. Maybe the introduction of hypertext is the beginning of a similar change as the replacement of scrolls by codices.

INDEXCARD, 24/34
 
blowfish encryption algorithm

Blowfish is a symmetric key block cipher that can vary its length.
The idea behind is a simple design to make the system faster than others.

http://www.counterpane.com/blowfish.html
http://www.counterpane.com/bfsverlag.html

http://www.counterpane.com/blowfish.html
http://www.counterpane.com/blowfish.html
INDEXCARD, 25/34
 
Scientology

Official name Church Of Scientology, religio-scientific movement developed in the United States in the 1950s by the author L. Ron Hubbard (1911-86). The Church of Scientology was formally established in the United States in 1954 and was later incorporated in Great Britain and other countries. The scientific basis claimed by the church for its diagnostic and therapeutic practice is disputed, and the church has been criticized for the financial demands that it makes on its followers. From the 1960s the church and various of its officials or former officials faced government prosecutions as well as private lawsuits on charges of fraud, tax evasion, financial mismanagement, and conspiring to steal government documents, while the church on the other hand claimed it was being persecuted by government agencies and by established medical organizations. Some former Scientology officials have charged that Hubbard used the tax-exempt status of the church to build a profitable business empire.

INDEXCARD, 26/34
 
Medieval universities and copying of books

The first of the great medieval universities was established at Bologna. At the beginning, universities predominantly offered a kind of do-it-yourself publishing service.

Books still had to be copied by hand and were so rare that a copy of a widely desired book qualified for being invited to a university. Holding a lecture equaled to reading a book aloud, like a priest read from the Bible during services. Attending a lecture equaled to copy a lecture word by word, so you had your own copy of a book, thus enabling you to hold a lecture, too.

For further details see History of the Idea of a University, http://quarles.unbc.edu/ideas/net/history/history.html

http://quarles.unbc.edu/ideas/net/history/his...
INDEXCARD, 27/34
 
Above.net

Headquartered in San Jose, USA, AboveNet Communications is a backbone service provider. Through its extensive peering relationships, the company has built a network with the largest aggregated bandwidth in the world.

http://www.above.net

INDEXCARD, 28/34
 
Richard Barbrook and Andy Cameron, The Californian Ideology

According to Barbrook and Cameron there is an emerging global orthodoxy concerning the relation between society, technology and politics. In this paper they are calling this orthodoxy the Californian Ideology in honor of the state where it originated. By naturalizing and giving a technological proof to a political philosophy, and therefore foreclosing on alternative futures, the Californian ideologues are able to assert that social and political debates about the future have now become meaningless and - horror of horrors - unfashionable. - This paper argues for an interactive future.

http://www.wmin.ac.uk/media/HRC/ci/calif.html

INDEXCARD, 29/34
 
William Gibson

American science fiction author. Most famous novel: Neuromancer.

For resources as writings and interviews available on the Internet see http://www.lib.loyno.edu/bibl/wgibson.htm

INDEXCARD, 30/34
 
International Cable Protection Committee (ICPC)

The ICPC aims at reducing the number of incidents of damages to submarine telecommunications cables by hazards.

The Committee also serves as a forum for the exchange of technical and legal information pertaining to submarine cable protection methods and programs and funds projects and programs, which are beneficial for the protection of submarine cables.

Membership is restricted to authorities (governmental administrations or commercial companies) owning or operating submarine telecommunications cables. As of May 1999, 67 members representing 38 nations were members.

http://www.iscpc.org

INDEXCARD, 31/34
 
NSFNet

Developed under the auspices of the National Science Foundation (NSF), NSFnet served as the successor of the ARPAnet as the main network linking universities and research facilities until 1995, when it was replaced it with a commercial backbone network. Being research networks, ARPAnet and NSFnet served as testing grounds for future networks.

INDEXCARD, 32/34
 
Vacuum tube

The first half of the 20th century was the era of the vacuum tube in electronics. This variety of electron tube permitted the development of radio broadcasting, long-distance telephony, television, and the first electronic digital computers. These early electronic computers were, in fact, the largest vacuum-tube systems ever built. Perhaps the best-known representative is the ENIAC (Electronic Numerical Integrator and Computer, completed in 1946).

INDEXCARD, 33/34
 
Colouring

In November 1997, after the assassination of (above all Swiss) tourists in Egypt, the Swiss newspaper Blick showed a picture of the place where the attack had happened, with a tremendous pool of blood, to emphasize the cruelty of the Muslim terrorists. In other newspapers the same picture could be seen - with a pool of water, like in the original. Of course the manipulated coloured version of the Blick fit better into the mind of the shocked Swiss population. The question about death penalty arose quickly ....

INDEXCARD, 34/34