History: Communist Tradition

Following the communist revolutions of the 20th century all "means of production" became the property of the state as representative of "the masses". Private property ceased to exist. While moral rights of the creator were recognized and economic rights acknowledged with a one-time cash award, all subsequent rights reverted to the state.

With the transformation of many communist countries to a market system most of them have now introduced laws establishing markets in intellectual property rights. Still the high rate of piracy reflects a certain lack of legal tradition.

TEXTBLOCK 1/29 // URL: http://world-information.org/wio/infostructure/100437611725/100438659483
 
Face recognition

In order to be able to recognize a person, one commonly looks at this persons face, for it is there where the visual features which distinguish one person from another are concentrated. Eyes in particular seem to tell a story not only about who somebody is, but also about how that persons feel, where his / her attention is directed, etc. People who do not want to show who they are or what is going on inside of them must mask themselves. Consequently, face recognition is a kind of electronic unmasking.

"Real" face-to-face communication is a two-way process. Looking at somebody's face means exposing ones own face and allowing the other to look at oneself. It is a mutual process which is only suspended in extraordinary and voyeuristic situations. Looking at somebody without being looked at places the person who is visually exposed in a vulnerable position vis-à-vis the watcher.

In face recognition this extraordinary situation is normal. Looking at the machine, you only see yourself looking at the machine. Face biometrics are extracted anonymously and painlessly by a mask without a face.

Therefore the resistance against the mass appropriation of biometrical data through surveillance cameras is confronted with particular difficulties. The surveillance structure is largely invisible, it is not evident what the function of a particular camera is, nor whether it is connected to a face recognition system.

In a protest action against the face recognition specialist Visionics, the Surveillance Camera Players therefor adopted the strategy of re-masking: in front of the cameras, they perfomed the play "The Masque of the Red Death" an adaption of Edgar Allen Poe's classic short story by Art Toad.

According to Visionics, whose slogan is "enabling technology with a mass appeal", there are alrady 1.1 bn digitised face images stored on identification data banks world wide. When combined with wide area surveillance camera networks, face recognition is capable of creating a transparent social space that can be controlled by a depersonalised, undetected and unaccountable centre. It is a technology, of which the surveillance engeneers of sunken totalitarian regimes may have dreamt, and one that today is being adopted by democratic governments.

TEXTBLOCK 2/29 // URL: http://world-information.org/wio/infostructure/100437611729/100438658118
 
Further Tools: Photography

Art has always contributed a lot to disinformation.
Many modern tools for disinformation are used in art/photography.
Harold D. Lasswell once stated that propaganda was cheaper than violence. Today this is no longer true. Technology has created new tools for propaganda and disinformation - and they are expensive. But by now our possibilities to manipulate pictures and stories have gone so far that it can get difficult to tell the difference between the original and a manipulation.

Trillions of photographs have been taken in the 20th century. Too many to look at, too many to control them and their use. A paradise for manipulation.
We have to keep in mind: There is the world, and there exist pictures of the world, which does not mean that both are the same thing. Photographs are not objective, because the photographer selects the part of the world which is becoming a picture. The rest is left out.

Some tools for manipulation of photography are:



Some of those are digital ways of manipulation, which helps to change pictures in many ways without showing the manipulation.

Pictures taken from the internet could be anything and come from anywhere. To proof the source is nearly impossible. Therefore scientists created on watermarks for pictures, which make it impossible to "steal" or manipulate a picture out of the net.

TEXTBLOCK 3/29 // URL: http://world-information.org/wio/infostructure/100437611661/100438658730
 
Late 1960s - Early 1970s: Third Generation Computers

One of the most important advances in the development of computer hardware in the late 1960s and early 1970s was the invention of the integrated circuit, a solid-state device containing hundreds of transistors, diodes, and resistors on a tiny silicon chip. It made possible the production of large-scale computers (mainframes) of higher operating speeds, capacity, and reliability at significantly lower costs.

Another type of computer developed at the time was the minicomputer. It profited from the progresses in microelectronics and was considerably smaller than the standard mainframe, but, for instance, powerful enough to control the instruments of an entire scientific laboratory. Furthermore operating systems, that allowed machines to run many different programs at once with a central program that monitored and coordinated the computer's memory, attained widespread use.

TEXTBLOCK 4/29 // URL: http://world-information.org/wio/infostructure/100437611663/100438659498
 
fingerprint identification

Although fingerprinting smacks of police techniques used long before the dawn of the information age, its digital successor finger scanning is the most widely used biometric technology. It relies on the fact that a fingerprint's uniqueness can be defined by analysing the so-called "minutiae" in somebody's fingerprint. Minutae include sweat pores, distance between ridges, bifurcations, etc. It is estimated that the likelihood of two individuals having the same fingerprint is less than one in a billion.

As an access control device, fingerprint scanning is particularly popular with military institutions, including the Pentagon, and military research facilities. Banks are also among the principal users of this technology, and there are efforts of major credit card companies such as Visa and MasterCard to incorporate this finger print recognition into the bank card environment.

Problems of inaccuracy resulting from oily, soiled or cracked skins, a major impediment in fingerprint technology, have recently been tackled by the development a contactless capturing device (http://www.ddsi-cpc.com) which translates the characteristics of a fingerprint into a digitised image.

As in other biometric technologies, fingerprint recognition is an area where the "criminal justice" market meets the "security market", yet another indication of civilian spheres becomes indistinguishable from the military. The utopia of a prisonless society seems to come within the reach of a technology capable of undermining freedom by an upward spiral driven by identification needs and identification technologies.

TEXTBLOCK 5/29 // URL: http://world-information.org/wio/infostructure/100437611729/100438658358
 
Internet, Intranets, Extranets, and Virtual Private Networks

With the rise of networks and the corresponding decline of mainframe services computers have become communication devices instead of being solely computational or typewriter-like devices. Corporate networks become increasingly important and often use the Internet as a public service network to interconnect. Sometimes they are proprietary networks.

Software companies, consulting agencies, and journalists serving their interests make some further differences by splitting up the easily understandable term "proprietary networks" into terms to be explained and speak of Intranets, Extranets, and Virtual Private Networks.

Cable TV networks and online services as Europe Online, America Online, and Microsoft Network are also proprietary networks. Although their services resemble Internet services, they offer an alternative telecommunication infrastructure with access to Internet services for their subscribers.
America Online is selling its service under the slogan "We organize the Web for you!" Such promises are more frightening than promising because "organizing" is increasingly equated with "filtering" of seemingly objectionable messages and "rating" of content. For more information on these issues, click here If you want to know more about the technical nature of computer networks, here is a link to the corresponding article in the Encyclopaedia Britannica.

Especially for financial transactions, secure proprietary networks become increasingly important. When you transfer funds from your banking account to an account in another country, it is done through the SWIFT network, the network of the Society for Worldwide Interbank Financial Telecommunication (SWIFT). According to SWIFT, in 1998 the average daily value of payments messages was estimated to be above U$ 2 trillion.

Electronic Communications Networks as Instinet force stock exchanges to redefine their positions in trading of equities. They offer faster trading at reduced costs and better prices on trades for brokers and institutional investors as mutual funds and pension funds. Last, but not least clients are not restricted to trading hours and can trade anonymously and directly, thereby bypassing stock exchanges.

TEXTBLOCK 6/29 // URL: http://world-information.org/wio/infostructure/100437611791/100438658384
 
Legal Protection: TRIPS (Trade-Related Aspects of Intellectual Property Rights)

Another important multilateral treaty concerned with intellectual property rights is the TRIPS agreement, which was devised at the inauguration of the Uruguay Round negotiations of the WTO in January 1995. It sets minimum standards for the national protection of intellectual property rights and procedures as well as remedies for their enforcement (enforcement measures include the potential for trade sanctions against non-complying WTO members). The TRIPS agreement has been widely criticized for its stipulation that biological organisms be subject to intellectual property protection. In 1999, 44 nations considered it appropriate to treat plant varieties as intellectual property.

The complete TRIPS agreement can be found on: http://www.wto.org/english/tratop_e/trips_e/t_agm1_e.htm

TEXTBLOCK 7/29 // URL: http://world-information.org/wio/infostructure/100437611725/100438659758
 
Biometric applications: surveillance

Biometric technologies are not surveillance technologies in themselves, but as identification technologies they provide an input into surveillance which can make such as face recognition are combined with camera systems and criminal data banks in order to supervise public places and single out individuals.

Another example is the use of biometrics technologies is in the supervision of probationers, who in this way can carry their special hybrid status between imprisonment and freedom with them, so that they can be tracked down easily.

Unlike biometric applications in access control, where one is aware of the biometric data extraction process, what makes biometrics used in surveillance a particularly critical issue is the fact that biometric samples are extracted routinely, unnoticed by the individuals concerned.

TEXTBLOCK 8/29 // URL: http://world-information.org/wio/infostructure/100437611729/100438658740
 
Timeline 1900-1970 AD

1913 the wheel cipher gets re-invented as a strip

1917 William Frederick Friedman starts working as a cryptoanalyst at Riverbank Laboratories, which also works for the U.S. Government. Later he creates a school for military cryptoanalysis

- an AT&T-employee, Gilbert S. Vernam, invents a polyalphabetic cipher machine that works with random-keys

1918 the Germans start using the ADFGVX-system, that later gets later by the French Georges Painvin

- Arthur Scherbius patents a ciphering machine and tries to sell it to the German Military, but is rejected

1919 Hugo Alexander Koch invents a rotor cipher machine

1921 the Hebern Electric Code, a company producing electro-mechanical cipher machines, is founded

1923 Arthur Scherbius founds an enterprise to construct and finally sell his Enigma machine for the German Military

late 1920's/30's more and more it is criminals who use cryptology for their purposes (e.g. for smuggling). Elizabeth Smith Friedman deciphers the codes of rum-smugglers during prohibition regularly

1929 Lester S. Hill publishes his book Cryptography in an Algebraic Alphabet, which contains enciphered parts

1933-1945 the Germans make the Enigma machine its cryptographic main-tool, which is broken by the Poles Marian Rejewski, Gordon Welchman and Alan Turing's team at Bletchley Park in England in 1939

1937 the Japanese invent their so called Purple machine with the help of Herbert O. Yardley. The machine works with telephone stepping relays. It is broken by a team of William Frederick Friedman. As the Japanese were unable to break the US codes, they imagined their own codes to be unbreakable as well - and were not careful enough.

1930's the Sigaba machine is invented in the USA, either by W.F. Friedman or his colleague Frank Rowlett

- at the same time the British develop the Typex machine, similar to the German Enigma machine

1943 Colossus, a code breaking computer is put into action at Bletchley Park

1943-1980 the cryptographic Venona Project, done by the NSA, is taking place for a longer period than any other program of that type

1948 Shannon, one of the first modern cryptographers bringing mathematics into cryptography, publishes his book A Communications Theory of Secrecy Systems

1960's the Communications-Electronics Security Group (= CESG) is founded as a section of Government Communications Headquarters (= GCHQ)

late 1960's the IBM Watson Research Lab develops the Lucifer cipher

1969 James Ellis develops a system of separate public-keys and private-keys

TEXTBLOCK 9/29 // URL: http://world-information.org/wio/infostructure/100437611776/100438658921
 
World War II ...

Never before propaganda had been as important as in the 2nd World War. From now on education was one more field of propaganda: its purpose was to teach how to think, while pure propaganda was supposed to show what to think.
Every nation founded at least one ministry of propaganda - of course without calling it that way. For example the British called it the Ministry of Information (= MOI), the U.S. distinguished between the Office of Strategic Services (= OSS) and the Office of War Information (= OWI), the Germans created a Ministry of Propaganda and Public Enlightenment (= RMVP) and the Japanese called their disinformation and propaganda campaign the "Thought War".
British censorship was so strict that the text of an ordinary propaganda leaflet, that had been dropped from planes several million times, was not given to a journalist who asked for it.

Atrocity stories were no longer used the same way as in the 1st World War. Instead, black propaganda was preferred, especially to separate the Germans from their leaders.
German war propaganda had started long before the war. In the middle of the 1930s Leni Riefenstahl filmed Hitler best propaganda movies. For the most famous one, "Triumph of the Will" (1935), she was the only professional filmier who was allowed to make close-up pictures of her admirer.

Some of the pictures of fear, hatred and intolerance still exist in people's heads. Considering this propaganda did a good job, unfortunately it was the anti-national-socialist propaganda that failed at that time.

TEXTBLOCK 10/29 // URL: http://world-information.org/wio/infostructure/100437611661/100438658610
 
Anonymity

"Freedom of anonymous speech is an essential component of free speech."

Ian Goldberg/David Wagner, TAZ Servers and the Rewebber Network: Enabling Anonymous Publishing on the World Wide Web, in: First Monday 3,4, 1999

Someone wants to hide one's identity, to remain anonymous, if s/he fears to be holding accountable for something, say, a publication, that is considered to be prohibited. Anonymous publishing has a long tradition in European history. Writers of erotic literature or pamphlets, e. g., preferred to use pseudonyms or publish anonymously. During the Enlightenment books as d'Alembert's and Diderot's famous Encyclopaedia were printed and distributed secretly. Today Book Locker, a company selling electronic books, renews this tradition by allowing to post writings anonymously, to publish without the threat of being perishing for it. Sometimes anonymity is a precondition for reporting human rights abuses. For example, investigative journalists and regime critics may rely on anonymity. But we do not have to look that far; even you might need or use anonymity sometimes, say, when you are a woman wanting to avoid sexual harassment in chat rooms.

The original design of the Net, as far as it is preserved, offers a relatively high degree of privacy, because due to the client-server model all what is known about you is a report of the machine from which information was, respectively is requested. But this design of the Net interferes with the wish of corporations to know you, even to know more about you than you want them to know. What is euphemistically called customer relationship management systems means the collection, compilation and analysis of personal information about you by others.

In 1997 America Online member Timothy McVeigh, a Navy employee, made his homosexuality publicly known in a short autobiographical sketch. Another Navy employee reading this sketch informed the Navy. America Online revealed McVeigh's identity to the Navy, who discharged McVeigh. As the consequence of a court ruling on that case, Timothy McVeigh was allowed to return to the Navy. Sometimes anonymity really matters.

On the Net you still have several possibilities to remain anonymous. You may visit web sites via an anonymizing service. You might use a Web mail account (given the personal information given to the web mail service provider is not true) or you might use an anonymous remailing service which strips off the headers of your mail to make it impossible to identify the sender and forward your message. Used in combination with encryption tools and technologies like FreeHaven or Publius anonymous messaging services provide a powerful tool for countering censorship.

In Germany, in 1515, printers had to swear not to print or distribute any publication bypassing the councilmen. Today repressive regimes, such as China and Burma, and democratic governments, such as the France and Great Britain, alike impose or already have imposed laws against anonymous publishing on the Net.

Anonymity might be used for abuses, that is true, but "the burden of proof rests with those who would seek to limit it. (Rob Kling, Ya-ching Lee, Al Teich, Mark S. Frankel, Assessing Anonymous Communication on the Internet: Policy Deliberations, in: The Information Society, 1999).

TEXTBLOCK 11/29 // URL: http://world-information.org/wio/infostructure/100437611742/100438659040
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 12/29 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
Eliminating online censorship: Freenet, Free Haven and Publius

Protecting speech on the global data networks attracts an increasing attention. The efforts and the corresponding abilities of governmental authorities, corporations and copyright enforcement agencies are countered by similar efforts and abilities of researchers and engineers to provide means for anonymous and uncensored communication, as Freenet, Free Haven and Publius. All three of them show a similar design. Content is split up and spread on several servers. When a file is requested, the pieces are reassembled. This design makes it difficult to censor content. All of these systems are not commercial products.

The most advanced system seems to be Publius. Because of being designed by researchers and engineers at the prestigious AT&T Labs, Publius is a strong statement against online censorship. No longer can it be said that taking a firm stand against the use of technologies limiting the freedom of individuals is a position of radical leftists only.

For more information on Publius, see John Schwartz, Online and Unidentifiable? in: The Washington Post, June 30, 2000, http://www.washingtonpost.com/wp-dyn/articles/A21689-2000Jun29.html .

Freenet web site: http://freenet.sourceforge.net

Free Haven web site: http://www.freehaven.net

Publius web site: http://www.cs.nyu.edu/waldman/publius

TEXTBLOCK 13/29 // URL: http://world-information.org/wio/infostructure/100437611742/100438658749
 
The 2nd Chechnya-War

In the summer of 1999 between 1.200 and 2.000 Muslim rebels from Chechnya fell into Dagestan. Rumors say that Russian soldiers closed their eyes pretending not to see anything. During the fightings that started soon, many persons got killed. The hole issue was blamed on Chechnya.
At that time there were rumors that there would be heavy bombing in Moscow in September. And there was. Those two things together brought back the hatred against the Chechnya rebels. The 2nd War between Russia and the Muslim country began. While the first war was lost at home, because the Russians, especially mothers, did not understand why their sons should fight against Chechnya, this time the atmosphere was completely different. In the cities 85% and all over Russia 65% of the Russian population agreed with the war. This time the war was a national issue, a legitimate defense.
The media emphasized this.
Alexander Zilin, a journalist, found out that the truth was far from the one presented in the media: First of all there was no evidence that the Moscow-bombings were organized by Chechnyans. On the contrary it is more than probable that the crimes were organized by a governmental institution for national security. The disinformation was part of the strategy to make the population support another war with Chechnya. The media were part of the story, maybe without knowing. They kept on the government's and army's side, showing only special and patriotic parts of the war. For example the number of dead Russian soldiers was held back.

The U.S.-behavior on this:
The USA would like to intervene but they are afraid of ruining the weak relation to Russia. For years the main topic of U.S.-politics has been the struggle against terrorism. Now Russia pretends to be fighting terrorism. How could it be criticized for that?

The reason for this war is rather cynical: it worked as a public relations-campaign for Vladimir Putin, candidate for the president's elections in. When Putin came into power as minister-president of Russia in August 1999, opinion polls gave him 2% for the elections in summer 2000. By the end of November he got already 46%! And finally he won. The public relations war worked well.
At the same time a propaganda-campaign against his rival Y. Primakov (98), formerly the most popular candidate, was spreading lies and bad rumors. Opinion-polls showed very fast that he had lost the elections because of this black propaganda, even before the elections took place.

TEXTBLOCK 14/29 // URL: http://world-information.org/wio/infostructure/100437611661/100438658639
 
Biometric technologies

In what follows there is a brief description of the principal biometric technologies, whose respective proponents - producers, research laboratories, think tanks - mostly tend to claim superiority over the others. A frequently used definition of "biometric" is that of a "unique, measurable characteristic or trait of a human being for automatically recognizing or verifying identity" (http://www.icsa.net/services/consortia/cbdc/bg/introduction.shtml); biometrics is the study and application of such measurable characteristics. In IT environments, biometrics are categorised as "security" technologies meant to limit access to information, places and other resources to a specific group of people.

All biometric technologies are made up of the same basic processes:

1. A sample of a biometric is first collected, then transformed into digital information and stored as the "biometric template" of the person in question.

2. At every new identification, a second sample is collected and its identity with the first one is examined.

3. If the two samples are identical, the persons identity is confirmed, i.e. the system knows who the person is.

This means that access to the facility or resource can be granted or denied. It also means that information about the persons behaviour and movements has been collected. The system now knows who passed a certain identification point at which time, at what distance from the previous time, and it can combine these data with others, thereby appropriating an individual's data body.

TEXTBLOCK 15/29 // URL: http://world-information.org/wio/infostructure/100437611729/100438658188
 
2000 A.D.

2000
Convergence of telephony, audiovisual technologies and computing

Digital technologies are used to combine previously separated communication and media systems such as telephony, audiovisual technologies and computing to new services and technologies, thus forming extensions of existing communication systems and resulting in fundamentally new communication systems. This is what is meant by today's new buzzwords "multimedia" and "convergence".

Classical dichotomies as the one of computing and telephony and traditional categorizations no longer apply, because these new services no longer fit traditional categories.

Convergence and Regulatory Institutions

Digital technology permits the integration of telecommunications with computing and audiovisual technologies. New services that extend existing communication systems emerge. The convergence of communication and media systems corresponds to a convergence of corporations. Recently, America Online, the world's largest online service provider, merged with Time Warner, the world's largest media corporation. For such corporations the classical approach to regulation - separate institutions regulate separate markets - is no longer appropriate, because the institutions' activities necessarily overlap. The current challenges posed to these institutions are not solely due to the convergence of communication and media systems made possible by digital technologies; they are also due to the liberalization and internationalization of the electronic communications sector. For regulation to be successful, new categorizations and supranational agreements are needed.
For further information on this issue see Natascha Just and Michael Latzer, The European Policy Response to Convergence with Special Consideration of Competition Policy and Market Power Control, http://www.soe.oeaw.ac.at/workpap.htm or http://www.soe.oeaw.ac.at/WP01JustLatzer.doc.

TEXTBLOCK 16/29 // URL: http://world-information.org/wio/infostructure/100437611796/100438659802
 
1940s - 1950s: The Development of Early Robotics Technology

During the 1940s and 1950s two major developments enabled the design of modern robots. Robotics generally is based on two related technologies: numerical control and teleoperators.

Numerical control was invented during the late 1940s and early 1950s. It is a method of controlling machine tool axes by means of numbers that have been coded on media. The first numerical control machine was presented in 1952 at the Massachusetts Institute of Technology (MIT), whose subsequent research led to the development of APT (Automatically Programmed Tools). APT, a language for programming machine tools, was designed for use in computer-assisted manufacturing (CAM).

First teleoperators were developed in the early 1940s. Teleoperators are mechanical manipulators which are controlled by a human from a remote location. In its typical application a human moves a mechanical arm and hand with its moves being duplicated at another location.

TEXTBLOCK 17/29 // URL: http://world-information.org/wio/infostructure/100437611663/100438659348
 
Timeline BC

~ 1900 BC: Egyptian writers use non-standard Hieroglyphs in inscriptions of a royal tomb; supposedly this is not the first but the first documented example of written cryptography

1500 an enciphered formula for the production of pottery is done in Mesopotamia

parts of the Hebrew writing of Jeremiah's words are written down in "atbash", which is nothing else than a reverse alphabet and one of the first famous methods of enciphering

4th century Aeneas Tacticus invents a form of beacons, by introducing a sort of water-clock

487 the Spartans introduce the so called "skytale" for sending short secret messages to and from the battle field

170 Polybius develops a system to convert letters into numerical characters, an invention called the Polybius Chequerboard.

50-60 Julius Caesar develops an enciphering method, later called the Caesar Cipher, shifting each letter of the alphabet an amount which is fixed before. Like atbash this is a monoalphabetic substitution.

TEXTBLOCK 18/29 // URL: http://world-information.org/wio/infostructure/100437611776/100438659084
 
Virtual cartels; mergers

In parallel to the deregulation of markets, there has been a trend towards large-scale mergers which ridicules dreams of increased competition.

Recent mega-mergers and acquisitions include

SBC Communications - Ameritech, $ 72,3 bn

Bell Atlantic - GTE, $ 71,3

AT&T - Media One, $ 63,1

AOL - Time Warner, $ 165 bn

MCI Worldcom - Spring, $ 129 bn

The total value of all major mergers since the beginnings of the 1990s has been 20 trillion Dollars, 2,5 times the size of the USA's GIP.

The AOL- Time Warner reflects a trend which can be observed everywhere: the convergence of the ICT and the content industries. This represents the ultimate advance in complete market domination, and a alarming threat to independent content.

"Is TIME going to write something negative about AOL? Will AOL be able to offer anything other than CNN sources? Is the Net becoming as silly and unbearable as television?"

(Detlev Borchers, journalist)

TEXTBLOCK 19/29 // URL: http://world-information.org/wio/infostructure/100437611709/100438658959
 
Znet

ZNet provides forum facilities for online discussion and chatting on various topics ranging from culture and ecology to international relations and economics. ZNet also publishes daily commentaries and maintains a Web-zine, which addresses current news and events as well as many other topics, trying to be provocative, informative and inspiring to its readers.

Strategies and Policies

Daily Commentaries: Znet's commentaries address current news and events, cultural happenings, and organizing efforts, providing context, critique, vision, and analysis, but also references to or reviews of broader ideas, new books, activism, the Internet, and other topics that strike the diverse participating authors as worthy of attention.

Forum System: Znet provides a private (and soon also a public) forum system. The fora are among others concerned with topics such as: activism, cultural, community/race/religion/ethnicity, ecology, economics/class, gender/kinship/sexuality, government/polity, international relations, ParEcon, vision/strategy and popular culture. Each forum has a set of threaded discussions, also the fora hosted by commentary writers like Chomsky, Ehrenreich, Cagan, Peters and Wise.

ZNet Daily WebZine: ZNet Daily WebZine offers commentaries in web format.

Z Education Online (planned): The Z Education Online site will provide instructionals and courses of diverse types as well as other university-like, education-aimed features.

TEXTBLOCK 20/29 // URL: http://world-information.org/wio/infostructure/100437611734/100438659288
 
The Secret Behind

The secret behind all this is the conception that nothing bad could ever be referred to the own nation. All the bad words belong to the enemy, whereas the "we" is the good one, the one who never is the aggressor but always defender, the savior - not only for ones own sake but also for the others, even if they never asked for it, like the German population during World War I and II.
The spiritualization of such thinking leads to the point that it gets nearly impossible to believe that this could be un-true, a fake. To imagine injustice committed by the own nation gets more and more difficult, the longer the tactic of this kind of propaganda goes on. U.S.-Americans voluntarily believe in its politics, believing also the USA works as the police of the world, defending the morally good against those who just do not have reached the same level of civilization until today.
To keep up this image, the enemy must be portrayed ugly and bad, like in fairy-tales, black-and-white-pictures. Any connection between oneself and the enemy must be erased and made impossible. In the case of Slobodan Milosevic or Saddam Hussein this meant to delete the positive contact of the last years from the consciousness of the population. Both had received a high amount of money and material help as long as they kept to the rules of the Western game. Later, when the image of the friend/confederate was destroyed, the contact had to be denied. The media, who had reported that help, no longer seemed to remember and did not write anything about that strange change of mind. And if any did, they were not really listened to, because people tend to hear what they want to hear. And who would want to hear that high amounts of his taxes had formerly gone to "a man" (this personification of the war to one single man is the next disinformation) who now is the demon in one's mind.

All of this is no invention of several politicians. Huge think tanks and different governmental organizations are standing behind that. Part of their work is to hide their own work, or to deny it.

TEXTBLOCK 21/29 // URL: http://world-information.org/wio/infostructure/100437611661/100438658637
 
Challenges for Copyright by ICT: Introduction

Traditional copyright and the practice of paying royalties to the creators of intellectual property have emerged with the introduction of the printing press (1456). Therefore early copyright law has been tailored to the technology of print and the (re) production of works in analogue form. Over the centuries legislation concerning the protection of intellectual property has been adapted several times in order to respond to the technological changes in the production and distribution of information.

Yet again new technologies have altered the way of how (copyrighted) works are produced, copied, made obtainable and distributed. The emergence of global electronic networks and the increased availability of digitalized intellectual property confront existing copyright with a variety of questions and challenges. Although the combination of several types of works within one larger work or on one data carrier, and the digital format (although this may be a recent development it has been the object of detailed legal scrutiny), as well as networking (telephone and cable networks have been in use for a long time, although they do not permit interactivity) are nothing really new, the circumstance that recent technologies allow the presentation and storage of text, sound and visual information in digital form indeed is a novel fact. Like that the entire information can be generated, altered and used by and on one and the same device, irrespective of whether it is provided online or offline.


TEXTBLOCK 22/29 // URL: http://world-information.org/wio/infostructure/100437611725/100438659517
 
History: "Indigenous Tradition"

In preliterate societies the association of rhythmic or repetitively patterned utterances with supernatural knowledge endures well into historic times. Knowledge is passed from one generation to another. Similar as in the Southern tradition intellectual property rights are rooted in a concept of 'collective' or 'communal' intellectual property existing in perpetuity and not limited to the life of an individual creator plus some number of years after his or her death. Often rights are exercised by only one individual in each generation, often through matrilineal descent.


TEXTBLOCK 23/29 // URL: http://world-information.org/wio/infostructure/100437611725/100438659557
 
Racism on the Internet

The internet can be regarded as a mirror of the variety of interests, attitudes and needs of human kind. Propaganda and disinformation in that way have to be part of it, whether they struggle for something good or evil. But the classifications do no longer function.
During the last years the internet opened up a new source for racism as it can be difficult to find the person who gave a certain message into the net. The anarchy of the internet provides racists with a lot of possibilities to reach people which they do not possess in other media, for legal and other reasons.

In the 1980s racist groups used mailboxes to communicate on an international level; the first ones to do so were supposedly the Ku Klux Klan and mailboxes like the Aryan Nations Liberty Net. In the meantime those mailboxes can be found in the internet. In 1997 about 600 extreme right websites were in the net, the number is growing, most of them coming from the USA. The shocking element is not the number of racist pages, because still it is a very small number compared to the variety of millions of pages one can find in this media, it is the evidence of intentional disinformation, the language and the hatred that makes it dangerous.
A complete network of anti-racist organizations, including a high number of websites are fighting against racism. For example:

http://motlc.wiesenthal.com/text/x32/xr3257.html

http://www.aranet.org/

http://www.freespeech.org/waronracism/files/allies.htm
http://www.nsdapmuseum.com
http://www.globalissues.org/HumanRights/Racism.asp

TEXTBLOCK 24/29 // URL: http://world-information.org/wio/infostructure/100437611661/100438658620
 
Movies as a Propaganda- and Disinformation-Tool in World War I and II

Movies produced in Hollywood in 1918/19 were mainly anti-German. They had some influence but the bigger effect was reached in World War II-movies.
The first propaganda movie of World War II was British.
At that time all films had to pass censoring. Most beloved were entertaining movies with propaganda messages. The enemy was shown as a beast, an animal-like creature, a brutal person without soul and as an idiot. Whereas the own people were the heroes. That was the new form of atrocity.
Leni Riefenstahl was a genius in this respect. Her movies still have an incredible power, while the majority of the other movies of that time look ridiculous today. The combination of light and shadow, the dramatic music and the mass-scenes that resembled ballet, had its effect and political consequences. Some of the German movies of that period still are on the index.

U.S.-President Theodore Roosevelt considered movies the best propaganda-instrument, as they are more subtle than other tools.

In the late twenties, movies got more and more important, in the USSR, too, like Sergei Eisenstein demonstrated with his movies. Historic events were changed into symbolism, exactly the way propaganda should function. It was disinformation - but in its most artistic form, especially in comparison to most U.S.- and European movies of that time.

TEXTBLOCK 25/29 // URL: http://world-information.org/wio/infostructure/100437611661/100438658547
 
1960s - 1970s: Increased Research in Artificial Intelligence (AI)

During the cold war the U.S. tried to ensure that it would stay ahead of the Soviet Union in technological advancements. Therefore in 1963 the Defense Advanced Research Projects Agency (DARPA) granted the Massachusetts Institute of Technology (MIT) U.S.$ 2.2 million for research in machine-aided cognition (artificial intelligence). The major effect of the project was an increase in the pace of AI research and a continuation of funding.

In the 1960s and 1970s a multitude of AI programs were developed, most notably SHRDLU. Headed by Marvin Minsky the MIT's research team showed, that when confined to a small subject matter, computer programs could solve spatial and logic problems. Other progresses in the field of AI at the time were: the proposal of new theories about machine vision by David Marr, Marvin Minsky's frame theory, the PROLOGUE language (1972) and the development of expert systems.

TEXTBLOCK 26/29 // URL: http://world-information.org/wio/infostructure/100437611663/100438659474
 
Content as Transport Medium for Values and Ideologies

With the dissemination of their content commercial media are among other things also able to transport values and ideologies. Usually their programming reflects society's dominant social, political, ethical, cultural and economical values. A critical view of the prevalent ideologies often is sacrificed so as not to offend the existing political elites and corporate powers, but rather satisfy shareholders and advertisers.

With most of the worlds content produced by a few commercial media conglomerates, with the overwhelming majority of companies (in terms of revenue generation) concentrated in Europe, the U.S., Japan and Australia there is also a strong flow of content from the 'North-West' to the 'South-East'. Popular culture developed in the world's dominant commercial centers and Western values and ideologies are so disseminated into the most distant corners of the earth with far less coming back.

TEXTBLOCK 27/29 // URL: http://world-information.org/wio/infostructure/100437611795/100438659066
 
Biometrics applications: physical access

This is the largest area of application of biometric technologies, and the most direct lineage to the feudal gate keeping system. Initially mainly used in military and other "high security" territories, physical access control by biometric technology is spreading into a much wider field of application. Biometric access control technologies are already being used in schools, supermarkets, hospitals and commercial centres, where the are used to manage the flow of personnel.

Biometric technologies are also used to control access to political territory, as in immigration (airports, Mexico-USA border crossing). In this case, they can be coupled with camera surveillance systems and artificial intelligence in order to identify potential suspects at unmanned border crossings. Examples of such uses in remote video inspection systems can be found at http://www.eds-ms.com/acsd/RVIS.htm

A gate keeping system for airports relying on digital fingerprint and hand geometry is described at http://www.eds-ms.com/acsd/INSPASS.htm. This is another technology which allows separating "low risk" travellers from "other" travellers.

An electronic reconstruction of feudal gate keeping capable of singling out high-risk travellers from the rest is already applied at various border crossing points in the USA. "All enrolees are compared against national lookout databases on a daily basis to ensure that individuals remain low risk". As a side benefit, the economy of time generated by the inspection system has meant that "drug seizures ... have increased since Inspectors are able to spend more time evaluating higher risk vehicles".

However, biometric access control can not only prevent people from gaining access on to a territory or building, they can also prevent them from getting out of buildings, as in the case of prisons.

TEXTBLOCK 28/29 // URL: http://world-information.org/wio/infostructure/100437611729/100438658838
 
Virtual cartels, oligopolistic structures

Global networks require global technical standards ensuring the compatibility of systems. Being able to define such standards makes a corporation extremely powerful. And it requires the suspension of competitive practices. Competition is relegated to the symbolic realm. Diversity and pluralism become the victims of the globalisation of baroque sameness.

The ICT market is dominated by incomplete competition aimed at short-term market domination. In a very short time, new ideas can turn into best-selling technologies. Innovation cycles are extremely short. But today's state-of-the-art products are embryonic trash.

    According to the Computer and Communications Industry Association, Microsoft is trying to aggressively take over the network market. This would mean that AT&T would control 70 % of all long distance phone calls and 60 % of cable connections.



    AOL and Yahoo are lone leaders in the provider market. AOL has 21 million subscribers in 100 countries. In a single month, AOL registers 94 million visits. Two thirds of all US internet users visited Yahoo in December 1999.



    The world's 13 biggest internet providers are all American.



    AOL and Microsoft have concluded a strategic cross-promotion deal. In the US, the AOL icon is installed on every Windows desktop. AOL has also concluded a strategic alliance with Coca Cola.


TEXTBLOCK 29/29 // URL: http://world-information.org/wio/infostructure/100437611709/100438658963
 
Next Generation Internet Program

A research and development program funded by the US government. Goal is the development of advanced networking technologies and applications requiring advanced networking with capabilities that are 100 to 1,000 times faster end-to-end than today's Internet.

http://www.ngi.gov

INDEXCARD, 1/40
 
Karl Neupert

In the 1920s the Hollow Earth Theory was very popular in Germany. With the acceptance and support of the NAZI regime Karl Neupert wrote the book Geokosmos. With the help of this book the theory became a cult in Germany.

INDEXCARD, 2/40
 
Kessler Marketing Intelligence (KMI)

KMI is the leading source for information on fiber-optics markets. It offers market research, strategic analysis and product planning services to the opto-electronics and communications industries. KMI tracks the worldwide fiber-optic cable system and sells the findings to the industry. KMI says that every fiber-optics corporation with a need for strategic market planning is a subscriber to their services.

http://www.kmicorp.com/

http://www.kmicorp.com/
INDEXCARD, 3/40
 
Electronic Messaging (E-Mail)

Electronic messages are transmitted and received by computers through a network. By E-Mail texts, images, sounds and videos can be sent to single users or simultaneously to a group of users. Now texts can be sent and read without having them printed.

E-Mail is one of the most popular and important services on the Internet.

INDEXCARD, 4/40
 
Machine language

Initially computer programmers had to write instructions in machine language. This coded language, which can be understood and executed directly by the computer without conversion or translation, consists of binary digits representing operation codes and memory addresses. Because it is made up of strings of 1s and 0s, machine language is difficult for humans to use.

INDEXCARD, 5/40
 
Gottfried Wilhelm von Leibniz

b. July 1, 1646, Leipzig
d. November 14, 1716, Hannover, Hanover

German philosopher, mathematician, and political adviser, important both as a metaphysician and as a logician and distinguished also for his independent invention of the differential and integral calculus. 1661, he entered the University of Leipzig as a law student; there he came into contact with the thought of men who had revolutionized science and philosophy--men such as Galileo, Francis Bacon, Thomas Hobbes, and René Descartes. In 1666 he wrote De Arte Combinatoria ("On the Art of Combination"), in which he formulated a model that is the theoretical ancestor of some modern computers.

INDEXCARD, 6/40
 
Ross Perot

Ross Perot, founder of EDS, is one of the richtest individuals of the US, and former presidential candidate of the Reform Party. A staunch patriot, Perot has been know for his aggressive business practices as well as for his close relationships to the military and other US governmental bodies. Perot reached 19 % in the 1992 presidential elections, but dropped to less than 10 % in 1996.

Official website: http://www.perot.org/

Unofficial website: http://www.realchange.org/perot.htm

http://www.eds.com/
http://www.perot.org/
http://www.realchange.org/perot.htm
INDEXCARD, 7/40
 
New World Order

http://www.douzzer.ai.mit.edu:8080/conspiracy...
http://www.geocities.com/CapitolHill/Lobby/18...
INDEXCARD, 8/40
 
Core copyright industries

Those encompass the industries that create copyrighted works as their primary product. These industries include the motion picture industry (television, theatrical, and home video), the recording industry (records, tapes and CDs), the music publishing industry, the book, journal and newspaper publishing industry, and the computer software industry (including data processing, business applications and interactive entertainment software on all platforms), legitimate theater, advertising, and the radio, television and cable broadcasting industries.

INDEXCARD, 9/40
 
Clipper Chip

The Clipper Chip is a cryptographic device proposed by the U.S. government that purportedly intended to protect private communications while at the same time permitting government agents to obtain the "keys" upon presentation of what has been vaguely characterized as "legal authorization." The "keys" are held by two government "escrow agents" and would enable the government to access the encrypted private communication. While Clipper would be used to encrypt voice transmissions, a similar chip known as Capstone
would be used to encrypt data. The underlying cryptographic algorithm, known as Skipjack, was developed by the National Security Agency (NSA).

INDEXCARD, 10/40
 
cryptology

also called "the study of code". It includes both, cryptography and cryptoanalysis

INDEXCARD, 11/40
 
Cookie

A cookie is an information package assigned to a client program (mostly a Web browser) by a server. The cookie is saved on your hard disk and is sent back each time this server is accessed. The cookie can contain various information: preferences for site access, identifying authorized users, or tracking visits.

In online advertising, cookies serve the purpose of changing advertising banners between visits, or identifying a particular direct marketing strategy based on a user's preferences and responses.

Advertising banners can be permanently eliminated from the screen by filtering software as offered by Naviscope or Webwash

Cookies are usually stored in a separate file of the browser, and can be erased or permanently deactivated, although many web sites require cookies to be active.

http://www.naviscope.com/
http://www.webwash.com/
INDEXCARD, 12/40
 
DES

The U.S. Data Encryption Standard (= DES) is the most widely used encryption algorithm, especially used for protection of financial transactions. It was developed by IBM in 1971. It is a symmetric-key cryptosystem. The DES algorithm uses a 56-bit encryption key, meaning that there are 72,057,594,037,927,936 possible keys.

for more information see:
http://www.britannica.com/bcom/eb/article/3/0,5716,117763+5,00.html
http://www.cryptography.com/des/

http://www.britannica.com/bcom/eb/article/3/0...
http://www.cryptography.com/des/
INDEXCARD, 13/40
 
America Online

Founded in 1985, America Online is the world's biggest Internet service provider serving almost every second user. Additionally, America Online operates CompuServe, the Netscape Netcenter and several AOL.com portals. As the owner of Netscape, Inc. America Online plays also an important role in the Web browser market. In January 2000 America Online merged with Time Warner, the worlds leading media conglomerate, in a US$ 243,3 billion deal, making America Online the senior partner with 55 percent in the new company.

http://www.aol.com

http://www.aol.com/
INDEXCARD, 14/40
 
Transistor

A transistor is a solid-state device for amplifying, controlling, and generating electrical signals. Transistors are used in a wide array of electronic equipment, ranging from pocket calculators and radios to industrial robots and communications satellites.

INDEXCARD, 15/40
 
RIPE

The RIPE Network Coordination Centre (RIPE NCC) is one of three Regional Internet

Registries (RIR), which exist in the world today, providing allocation and registration services which support the operation of the Internet globally, mainly the allocation of IP address space for Europe.

http://www.ripe.net

INDEXCARD, 16/40
 
Central processing unit

A CPU is the principal part of any digital computer system, generally composed of the main memory, control unit, and arithmetic-logic unit. It constitutes the physical heart of the entire computer system; to it is linked various peripheral equipment, including input/output devices and auxiliary storage units...

INDEXCARD, 17/40
 
Writing

Writing and calculating came into being at about the same time. The first pictographs carved into clay tablets are used for administrative purposes. As an instrument for the administrative bodies of early empires, who began to rely on the collection, storage, processing and transmission of data, the skill of writing was restricted to a few. Being more or less separated tasks, writing and calculating converge in today's computers.

Letters are invented so that we might be able to converse even with the absent, says Saint Augustine. The invention of writing made it possible to transmit and store information. No longer the ear predominates; face-to-face communication becomes more and more obsolete for administration and bureaucracy. Standardization and centralization become the constituents of high culture and vast empires as Sumer and China.

INDEXCARD, 18/40
 
First Monday

An English language peer reviewed media studies journal based in Denmark.

http://firstmonday.dk

INDEXCARD, 19/40
 
Sergei Eisenstein

Though Sergei Eisenstein (1898-1948) made only seven films in his entire career, he was the USSR's most important movie-conductor in the 1920s and 1930s. His typical style, putting mountains of metaphors and symbols into his films, is called the "intellectual montage" and was not always understood or even liked by the audience. Still, he succeeded in mixing ideological and abstract ideas with real stories. His most famous work was The Battleship Potemkin (1923).

INDEXCARD, 20/40
 
AT&T Labs-Research

The research and development division of AT&T. Inventions made at AT&T Labs-Research include so important ones as stereo recording, the transistor and the communications satellite.

http://www.research.att.com/

INDEXCARD, 21/40
 
Vinton Cerf

Addressed as one of the fathers of the Internet, Vinton Cerf together with Robert Kahn developed the TCP/IP protocol suite, up to now the de facto-communication standard for the Internet, and also contributed to the development of other important communication standards. The early work on the protocols broke new ground with the realization of a multi-network open architecture.

In 1992, he co-founded the Internet Society where he served as its first President and later Chairman.

Today, Vinton Cerf is Senior Vice President for Internet Architecture and Technology at WorldCom, one of the world's most important ICT companies

Vinton Cerf's web site: http://www.wcom.com/about_the_company/cerfs_up/

http://www.isoc.org/
http://www.wcom.com/
INDEXCARD, 22/40
 
Vacuum tube

The first half of the 20th century was the era of the vacuum tube in electronics. This variety of electron tube permitted the development of radio broadcasting, long-distance telephony, television, and the first electronic digital computers. These early electronic computers were, in fact, the largest vacuum-tube systems ever built. Perhaps the best-known representative is the ENIAC (Electronic Numerical Integrator and Computer, completed in 1946).

INDEXCARD, 23/40
 
World Wide Web (WWW)

Probably the most significant Internet service, the World Wide Web is not the essence of the Internet, but a subset of it. It is constituted by documents that are linked together in a way you can switch from one document to another by simply clicking on the link connecting these documents. This is made possible by the Hypertext Mark-up Language (HTML), the authoring language used in creating World Wide Web-based documents. These so-called hypertexts can combine text documents, graphics, videos, sounds, and Java applets, so making multimedia content possible.

Especially on the World Wide Web, documents are often retrieved by entering keywords into so-called search engines, sets of programs that fetch documents from as many servers as possible and index the stored information. (For regularly updated lists of the 100 most popular words that people are entering into search engines, click here). No search engine can retrieve all information on the whole World Wide Web; every search engine covers just a small part of it.

Among other things that is the reason why the World Wide Web is not simply a very huge database, as is sometimes said, because it lacks consistency. There is virtually almost infinite storage capacity on the Internet, that is true, a capacity, which might become an almost everlasting too, a prospect, which is sometimes consoling, but threatening too.

According to the Internet domain survey of the Internet Software Consortium the number of Internet host computers is growing rapidly. In October 1969 the first two computers were connected; this number grows to 376.000 in January 1991 and 72,398.092 in January 2000.

World Wide Web History Project, http://www.webhistory.org/home.html

http://www.searchwords.com/
http://www.islandnet.com/deathnet/
http://www.salonmagazine.com/21st/feature/199...
INDEXCARD, 24/40
 
Network Information Center (NIC)

Network information centers are organizations responsible for registering and maintaining the domain names on the World Wide Web. Until competition in domain name registration was introduced, they were the only ones responsible. Most countries have their own network information center.

INDEXCARD, 25/40
 
Black Propaganda

Black propaganda does not tell its source. The recipient cannot find out the correct source. Rather would it be possible to get a wrong idea about the sender. It is very helpful for separating two allies.

INDEXCARD, 26/40
 
Microsoft Corporation

Founded by Bill Gates and Paul Allen and headquartered in Redmond, USA, Microsoft Corporation is today's world-leading developer of personal-computer software systems and applications. As MS-DOS, the first operating system released by Microsoft, before, Windows, its successor, has become the de-facto standard operating system for personal computer. According to critics and following a recent court ruling this is due to unfair competition.

http://www.microsoft.com

For more detailed information see the Encyclopaedia Britannica: http://www.britannica.com/bcom/eb/article/4/0,5716,1524+1+1522,00.html

http://www.microsoft.com/
http://www.britannica.com/bcom/eb/article/4/0...
INDEXCARD, 27/40
 
CNN

CNN is a U.S.-TV-enterprise, probably the world's most famous one. Its name has become the symbol for the mass-media, but also the symbol of a power that can decide which news are important for the world and which are not worth talking about. Every message that is published on CNN goes around the world. The Gulf War has been the best example for this until now, when a CNN-reporter was the one person to do the countdown to a war. The moments when he stood on the roof of a hotel in Baghdad and green flashes surrounded him, went around the world.

INDEXCARD, 28/40
 
Napoleon

Napoleon I. (1769-1821) was French King from 1804-1815.
He is regarded as the master of propaganda and disinformation of his time. Not only did he play his game with his own people but also with all European nations. And it worked as long as he managed to keep up his propaganda and the image of the winner.
Part of his already nearly commercial ads was that his name's "N" was painted everywhere.
Napoleon understood the fact that people believe what they want to believe - and he gave them images and stories to believe. He was extraordinary good in black propaganda.
Censorship was an element of his politics, accompanied by a tremendous amount of positive images about himself.
But his enemies - like the British - used him as a negative image, the reincarnation of the evil (a strategy still very popular in the Gulf-War and the Kosovo-War) (see Taylor, Munitions of the Mind p. 156/157).

INDEXCARD, 29/40
 
Fuzzy logic

A superset of Boolean logic (George Boole) introduced by Lotfi Zadeh in the 1960s as a means to model the uncertainty of natural language. Fuzzy logic is a type of logic that recognizes more than simple true and false values. It represents a departure from classical two-valued sets and logic, that use "soft" linguistic (e.g. large, small, hot, cold, warm) system variables and a continuous range of truth values in the interval [0,1], rather than strict binary (true or false) decisions and assignments.

INDEXCARD, 30/40
 
Ron Rivest

Ronald L. Rivest is Webster Professor of Electrical Engineering and Computer Science in MIT's EECS Department. He was one of three persons in a team to invent the RSA public-key cryptosystem. The co-authors were Adi Shamir and Leonard M. Adleman.

INDEXCARD, 31/40
 
Medieval universities and copying of books

The first of the great medieval universities was established at Bologna. At the beginning, universities predominantly offered a kind of do-it-yourself publishing service.

Books still had to be copied by hand and were so rare that a copy of a widely desired book qualified for being invited to a university. Holding a lecture equaled to reading a book aloud, like a priest read from the Bible during services. Attending a lecture equaled to copy a lecture word by word, so you had your own copy of a book, thus enabling you to hold a lecture, too.

For further details see History of the Idea of a University, http://quarles.unbc.edu/ideas/net/history/history.html

http://quarles.unbc.edu/ideas/net/history/his...
INDEXCARD, 32/40
 
David Kahn

David Kahn can be considered one of the most important historians on cryptography. His book The Codebreakers. The comprehensive history of secret Communication from Ancient Times to the Internet, written in 1996 is supposed to be the most important work on the history of cryptography.

INDEXCARD, 33/40
 
Terrestrial antennas

Microwave transmission systems based on terrestrial antennas are similar to satellite transmission system. Providing reliable high-speed access, they are used for cellular phone networks.

The implementation of the Wide Application Protocol (WAP) makes the wireless access to Internet services as E-Mail and even the World Wide Web via cellular phones convenient. Therefore microwave transmission systems become increasingly important.

INDEXCARD, 34/40
 
Bandwidth

The bandwidth of a transmitted communications signal is a measure of the range of frequencies the signal occupies. The term is also used in reference to the frequency-response characteristics of a communications receiving system. All transmitted signals, whether analog or digital, have a certain bandwidth. The same is true of receiving systems.

Generally speaking, bandwidth is directly proportional to the amount of data transmitted or received per unit time. In a qualitative sense, bandwidth is proportional to the complexity of the data for a given level of system performance. For example, it takes more bandwidth to download a photograph in one second than it takes to download a page of text in one second. Large sound files, computer programs, and animated videos require still more bandwidth for acceptable system performance. Virtual reality (VR) and full-length three-dimensional audio/visual presentations require the most bandwidth of all.

In digital systems, bandwidth is data speed in bits per second (bps).

Source: Whatis.com

INDEXCARD, 35/40
 
International Cable Protection Committee (ICPC)

The ICPC aims at reducing the number of incidents of damages to submarine telecommunications cables by hazards.

The Committee also serves as a forum for the exchange of technical and legal information pertaining to submarine cable protection methods and programs and funds projects and programs, which are beneficial for the protection of submarine cables.

Membership is restricted to authorities (governmental administrations or commercial companies) owning or operating submarine telecommunications cables. As of May 1999, 67 members representing 38 nations were members.

http://www.iscpc.org

INDEXCARD, 36/40
 
Internet Relay Chat (IRC)

IRC is a text-based chat system used for live discussions of groups.

For a history of IRC see Charles A. Gimon, IRC: The Net in Realtime, http://www.skypoint.com/~gimonca/irc2.html

http://www.skypoint.com/~gimonca/irc2.html
INDEXCARD, 37/40
 
Transmission Control Protocol/Internet Protocol (TCP/IP)

TCP and IP are the two most important protocols and communication standards. TCP provides reliable message-transmission service; IP is the key protocol for specifying how packets are routed around the Internet.

More detailed information can be found here

http://www.anu.edu/people/Roger.Clarke/II/Pri...
INDEXCARD, 38/40
 
Adolf Hitler

Adolf Hitler (1889-1945) was the head of the NSdAP, the National Socialist Workers' Party. Originally coming from Austria, he started his political career in Germany. As the Reichskanzler of Germany he provoked World War II. His hatred against all non-Aryans and people thinking in a different way killed millions of human beings. Disinformation about his personality and an unbelievable machinery of propaganda made an entire people close its eyes to the most cruel crimes on human kind.

INDEXCARD, 39/40
 
NSFNet

Developed under the auspices of the National Science Foundation (NSF), NSFnet served as the successor of the ARPAnet as the main network linking universities and research facilities until 1995, when it was replaced it with a commercial backbone network. Being research networks, ARPAnet and NSFnet served as testing grounds for future networks.

INDEXCARD, 40/40