some essential definitions

some essential definitions in the field of cryptography are:
- cryptoanalysis
- cryptology
- ciphers

"Few false ideas have more firmly gripped the minds of so many intelligent men than the one that, if they just tried, they could invent a cipher that no one could break." (David Kahn)

codes
plaintext
ciphertext
to encipher/encode
to decipher/decode

The variants of encryption systems are endless.
For deciphering there exists always the same game of trial and error (first guessing the encryption method, then the code). A help to do so is pruning. Once, after a more or less long or short period a code/cipher breaks. Monoalphabetic ciphers can be broken easily and of course are no longer used today but for games.

for further information on codes and ciphers etc. see:
http://www.optonline.com/comptons/ceo/01004A.html
http://www.ridex.co.uk/cryptology/#_Toc439908851

TEXTBLOCK 1/20 // URL: http://world-information.org/wio/infostructure/100437611776/100438659070
 
1970s: Computer-Integrated Manufacturing (CIM)

Since the 1970s there had been a growing trend towards the use of computer programs in manufacturing companies. Especially functions related to design and production, but also business functions should be facilitated through the use of computers.

Accordingly the CAD/CAM technology, related to the use of computer systems for design and production, was developed. CAD (computer-aided design) was created to assist in the creation, modification, analysis, and optimization of design. CAM (computer-aided manufacturing) was designed to help with the planning, control, and management of production operations. CAD/CAM technology, since the 1970s, has been applied in many industries, including machined components, electronics products, equipment design and fabrication for chemical processing.

To enable a more comprehensive use of computers in firms the CIM (computer-integrated manufacturing) technology, which also includes applications concerning the business functions of companies, was created. CIM systems can handle order entry, cost accounting, customer billing and employee time records and payroll. The scope of CIM technology includes all activities that are concerned with production. Therefore in many ways CIM represents the highest level of automation in manufacturing.

TEXTBLOCK 2/20 // URL: http://world-information.org/wio/infostructure/100437611663/100438659495
 
Advertising and the Content Industry - The Coca-Cola Case

Attempts to dictate their rules to the media has become a common practice among marketers and the advertising industry. Similar as in the Chrysler case, where the company demanded that magazines give advance notice about controversial articles, recent attempts to put pressure on content providers have been pursued by the Coca-Cola Company.

According to a memo published by the New York Post, Coca-Cola demands a free ad from any publication that publishes a Coke ad adjacent to stories on religion, politics, disease, sex, food, drugs, environmental issues, health, or stories that employ vulgar language. "Inappropriate editorial matter" will result in the publisher being liable for a "full make good," said the memo by Coke advertising agency McCann-Erickson. Asked about this practice, a Coke spokes person said the policy has long been in effect.

(Source: Odwyerpr.com: Coke Dictates nearby Editorial. http://www.odwyerpr.com)

TEXTBLOCK 3/20 // URL: http://world-information.org/wio/infostructure/100437611652/100438657998
 
biotechnology introduction



One of the most critical trends in the western culture what might be called the "fusion of flesh and machine". Increasingly, technological artifacts such as computers, rather than being used as tools by people, are functioning as parts of the human organism. On the other hand, human functionalities such as intelligence, emotion, adaptability or reproductivity are integrated into technological artifacts: the days when computers where only able to count apples and pears and their intelligence was not even matching an insect's are rapidly becoming history. Today, the boundaries between organisms and technology are losing their significance.

As new technologies are no longer mere instruments, or "extensions" of the organism, they acquire the capability of modifying the human organism - body and mind - from within according to certain pre-established principles. The history of the evolution of the human species is hastily being re-written as artificial beings begin to mock the categories of evolution and seem to work their way towards historical subjectivity. The German philosopher Günther Anders has extensively reflected on the changes of the human condition provoked by the development of modern technology speaks of an "antiquatedness of history" at a time when technology itself becomes a historical subject and men are becoming co-historical.

However, the softening of the biological definition of the human race is a theme which has accompanied western thinking ever since its origins. Beings half man-half animal crowd the tales of classical mythology and transcend the boundary between the human from below, while divine creatures, temporarily occupying humanoid bodies, relativise humanness form "above". What exactly "being human" meant and who "human beings" could be "defined" is a question with a long history. "Der Mensch ist das nicht festgestellte Thier" as Nietzsche commented.

Just as the boundaries between human and non-human are being crossed by technological development, so also the boundaries between the classical episteme are becoming permeable. Psychology is occupying itself with the emotions of machines, while physics and cybernetics is applied to the human mind and body. The "nicht festgestellte "character of humanness has meant that imagination has become just as relevant a factor in understanding humanness as science. Science fiction as a literary genre is no longer merely a depository of phantasies about a technisised world our of control. As the human monopoly on history seems to dissolve, the baroque narratives of science fiction have also become a resource for understanding history.

However, it is evident that the potentials of the new technologies gives rise not only to wild hopes and dreams and to bizarre stories; they also harbour some real ambiguities of a political and ethical nature. Not only does the merging of previously distinct realities - the human and the non-human worlds - unhinge theories and institutions based upon a clear distinction and separation of the two, it also is placing political practice on a different footing. Will artificial life have rights? Will artificial entities have political power? How will social relationships change?

TEXTBLOCK 4/20 // URL: http://world-information.org/wio/infostructure/100437611777/100438658433
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 5/20 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
Eliminating online censorship: Freenet, Free Haven and Publius

Protecting speech on the global data networks attracts an increasing attention. The efforts and the corresponding abilities of governmental authorities, corporations and copyright enforcement agencies are countered by similar efforts and abilities of researchers and engineers to provide means for anonymous and uncensored communication, as Freenet, Free Haven and Publius. All three of them show a similar design. Content is split up and spread on several servers. When a file is requested, the pieces are reassembled. This design makes it difficult to censor content. All of these systems are not commercial products.

The most advanced system seems to be Publius. Because of being designed by researchers and engineers at the prestigious AT&T Labs, Publius is a strong statement against online censorship. No longer can it be said that taking a firm stand against the use of technologies limiting the freedom of individuals is a position of radical leftists only.

For more information on Publius, see John Schwartz, Online and Unidentifiable? in: The Washington Post, June 30, 2000, http://www.washingtonpost.com/wp-dyn/articles/A21689-2000Jun29.html .

Freenet web site: http://freenet.sourceforge.net

Free Haven web site: http://www.freehaven.net

Publius web site: http://www.cs.nyu.edu/waldman/publius

TEXTBLOCK 6/20 // URL: http://world-information.org/wio/infostructure/100437611742/100438658749
 
The Role of the Media

To be able to participate in community life and make political choices citizens heavily rely on information. They need to know what is going on and the options that they should weigh, debate and act upon. An essential element for a functioning public sphere therefore is information.

Whereas formerly communication mostly happened on a face-to-face basis in large and complex societies (mass) media have evolved as the principal source of information. They act as a transport medium for the information necessary for a citizen's participation in the public sphere. Ideally there should be a wide range of media, that represent the diverse opinions and viewpoints on issues of public interest existent in a society and which are independent of the state and society's dominant economic forces.

TEXTBLOCK 7/20 // URL: http://world-information.org/wio/infostructure/100437611734/100438658499
 
Advertising and the Media System

Media systems (especially broadcasting) can be classified in two different types:

Public Media Systems: Government control over broadcasting through ownership, regulation, and partial funding of public broadcasting services.

Private Media System: Ownership and control lies in the hands of private companies and shareholders.

Both systems can exist in various forms, according to the degree of control by governments and private companies, with mixed systems (public and private) as the third main kind.

Whereas public media systems are usually at least partially funded by governments, private broadcasting solely relies on advertising revenue. Still also public media systems cannot exclude advertising as a source of revenue. Therefore both types are to a certain degree dependent on money coming in by advertisers.

And this implies consequences on the content provided by the media. As the attraction of advertisers becomes critically important, interests of the advertising industry frequently play a dominant role concerning the structure of content and the creation of environments favorable for advertising goods and services within the media becomes more and more common.

TEXTBLOCK 8/20 // URL: http://world-information.org/wio/infostructure/100437611652/100438657942
 
The Advertising Industry

The advertising industry is dominated by three huge advertising networks, which offer their services throughout the world. Gross income of the three leading agencies is twice as much, as the one of places four to ten.

Table: World's Top 10 Advertising Organizations 1999

(figures in millions of U.S. dollars)

Rank 1999

Advertising Organization

Headquarters

World-Wide Gross Income 1999

1

Omnicom

New York, USA

$ 5,743.4

2

Interpublic Group of Cos.

New York, USA

$ 5,079.3

3

WPP Group

London, UK

$ 4,819.3

4

Havas Advertising

Levallois-Perret, France

$ 2,385.1

5

Dentsu

Tokyo, Japan

$ 2,106.8

6

B Com3 Group

Chicago, USA

$ 1,933.8

7

Young & Rubicam Inc.

New York, USA

$ 1,870.1

8

Grey Advertising

New York, USA

$ 1,577.9

9

True North

Chicago, USA

$ 1,489.2

10

Publicis SA

Paris, France

$ 1,434.6



Table: Top 10 Global Marketers 1998

(figures in millions of U.S. dollars)

Rank 1998

Advertiser

Headquarters

World-Wide Media Spending 1998

1

Procter & Gamble Co.

Cincinnati (US)

$ 4,747.6

2

Unilever

Rotterdam (NL)/London (UK)

$ 3,428.5

3

General Motors Corp.

Detroit (US)

$ 3,193.5

4

Ford Motor Co.

Darborn (US)

$ 2,229.5

5

Philip Morris Cos.

New York

$ 1,980.3

6

Daimler Chrysler

Stuttgart (GER)/Auburn Hills (US

$ 1,922.2

7

Nestle

Vevey (SUI)

$ 1,833.0

8

Toyota Motor Corp.

Toyota City (JP)

$ 1,692.4

9

Sony Corp.

Tokyo (JP)

$ 1,337.7

10

Coca-Cola Co.

Atlanta (US)

$ 1,327.3



On the other hand the three biggest advertisers only spend about US$ 2 millions less than places four to ten together. Whereas money spent on advertising in traditional media comes from very diverse categories, companies offering computer hard- and software, peripherals or Internet services mainly pay for on-line advertisements.

Table: Top 10 Internet Advertisers 1998

(figures in millions of U.S. dollars)

Rank 1998

Advertiser

Internet Spending 1998

1998 - 1997 % Change

1

Microsoft Corp.

$ 34.9

9.4

2

IBM Corp.

$ 28.5

58.6

3

Compaq Computer Corp.

$ 16.2

169.8

4

General Motors Corp.

$ 12.7

84.8

5

Excite

$ 12.4

1.5

6

Infoseek Corp.

$ 9.3

22.3

7

AT&T Corp.

$ 9.3

43.5

8

Ford Motor Co.

$ 8.6

46.7

9

Hewlett-Packard Co.

$ 8.1

102.9

10

Barnes & Noble

$ 7.6

280.2



Source: Advertising Age

TEXTBLOCK 9/20 // URL: http://world-information.org/wio/infostructure/100437611652/100438657954
 
Virtual cartels, introduction

Among the most striking development of the 1990s has been the emergence of a global commercial media market utilizing new technologies and the global trend toward deregulation.
This global commercial media market is a result of aggressive maneuvering by the dominant firms, new technologies that make global systems cost-efficient, and neoliberal economic policies encouraged by the World Bank, IMF, WTO, and the US government to break down regulatory barriers to a global commercial media and telecommunication market.

A global oligopolistic market that covers the spectrum of media is now crystallizing the very high barriers to entry."

(Robert McChesney, author of "Rich Media, Poor Democracy")

The network structure of information and communication technologies means that even deregulated markets are not "free". The functional logic of global networks only tolerates a small number of large players. Mergers, strategic alliances, partnerships and cooperations are therefore the daily routine of the ICT business. They bypass competition and create "virtual cartels".

TEXTBLOCK 10/20 // URL: http://world-information.org/wio/infostructure/100437611709/100438658911
 
Timeline 1900-1970 AD

1913 the wheel cipher gets re-invented as a strip

1917 William Frederick Friedman starts working as a cryptoanalyst at Riverbank Laboratories, which also works for the U.S. Government. Later he creates a school for military cryptoanalysis

- an AT&T-employee, Gilbert S. Vernam, invents a polyalphabetic cipher machine that works with random-keys

1918 the Germans start using the ADFGVX-system, that later gets later by the French Georges Painvin

- Arthur Scherbius patents a ciphering machine and tries to sell it to the German Military, but is rejected

1919 Hugo Alexander Koch invents a rotor cipher machine

1921 the Hebern Electric Code, a company producing electro-mechanical cipher machines, is founded

1923 Arthur Scherbius founds an enterprise to construct and finally sell his Enigma machine for the German Military

late 1920's/30's more and more it is criminals who use cryptology for their purposes (e.g. for smuggling). Elizabeth Smith Friedman deciphers the codes of rum-smugglers during prohibition regularly

1929 Lester S. Hill publishes his book Cryptography in an Algebraic Alphabet, which contains enciphered parts

1933-1945 the Germans make the Enigma machine its cryptographic main-tool, which is broken by the Poles Marian Rejewski, Gordon Welchman and Alan Turing's team at Bletchley Park in England in 1939

1937 the Japanese invent their so called Purple machine with the help of Herbert O. Yardley. The machine works with telephone stepping relays. It is broken by a team of William Frederick Friedman. As the Japanese were unable to break the US codes, they imagined their own codes to be unbreakable as well - and were not careful enough.

1930's the Sigaba machine is invented in the USA, either by W.F. Friedman or his colleague Frank Rowlett

- at the same time the British develop the Typex machine, similar to the German Enigma machine

1943 Colossus, a code breaking computer is put into action at Bletchley Park

1943-1980 the cryptographic Venona Project, done by the NSA, is taking place for a longer period than any other program of that type

1948 Shannon, one of the first modern cryptographers bringing mathematics into cryptography, publishes his book A Communications Theory of Secrecy Systems

1960's the Communications-Electronics Security Group (= CESG) is founded as a section of Government Communications Headquarters (= GCHQ)

late 1960's the IBM Watson Research Lab develops the Lucifer cipher

1969 James Ellis develops a system of separate public-keys and private-keys

TEXTBLOCK 11/20 // URL: http://world-information.org/wio/infostructure/100437611776/100438658921
 
Virtual cartels; mergers

In parallel to the deregulation of markets, there has been a trend towards large-scale mergers which ridicules dreams of increased competition.

Recent mega-mergers and acquisitions include

SBC Communications - Ameritech, $ 72,3 bn

Bell Atlantic - GTE, $ 71,3

AT&T - Media One, $ 63,1

AOL - Time Warner, $ 165 bn

MCI Worldcom - Spring, $ 129 bn

The total value of all major mergers since the beginnings of the 1990s has been 20 trillion Dollars, 2,5 times the size of the USA's GIP.

The AOL- Time Warner reflects a trend which can be observed everywhere: the convergence of the ICT and the content industries. This represents the ultimate advance in complete market domination, and a alarming threat to independent content.

"Is TIME going to write something negative about AOL? Will AOL be able to offer anything other than CNN sources? Is the Net becoming as silly and unbearable as television?"

(Detlev Borchers, journalist)

TEXTBLOCK 12/20 // URL: http://world-information.org/wio/infostructure/100437611709/100438658959
 
The Privatization of Censorship

According to a still widely held conviction, the global data networks constitute the long desired arena for uncensorable expression. This much is true: Because of the Net it has become increasingly difficult to sustain cultural and legal standards. Geographical proximity and territorial boundaries prove to be less relevant, when it does not affect a document's availability if it is stored on your desktop or on a host some thousand kilometers away. There is no international agreement on non-prohibited contents, so human rights organizations and nazi groups alike can bypass restrictions. No single authority or organization can impose its rules and standards on all others. This is why the Net is public space, a political arena where free expression is possible.

This freedom is conditioned by the design of the Net. But the Net's design is not a given, as Lawrence Lessig reminds us. Originally the design of the Net allowed a relatively high degree of privacy and communication was not controlled directly. But now this design is changing and this invisible agora in electronic space is endangered. Governments - even elected ones - and corporations introduce new technologies that allow us to be identified, monitored and tracked, that identify and block content, and that can allow our behaviour to be efficiently controlled.

When the World Wide Web was introduced, soon small independent media and human rights organizations began to use this platform for drawing worldwide attention to their publications and causes. It seemed to be the dawning of a new era with authoritarian regimes and multinational media corporations on the looser side. But now the Net's design is changing according to their needs.

"In every context that it can, the entertaining industry is trying to force the Internet into its own business model: the perfect control of content. From music (fighting MP3) and film (fighting the portability of DVD) to television, the industry is resisting the Net's original design. It was about the free flow of content; Hollywood wants perfect control instead" (Lawrence Lessig, Cyberspace Prosecutor, in: The Industry Standard, February 2000).

In the United States, Hollywood and AT&T, after its merger with MediaOne becoming the biggest US cable service provider, return to their prior positions in the Seventies: the control of content and infrastructure. If most people will access the Net via set up boxes connected to a TV set, it will become a kind of television, at least in the USA.

For small independent media it will become very hard to be heard, especially for those offering streaming video and music. Increasingly faster data transmissions just apply to download capacities; upload capacities are much - on the average about eight times - lower than download capacities. As an AT&T executive said in response to criticism: "We haven't built a 56 billion dollar cable network to have the blood sucked from our veins" (Lawrence Lessig, The Law in the Code: How the Net is Regulated, Lecture at the Institute for Human Sciences, Vienna, May 29th, 2000).

Consumers, not producers are preferred.

For corporations what remains to be done to control the Net is mainly to cope with the fact that because of the Net it has become increasingly difficult to sustain cultural and legal standards. On Nov 11, 1995 the German prosecuting attorney's office searched Compuserve Germany, the branch of an international Internet service provider, because the company was suspected of having offered access to child pornography. Consequently Compuserve blocked access to more than 200 newsgroups, all containing "sex" or "gay" in their names, for all its customers. But a few days later, an instruction for access to these blocked newsgroups via Compuserve came into circulation. On February 26, 1997, Felix Somm, the Chief Executive Officer of Compuserve Germany, was accused of complicity with the distribution of child and animal pornography in newsgroups. In May 1998 he received a prison sentence for two years. This sentence was suspended against a bail of about 51.000 Euro. The sentence was justified by pointing to the fact that Compuserve Germany offered access to its US parent company's servers hosting child pornography. Felix Somm was held responsible for access to forbidden content he could not know of. (For further information (in German) click here.)

Also in 1995, as an attack on US Vice-President Al Gore's intention to supply all public schools with Internet access, Republican Senator Charles Grassley warned of the lurking dangers for children on the Net. By referring to a Time magazine cover story by Philip Elmer-Dewitt from July 3 on pornography on the Net, he pointed out that 83,5% of all images online are pornographic. But Elmer-Dewitt was wrong. Obviously unaware of the difference between Bulletin Board Systems and the Net, he referred misleadingly to Marty Rimm's article Marketing Pornography on the Information Superhighway, published in the prestigious Georgetown Law Journal (vol. 83, June 1995, pp. 1849-1935). Rimm knew of this difference, of course, and stated it clearly. (For further information see Hoffman & Novak, The Cyberporn debate, http://ecommerce.vanderbilt.edu/cyberporn.debate.html and Franz Wegener, Cyberpornographie: Chronologie einer Hexenjagd; http://www.intro-online.de/c6.html)

Almost inevitably anxieties accompany the introduction of new technologies. In the 19th century it was said that traveling by train is bad for health. The debate produced by Time magazine's cover story and Senator Grassley's attack caused the impression that the Net has multiplied possible dangers for children. The global communication networks seem to be a inexhaustible source of mushrooming child pornography. Later would-be bomb recipes found on the Net added to already prevailing anxieties. As even in industrialized countries most people still have little or no first-hand experience with the Net, anxieties about child pornography or terrorist attacks can be stirred up and employed easily.

A similar and related debate is going on about the glorification of violence and erotic depictions in media. Pointing to a "toxic popular culture" shaped by media that "distort children's view of reality and even undermine their character growth", US right-wing social welfare organizations and think tanks call for strong media censorship. (See An Appeal to Hollywood, http://www.media-appeal.org/appeal.htm) Media, especially films and videos, are already censored and rated, so it is more censorship that is wanted.

The intentions for stimulating a debate on child pornography on the Net were manifold: Inter alia, it served the Republican Party to attack Democrat Al Gore's initiative to supply all public schools with Internet access; additionally, the big media corporations realized that because of the Net they might have to face new competitors and rushed to press for content regulation. Taking all these intentions together, we can say that this still ongoing debate constitutes the first and most well known attempt to impose content regulation on the Net. Consequently, at least in Western countries, governments and media corporations refer to child pornography for justifying legal requirement and the implementation of technologies for the surveillance and monitoring of individuals, the filtering, rating and blocking of content, and the prohibition of anonymous publishing on the Net.

In the name of "cleaning" the Net of child pornography, our basic rights are restricted. It is the insistence on unrestricted basic rights that needs to be justified, as it may seem.

Underlying the campaign to control the Net are several assumptions. Inter alia: The Net lacks control and needs to be made safe and secure; we may be exposed inadvertently to pornographic content; this content is harmful to children. Remarkably, racism seems to be not an issue.

The Net, especially the World Wide Web, is not like television (although it is to be feared this is what it might become like within the next years). Say, little Mary types "Barbie" in a search engine. Click here to see what happens. It is true, sometimes you might have the opportunity to see that pornography is just a few mouse clicks away, but it is not likely that you might be exposed to pornographic content unless you make deliberate mouse clicks.

In reaction to these anxieties, but in absence of data how children use the Internet, the US government released the Communications Decency Act (CDA) in 1996. In consequence the Electronic Frontier Foundation (EFF) launched the famous Blue Ribbon Campaign and, among others, America Online and Microsoft Corporation supported a lawsuit of the American Civil Liberties Union (ACLU) against this Act. On June 26, 1997, the US Supreme Court ruled the CDA as unconstitutional under the provisions of the First Amendment to the Constitution: The Communications Decency Act violated the basic right to free expression. After a summit with the US government industry leaders announced the using of existing rating and blocking systems and the development of new ones for "inappropriate" online resources.

So, after the failing of the CDA the US government has shifted its responsibility to the industry by inviting corporations to taking on governmental tasks. Bearing in the mind the CompuServe case and its possible consequences, the industry welcomed this decision and was quick to call this newly assumed responsibility "self-regulation". Strictly speaking, "self-regulation" as meant by the industry does not amount to the regulation of the behaviour of corporations by themselves. On the opposite, "self-regulation" is to be understood as the regulation of users' behaviour by the rating, filtering and blocking of Internet content considered being inappropriate. The Internet industry tries to show that technical solutions are more favourable than legislation und wants to be sure, not being held responsible and liable for illegal, offensive or harmful content. A new CompuServe case and a new Communications Decency Act shall be averted.

In the Memorandum Self-regulation of Internet Content released in late 1999 by the Bertelsmann Foundation it is recommended that the Internet industry joins forces with governmental institutions for enforcing codes of conduct and encouraging the implementation of filters and ratings systems. For further details on the Memorandum see the study by the Center for Democracy and Technology, An Analysis of the Bertelsmann Foundation Memorandum on Self-Regulation of Internet Content: Concerns from a User Empowerment Perspective.

In fact, the "self-regulation" of the Internet industry is privatized censorship performed by corporations and right-wing NGOs. Censorship has become a business. "Crucially, the lifting of restrictions on market competition hasn't advanced the cause of freedom of expression at all. On the contrary, the privatisation of cyberspace seems to be taking place alongside the introduction of heavy censorship." (Richard Barbrook and Andy Cameron, The Californian Ideology)

While trying to convince us that its technical solutions are appropriate alternatives to government regulation, the Internet industry cannot dispense of governmental backing to enforce the proposed measures. This adds to and enforces the censorship measures already undertaken by governments. We are encouraged to use today's information and communication technologies, while the flow of information is restricted.

According to a report by Reporters Sans Frontières, quoted by Leonard R. Sussman in his essay Censor Dot Gov. The Internet and Press Freedom 2000, the following countries totally or largely control Internet access: Azerbaijan, Belarus, Burma, China, Cuba, Iran, Iraq, Kazakhstan, Kirghizstan, Libya, North Korea, Saudi Arabia, Sierra Leone, Sudan, Syria, Tajikistan, Tunisia, Turkmenistan, Uzbekistan, and Vietnam.

TEXTBLOCK 13/20 // URL: http://world-information.org/wio/infostructure/100437611742/100438658968
 
Sponsorship Models

With new sponsorship models being developed, even further influence over content from the corporate side can be expected. Co-operating with Barnes & Nobel Booksellers, the bookish e-zine FEED for instance is in part relying on sponsoring. Whenever a specific title is mentioned in the editorial, a link is placed in the margin - under the heading "Commerce" - to an appropriate page on Barnes & Noble. Steve Johnson, editor of FEED, says "We do not take a cut of any merchandise sold through those links.", but admits that the e-zine does indirectly profit from putting those links there.

TEXTBLOCK 14/20 // URL: http://world-information.org/wio/infostructure/100437611652/100438658034
 
Timeline BC

~ 1900 BC: Egyptian writers use non-standard Hieroglyphs in inscriptions of a royal tomb; supposedly this is not the first but the first documented example of written cryptography

1500 an enciphered formula for the production of pottery is done in Mesopotamia

parts of the Hebrew writing of Jeremiah's words are written down in "atbash", which is nothing else than a reverse alphabet and one of the first famous methods of enciphering

4th century Aeneas Tacticus invents a form of beacons, by introducing a sort of water-clock

487 the Spartans introduce the so called "skytale" for sending short secret messages to and from the battle field

170 Polybius develops a system to convert letters into numerical characters, an invention called the Polybius Chequerboard.

50-60 Julius Caesar develops an enciphering method, later called the Caesar Cipher, shifting each letter of the alphabet an amount which is fixed before. Like atbash this is a monoalphabetic substitution.

TEXTBLOCK 15/20 // URL: http://world-information.org/wio/infostructure/100437611776/100438659084
 
FREEnet (The Network for Research, Education and Engineering)

FREEnet is an academic and research network, interconnecting computer networks of research institutes of the Russian Academy of Sciences, universities, colleges, and other research and academic institutions. It was established in 1991 by the N.D. Zelinsky Institute of Organic Chemistry at the Center of Computer Assistance to Chemical Research. It provides its more than 350 members of the academic and educational community with all types of basic Internet services and various information services.

Strategies and Policies

FREEnets general intention is to become a backbone infrastructure providing:

Open networking services for efficient access to the network and information resources located both in Russia and all over the Internet.

Reliable network connectivity for research, academic and educational communities in Russia and abroad.

Worldwide access to science and information resources of the Russian Academy of Sciences, universities and colleges in Russia.

Assistance to the progress of Russian based fundamental research.

Assistance to the development and application of modern information technologies in education.

TEXTBLOCK 16/20 // URL: http://world-information.org/wio/infostructure/100437611734/100438659253
 
Timeline 1600 - 1900 AD

17th century Cardinal Richelieu invents an encryption-tool called grille, a card with holes for writing messages on paper into the holes of those cards. Afterwards he removes the cards and fills in the blanks, so the message looks like an ordinary letter. The recipient needs to own the same card

- Bishop John Wilkins invents a cryptologic system looking like music notes. In a book he describes several forms of steganographic systems like secrets inks, but also the string cipher. He mentions the so-called Pig Latin, a spoken way of encryption that was already used by the ancient Indians

- the English scientist, magician and astrologer John Dee works on the ancient Enochian alphabet; he also possesses an encrypted writing that could not been broken until today

1605/1623 Sir Francis Bacon (= Francis Tudor = William Shakespeare?) writes several works containing ideas about cryptography. One of his most important advises is to use ciphers in such a way that no-one gets suspicious that the text could be enciphered. For this the steganogram was the best method, very often used in poems. The attempt to decipher Shakespeare's sonnets (in the 20th century) lead to the idea that his works had been written by Francis Bacon originally.

1671 Leibniz invents a calculating machine that uses the binary scale which we still use today, more advanced of course, called the ASCII code

18th century this is the time of the Black Chambers of espionage in Europe, Vienna having one of the most effective ones, called the "Geheime Kabinettskanzlei", headed by Baron Ignaz von Koch. Its task is to read through international diplomatic mail, copy letters and return them to the post-office the same morning. Supposedly about 100 letters are dealt with each day.

1790's Thomas Jefferson and Robert Patterson invent a wheel cipher

1799 the Rosetta Stone is found and makes it possible to decipher the Egyptian Hieroglyphs

1832 or 1838 Sam Morse develops the Morse Code, which actually is no code but an enciphered alphabet of short and long sounds. The first Morse code-message is sent by telegraph in 1844.

1834 the Braille Code for blind people is developed in today's form by Louis Braille

1844 the invention of the telegraph changes cryptography very much, as codes are absolutely necessary by then

1854 the Playfair cipher is invented by Sir Charles Wheatstone

1859 for the first time a tomographic cipher gets described

1861 Friedrich W. Kasiski does a cryptoanalysis of the Vigenère ciphers, which had been supposed to be uncrackable for ages

1891 Major Etienne Bazeries creates a new version of the wheel cipher, which is rejected by the French Army

1895 the invention of the radio changes cryptography-tasks again and makes them even more important

TEXTBLOCK 17/20 // URL: http://world-information.org/wio/infostructure/100437611776/100438658974
 
On-line Advertising Revenues

Although Internet advertising only really started in 1994, revenues showed a steady and fast growth. In 1997 US$ 906.5 million were spent on on-line advertising. Compared with advertising revenue for the television industry in equivalent dollars for its third year, the Internet was slightly ahead, at US$ 907 million compared to television's US$ 834 million. 1998 on-line advertising grew by 112 percent to US$ 1.92 billion in revenues, and is on track to hit US$ 4 billion in 1999, which would put Internet advertising at about 2 percent of the U.S. ad market.

Table: Spending on On-Line Advertising by Category

(first quarter 1999)

Category

Percent

Consumer-related

27 %

Financial services

21 %

Computing

20 %

Retail/mail order

13 %

New media

8 %



Table: Types of On-Line Advertising

(first quarter 1999)

Type of Advertising

Percent

Banners

58 %

Sponsorships

29 %

Interstitials

6 %

E-mail

1 %

Others

6 %



Source: Internet Advertising Bureau (IAB).

TEXTBLOCK 18/20 // URL: http://world-information.org/wio/infostructure/100437611652/100438657944
 
ZaMir.net

ZaMir.net started in 1992 trying to enable anti-war and human rights groups of former Yugoslavia to communicate with each other and co-ordinate their activities. Today there are an estimated 1,700 users in 5 different Bulletin Board Systems (Zagreb, Belgrade, Ljubljana, Sarajevo and Pristiana). Za-mir Transnational Network (ZTN) offers e-mail and conferences/newsgroups. The ZTN has its own conferences, which are exchanged between the 5 BBS, and additionally offers more than 150 international conferences. ZTN aim is to help set up systems in other cities in the post-Yugoslav countries that have difficulty connecting to the rest of the world.

History

With the war in Yugoslavia anti-war and human rights groups of former Yugoslavia found it very difficult to organize and met huge problems to co-ordinate their activities due to immense communication difficulties. So in 1992 foreign peace groups together with Institutions in Ljubljana, Zagreb and Belgrade launched the Communications Aid project. Modems were distributed to peace and anti-war groups in Ljubljana, Zagreb, Belgrade and Sarajevo and a BBS (Bulletin Board System) installed.

As after spring 1992 no directs connections could be made they were done indirectly through Austria, Germany or Britain, which also enabled a connection with the worldwide networks of BBS's. Nationalist dictators therefore lost their power to prevent communication of their people. BBS were installed in Zagreb and Belgrade and connected to the APC Network and associated networks. Za-mir Transnational Network (ZTN) was born.

Strategies and Policies

With the help of ZaMir's e-mail network it have been possible to find and coordinate humanitarian aid for some of the many refugees of the war. It has become an important means of communication for humanitarian organizations working in the war region and sister organizations form other countries. It helps co-ordinate work of activists form different countries of former Yugoslavia, and it also helps to coordinate the search for volunteers to aid in war reconstruction. ZTN also helped facilitate exchange of information undistorted by government propaganda between Croatia, Serbia and Bosnia. Independent magazines like Arkzin (Croatia) and Vreme (Serbia) now publish electronic editions on ZTN.

TEXTBLOCK 19/20 // URL: http://world-information.org/wio/infostructure/100437611734/100438659208
 
Commercial vs. Independent Content

Commercial media aim towards economies of scale and scope, with the goal to maximize profits. As advertising money usually is their primary source of revenue their content very often is attuned to meet the needs of advertisers and marketers. Information necessary for a citizen's participation in the public sphere usually only plays a minor role in their programming, as it does not comply with the demands of an economic system whose principal aim is the generation of profit. They also virtually always are structured in accord with and to help reinforce society's defining hierarchical social relationships, and are generally controlled by and controlling of other major social institutions, particularly corporations.

Independent content provider on the other hand mostly act on a non-profit basis and try to avoid dependence on corporate powers and the state. One of their main concerns is the critical observation of public interest issues. The central aim of independent content provider's activities usually is to bring aspects and standpoints neglected by the (commercial) mainstream media to the public and subvert society's defining hierarchical social relationships. Promoting public debate and an active civil society they engage in the organization of alert actions and information campaigns or create subversive art

TEXTBLOCK 20/20 // URL: http://world-information.org/wio/infostructure/100437611734/100438659280
 
Blaise Pascal

b. June 19, 1623, Clermont-Ferrand, France
d. August 19, 1662, Paris, France

French mathematician, physicist, religious philosopher, and master of prose. He laid the foundation for the modern theory of probabilities, formulated what came to be known as Pascal's law of pressure, and propagated a religious doctrine that taught the experience of God through the heart rather than through reason. The establishment of his principle of intuitionism had an impact on such later philosophers as Jean-Jacques Rousseau and Henri Bergson and also on the Existentialists.

INDEXCARD, 1/25
 
Günther Anders

Born in Germany in 1913, Günther Anders spent the Nazi period exiled in the USA where he stayed until 1950. His chief work Die Antiquiertheit des Menschen is an extensive analysis of human existence in a technised world. Among the most outstanding theses of Anders is the concept a permanent gap between the potential of technical artefacts and the human mind's power to imagine the consequences of technology. Men think of themselves as "antiquated" in comparison to their artefacts, and feel "promethean embarrassment". Anders was among the first thinkers to react to the Holocaust and the dropping of the Atomic bomb. The ethical quandaries resulting from the latter are documented in an exchange of letters between Anders and Claude Eatherly, the pilot of the Hiroshima plane, in Burning Conscience.

INDEXCARD, 2/25
 
Nadia Thalman

Nadia Thalman is director of MIRAlab at the University of Geneva, Switzerland. Thalmann has become known as the creator of "virtual Marylyn", an installation which allowed visitors to literally to slip into Marylyn's shoes. Thalman's work is located at interface between science and art. It is about modelling human bodies for science and creative purposes, e.g. as virtual actors in movies. Thalman insists that artificial beings must be beautiful, in addition to being useful, as we will be living with them at close quarters.

INDEXCARD, 3/25
 
Seagram Company Ltd.

Seagram is the largest producer and marketer of distilled spirits in the world. It is headquartered in Montreal, Que. The company began when Distillers Corp., Ltd., a Montreal distillery owned by Samuel Bronfman, acquired Joseph E. Seagram & Sons in 1928. Under the leadership of the founder's son, Edgar M. Bronfman, who became head of the company in 1971, the firm diversified during the 1950s and '60s from its original base of blended whiskies into the production and marketing of scotch, bourbon, rum, vodka, gin, and many different wines. It also expanded into the European, Latin American, East Asian, and African markets with its products. The company adopted its present name in 1975. It produces more than 400 different brands of distilled spirits and wines. Edgar M. Bronfman, Jr., took over as head of the company in 1989. Seagram in 1995 purchased MCA Inc., a media and entertainment firm, from the Matsushita Electric Industrial Company.

INDEXCARD, 4/25
 
Mark

A mark (trademark or service mark) is "... a sign, or a combination of signs, capable of distinguishing the goods or services of one undertaking from those of other undertakings. The sign may particularly consist of one or more distinctive words, letters, numbers, drawings or pictures, emblems, colors or combinations of colors, or may be three-dimensional..." (WIPO) To be protected a mark must be registered in a government office whereby generally the duration is limited in time, but can be periodically (usually every 10 years) renewed.

INDEXCARD, 5/25
 
Internet Protocol Number (IP Number)

Every computer using TCP/IP has a 32 bit-Internet address, an IP number. This number consists of a network identifier and of a host identifier. The network identifier is registered at and allocated by a Network Information Center (NIC), the host identifier is allocated by the local network administration.

IP numbers are divided into three classes. Class A is restricted for big-sized organizations, Class B to medium-sized ones as universities, and Class C is dedicated to small networks.

Because of the increasing number of networks worldwide, networks belonging together, as LANs forming a corporate network, are allocated a single IP number.

INDEXCARD, 6/25
 
CIM

To perform manufacturing firm's functions related to design and production the CAD/CAM technology, for computer-aided design and computer-aided manufacturing, was developed. Today it is widely recognized that the scope of computer applications must extend beyond design and production to include the business functions of the firm. The name given to this more comprehensive use of computers is computer-integrated manufacturing (CIM).

INDEXCARD, 7/25
 
The European Convention on Human Rights and its Five Protocols

As can be read in the Convention's preamble, the member states of the Council of Europe, the European Convention on Human Rights is intended as a follow-up of the Universal Declaration of Human Rights proclaimed by the General Assembly of the United Nations on 10 December 1948 and as an official act of "securing the universal and effective recognition and observance of the Rights therein declared." Because it is stated "that the aim of the Council of Europe is the achievement of greater unity between its Members and that one of the methods by which the aim is to be pursued is the maintenance and further realization of Human Rights and Fundamental Freedoms", the European Convention on Human Rights can be read as the political sibling to the biblical Ten Commandments on which effective and legitimate European democratic government are based. The European Convention on Human Rights is intended to represent the essence of the common heritage of European political traditions and ideals.

Signed in Rome on November 4, 1950, the Convention is supplemented by five protocols dated from March 20, 1952 (Paris), May 6, 1963, September 16, 1963, and January 20, 1966 (Strasbourg).

http://www.hri.org/docs/ECHR50.html

INDEXCARD, 8/25
 
Internet Research Task Force

Being itself under the umbrella of the Internet Society, the Internet Research Task Force is an umbrella organization of small research groups working on topics related to Internet protocols, applications, architecture and technology. It is governed by the Internet Research Steering Group.

http://www.irtf.org

http://www.irtf.org/
INDEXCARD, 9/25
 
Vandana Shiva

Vandana Shiva is the Director of the Research Foundation for Science, Technology and Ecology in New Delhi. She has been a tireless and one of the most original campaigners for ecological diversity, eco-feminism and against "official" development policies and commercial exploitation. Book publications include Ecofeminism (1993), Monocultures of the Mind (1993) and Biopiracy : The Plunder of Nature and Knowledge (1997

INDEXCARD, 10/25
 
Robot

Robot relates to any automatically operated machine that replaces human effort, though it may not resemble human beings in appearance or perform functions in a humanlike manner. The term is derived from the Czech word robota, meaning "forced labor." Modern use of the term stems from the play R.U.R., written in 1920 by the Czech author Karel Capek, which depicts society as having become dependent on mechanical workers called robots that are capable of doing any kind of mental or physical work. Modern robot devices descend through two distinct lines of development--the early automation, essentially mechanical toys, and the successive innovations and refinements introduced in the development of industrial machinery.

INDEXCARD, 11/25
 
Chrysler Corporation

American automotive company first incorporated in 1925 and reorganized and newly incorporated in 1986. It has long been the third largest automaker in the United States (after General Motors and the Ford Motor Company). Founded by Walter P. Chrysler, it took over the business and properties of Maxwell Motor Company, Inc. (first formed in 1913). Today its major subsidiaries include Chrysler Automotive Operations, Inc., which manufactures Plymouth, Dodge, and Chrysler passenger cars, Dodge trucks, and auto parts and accessories; and the Chrysler Financial Corporation. Headquarters are in Highland Park, Mich., U.S.

INDEXCARD, 12/25
 
Internet Societal Task Force

The Internet Societal Task Force is an organization under the umbrella of the Internet Society dedicated to assure that the Internet is for everyone by identifying and characterizing social and economic issues associated with the growth and use of Internet. It supplements the technical tasks of the Internet Architecture Board, the Internet Engineering Steering Group and the Internet Engineering Task Force.

Topics under discussion are social, economic, regulatory, physical barriers to the use of the Net, privacy, interdependencies of Internet penetration rates and economic conditions, regulation and taxation.

http://www.istf.isoc.org/

http://www.istf.isoc.org/
INDEXCARD, 13/25
 
Internet Engineering Steering Group

On behalf of the Internet Society, the Internet Engineering Steering Group is responsible for the technical management of the evolution of the architecture, the standards and the protocols of the Net.

http://www.ietf.org/iesg.html

http://www.ietf.org/iesg.html
INDEXCARD, 14/25
 
MIRALab

MIRALab is a research laboratory attached to the University of Geneva. Its motto is "where research meets creativity". MIRAlab's objective is to model human functionalities, such as movement or facial expression, in a realistic way.

INDEXCARD, 15/25
 
AT&T Labs-Research

The research and development division of AT&T. Inventions made at AT&T Labs-Research include so important ones as stereo recording, the transistor and the communications satellite.

http://www.research.att.com/

INDEXCARD, 16/25
 
Network Information Center (NIC)

Network information centers are organizations responsible for registering and maintaining the domain names on the World Wide Web. Until competition in domain name registration was introduced, they were the only ones responsible. Most countries have their own network information center.

INDEXCARD, 17/25
 
Censorship of Online Content in China

During the Tian-an men massacre reports and photos transmitted by fax machines gave notice of what was happening only with a short delay. The Chinese government has learned his lesson well and "regulated" Internet access from the beginning. All Internet traffic to and out of China passes through a few gateways, a few entry-points, thus making censorship a relatively easy task. Screened out are web sites of organizations and media which express dissident viewpoints: Taiwan's Democratic Progress Party and Independence Party, The New York Times, CNN, and sites dealing with Tibetan independence and human rights issues.

Users are expected not to "harm" China's national interests and therefore have to apply for permission of Internet access; Web pages have to be approved before being published on the Net. For the development of measures to monitor and control Chinese content providers, China's state police has joined forces with the MIT.

For further information on Internet censorship, see Human Rights Watch, World Report 1999.

http://www.dpp.org/
http://www.nytimes.com/
http://www.hrw.org/worldreport99/special/inte...
INDEXCARD, 18/25
 
Caching

Caching is a mechanism that attempts to decrease the time it takes to retrieve data by storing a copy at a closer location.

INDEXCARD, 19/25
 
Calculator

Calculators are machines for automatically performing arithmetical operations and certain mathematical functions. Modern calculators are descendants of a digital arithmetic machine devised by Blaise Pascal in 1642. Later in the 17th century, Gottfried Wilhelm von Leibniz created a more advanced machine, and, especially in the late 19th century, inventors produced calculating machines that were smaller and smaller and less and less laborious to use.

INDEXCARD, 20/25
 
Coca-Cola Company

American corporation founded in 1892 and today engaged primarily in the manufacture and sale of syrup and concentrate for Coca-Cola, a sweetened, carbonated beverage that is a cultural institution in the United States and a symbol around the world of American tastes. The company also produces and sells other soft drinks and citrus beverages. Corporate headquarters are in Atlanta, Ga. The post-World War II years saw diversification in the packaging of Coca-Cola and also in the development or acquisition of new products. In 1946 the company purchased rights to the Fanta soft drink, previously developed in Germany. It introduced the lemon-lime drink Sprite in 1961 and the sugar-free cola Tab in 1963. By purchase of Minute Maid Corporation in 1960, it entered the citrus beverage market. In 1982 the company acquired a controlling interest in Columbia Pictures, a motion picture and entertainment company, but sold its interest to Sony Corporation in 1989.

INDEXCARD, 21/25
 
Moral rights

Authors of copyrighted works (besides economic rights) enjoy moral rights on the basis of which they have the right to claim their authorship and require that their names be indicated on the copies of the work and in connection with other uses thereof. Moral rights are generally inalienable and remain with the creator even after he has transferred his economic rights, although the author may waive their exercise.

INDEXCARD, 22/25
 
About Wines

http://www.aboutwine.com/

http://www.aboutwine.com/
INDEXCARD, 23/25
 
Braille

Universally accepted system of writing used by and for blind persons and consisting of a code of 63 characters, each made up of one to six raised dots arranged in a six-position matrix or cell. These Braille characters are embossed in lines on paper and read by passing the fingers lightly over the manuscript. Louis Braille, who was blinded at the age of three, invented the system in 1824 while a student at the Institution Nationale des Jeunes Aveugles (National Institute for Blind Children), Paris.

INDEXCARD, 24/25
 
FEED

http://www.feed.com/

http://www.feed.com/
INDEXCARD, 25/25