In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 1/20 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
Credibility

The magic word is credibility.

NATO took away part of its credibility by extending more and more the definition of military facilities, which where to be destroyed in order to end the Serb power.

Disinformation can mean leaving out important informations. Telling lies is not the only method of disinformation. The not telling also creates thoughts and delegates them into certain directions, whereas other models of thinking are left out.

Like this, the deaths on the own side are adjusted downwards whereas the victims of the enemy are counted proudly - as long as they are not civilians. The post-Gulf War period demonstrated how the population reacts if the number of innocent victims is much higher than expected. It was the fact of those numbers that provoked the biggest part of the post-war critique.

The media in democratic states tend to criticize this, which does not mean that they always want to be free of governmental influence. They can choose to help the government in a single case by not writing anything against it or by writing pro-government stories.

At the same time every democracy has undemocratic parts in it - which is already part of democracy itself. There are situations when a democratic government may find it essential to put pressure on the media to inform the population in a certain way; and also censorship is nothing that can only be connected to dictatorship; just think of the Falkland War, the Gulf-War or the Kosovo-War.

TEXTBLOCK 2/20 // URL: http://world-information.org/wio/infostructure/100437611661/100438658709
 
Late 1950s - Early 1960s: Second Generation Computers

An important change in the development of computers occurred in 1948 with the invention of the transistor. It replaced the large, unwieldy vacuum tube and as a result led to a shrinking in size of electronic machinery. The transistor was first applied to a computer in 1956. Combined with the advances in magnetic-core memory, the use of transistors resulted in computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.

Stretch by IBM and LARC by Sperry-Rand (1959) were the first large-scale machines to take advantage of the transistor technology (and also used assembly language instead of the difficult machine language). Both developed for atomic energy laboratories could handle enormous amounts of data, but still were costly and too powerful for the business sector's needs. Therefore only two LARC's were ever installed.

Throughout the early 1960s there were a number of commercially successful computers (for example the IBM 1401) used in business, universities, and government and by 1965 most large firms routinely processed financial information by using computers. Decisive for the success of computers in business was the stored program concept and the development of sophisticated high-level programming languages like FORTRAN (Formular Translator), 1956, and COBOL (Common Business-Oriented Language), 1960, that gave them the flexibility to be cost effective and productive. The invention of second generation computers also marked the beginning of an entire branch, the software industry, and the birth of a wide range of new types of careers.

TEXTBLOCK 3/20 // URL: http://world-information.org/wio/infostructure/100437611663/100438659439
 
Exchange of the Text

One of the easiest tools for disinformation is to exchange the words written below a photograph. The entire meaning of the picture can be varied like this:

- The visit of a school-group at a former international camp can change into a camp, where children are imprisoned (which happened in the Russian city of Petroskoy in 1944).

- Victims of war can change nationality. The picture of the brutal German soldier in World War II that was shown in many newspapers to demonstrate the so-called typical face of a murderer, turned out to be French and a victim in other newspapers.

- In 1976 a picture of children in a day-nursery in the GDR is taken: The children, coming out of the shower, were dressed up in terry cloth suits with stripes. The same year the photograph with the happily laughing boys and girls wins the contest "a beautiful picture". Two years later a small part of the photograph can be seen in a Christian magazine in West-Germany, supposedly showing children from a concentration camp in the USSR. The smiling faces now seem to scream. (source: Stiftung Haus der Geschichte der Bundesrepublik Deutschland (ed.): Bilder, die lügen. Begleitbuch zur Ausstellung im Haus der Geschichte der Bundesrepublik Deutschland. Bonn 1998, p. 79)

TEXTBLOCK 4/20 // URL: http://world-information.org/wio/infostructure/100437611661/100438658773
 
Public Relations and Propaganda

Public relations usually is associated with the influencing of public opinion. Therefore it has subsequently been linked with propaganda. Using one of the many definitions of propaganda "... the manipulation of symbols as a means of influencing attitudes on controversial matters" (Harold D. Lasswell), the terms propaganda and PR seem to be easily interchangeable.

Still many authors explicitly distinguish between public relations, advertising and propaganda. Unlike PR, which is often described as objective and extensive information of the public, advertising and propaganda are associated with manipulative activities. Nevertheless to treat public relations and propaganda as equivalents stands in the tradition of PR. Edward L. Bernays, one of the founders of public relations wrote "The only difference between propaganda and education, really, is the point of view. The advocacy of what we believe in is education. The advocacy of what we don't believe is propaganda."

Also institutions like the German Bundeswehr use the terms publics relations and propaganda synonymously. After a 1990 legislation of the former minister of defense Stoltenberg, the "psychological influence of the enemy" was ceased during peace time and the Academy for Psychological Defense renamed to Academy for Information and Communication, among other things responsible for scientific research in the field of public relations.

TEXTBLOCK 5/20 // URL: http://world-information.org/wio/infostructure/100437611652/100438658084
 
Intellectual Property and the "Information Society" Metaphor

Today the talk about the so-called "information society" is ubiquitous. By many it is considered as the successor of the industrial society and said to represent a new form of societal and economical organization. This claim is based on the argument, that the information society uses a new kind of resource, which fundamentally differentiates from that of its industrial counterpart. Whereas industrial societies focus on physical objects, the information society's raw material is said to be knowledge and information. Yet the conception of the capitalist system, which underlies industrial societies, also continues to exist in an information-based environment. Although there have been changes in the forms of manufacture, the relations of production remain organized on the same basis. The principle of property.

In the context of a capitalist system based on industrial production the term property predominantly relates to material goods. Still even as in an information society the raw materials, resources and products change, the concept of property persists. It merely is extended and does no longer solely consider physical objects as property, but also attempts to put information into a set of property relations. This new kind of knowledge-based property is widely referred to as "intellectual property". Although intellectual property in some ways represents a novel form of property, it has quickly been integrated in the traditional property framework. Whether material or immaterial products, within the capitalist system they are both treated the same - as property.

TEXTBLOCK 6/20 // URL: http://world-information.org/wio/infostructure/100437611725/100438659429
 
History: "Indigenous Tradition"

In preliterate societies the association of rhythmic or repetitively patterned utterances with supernatural knowledge endures well into historic times. Knowledge is passed from one generation to another. Similar as in the Southern tradition intellectual property rights are rooted in a concept of 'collective' or 'communal' intellectual property existing in perpetuity and not limited to the life of an individual creator plus some number of years after his or her death. Often rights are exercised by only one individual in each generation, often through matrilineal descent.


TEXTBLOCK 7/20 // URL: http://world-information.org/wio/infostructure/100437611725/100438659557
 
Positions Towards the Future of Copyright in the "Digital Age"

With the development of new transmission, distribution and publishing technologies and the increasing digitalization of information copyright has become the subject of vigorous debate. Among the variety of attitudes towards the future of traditional copyright protection two main tendencies can be identified:

Eliminate Copyright

Anti-copyrightists believe that any intellectual property should be in the public domain and available for all to use. "Information wants to be free" and copyright restricts people's possibilities concerning the utilization of digital content. An enforced copyright will lead to a further digital divide as copyright creates unjust monopolies in the basic commodity of the "information age". Also the increased ease of copying effectively obviates copyright, which is a relict of the past and should be expunged.

Enlarge Copyright

Realizing the growing economic importance of intellectual property, especially the holders of copyright (in particular the big publishing, distribution and other core copyright industries) - and therefore recipients of the royalties - adhere to the idea of enlarging copyright. In their view the basic foundation of copyright - the response to the need to provide protection to authors so as to give them an incentive to invest the time and effort required to produce creative works - is also relevant in a digital environment.

TEXTBLOCK 8/20 // URL: http://world-information.org/wio/infostructure/100437611725/100438659711
 
The Copyright Industry

Copyright is not only about protecting the rights of creators, but has also become a major branch of industry with significant contributions to the global economy. According to the International Intellectual Property Alliance the U.S. copyright industry has grown almost three times as fast as the economy as a whole for the past 20 years. In 1997, the total copyright industries contributed an estimated US$ 529.3 billion to the U.S. economy with the core copyright industries accounting for US$ 348.4 billion. Between 1977 and 1997, the absolute growth rate of value added to the U.S. GDP by the core copyright industries was 241 %. Also the copyright industry's foreign sales in 1997 (US$ 66.85 billion for the core copyright industries) were larger than the U.S. Commerce Department International Trade Administration's estimates of the exports of almost all other leading industry sectors. They exceeded even the combined automobile and automobile parts industries, as well as the agricultural sector.

In an age where knowledge and information become more and more important and with the advancement of new technologies, transmission systems and distribution channels a further increase in the production of intellectual property is expected. Therefore as copyright establishes ownership in intellectual property it is increasingly seen as the key to wealth in the future.

TEXTBLOCK 9/20 // URL: http://world-information.org/wio/infostructure/100437611725/100438658710
 
Linking and Framing: Cases

Mormon Church v. Sandra and Jerald Tanner

In a ruling of December 1999, a federal judge in Utah temporarily barred two critics of the Mormon Church from posting on their website the Internet addresses of other sites featuring pirated copies of a Mormon text. The Judge said that it was likely that Sandra and Jerald Tanner had engaged in contributory copyright infringement when they posted the addresses of three Web sites that they knew, or should have known, contained the copies.

Kaplan, Carl S.: Copyright Decision Threatens Freedom to Link. In: New York Times. December 10, 1999.

Universal Studios v. Movie-List

The website Movie-List, which features links to online, externally hosted movie trailers has been asked to completely refrain from linking to any of Universal Studio's servers containing the trailers as this would infringe copyright.

Cisneros, Oscar S.: Universal: Don't Link to Us. In: Wired. July 27, 1999.

More cases concerned with the issue of linking, framing and the infringement of intellectual property are published in:

Ross, Alexandra: Copyright Law and the Internet: Selected Statutes and Cases.

TEXTBLOCK 10/20 // URL: http://world-information.org/wio/infostructure/100437611725/100438659639
 
Identity vs. Identification

It has become a commonplace observation that the history of modernity has been accompanied by what one might call a general weakening of identity, both as a theoretical concept and as a social and cultural reality. This blurring of identity has come to full fruition in the 20th century. As a theoretical concept, identity has lost its metaphysical foundation of "full correspondence" following the destruction of metaphysics by thinkers such as Nietzsche, Heidegger, Witgenstein or Davidson. Nietzsche's "dead god", his often-quoted metaphor for the demise of metaphysics, has left western cultures not only with the problem of having to learn how to think without permanent foundations; it has left them with both the liberty of constructing identities, and the structural obligation to do so. The dilemmas arising out of this ambivalent situation have given rise to the comment that "god is dead, and men is not doing so well himself". The new promise of freedom is accompanied by the threat of enslavement. Modern, technologically saturated cultures survive and propagate and emancipate themselves by acting as the gatekeepers of their own technological prisons.

On the social and cultural levels, traditional clear-cut identities have become weakened as traditional cultural belonging has been undermined or supplanted by modern socio-technological structures. The question as to "who one is" has become increasingly difficult to answer: hybrid identities are spreading, identities are multiple, temporary, fleeting rather than reflecting an inherited sense of belonging. The war cry of modern culture industry "be yourself" demands the impossible and offers a myriad of tools all outcompeting each other in their promise to fulfil the impossible.

For many, identity has become a matter of choice rather than of cultural or biological heritage, although being able to chose may not have been the result of a choice. A large superstructure of purchasable identification objects caters for an audience finding itself propelled into an ever accelerating and vertiginous spiral of identification and estrangement. In the supermarket of identities, what is useful and cool today is the waste of tomorrow. What is offered as the latest advance in helping you to "be yourself" is as ephemeral as your identification with it; it is trash in embryonic form.

Identity has become both problematic and trivial, causing modern subjects a sense of thrownness and uprootedness as well as granting them the opportunity of overcoming established authoritarian structures. In modern, technologically saturated societies, the general weakening of identities is a prerequisite for emancipation. The return to "strong" clear-cut "real" identities is the way of new fundamentalism demanding a rehabilitation of "traditional values" and protected zones for metaphysical thought, both of which are to be had only at the price of suppression and violence.

It has become difficult to know "who one is", but this difficulty is not merely a private problem. It is also a problem for the exercise of power, for the state and other power institutions also need to know "who you are". With the spread of weak identities, power is exercised in a different manner. Power cannot be exercised without being clear who it addresses; note the dual significance of "subject". A weakened, hybrid undefined subject (in the philosophical sense) cannot be a "good" subject (in the political sense), it is not easy to sub-ject. Without identification, power cannot be exercised. And while identification is itself not a sufficient precondition for authoritarianism, it is certainly a necessary one.

Identities are therefore reconstructed using technologies of identification in order to keep the weakened and hence evasive subjects "sub-jected". States have traditionally employed bureaucratic identification techniques and sanctioned those who trying to evade the grip of administration. Carrying several passports has been the privilege of spies and of dubious outlaws, and not possessing an "ID" at all is the fate of millions of refugees fleeing violence or economic destitution. Lack of identification is structurally sanctioned by placelessness.

The technisised acceleration of societies and the weakening of identities make identification a complicated matter. On the one hand, bureaucratic identification techniques can be technologically bypassed. Passports and signatures can be forged; data can be manipulated and played with. On the other hand, traditional bureaucratic methods are slow. The requirements resulting from these constraints are met by biometric technology.

TEXTBLOCK 11/20 // URL: http://world-information.org/wio/infostructure/100437611729/100438658075
 
Global hubs of the data body industry

While most data bunkers are restricted to particular areas or contexts, there are others which act as global data nodes. Companies such as EDS (Electronic Data Systems), Experian, First Data Corporation and Equifax operate globally and run giant databases containing personal information. They are the global hubs of the data body economy.

Company

Sales in USD billions

Size of client database in million datasets





Equifax





1,7





360





Experian





1,5





779





Fist Data Corporation





5,5





260





EDS





18,5









(not disclosed)

(Sales and database sizes, 1998)

The size of these data repositories is constantly growing, so it is only a matter of time when everybody living in the technologically saturated part of the world will be registered in one of these data bunkers.

Among these companies, EDS, founded by the former US presidential candidate Ross Perot, known for his right-wing views and direct language, is of particular importance. Not only is it the world's largest data body company, it is also secretive about the size of its client database - a figure disclosed by the other companies either in company publications or upon enquiry. After all, the size of such a data base makes a company more attractive for potential customers.

For many years, EDS has been surrounded by rumours concerning sinister involvement with intelligence agencies. Beyond the rumours, though, there are also facts. EDS has a special division for government services. EDS does business with all military agencies of the US, as well as law enforcement agencies, justice agencies, and many others. The company also maintains a separate division for military equipment In 1984, the company became a subsidiary of General Motors, itself a leading manufacturer of military and intelligence systems. EDS is listed by the Federation of American Scientist's intelligence resource program as contractor to US intelligence agencies, and prides itself, amongst other things, to respond to the "rise of the citizen as a consumer".

TEXTBLOCK 12/20 // URL: http://world-information.org/wio/infostructure/100437611761/100438659778
 
Transparent customers. Direct marketing online



This process works even better on the Internet because of the latter's interactive nature. "The Internet is a dream to direct marketers", said Wil Lansing, CEO of the American retailer Fingerhut Companies. Many services require you to register online, requiring users to provide as much information about them as possible. And in addition, the Internet is fast, cheap and used by people who tend to be young and on the search for something interesting.

Many web sites also are equipped with user tracking technology that registers a users behaviour and preferences during a visit. For example, user tracking technology is capable of identifying the equipment and software employed by a user, as well as movements on the website, visit of links etc. Normally such information is anonymous, but can be personalised when it is coupled with online registration, or when personal identifcation has been obtained from other sources. Registration is often a prerequisite not just for obtaining a free web mail account, but also for other services, such as personalised start pages. Based on the information provided by user, the start page will then include advertisements and commercial offers that correspond to the users profile, or to the user's activity on the website.

One frequent way of obtaining such personal information of a user is by offering free web mail accounts offered by a great many companies, internet providers and web portals (e.g. Microsoft, Yahoo, Netscape and many others). In most cases, users get "free" accounts in return for submitting personal information and agreeing to receive marketing mails. Free web mail accounts are a simple and effective direct marketing and data capturing strategy which is, however, rarely understood as such. However, the alliances formed between direct advertising and marketing agencies on the one hand, and web mail providers on the other hand, such as the one between DoubleClick and Yahoo, show the common logic of data capturing and direct marketing. The alliance between DoubleClick and Yahoo eventually attracted the US largest direct marketing agency, Abacus Direct, who ended up buying DoubleClick.

However, the intention of collecting users personal data and create consumer profiles based on online behaviour can also take on more creative and playful forms. One such example is sixdegrees.com. This is a networking site based on the assumption that everybody on the planet is connected to everybody else by a chain of six people at most. The site offers users to get to know a lot of new people, the friends of their friends of their friends, for example, and if they try hard enough, eventually Warren Beatty or Claudia Schiffer. But of course, in order to make the whole game more useful for marketing purposes, users are encouraged to join groups which share common interests, which are identical with marketing categories ranging from arts and entertainment to travel and holiday. Evidently, the game becomes more interesting the more new people a user brings into the network. What seems to be fun for the 18 to 24 year old college student customer segment targeted by sixdegrees is, of course, real business. While users entertain themselves they are being carefully profiled. After all, data of young people who can be expected to be relatively affluent one day are worth more than money.

The particular way in which sites such as sixdegrees.com and others are structured mean that not only to users provide initial information about them, but also that this information is constantly updated and therefore becomes even more valuable. Consequently, many free online services or web mail providers cancel a user's account if it has not been uses for some time.

There are also other online services which offer free services in return for personal information which is then used for marketing purposes, e.g. Yahoo's Geocities, where users may maintain their own free websites, Bigfoot, where people are offered a free e-mail address for life, that acts as a relais whenever a customer's residence or e-mail address changes. In this way, of course, the marketers can identify friendship and other social networks, and turn this knowledge into a marketing advantage. People finders such as WhoWhere? operate along similar lines.

A further way of collecting consumer data that has recently become popular is by offering free PCs. Users are provided with a PC for free or for very little money, and in return commit themselves to using certain services rather than others (e.g. a particular internet provider), providing information about themselves, and agree to have their online behaviour monitored by the company providing the PC, so that accurate user profiles can be compiled. For example, the Free PC Network offers advertisers user profiles containing "over 60 individual demographics". There are literally thousands of variations of how a user's data are extracted and commercialised when online. Usually this happens quietly in the background.

A good inside view of the world of direct marketing can be gained at the website of the American Direct Marketing Association and the Federation of European Direct Marketing.

TEXTBLOCK 13/20 // URL: http://world-information.org/wio/infostructure/100437611761/100438659667
 
History: European Tradition

Only in Roman times the first rights referring to artistic works appeared. Regulations resembling a lasting exclusive right to copy did not occur until the 17th century. Before copyright was a private arrangement between guilds able to reproduce copies in commercial quantities.

In France and Western European countries "droits d'auteur" or author's rights is the core of what in the Anglo-American tradition is called copyright. Such rights are rooted in the republican revolution of the late 18th century, and the Rights of Man movement. Today in the European system the creator is front and center; later exploiters are only secondary players.

France

During the 18th century France gradually lost the ability to restrict intellectual property. Before the Revolution, all books, printers and booksellers had to have a royal stamp of approval, called a "privilege". In return for their lucrative monopoly, the French guild of printers and booksellers helped the police to suppress anything that upset royal sensibilities or ran contrary to their interests. Still there were also a whole lot of underground printers who flooded the country with pirated, pornographic and seditious literature. And thousands of writers, most at the edge of starvation.

In 1777 the King threatened the monopoly by reducing the duration of publisher's privileges to the lifetime of the authors. Accordingly a writer's work would go into the public domain after his death and could be printed by anyone. The booksellers fought back by argumenting that, no authority could take their property from them and give it to someone else. Seven months later, in August 1789, the revolutionary government ended the privilege system and from that time on anyone could print anything. Early in 1790 Marie-Jean-Antoine-Nicolas de Caritat, Marquis de Condorcet proposed giving authors power over their own work lasting until ten years after their deaths. The proposal - the basis for France's first modern copyright law - passed in 1793.

TEXTBLOCK 14/20 // URL: http://world-information.org/wio/infostructure/100437611725/100438659414
 
Association for Progressive Communication (APC)

The APC is a global federation of 24 non-profit Internet providers serving over 50,000 NGOs in 133 countries. Since 1990, APC has been supporting people and organizations worldwide, working together online for social, environmental and economic justice. The APC's network of members and partners spans the globe, with significant presence in Central and Eastern Europe, Africa, Asia and Latin America.

History

Between 1982 and 1987 several independent, national, non-profit computer networks emerged as viable information and communication resources for activists and NGOs. The networks were founded to make new communication techniques available to movements working for social change.

In 1987, people at GreenNet in England began collaborating with their counterparts at the Institute for Global Communications (IGC) in the United States. These two networks started sharing electronic conference material and demonstrated that transnational electronic communications could serve international as well as domestic communities working for peace, human rights and the environment.

This innovation proved so successful that by late 1989, networks in Sweden, Canada, Brazil, Nicaragua and Australia were exchanging information with each other and with IGC and GreenNet. In the spring of 1990, these seven organizations founded the Association for Progressive communications to co-ordinate the operation and development of this emerging global network of networks.

Strategies and Policies

The APC defends and promotes non-commercial, productive online space for NGOs and collaborates with like-minded organizations to ensure that the information and communication needs of civil society are considered in telecommunications, donor and investment policy. The APC is committed to freedom of expression and exchange of information on the Internet.

The APC helps to build capacity between existing and emerging communication service providers.

The APC Women's Networking Support Program promotes gender-aware Internet design, implementation and use.

Through its African members, the APC is trying to strengthen indigenous information sharing and independent networking capacity on the continent.

Members of APC develop Internet products, resources and tools to meet the advocacy, collaboration and information publishing and management needs of civil society. Recent APC initiatives have included the APC Toolkit Project: Online Publishing and Collaboration for Activists and the Mission-Driven Business Planning Toolkit.

The APC also runs special projects like the Beijing+5, which shall enable non-governmental organizations to actively participate in the review of the Beijing Platform for Action.

TEXTBLOCK 15/20 // URL: http://world-information.org/wio/infostructure/100437611734/100438659269
 
1900 - 2000 A.D.

1904
First broadcast talk

1918
Invention of the short-wave radio

1929
Invention of television in Germany and Russia

1941
Invention of microwave transmission

1946
Long-distance coaxial cable systems and mobile telephone services are introduced in the USA.

1957
Sputnik, the first satellite, is launched by the USSR
First data transmissions over regular phone circuits.

At the beginning of the story of today's global data networks is the story of the development of satellite communication.

In 1955 President Eisenhower announced the USA's intention to launch a satellite. But it in the end it was the Soviet Union, which launched the first satellite in 1957: Sputnik I. After Sputnik's launch it became evident that the Cold War was also a race for leadership in the application of state-of-the-art technology to defense. As the US Department of Defense encouraged the formation of high-tech companies, it laid the ground to Silicon Valley, the hot spot of the world's computer industry.

The same year as the USA launched their first satellite - Explorer I - data was transmitted over regular phone circuits for the first time, thus laying the ground for today's global data networks.

Today's satellites may record weather data, scan the planet with powerful cameras, offer global positioning and monitoring services, and relay high-speed data transmissions. Yet up to now, most satellites are designed for military purposes such as reconnaissance.

1969
ARPAnet online

ARPAnet was the small network of individual computers connected by leased lines that marked the beginning of today's global data networks. An experimental network it mainly served the purpose of testing the feasibility of wide area networks and the possibility of remote computing. It was created for resource sharing between research institutions and not for messaging services like E-mail. Although US military sponsored its research, ARPAnet was not designed for directly martial use but to support military-related research.

In 1969 ARPANET went online and linked the first two computers, one located at the University of California, Los Angeles, the other at the Stanford Research Institute.

Yet ARPAnet did not become widely accepted before it was demonstrated in action to a public of computer experts at the First International Conference on Computers and Communication in Washington, D. C. in 1972.

Before it was decommissioned in 1990, NSFnet, a network of scientific and academic computers funded by the National Science Foundation, and a separate new military network went online in 1986. In 1988 the first private Internet service providers started offering access to NSFnet to a general public. After having become the backbone of the Internet in the USA, in 1995 NSFnet was turned into a consortium of commercial backbone providers. This and the launch of the World Wide Web added to the success of the global data network we call the Net.

In the USA it was already in 1994 that commercial users outnumbered military and academic users.

Despite the rapid growth of the Net, most computers linked to it are still located in the United States.

1971
Invention of E-Mail

1979
Introduction of fiber-optic cable systems

1992
Launch of the World Wide Web

TEXTBLOCK 16/20 // URL: http://world-information.org/wio/infostructure/100437611796/100438659828
 
Basics: Protected Persons

Generally copyright vests in the author of the work. Certain national laws provide for exceptions and, for example, regard the employer as the original owner of a copyright if the author was, when the work was created, an employee and employed for the purpose of creating that work. In the case of some types of creations, particularly audiovisual works, several national laws provide for different solutions to the question that should be the first holder of copyright in such works.

Many countries allow copyright to be assigned, which means that the owner of the copyright transfers it to another person or entity, which then becomes its holder. When the national law does not permit assignment it usually provides the possibility to license the work to someone else. Then the owner of the copyright remains the holder, but authorizes another person or entity to exercise all or some of his rights subject to possible limitations. Yet in any case the "moral rights" always belong to the author of the work, whoever may be the owner of the copyright (and therefore of the "economic rights").


TEXTBLOCK 17/20 // URL: http://world-information.org/wio/infostructure/100437611725/100438659527
 
Problems of Copyright Management and Control Technologies

Profiling and Data Mining

At their most basic copyright management and control technologies might simply be used to provide pricing information, negotiate the purchase transaction, and release a copy of a work for downloading to the customer's computer. Still, from a technological point of view, such systems also have the capacity to be employed for digital monitoring. Copyright owners could for example use the transaction records generated by their copyright management systems to learn more about their customers. Profiles, in their crudest form consisting of basic demographic information, about the purchasers of copyrighted material might be created. Moreover copyright owners could use search agents or complex data mining techniques to gather more information about their customers that could either be used to market other works or being sold to third parties.

Fair Use

Through the widespread use of copyright management and control systems the balance of control could excessively be shifted in favor of the owners of intellectual property. The currently by copyright law supported practice of fair use might potentially be restricted or even eliminated. While information in analogue form can easily be reproduced, the protection of digital works through copyright management systems might complicate or make impossible the copying of material for purposes, which are explicitly exempt under the doctrine of fair use.

Provisions concerning technological protection measures and fair use are stated in the DMCA, which provides that "Since copying of a work may be a fair use under appropriate circumstances, section 1201 does not prohibit the act of circumventing a technological measure that prevents copying. By contrast, since the fair use doctrine is not a defense e to the act of gaining unauthorized access to a work, the act of circumventing a technological measure in order to gain access is prohibited." Also the proposed EU Directive on copyright and related rights in the information society contains similar clauses. It distinguishes between the circumvention of technical protection systems for lawful purposes (fair use) and the circumvention to infringe copyright. Yet besides a still existing lack of legal clarity also very practical problems arise. Even if the circumvention of technological protection measures under fair use is allowed, how will an average user without specialized technological know-how be able to gain access or make a copy of a work? Will the producers of copyright management and control systems provide fair use versions that permit the reproduction of copyrighted material? Or will users only be able to access and copy works if they hold a digital "fair use license" ("fair use licenses" have been proposed by Mark Stefik, whereby holders of such licenses could exercise some limited "permissions" to use a digital work without a fee)?

TEXTBLOCK 18/20 // URL: http://world-information.org/wio/infostructure/100437611725/100438659629
 
Recent "Digital Copyright" Legislation: U.S.

DMCA (Digital Millennium Copyright Act)

The debates in the House and Senate preceding the signing into law of the DMCA by U.S. President Clinton in October 1998 indicated that the principal object of the Act is to promote the U.S. economy by establishing an efficient Internet marketplace in copyrighted works. The DMCA implements the two 1996 WIPO treaties (WIPO Performances and Phonograms Treaty and WIPO Copyright Treaty) and addresses a variety of issues that arose with the increased availability of content in digital form. The Act 1) creates a series of "safe harbor" defenses (which are subject to a variety of conditions that must be met) for certain common activities of ISPs (Internet Service Provider), 2) bars the circumvention of technological protection measures that protect copyrighted works, 3) prohibits the distribution or provision of false copyright management information with the intent to induce or conceal infringement, 4) establishes an exemption for making a copy of a computer program for purposes of maintenance or repair, and 5) contains provisions concerning the "webcasting" of sound recordings on the Internet and the making of (digital) copies of copyrighted works by nonprofit libraries and archives.

A full-text version of the DMCA is available from:
The Library of Congress: Thomas (Legislative Information on the Internet): http://thomas.loc.gov/cgi-bin/cpquery/z?cp105:hr796:

Moreover the U.S. Copyright Office provides a memorandum, which briefly summarizes each of the five titles of the DMCA (pdf format): http://lcweb.loc.gov/copyright/legislation/dmca.pdf

The DMCA has been criticized for not clarifying the range of legal principles on the liability of ISPs and creating exceptions to only some of the provisions; therefore giving copyright owners even more rights.

Among the variety of comments on the DMCA are:

Lutzker, Arnold P.: Primer on the Digital Millennium: What the Digital Millennium Copyright Act and the Copyright Term Extension Act Mean for the Library Community. http://www.arl.org/info/frn/copy/primer.html

Lutzker & Lutzker law firm and the Association of Research Libraries: The Digital Millennium Copyright Act: Highlights of New Copyright Provision Establishing Limitation of Liability for Online Service Providers. http://www.arl.org/info/frn/copy/osp.html

TEXTBLOCK 19/20 // URL: http://world-information.org/wio/infostructure/100437611725/100438659614
 
"Project Censored"

Project Censored was launched at Sonoma State University (U.S.) in 1976 as an annual review of the systematic withholding of public access to important news facts by the mainstream media. The team composed of student media researcher and media analysts annually selects and publishes what they believe are the 25 most important under-covered news stories. "The essential issue raised by the project is the failure of the mass media to provide the people with all the information they need to make informed decisions concerning their own lives and in the voting booth". (Project Censored)

TEXTBLOCK 20/20 // URL: http://world-information.org/wio/infostructure/100437611795/100438658515
 
Internet Societal Task Force

The Internet Societal Task Force is an organization under the umbrella of the Internet Society dedicated to assure that the Internet is for everyone by identifying and characterizing social and economic issues associated with the growth and use of Internet. It supplements the technical tasks of the Internet Architecture Board, the Internet Engineering Steering Group and the Internet Engineering Task Force.

Topics under discussion are social, economic, regulatory, physical barriers to the use of the Net, privacy, interdependencies of Internet penetration rates and economic conditions, regulation and taxation.

http://www.istf.isoc.org/

http://www.istf.isoc.org/
INDEXCARD, 1/37
 
Framing

Framing is the practice of creating a frame or window within a web page where the content of a different web page can be display. Usually when a link is clicked on, the new web page is presented with the reminders of the originating page.

INDEXCARD, 2/37
 
Digital Subscriber Line (DSL)

DSL connections are high-speed data connections over copper wire telephone lines. As with cable connections, with DSL you can look up information on the Internet and make a phone call at the same time but you do not need to have a new or additional cable or line installed. One of the most prominent DSL services is ISDN (integrated services digital network, for more information click here ( http://www.britannica.com/bcom/eb/article/4/0,5716,129614+15,00.html )).

http://www.britannica.com/bcom/eb/article/4/0...
INDEXCARD, 3/37
 
Cyborg

The word "cyborg" short form for "cybernetic organism", i.e. an entity which is partly biological and partly technical. As the technical seizure of nature progresses, cyborgs are proliferating and pose novel theoretical and social questions. The incorporation of technical components into human bodies is not new, but the bio-chips and nano-computers made possible by advances in information technology give a new quality to the development. Because the technization of the body has its origin in military history, cyborg studies have been connected to a critique of militarism, as in Chris Hables Gray, and to feminist critiques of society, as in Donna Haraway.

INDEXCARD, 4/37
 
Bandwidth

The bandwidth of a transmitted communications signal is a measure of the range of frequencies the signal occupies. The term is also used in reference to the frequency-response characteristics of a communications receiving system. All transmitted signals, whether analog or digital, have a certain bandwidth. The same is true of receiving systems.

Generally speaking, bandwidth is directly proportional to the amount of data transmitted or received per unit time. In a qualitative sense, bandwidth is proportional to the complexity of the data for a given level of system performance. For example, it takes more bandwidth to download a photograph in one second than it takes to download a page of text in one second. Large sound files, computer programs, and animated videos require still more bandwidth for acceptable system performance. Virtual reality (VR) and full-length three-dimensional audio/visual presentations require the most bandwidth of all.

In digital systems, bandwidth is data speed in bits per second (bps).

Source: Whatis.com

INDEXCARD, 5/37
 
Electronic Data Interchange (EDI)

EDI is an international standard relating to the exchange of trade goods and services. It enables trading partners to conduct routine business transactions, such as purchase orders, invoices and shipping notices independent of the computer platform used by the trading partners. Standardization by EDI translation software assures the correct interpretation of data.

EDI might become increasingly important to electronic commerce.

INDEXCARD, 6/37
 
NSFNet

Developed under the auspices of the National Science Foundation (NSF), NSFnet served as the successor of the ARPAnet as the main network linking universities and research facilities until 1995, when it was replaced it with a commercial backbone network. Being research networks, ARPAnet and NSFnet served as testing grounds for future networks.

INDEXCARD, 7/37
 
AT&T Labs-Research

The research and development division of AT&T. Inventions made at AT&T Labs-Research include so important ones as stereo recording, the transistor and the communications satellite.

http://www.research.att.com/

INDEXCARD, 8/37
 
MIT

The MIT (Massachusetts Institute of Technology) is a privately controlled coeducational institution of higher learning famous for its scientific and technological training and research. It was chartered by the state of Massachusetts in 1861 and became a land-grant college in 1863. During the 1930s and 1940s the institute evolved from a well-regarded technical school into an internationally known center for scientific and technical research. In the days of the Great Depression, its faculty established prominent research centers in a number of fields, most notably analog computing (led by Vannevar Bush) and aeronautics (led by Charles Stark Draper). During World War II, MIT administered the Radiation Laboratory, which became the nation's leading center for radar research and development, as well as other military laboratories. After the war, MIT continued to maintain strong ties with military and corporate patrons, who supported basic and applied research in the physical sciences, computing, aerospace, and engineering. MIT has numerous research centers and laboratories. Among its facilities are a nuclear reactor, a computation center, geophysical and astrophysical observatories, a linear accelerator, a space research center, supersonic wind tunnels, an artificial intelligence laboratory, a center for cognitive science, and an international studies center. MIT's library system is extensive and includes a number of specialized libraries; there are also several museums.

INDEXCARD, 9/37
 
WIPO

The World Intellectual Property Organization is one of the specialized agencies of the United Nations (UN), which was designed to promote the worldwide protection of both industrial property (inventions, trademarks, and designs) and copyrighted materials (literary, musical, photographic, and other artistic works). It was established by a convention signed in Stockholm in 1967 and came into force in 1970. The aims of WIPO are threefold. Through international cooperation, WIPO promotes the protection of intellectual property. Secondly, the organization supervises administrative cooperation between the Paris, Berne, and other intellectual unions regarding agreements on trademarks, patents, and the protection of artistic and literary work and thirdly through its registration activities the WIPO provides direct services to applicants for, or owners of, industrial property rights.

INDEXCARD, 10/37
 
Barnes and Noble

Massive online retail bookstore housing more than a million titles. Includes a book recommendation "personalizer,", a comprehensive list of The New York Times bestsellers, a "live" community events calendar with a daily survey and several forums, "highlighted" books from 19 subject areas, browsable categories such as antiques, ethnic studies, and pop culture, Books in the News, and weekly features such as reviews, excerpts, recommendations, interviews, events, "roundups" of popular titles, and quizzes.

INDEXCARD, 11/37
 
Neighboring rights

Copyright laws generally provide for three kinds of neighboring rights: 1) the rights of performing artists in their performances, 2) the rights of producers of phonograms in their phonograms, and 3) the rights of broadcasting organizations in their radio and television programs. Neighboring rights attempt to protect those who assist intellectual creators to communicate their message and to disseminate their works to the public at large.

INDEXCARD, 12/37
 
Fair use

Certain acts normally restricted by copyright may, in circumstances specified in the law, be done without the authorization of the copyright owner. Fair use may therefore be described as the privilege to use copyrighted material in a reasonable manner without the owner's consent and allows the reproduction and use of a work for limited purposes such as criticism, comment, news reporting, teaching, and research. To determine whether a use is fair or not most copyright laws consider: 1) purpose and character of the use, 2) nature of the copyrighted work, 3) amount and substantiality of the portion used, and 4) effect of the use on the potential market. Examples of activities that may be excused as fair use include: providing a quotation in a book review; distributing copies of a section of an article in class for educational purposes; and imitating a work for the purpose of parody or social commentary.

INDEXCARD, 13/37
 
World Wide Web (WWW)

Probably the most significant Internet service, the World Wide Web is not the essence of the Internet, but a subset of it. It is constituted by documents that are linked together in a way you can switch from one document to another by simply clicking on the link connecting these documents. This is made possible by the Hypertext Mark-up Language (HTML), the authoring language used in creating World Wide Web-based documents. These so-called hypertexts can combine text documents, graphics, videos, sounds, and Java applets, so making multimedia content possible.

Especially on the World Wide Web, documents are often retrieved by entering keywords into so-called search engines, sets of programs that fetch documents from as many servers as possible and index the stored information. (For regularly updated lists of the 100 most popular words that people are entering into search engines, click here). No search engine can retrieve all information on the whole World Wide Web; every search engine covers just a small part of it.

Among other things that is the reason why the World Wide Web is not simply a very huge database, as is sometimes said, because it lacks consistency. There is virtually almost infinite storage capacity on the Internet, that is true, a capacity, which might become an almost everlasting too, a prospect, which is sometimes consoling, but threatening too.

According to the Internet domain survey of the Internet Software Consortium the number of Internet host computers is growing rapidly. In October 1969 the first two computers were connected; this number grows to 376.000 in January 1991 and 72,398.092 in January 2000.

World Wide Web History Project, http://www.webhistory.org/home.html

http://www.searchwords.com/
http://www.islandnet.com/deathnet/
http://www.salonmagazine.com/21st/feature/199...
INDEXCARD, 14/37
 
Java Applets

Java applets are small programs that can be sent along with a Web page to a user. Java applets can perform interactive animations, immediate calculations, or other simple tasks without having to send a user request back to the server. They are written in Java, a platform-independent computer language, which was invented by Sun Microsystems, Inc.

Source: Whatis.com

INDEXCARD, 15/37
 
Mark

A mark (trademark or service mark) is "... a sign, or a combination of signs, capable of distinguishing the goods or services of one undertaking from those of other undertakings. The sign may particularly consist of one or more distinctive words, letters, numbers, drawings or pictures, emblems, colors or combinations of colors, or may be three-dimensional..." (WIPO) To be protected a mark must be registered in a government office whereby generally the duration is limited in time, but can be periodically (usually every 10 years) renewed.

INDEXCARD, 16/37
 
RIPE

The RIPE Network Coordination Centre (RIPE NCC) is one of three Regional Internet

Registries (RIR), which exist in the world today, providing allocation and registration services which support the operation of the Internet globally, mainly the allocation of IP address space for Europe.

http://www.ripe.net

INDEXCARD, 17/37
 
Internet Research Task Force

Being itself under the umbrella of the Internet Society, the Internet Research Task Force is an umbrella organization of small research groups working on topics related to Internet protocols, applications, architecture and technology. It is governed by the Internet Research Steering Group.

http://www.irtf.org

http://www.irtf.org/
INDEXCARD, 18/37
 
Intranet

As a local area network (LAN), an Intranet is a secured network of computers based on the IP protocol and with restricted access.

INDEXCARD, 19/37
 
Total copyright industries

The total copyright industries encompass the "core copyright industries" and portions of many other industries that either create, distribute, or depend upon copyrighted works. Examples include retail trade (a portion of which is sales of video, audio, software, and books, for example), the doll and toy industry, and computer manufacturing.


INDEXCARD, 20/37
 
AT&T

AT&T Corporation provides voice, data and video communications services to large and small businesses, consumers and government entities. AT&T and its subsidiaries furnish domestic and international long distance, regional, local and wireless communications services, cable television and Internet communications services. AT&T also provides billing, directory and calling card services to support its communications business. AT&T's primary lines of business are business services, consumer services, broadband services and wireless services. In addition, AT&T's other lines of business include network management and professional services through AT&T Solutions and international operations and ventures. In June 2000, AT&T completed the acquisition of MediaOne Group. With the addition of MediaOne's 5 million cable subscribers, AT&T becomes the country's largest cable operator, with about 16 million customers on the systems it owns and operates, which pass nearly 28 million American homes. (source: Yahoo)

Slogan: "It's all within your reach"

Business indicators:

Sales 1999: $ 62.391 bn (+ 17,2 % from 1998)

Market capitalization: $ 104 bn

Employees: 107,800

Corporate website: http://www.att.com http://www.att.com/
INDEXCARD, 21/37
 
Invention

According to the WIPO an invention is a "... novel idea which permits in practice the solution of a specific problem in the field of technology." Concerning its protection by law the idea "... must be new in the sense that is has not already been published or publicly used; it must be non-obvious in the sense that it would not have occurred to any specialist in the particular industrial field, had such a specialist been asked to find a solution to the particular problem; and it must be capable of industrial application in the sense that it can be industrially manufactured or used." Protection can be obtained through a patent (granted by a government office) and typically is limited to 20 years.

INDEXCARD, 22/37
 
1996 WIPO Copyright Treaty (WCT)

The 1996 WIPO Copyright Treaty, which focused on taking steps to protect copyright "in the digital age" among other provisions 1) makes clear that computer programs are protected as literary works, 2) the contracting parties must protect databases that constitute intellectual creations, 3) affords authors with the new right of making their works "available to the public", 4) gives authors the exclusive right to authorize "any communication to the public of their works, by wire or wireless means ... in such a way that members of the public may access these works from a place and at a time individually chosen by them." and 5) requires the contracting states to protect anti-copying technology and copyright management information that is embedded in any work covered by the treaty. The WCT is available on: http://www.wipo.int/documents/en/diplconf/distrib/94dc.htm



http://www.wipo.int/documents/en/diplconf/dis...
INDEXCARD, 23/37
 
Critical Art Ensemble

Critical Art Ensemble is a collective of five artists of various specializations dedicated to exploring the intersections between art, technology, radical politics, and critical theory. CAE have published a number of books and carried out innovative art projects containing insightful and ironic theoretical contributions to media art. Projects include Addictionmania, Useless Technology, The Therapeutic State, Diseases of Consciousness, Machineworld, As Above So Below, and Flesh Machine.

http://www.critical-art.net

INDEXCARD, 24/37
 
National Laboratory for Applied Network Research

NLANR, initially a collaboration among supercomputer sites supported by the National Science Foundation, was created in 1995 to provide technical and engineering support and overall coordination of the high-speed connections at these five supercomputer centers.

Today NLANR offers support and services to institutions that are qualified to use high performance network service providers - such as Internet 2 and Next Generation Internet.

http://www.nlanr.net

INDEXCARD, 25/37
 
Backbone Networks

Backbone networks are central networks usually of very high bandwidth, that is, of very high transmitting capacity, connecting regional networks. The first backbone network was the NSFNet run by the National Science Federation of the United States.

INDEXCARD, 26/37
 
cryptology

also called "the study of code". It includes both, cryptography and cryptoanalysis

INDEXCARD, 27/37
 
The World Wide Web History Project

The ongoing World Wide Web History Project was established to record and publish the history of the World Wide Web and its roots in hypermedia and networking. As primary research methods are used archival research and the analysis of interviews and talks with pioneers of the World Wide Web. As result a vast of collection of historic video, audio, documents, and software is expected. The project's digital archive is currently under development.

http://www.webhistory.org/home.html

INDEXCARD, 28/37
 
Intellectual property

Intellectual property, very generally, relates to the output that result from intellectual activity in the industrial, scientific, literary and artistic fields. Traditionally intellectual property is divided into two branches: 1) industrial property (inventions, marks, industrial designs, unfair competition and geographical indications), and 2) copyright. The protection of intellectual property is guaranteed through a variety of laws, which grant the creators of intellectual goods, and services certain time-limited rights to control the use made of their products.

INDEXCARD, 29/37
 
Internet Architecture Board

On behalf of the Internet Society, the Internet Architecture Board oversees the evolution of the architecture, the standards and the protocols of the Net.

Internet Society: http://www.isoc.org/iab

http://www.isoc.org/
INDEXCARD, 30/37
 
ARPAnet

ARPAnet was the small network of individual computers connected by leased lines that marked the beginning of today's global data networks. Being an experimental network mainly serving the purpose to test the feasibility of wide area networks, the possibility of remote computing, it was created for resource sharing between research institutions, not for messaging services like E-mail. Although research was sponsored by US military, ARPAnet was not designed for directly martial use but to support military-related research.

In 1969 ARPANET went online and links the first two computers, one of them located at the University of California, Los Angeles, the other at the Stanford Research Institute.

But ARPAnet has not become widely accepted before it was demonstrated in action to a public of computer experts at the First International Conference on Computers and Communication in Washington, D. C. in 1972.

Before it was decommissioned in 1990, NSFnet, a network of scientific and academic computers funded by the National Science Foundation, and a separate new military network went online in 1986. In 1988 the first private Internet service providers offered a general public access to NSFnet. Beginning in 1995, after having become the backbone of the Internet in the USA, NSFnet was turned over to a consortium of commercial backbone providers. This and the launch of the World Wide Web added to the success of the global data network we call the Net.

In the USA commercial users already outnumbered military and academic users in 1994.

Despite the rapid growth of the Net, most computers linked to it are still located in the United States.

INDEXCARD, 31/37
 
Electronic Messaging (E-Mail)

Electronic messages are transmitted and received by computers through a network. By E-Mail texts, images, sounds and videos can be sent to single users or simultaneously to a group of users. Now texts can be sent and read without having them printed.

E-Mail is one of the most popular and important services on the Internet.

INDEXCARD, 32/37
 
Gateway

A gateway is a computer supplying point-to-multipoint connections between computer networks.

INDEXCARD, 33/37
 
Network Information Center (NIC)

Network information centers are organizations responsible for registering and maintaining the domain names on the World Wide Web. Until competition in domain name registration was introduced, they were the only ones responsible. Most countries have their own network information center.

INDEXCARD, 34/37
 
Amazon.com

Amazon.com is an online shop that serves approx. 17 mn customers in 150 countries. Starting out as a bookshop, Amazon today offers a wide range of other products as well.

Among privacy campaigners, the company's name has become almost synonymous with aggressive online direct marketing practices as well as user profiling and tracking. Amazon and has been involved in privacy disputes at numerous occasions.

http://www.amazon.com/
http://www.computeruser.com/newstoday/00/01/0...
INDEXCARD, 35/37
 
Server

A server is program, not a computer, as it sometimes said, dedicated to store files, manage printers and network traffic, or process database queries.

Web sites, the nodes of the World Wide Web (WWW), e.g., are stored on servers.

INDEXCARD, 36/37
 
Sputnik

At the beginning of the story of today's global data networks is the story of the development of satellite communication.

In 1955 President Eisenhower announced the USA's intention to launch a satellite. But it was the Soviet Union, which launched the first satellite in 1957: Sputnik I. After Sputnik's launch it became evident that the Cold War was also a race for leadership in the application of state-of-the-art technology to defence. As the US Department of Defence encouraged the formation of high-tech companies, it laid the ground to Silicon Valley, the hot spot of the world's computer industry.

In the same year the USA launched their first satellite - Explorer I - data were transmitted over regular phone circuits for the first time, thus laying the ground for today's global data networks.

Today's satellites may record weather data, scan the planet with powerful cameras, offer global positioning and monitoring services, and relay high-speed data transmissions. But up to now, most satellites are designed for military purposes such as reconnaissance.

INDEXCARD, 37/37