Identificaiton in history

In biometric technology, the subject is reduced to its physical and therefore inseparable properties. The subject is a subject in so far as it is objectified; that is, in so far as is identified with its own res extensa, Descartes' "extended thing". The subject exists in so far as it can be objectified, if it resists the objectification that comes with measurement, it is rejected or punished. Biometrics therefore provides the ultimate tool for control; in it, the dream of hermetic identity control seems to become a reality, a modern technological reconstruction of traditional identification techniques such as the handshake or the look into somebody's eyes.

The use of identification by states and other institutions of authority is evidently not simply a modern phenomenon. The ancient Babylonians and Chinese already made use of finger printing on clay to identify authors of documents, while the Romans already systematically compared handwritings.

Body measurement has long been used by the military. One of the first measures after entering the military is the identification and appropriation of the body measurements of a soldier. These measurements are filed and combined with other data and make up what today we would call the soldier's data body. With his data body being in possession of the authority, a soldier is no longer able freely socialise and is instead dependent on the disciplinary structure of the military institution. The soldier's social being in the world is defined by the military institution.

However, the military and civilian spheres of modern societies are no longer distinct entities. The very ambivalence of advanced technology (dual use technologies) has meant that "good" and "bad" uses of technology can no longer be clearly distinguished. The measurement of physical properties and the creation of data bodies in therefore no longer a military prerogative, it has become diffused into all areas of modern societies.

If the emancipatory potential of weak identities is to be of use, it is therefore necessary to know how biometric technologies work and what uses they are put to.

TEXTBLOCK 1/29 // URL: http://world-information.org/wio/infostructure/100437611729/100438658096
 
Virtual cartels; mergers

In parallel to the deregulation of markets, there has been a trend towards large-scale mergers which ridicules dreams of increased competition.

Recent mega-mergers and acquisitions include

SBC Communications - Ameritech, $ 72,3 bn

Bell Atlantic - GTE, $ 71,3

AT&T - Media One, $ 63,1

AOL - Time Warner, $ 165 bn

MCI Worldcom - Spring, $ 129 bn

The total value of all major mergers since the beginnings of the 1990s has been 20 trillion Dollars, 2,5 times the size of the USA's GIP.

The AOL- Time Warner reflects a trend which can be observed everywhere: the convergence of the ICT and the content industries. This represents the ultimate advance in complete market domination, and a alarming threat to independent content.

"Is TIME going to write something negative about AOL? Will AOL be able to offer anything other than CNN sources? Is the Net becoming as silly and unbearable as television?"

(Detlev Borchers, journalist)

TEXTBLOCK 2/29 // URL: http://world-information.org/wio/infostructure/100437611709/100438658959
 
Legal Protection: TRIPS (Trade-Related Aspects of Intellectual Property Rights)

Another important multilateral treaty concerned with intellectual property rights is the TRIPS agreement, which was devised at the inauguration of the Uruguay Round negotiations of the WTO in January 1995. It sets minimum standards for the national protection of intellectual property rights and procedures as well as remedies for their enforcement (enforcement measures include the potential for trade sanctions against non-complying WTO members). The TRIPS agreement has been widely criticized for its stipulation that biological organisms be subject to intellectual property protection. In 1999, 44 nations considered it appropriate to treat plant varieties as intellectual property.

The complete TRIPS agreement can be found on: http://www.wto.org/english/tratop_e/trips_e/t_agm1_e.htm

TEXTBLOCK 3/29 // URL: http://world-information.org/wio/infostructure/100437611725/100438659758
 
Transparent customers. Direct marketing online



This process works even better on the Internet because of the latter's interactive nature. "The Internet is a dream to direct marketers", said Wil Lansing, CEO of the American retailer Fingerhut Companies. Many services require you to register online, requiring users to provide as much information about them as possible. And in addition, the Internet is fast, cheap and used by people who tend to be young and on the search for something interesting.

Many web sites also are equipped with user tracking technology that registers a users behaviour and preferences during a visit. For example, user tracking technology is capable of identifying the equipment and software employed by a user, as well as movements on the website, visit of links etc. Normally such information is anonymous, but can be personalised when it is coupled with online registration, or when personal identifcation has been obtained from other sources. Registration is often a prerequisite not just for obtaining a free web mail account, but also for other services, such as personalised start pages. Based on the information provided by user, the start page will then include advertisements and commercial offers that correspond to the users profile, or to the user's activity on the website.

One frequent way of obtaining such personal information of a user is by offering free web mail accounts offered by a great many companies, internet providers and web portals (e.g. Microsoft, Yahoo, Netscape and many others). In most cases, users get "free" accounts in return for submitting personal information and agreeing to receive marketing mails. Free web mail accounts are a simple and effective direct marketing and data capturing strategy which is, however, rarely understood as such. However, the alliances formed between direct advertising and marketing agencies on the one hand, and web mail providers on the other hand, such as the one between DoubleClick and Yahoo, show the common logic of data capturing and direct marketing. The alliance between DoubleClick and Yahoo eventually attracted the US largest direct marketing agency, Abacus Direct, who ended up buying DoubleClick.

However, the intention of collecting users personal data and create consumer profiles based on online behaviour can also take on more creative and playful forms. One such example is sixdegrees.com. This is a networking site based on the assumption that everybody on the planet is connected to everybody else by a chain of six people at most. The site offers users to get to know a lot of new people, the friends of their friends of their friends, for example, and if they try hard enough, eventually Warren Beatty or Claudia Schiffer. But of course, in order to make the whole game more useful for marketing purposes, users are encouraged to join groups which share common interests, which are identical with marketing categories ranging from arts and entertainment to travel and holiday. Evidently, the game becomes more interesting the more new people a user brings into the network. What seems to be fun for the 18 to 24 year old college student customer segment targeted by sixdegrees is, of course, real business. While users entertain themselves they are being carefully profiled. After all, data of young people who can be expected to be relatively affluent one day are worth more than money.

The particular way in which sites such as sixdegrees.com and others are structured mean that not only to users provide initial information about them, but also that this information is constantly updated and therefore becomes even more valuable. Consequently, many free online services or web mail providers cancel a user's account if it has not been uses for some time.

There are also other online services which offer free services in return for personal information which is then used for marketing purposes, e.g. Yahoo's Geocities, where users may maintain their own free websites, Bigfoot, where people are offered a free e-mail address for life, that acts as a relais whenever a customer's residence or e-mail address changes. In this way, of course, the marketers can identify friendship and other social networks, and turn this knowledge into a marketing advantage. People finders such as WhoWhere? operate along similar lines.

A further way of collecting consumer data that has recently become popular is by offering free PCs. Users are provided with a PC for free or for very little money, and in return commit themselves to using certain services rather than others (e.g. a particular internet provider), providing information about themselves, and agree to have their online behaviour monitored by the company providing the PC, so that accurate user profiles can be compiled. For example, the Free PC Network offers advertisers user profiles containing "over 60 individual demographics". There are literally thousands of variations of how a user's data are extracted and commercialised when online. Usually this happens quietly in the background.

A good inside view of the world of direct marketing can be gained at the website of the American Direct Marketing Association and the Federation of European Direct Marketing.

TEXTBLOCK 4/29 // URL: http://world-information.org/wio/infostructure/100437611761/100438659667
 
Legal Protection: WIPO (World Intellectual Property Organization)

Presumably the major player in the field of international intellectual property protection and administrator of various multilateral treaties dealing with the legal and administrative aspects of intellectual property is the WIPO.

Information on WIPO administered agreements in the field of industrial property (Paris Convention for the Protection of Industrial Property (1883), Madrid Agreement Concerning the International Registration of Marks (1891) etc.) can be found on: http://www.wipo.org/eng/general/index3.htm

Information on treaties concerning copyright and neighboring rights (Berne Convention for the Protection of Literary and Artistic Works (1886) etc.) is published on: http://www.wipo.org/eng/general/index5.htm

The most recent multilateral agreement on copyright is the 1996 WIPO Copyright Treaty. Among other things it provides that computer programs are protected as literary works and also introduces the protection of databases, which "... by reason of the selection or arrangement of their content constitute intellectual creations." Furthermore the 1996 WIPO Copyright Treaty contains provisions concerning technological measures, rights management information and establishes a new "right of communication to the public". It is available on: http://www.wipo.org/eng/diplconf/distrib/treaty01.htm

TEXTBLOCK 5/29 // URL: http://world-information.org/wio/infostructure/100437611725/100438659588
 
Data bunkers

Personal data are collected, appropriated, processed and used for commercial purposes on a global scale. In order for such a global system to operate smoothly, there a server nodes at which the data streams converge. Among the foremost of these are the data bases of credit card companies, whose operation has long depended on global networking.

On top of credit card companies such as Visa, American Express, Master Card, and others. It would be erroneous to believe that the primary purpose of business of these companies is the provision of credit, and the facilitation of credit information for sale transactions. In fact, Information means much more than just credit information. In an advertisement of 1982, American Express described itself in these terms: ""Our product is information ...Information that charges airline tickets, hotel rooms, dining out, the newest fashions ...information that grows money funds buys and sells equities ...information that pays life insurance annuities ...information that schedules entertainment on cable television and electronically guards houses ...information that changes kroners into guilders and figures tax rates in Bermuda ..."

Information has become something like the gospel of the New Economy, a doctrine of salvation - the life blood of society, as Bill Gates expresses it. But behind information there are always data that need to be generated and collected. Because of the critical importance of data to the economy, their possession amounts to power and their loss can cause tremendous damage. The data industry therefore locates its data warehouses behind fortifications that bar physical or electronic access. Such structures are somewhat like a digital reconstruction of the medieval fortress

Large amounts of data are concentrated in fortress-like structures, in data bunkers. As the Critical Art Ensemble argue in Electronic Civil Disobedience: "The bunker is the foundation of homogeneity, and allows only a singular action within a given situation." All activities within data bunker revolve around the same principle of calculation. Calculation is the predominant mode of thinking in data-driven societies, and it reaches its greatest density inside data bunkers. However, calculation is not a politically neutral activity, as it provides the rational basis - and therefore the formal legitimisation most every decision taken. Data bunkers therefore have an essentially conservative political function, and function to maintain and strengthen the given social structures.

TEXTBLOCK 6/29 // URL: http://world-information.org/wio/infostructure/100437611761/100438659754
 
Basics: Infringement and Fair Use

The rights of a copyright holder are infringed when one of the acts requiring the authorization of the owner is done by someone else without his consent. In the case of copyright infringement or the violation of neighboring rights the remedies for the copyright owner consist of civil redress. The unauthorized copying of protected works for commercial purposes and the unauthorized commercial dealing in copied material is usually referred to as "piracy".

Yet copyright laws also provide that the rights of copyright owners are subject to the doctrine of "fair use". That allows the reproduction and use of a work, notwithstanding the rights of the author, for limited purposes such as criticism, comment, news reporting, teaching, and research. Fair use may be described as the privilege to use the copyrighted material in a reasonable manner without the owner's consent. To determine whether a use is fair or not most copyright laws consider:

- the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes (usually certain types of educational copying are allowed)

- the nature of the copyrighted work (mostly originals made for commercial reasons are less protected than their purely artistic counterparts)

- the amount and substantiality of the portion used in relation to the copyrighted work as a whole

- the effect of the use upon the potential market for or value of the copyrighted work (as a general rule copying may be permitted if it is unlikely to cause economic harm to the original author)

Examples of activities that may be excused as fair use include: providing a quotation in a book review; distributing copies of a section of an article in class for educational purposes; and imitating a work for the purpose of parody or social commentary.

TEXTBLOCK 7/29 // URL: http://world-information.org/wio/infostructure/100437611725/100438659569
 
Biometric applications: surveillance

Biometric technologies are not surveillance technologies in themselves, but as identification technologies they provide an input into surveillance which can make such as face recognition are combined with camera systems and criminal data banks in order to supervise public places and single out individuals.

Another example is the use of biometrics technologies is in the supervision of probationers, who in this way can carry their special hybrid status between imprisonment and freedom with them, so that they can be tracked down easily.

Unlike biometric applications in access control, where one is aware of the biometric data extraction process, what makes biometrics used in surveillance a particularly critical issue is the fact that biometric samples are extracted routinely, unnoticed by the individuals concerned.

TEXTBLOCK 8/29 // URL: http://world-information.org/wio/infostructure/100437611729/100438658740
 
Private data bunkers

On the other hand are the data bunkers of the private sector, whose position is different. Although these are fast-growing engines of data collection with a much greater degree of dynamism, they may not have the same privileged position - although one has to differentiate among the general historical and social conditions into which a data bunker is embedded. For example, it can safely be assumed that the databases of a large credit card company or bank are more protected than the bureaucracies of small developing countries.

Private data bunkers include

    Banks

    Building societies

    Credit bureaus

    Credit card companies

    Direct marketing companies

    Insurance companies

    Telecom service providers

    Mail order stores

    Online stores


TEXTBLOCK 9/29 // URL: http://world-information.org/wio/infostructure/100437611761/100438659735
 
Anonymity

"Freedom of anonymous speech is an essential component of free speech."

Ian Goldberg/David Wagner, TAZ Servers and the Rewebber Network: Enabling Anonymous Publishing on the World Wide Web, in: First Monday 3,4, 1999

Someone wants to hide one's identity, to remain anonymous, if s/he fears to be holding accountable for something, say, a publication, that is considered to be prohibited. Anonymous publishing has a long tradition in European history. Writers of erotic literature or pamphlets, e. g., preferred to use pseudonyms or publish anonymously. During the Enlightenment books as d'Alembert's and Diderot's famous Encyclopaedia were printed and distributed secretly. Today Book Locker, a company selling electronic books, renews this tradition by allowing to post writings anonymously, to publish without the threat of being perishing for it. Sometimes anonymity is a precondition for reporting human rights abuses. For example, investigative journalists and regime critics may rely on anonymity. But we do not have to look that far; even you might need or use anonymity sometimes, say, when you are a woman wanting to avoid sexual harassment in chat rooms.

The original design of the Net, as far as it is preserved, offers a relatively high degree of privacy, because due to the client-server model all what is known about you is a report of the machine from which information was, respectively is requested. But this design of the Net interferes with the wish of corporations to know you, even to know more about you than you want them to know. What is euphemistically called customer relationship management systems means the collection, compilation and analysis of personal information about you by others.

In 1997 America Online member Timothy McVeigh, a Navy employee, made his homosexuality publicly known in a short autobiographical sketch. Another Navy employee reading this sketch informed the Navy. America Online revealed McVeigh's identity to the Navy, who discharged McVeigh. As the consequence of a court ruling on that case, Timothy McVeigh was allowed to return to the Navy. Sometimes anonymity really matters.

On the Net you still have several possibilities to remain anonymous. You may visit web sites via an anonymizing service. You might use a Web mail account (given the personal information given to the web mail service provider is not true) or you might use an anonymous remailing service which strips off the headers of your mail to make it impossible to identify the sender and forward your message. Used in combination with encryption tools and technologies like FreeHaven or Publius anonymous messaging services provide a powerful tool for countering censorship.

In Germany, in 1515, printers had to swear not to print or distribute any publication bypassing the councilmen. Today repressive regimes, such as China and Burma, and democratic governments, such as the France and Great Britain, alike impose or already have imposed laws against anonymous publishing on the Net.

Anonymity might be used for abuses, that is true, but "the burden of proof rests with those who would seek to limit it. (Rob Kling, Ya-ching Lee, Al Teich, Mark S. Frankel, Assessing Anonymous Communication on the Internet: Policy Deliberations, in: The Information Society, 1999).

TEXTBLOCK 10/29 // URL: http://world-information.org/wio/infostructure/100437611742/100438659040
 
Timeline 1600 - 1900 AD

17th century Cardinal Richelieu invents an encryption-tool called grille, a card with holes for writing messages on paper into the holes of those cards. Afterwards he removes the cards and fills in the blanks, so the message looks like an ordinary letter. The recipient needs to own the same card

- Bishop John Wilkins invents a cryptologic system looking like music notes. In a book he describes several forms of steganographic systems like secrets inks, but also the string cipher. He mentions the so-called Pig Latin, a spoken way of encryption that was already used by the ancient Indians

- the English scientist, magician and astrologer John Dee works on the ancient Enochian alphabet; he also possesses an encrypted writing that could not been broken until today

1605/1623 Sir Francis Bacon (= Francis Tudor = William Shakespeare?) writes several works containing ideas about cryptography. One of his most important advises is to use ciphers in such a way that no-one gets suspicious that the text could be enciphered. For this the steganogram was the best method, very often used in poems. The attempt to decipher Shakespeare's sonnets (in the 20th century) lead to the idea that his works had been written by Francis Bacon originally.

1671 Leibniz invents a calculating machine that uses the binary scale which we still use today, more advanced of course, called the ASCII code

18th century this is the time of the Black Chambers of espionage in Europe, Vienna having one of the most effective ones, called the "Geheime Kabinettskanzlei", headed by Baron Ignaz von Koch. Its task is to read through international diplomatic mail, copy letters and return them to the post-office the same morning. Supposedly about 100 letters are dealt with each day.

1790's Thomas Jefferson and Robert Patterson invent a wheel cipher

1799 the Rosetta Stone is found and makes it possible to decipher the Egyptian Hieroglyphs

1832 or 1838 Sam Morse develops the Morse Code, which actually is no code but an enciphered alphabet of short and long sounds. The first Morse code-message is sent by telegraph in 1844.

1834 the Braille Code for blind people is developed in today's form by Louis Braille

1844 the invention of the telegraph changes cryptography very much, as codes are absolutely necessary by then

1854 the Playfair cipher is invented by Sir Charles Wheatstone

1859 for the first time a tomographic cipher gets described

1861 Friedrich W. Kasiski does a cryptoanalysis of the Vigenère ciphers, which had been supposed to be uncrackable for ages

1891 Major Etienne Bazeries creates a new version of the wheel cipher, which is rejected by the French Army

1895 the invention of the radio changes cryptography-tasks again and makes them even more important

TEXTBLOCK 11/29 // URL: http://world-information.org/wio/infostructure/100437611776/100438658974
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 12/29 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
The Piracy "Industry"

Until recent years, the problem of piracy (the unauthorized reproduction or distribution of copyrighted works (for commercial purposes)) was largely confined to the copying and physical distribution of tapes, disks and CDs. Yet the emergence and increased use of global data networks and the WWW has added a new dimension to the piracy of intellectual property by permitting still easier copying, electronic sales and transmissions of illegally reproduced copyrighted works on a grand scale.

This new development, often referred to as Internet piracy, broadly relates to the use of global data networks to 1) transmit and download digitized copies of pirated works, 2) advertise and market pirated intellectual property that is delivered on physical media through the mails or other traditional means, and 3) offer and transmit codes or other technologies which can be used to circumvent copy-protection security measures.

Lately the International Intellectual Property Alliance has published a new report on the estimated trade losses due to piracy. (The IIPA assumes that their report actually underestimates the loss of income due to the unlawful copying and distribution of copyrighted works. Yet it should be taken into consideration that the IIPA is the representative of the U.S. core copyright industries (business software, films, videos, music, sound recordings, books and journals, and interactive entertainment software).)

Table: IIPA 1998 - 1999 Estimated Trade Loss due to Copyright Piracy (in millions of US$)





Motion Pictures

Records & Music

Business Applications

Entertainment Software

Books





1999

1998

1999

1998

1999

1998

1999

1998

1999

1998

Total Losses

1323

1421

1684

1613

3211

3437

3020

2952

673

619



Total Losses (core copyright industries)

1999

1998

9910.0

10041.5




TEXTBLOCK 13/29 // URL: http://world-information.org/wio/infostructure/100437611725/100438659531
 
Basics: Protected Persons

Generally copyright vests in the author of the work. Certain national laws provide for exceptions and, for example, regard the employer as the original owner of a copyright if the author was, when the work was created, an employee and employed for the purpose of creating that work. In the case of some types of creations, particularly audiovisual works, several national laws provide for different solutions to the question that should be the first holder of copyright in such works.

Many countries allow copyright to be assigned, which means that the owner of the copyright transfers it to another person or entity, which then becomes its holder. When the national law does not permit assignment it usually provides the possibility to license the work to someone else. Then the owner of the copyright remains the holder, but authorizes another person or entity to exercise all or some of his rights subject to possible limitations. Yet in any case the "moral rights" always belong to the author of the work, whoever may be the owner of the copyright (and therefore of the "economic rights").


TEXTBLOCK 14/29 // URL: http://world-information.org/wio/infostructure/100437611725/100438659527
 
Who owns the Internet and who is in charge?

The Internet/Matrix still depends heavily on public infrastructure and there is no dedicated owner of the whole Internet/Matrix, but the networks it consists of are run and owned by corporations and institutions. Access to the Internet is usually provided by Internet Service Providers (ISPs) for a monthly fee. Each network is owned by someone and has a network operation center from where it is centrally controlled, but the Internet/Matrix is not owned by any single authority and has no network operation center of its own. No legal authority determines how and where networks can be connected together, this is something the managers of networks have to agree about. So there is no way to ever gain ultimate control of the Matrix/Internet.
The in some respects decentralized Matrix/Internet architecture and administration do not imply that there are no authorities for oversight and common standards for sustaining basic operations, for administration: There are authorities for IP number and domain name registrations, e.g.
Ever since the organizational structures for Internet administration have changed according to the needs to be addressed. Up to now, administration of the Internet is a collaborative undertaking of several loose cooperative bodies with no strict hierarchy of authority. These bodies make decisions on common guidelines, as communication protocols, e.g., cooperatively, so that compatibility of software is guaranteed. But they have no binding legal authority, nor can they enforce the standards they have agreed upon, nor are they wholly representative for the community of Internet users. The Internet has no official governing body or organization; most parts are still administered by volunteers.
Amazingly, there seems to be an unspoken and uncodified consent of what is allowed and what is forbidden on the Internet that is widely accepted. Codifications, as the so-called Netiquette, are due to individual efforts and mostly just expressively stating the prevailing consent. Violations of accepted standards are fiercely rejected, as reactions to misbehavior in mailing lists and newsgroups prove daily.
Sometimes violations not already subject to law become part of governmental regulations, as it was the case with spamming, the unsolicited sending of advertising mail messages. But engineers proved to be quicker and developed software against spamming. So, in some respects, the Internet is self-regulating, indeed.
For a detailed report on Internet governance, click here.

TEXTBLOCK 15/29 // URL: http://world-information.org/wio/infostructure/100437611791/100438658447
 
Legal Protection: European Union

Within the EU's goal of establishing a European single market also intellectual property rights are of significance. Therefore the European Commission aims at the harmonization of the respective national laws of the EU member states and for a generally more effective protection of intellectual property on an international level. Over the years it has adopted a variety of Conventions and Directives concerned with different aspects of the protection of industrial property as well as copyright and neighboring rights.

An overview of EU activities relating to intellectual property protection is available on the website of the European Commission (DG Internal Market): http://www.europa.eu.int/comm/internal_market/en/intprop/intprop/index.htm

TEXTBLOCK 16/29 // URL: http://world-information.org/wio/infostructure/100437611725/100438659574
 
2000 A.D.

2000
Convergence of telephony, audiovisual technologies and computing

Digital technologies are used to combine previously separated communication and media systems such as telephony, audiovisual technologies and computing to new services and technologies, thus forming extensions of existing communication systems and resulting in fundamentally new communication systems. This is what is meant by today's new buzzwords "multimedia" and "convergence".

Classical dichotomies as the one of computing and telephony and traditional categorizations no longer apply, because these new services no longer fit traditional categories.

Convergence and Regulatory Institutions

Digital technology permits the integration of telecommunications with computing and audiovisual technologies. New services that extend existing communication systems emerge. The convergence of communication and media systems corresponds to a convergence of corporations. Recently, America Online, the world's largest online service provider, merged with Time Warner, the world's largest media corporation. For such corporations the classical approach to regulation - separate institutions regulate separate markets - is no longer appropriate, because the institutions' activities necessarily overlap. The current challenges posed to these institutions are not solely due to the convergence of communication and media systems made possible by digital technologies; they are also due to the liberalization and internationalization of the electronic communications sector. For regulation to be successful, new categorizations and supranational agreements are needed.
For further information on this issue see Natascha Just and Michael Latzer, The European Policy Response to Convergence with Special Consideration of Competition Policy and Market Power Control, http://www.soe.oeaw.ac.at/workpap.htm or http://www.soe.oeaw.ac.at/WP01JustLatzer.doc.

TEXTBLOCK 17/29 // URL: http://world-information.org/wio/infostructure/100437611796/100438659802
 
Intellectual Property: A Definition

Intellectual property, very generally, relates to the output, which result from intellectual activity in the industrial, scientific, literary and artistic fields. Traditionally intellectual property is divided into two branches:

1) Industrial Property

a) Inventions
b) Marks (trademarks and service marks)
c) Industrial designs
d) Unfair competition (trade secrets)
e) Geographical indications (indications of source and appellations of origin)

2) Copyright

The protection of intellectual property is guaranteed through a variety of laws, which grant the creators of intellectual goods, and services certain time-limited rights to control the use made of their products. Those rights apply to the intellectual creation as such, and not to the physical object in which the work may be embodied.

TEXTBLOCK 18/29 // URL: http://world-information.org/wio/infostructure/100437611725/100438659434
 
Product Placement

With television still being very popular, commercial entertainment has transferred the concept of soap operas onto the Web. The first of this new species of "Cybersoaps" was "The Spot", a story about the ups and downs of an American commune. The Spot not only within short time attracted a large audience, but also pioneered in the field of online product placement. Besides Sony banners, the companies logo is also placed on nearly every electronic product appearing in the story. Appearing as a site for light entertainment, The Spots main goal is to make the name Sony and its product range well known within the target audience.

TEXTBLOCK 19/29 // URL: http://world-information.org/wio/infostructure/100437611652/100438658026
 
Feeding the data body

TEXTBLOCK 20/29 // URL: http://world-information.org/wio/infostructure/100437611761/100438659644
 
Virtual body and data body



The result of this informatisation is the creation of a virtual body which is the exterior of a man or woman's social existence. It plays the same role that the physical body, except located in virtual space (it has no real location). The virtual body holds a certain emancipatory potential. It allows us to go to places and to do things which in the physical world would be impossible. It does not have the weight of the physical body, and is less conditioned by physical laws. It therefore allows one to create an identity of one's own, with much less restrictions than would apply in the physical world.

But this new freedom has a price. In the shadow of virtualisation, the data body has emerged. The data body is a virtual body which is composed of the files connected to an individual. As the Critical Art Ensemble observe in their book Flesh Machine, the data body is the "fascist sibling" of the virtual body; it is " a much more highly developed virtual form, and one that exists in complete service to the corporate and police state."

The virtual character of the data body means that social regulation that applies to the real body is absent. While there are limits to the manipulation and exploitation of the real body (even if these limits are not respected everywhere), there is little regulation concerning the manipulation and exploitation of the data body, although the manipulation of the data body is much easier to perform than that of the real body. The seizure of the data body from outside the concerned individual is often undetected as it has become part of the basic structure of an informatised society. But data bodies serve as raw material for the "New Economy". Both business and governments claim access to data bodies. Power can be exercised, and democratic decision-taking procedures bypassed by seizing data bodies. This totalitarian potential of the data body makes the data body a deeply problematic phenomenon that calls for an understanding of data as social construction rather than as something representative of an objective reality. How data bodies are generated, what happens to them and who has control over them is therefore a highly relevant political question.

TEXTBLOCK 21/29 // URL: http://world-information.org/wio/infostructure/100437611761/100438659695
 
Challenges for Copyright by ICT: Internet Service Providers

ISPs (Internet Service Providers) (and to a certain extent also telecom operators) are involved in the copyright debate primarily because of their role in the transmission and storage of digital information. Problems arise particularly concerning caching, information residing on systems or networks of ISPs at the directions of users and transitory communication.

Caching

Caching it is argued could cause damage because the copies in the cache are not necessarily the most current ones and the delivery of outdated information to users could deprive website operators of accurate "hit" information (information about the number of requests for a particular material on a website) from which advertising revenue is frequently calculated. Similarly harms such as defamation or infringement that existed on the original page may propagate for years until flushed from each cache where they have been replicated.

Although different concepts, similar issues to caching arise with mirroring (establishing an identical copy of a website on a different server), archiving (providing a historical repository for information, such as with newsgroups and mailing lists), and full-text indexing (the copying of a document for loading into a full-text or nearly full-text database which is searchable for keywords or concepts).

Under a literal reading of some copyright laws caching constitutes an infringement of copyright. Yet recent legislation like the DMCA or the proposed EU Directive on copyright and related rights in the information society (amended version) have provided exceptions for ISPs concerning particular acts of reproduction that are considered technical copies (caching). Nevertheless the exemption of liability for ISPs only applies if they meet a variety of specific conditions. In the course of the debate about caching also suggestions have been made to subject it to an implied license or fair use defense or make it (at least theoretically) actionable.

Information Residing on Systems or Networks at the Direction of Users

ISPs may be confronted with problems if infringing material on websites (of users) is hosted on their systems. Although some copyright laws like the DMCA provide for limitations on the liability of ISPs if certain conditions are met, it is yet unclear if ISPs should generally be accountable for the storage of infringing material (even if they do not have actual knowledge) or exceptions be established under specific circumstances.

Transitory Communication

In the course of transmitting digital information from one point on a network to another ISPs act as a data conduit. If a user requests information ISPs engage in the transmission, providing of a connection, or routing thereof. In the case of a person sending infringing material over a network, and the ISP merely providing facilities for the transmission it is widely held that they should not be liable for infringement. Yet some copyright laws like the DMCA provide for a limitation (which also covers the intermediate and transient copies that are made automatically in the operation of a network) of liability only if the ISPs activities meet certain conditions.

For more information on copyright (intellectual property) related problems of ISPs (BBSs (Bulletin Board Service Operators), systems operators and other service providers) see:

Harrington, Mark E.: On-line Copyright Infringement Liability for Internet Service Providers: Context, Cases & Recently Enacted Legislation. In: Intellectual Property and Technology Forum. June 4, 1999.

Teran, G.: Who is Vulnerable to Suit? ISP Liability for Copyright Infringement. November 2, 1999.

TEXTBLOCK 22/29 // URL: http://world-information.org/wio/infostructure/100437611725/100438659550
 
Another Question of Security

Even with the best techniques it is impossible to invent a cryptographic system that is absolutely safe/unbreakable. To decipher a text means to go through many, sometimes nearly - but never really - endless attempts. For the computers of today it might take hundreds of years or even more to go through all possibilities of codes, but still, finally the code stays breakable. The much faster quantum computers will proof that one day.
Therefore the decision to elect a certain method of enciphering finally is a matter of trust.

For the average user of computers it is rather difficult to understand or even realize the dangers and/or the technological background of electronic transmission of data. For the majority thinking about one's own necessities for encryption first of all means to trust others, the specialists, to rely on the information they provide.
The websites explaining the problems behind (and also the articles and books concerning the topic) are written by experts of course as well, very often in their typical scientific language, merely understandable for laymen. The introductions and other superficial elements of those articles can be understood, whereas the real background appears as untouchable spheres of knowledge.

The fact that dangers are hard to see through and the need for security measures appears as something most people know from media reports, leads directly to the problem of an underdeveloped democracy in the field of cryptography. Obviously the connection between cryptography and democracy is rather invisible for many people. Those mentioned media reports often specialize in talking about the work computer hackers do (sometimes being presented as criminals, sometimes as heroes) and the danger to lose control over the money drawn away from one's bank account, if someone steals the credit card number or other important financial data. The term "security", surely connected to those issues, is a completely different one from the one that is connected to privacy.
It is especially the latter that touches the main elements of democracy.

for the question of security see:
http://www-db.stanford.edu/pub/gio/CS99I/security.html

TEXTBLOCK 23/29 // URL: http://world-information.org/wio/infostructure/100437611776/100438658850
 
Biometrics applications: gate keeping

Identity has to do with "place". In less mobile societies, the place where a person finds him/herself tells us something about his/her identity. In pre-industrial times, gatekeepers had the function to control access of people to particular places, i.e. the gatekeepers function was to identify people and then decide whether somebody's identity would allow that person to physically occupy another place - a town, a building, a vehicle, etc.

In modern societies, the unambiguous nature of place has been weakened. There is a great amount of physical mobility, and ever since the emergence and spread of electronic communication technologies there has been a "virtualisation" of places in what today we call "virtual space" (unlike place, space has been a virtual reality from the beginning, a mathematical formula) The question as to who one is no longer coupled to the physical abode. Highly mobile and virtualised social contexts require a new generation of gatekeepers which biometric technology aims to provide.

TEXTBLOCK 24/29 // URL: http://world-information.org/wio/infostructure/100437611729/100438658757
 
History: "Indigenous Tradition"

In preliterate societies the association of rhythmic or repetitively patterned utterances with supernatural knowledge endures well into historic times. Knowledge is passed from one generation to another. Similar as in the Southern tradition intellectual property rights are rooted in a concept of 'collective' or 'communal' intellectual property existing in perpetuity and not limited to the life of an individual creator plus some number of years after his or her death. Often rights are exercised by only one individual in each generation, often through matrilineal descent.


TEXTBLOCK 25/29 // URL: http://world-information.org/wio/infostructure/100437611725/100438659557
 
Legal Protection: National Legislation

Intellectual property - comprising industrial property and copyright - in general is protected by national legislation. Therefore those rights are limited territorially and can be exercised only within the jurisdiction of the country or countries under whose laws they are granted.

TEXTBLOCK 26/29 // URL: http://world-information.org/wio/infostructure/100437611725/100438659540
 
Late 1950s - Early 1960s: Second Generation Computers

An important change in the development of computers occurred in 1948 with the invention of the transistor. It replaced the large, unwieldy vacuum tube and as a result led to a shrinking in size of electronic machinery. The transistor was first applied to a computer in 1956. Combined with the advances in magnetic-core memory, the use of transistors resulted in computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.

Stretch by IBM and LARC by Sperry-Rand (1959) were the first large-scale machines to take advantage of the transistor technology (and also used assembly language instead of the difficult machine language). Both developed for atomic energy laboratories could handle enormous amounts of data, but still were costly and too powerful for the business sector's needs. Therefore only two LARC's were ever installed.

Throughout the early 1960s there were a number of commercially successful computers (for example the IBM 1401) used in business, universities, and government and by 1965 most large firms routinely processed financial information by using computers. Decisive for the success of computers in business was the stored program concept and the development of sophisticated high-level programming languages like FORTRAN (Formular Translator), 1956, and COBOL (Common Business-Oriented Language), 1960, that gave them the flexibility to be cost effective and productive. The invention of second generation computers also marked the beginning of an entire branch, the software industry, and the birth of a wide range of new types of careers.

TEXTBLOCK 27/29 // URL: http://world-information.org/wio/infostructure/100437611663/100438659439
 
Other biometric technologies

Other biometric technologies not specified here include ear recognition, signature dynamics, key stroke dynamics, vein pattern recognition, retinal scan, body odour recognition, and DNA recognition. These are technologies which are either in early stages of development or used in highly specialised and limited contexts.

TEXTBLOCK 28/29 // URL: http://world-information.org/wio/infostructure/100437611729/100438658399
 
Global hubs of the data body industry

While most data bunkers are restricted to particular areas or contexts, there are others which act as global data nodes. Companies such as EDS (Electronic Data Systems), Experian, First Data Corporation and Equifax operate globally and run giant databases containing personal information. They are the global hubs of the data body economy.

Company

Sales in USD billions

Size of client database in million datasets





Equifax





1,7





360





Experian





1,5





779





Fist Data Corporation





5,5





260





EDS





18,5









(not disclosed)

(Sales and database sizes, 1998)

The size of these data repositories is constantly growing, so it is only a matter of time when everybody living in the technologically saturated part of the world will be registered in one of these data bunkers.

Among these companies, EDS, founded by the former US presidential candidate Ross Perot, known for his right-wing views and direct language, is of particular importance. Not only is it the world's largest data body company, it is also secretive about the size of its client database - a figure disclosed by the other companies either in company publications or upon enquiry. After all, the size of such a data base makes a company more attractive for potential customers.

For many years, EDS has been surrounded by rumours concerning sinister involvement with intelligence agencies. Beyond the rumours, though, there are also facts. EDS has a special division for government services. EDS does business with all military agencies of the US, as well as law enforcement agencies, justice agencies, and many others. The company also maintains a separate division for military equipment In 1984, the company became a subsidiary of General Motors, itself a leading manufacturer of military and intelligence systems. EDS is listed by the Federation of American Scientist's intelligence resource program as contractor to US intelligence agencies, and prides itself, amongst other things, to respond to the "rise of the citizen as a consumer".

TEXTBLOCK 29/29 // URL: http://world-information.org/wio/infostructure/100437611761/100438659778
 
File Transfer Protocol (FTP)

FTP enables the transfer of files (text, image, video, sound) to and from other remote computers connected to the Internet.

INDEXCARD, 1/36
 
WTO

An international organization designed to supervise and liberalize world trade. The WTO (World Trade Organization) is the successor to the General Agreement on Tariffs and Trade (GATT), which was created in 1947 and liberalized the world's trade over the next five decades. The WTO came into being on Jan. 1, 1995, with 104 countries as its founding members. The WTO is charged with policing member countries' adherence to all prior GATT agreements, including those of the last major GATT trade conference, the Uruguay Round (1986-94), at whose conclusion GATT had formally gone out of existence. The WTO is also responsible for negotiating and implementing new trade agreements. The WTO is governed by a Ministerial Conference, which meets every two years; a General Council, which implements the conference's policy decisions and is responsible for day-to-day administration; and a director-general, who is appointed by the Ministerial Conference. The WTO's headquarters are in Geneva, Switzerland.



INDEXCARD, 2/36
 
François Duvalier

b. April 14, 1907, Port-au-Prince, Haiti
d. April 21, 1971, Port-au-Prince

By name PAPA DOC, president of Haiti whose 14-year regime was of unprecedented duration in that country. A supporter of President Dumarsais Estimé, Duvalier was appointed director general of the National Public Health Service in 1946. He was appointed underminister of labour in 1948 and the following year became minister of public health and labour, a post that he retained until May 10, 1950, when President Estimé was overthrown by a military junta under Paul E. Magloire, who was subsequently elected president. By 1954 he had become the central opposition figure and went underground. Duvalier was elected president in September 1957. Setting about to consolidate his power, he reduced the size of the army and organized the Tontons Macoutes ("Bogeymen"), a private force responsible for terrorizing and assassinating alleged foes of the regime. Late in 1963 Duvalier moved further toward an absolutist regime, promoting a cult of his person as the semi divine embodiment of the Haitian nation. In April 1964 he was declared president for life. Although diplomatically almost completely isolated, excommunicated by the Vatican until 1966 for harassing the clergy, and threatened by conspiracies against him, Duvalier was able to stay in power longer than any of his predecessors.

INDEXCARD, 3/36
 
Cisco, Inc.

Being the worldwide leader in networking for the Internet, Cisco Systems is one of the most prominent companies of the Internet industry.

http://www.cisco.com

INDEXCARD, 4/36
 
Internet Architecture Board

On behalf of the Internet Society, the Internet Architecture Board oversees the evolution of the architecture, the standards and the protocols of the Net.

Internet Society: http://www.isoc.org/iab

http://www.isoc.org/
INDEXCARD, 5/36
 
Kosov@

The "word" Kosov@ is a compromise between the Serb name KosovO and the Albanian KosovA. It is mostly used by international people who want to demonstrate a certain consciousness about the conflict including some sort of neutrality, believing that neither the one side nor the other (and maybe not even NATO) is totally right. Using the word Kosov@ is seen as a symbol of peace.

For more explanations (in German) see: http://www.zivildienst.at/kosov@.htm

http://www.zivildienst.at/kosov@.htm
INDEXCARD, 6/36
 
America Online

Founded in 1985, America Online is the world's biggest Internet service provider serving almost every second user. Additionally, America Online operates CompuServe, the Netscape Netcenter and several AOL.com portals. As the owner of Netscape, Inc. America Online plays also an important role in the Web browser market. In January 2000 America Online merged with Time Warner, the worlds leading media conglomerate, in a US$ 243,3 billion deal, making America Online the senior partner with 55 percent in the new company.

http://www.aol.com

http://www.aol.com/
INDEXCARD, 7/36
 
Netiquette

Although referred to as a single body of rules, there is not just one Netiquette, but there are several, though overlapping largely. Proposing general guidelines for posting messages to newsgroups and mailing lists and using the World Wide Web and FTP, Netiquettes address civility topics (i.e., avoiding hate speech) and comprise technical advises (i.e., using simple and platform-independent file formats).
Well-known Netiquettes are the Request for Comment #1855 and The Net: User Guidelines and Netiquette by Arlene H. Rinaldi.

ftp://ftp.isi.edu/in-notes/rfc1855.txt
http://www.fau.edu/netiquette/net/index.html
INDEXCARD, 8/36
 
Expert system

Expert systems are advanced computer programs that mimic the knowledge and reasoning capabilities of an expert in a particular discipline. Their creators strive to clone the expertise of one or several human specialists to develop a tool that can be used by the layman to solve difficult or ambiguous problems. Expert systems differ from conventional computer programs as they combine facts with rules that state relations between the facts to achieve a crude form of reasoning analogous to artificial intelligence. The three main elements of expert systems are: (1) an interface which allows interaction between the system and the user, (2) a database (also called the knowledge base) which consists of axioms and rules, and (3) the inference engine, a computer program that executes the inference-making process. The disadvantage of rule-based expert systems is that they cannot handle unanticipated events, as every condition that may be encountered must be described by a rule. They also remain limited to narrow problem domains such as troubleshooting malfunctioning equipment or medical image interpretation, but still have the advantage of being much lower in costs compared with paying an expert or a team of specialists.

INDEXCARD, 9/36
 
Virtual Private Networks

Virtual Private Networks provide secured connections to a corporate site over a public network as the Internet. Data transmitted through secure connections are encrypted and therefore have to be encrypted before they can be read.
These networks are called virtual because connections are provided only when you connect to a corporate site; they do not rely on dedicated lines and support mobile use.

INDEXCARD, 10/36
 
Telnet

Telnet allows you to login remotely on a computer connected to the Internet.

INDEXCARD, 11/36
 
Amazon.com

Amazon.com is an online shop that serves approx. 17 mn customers in 150 countries. Starting out as a bookshop, Amazon today offers a wide range of other products as well.

Among privacy campaigners, the company's name has become almost synonymous with aggressive online direct marketing practices as well as user profiling and tracking. Amazon and has been involved in privacy disputes at numerous occasions.

http://www.amazon.com/
http://www.computeruser.com/newstoday/00/01/0...
INDEXCARD, 12/36
 
Satellites

Communications satellites are relay stations for radio signals and provide reliable and distance-independent high-speed connections even at remote locations without high-bandwidth infrastructure.

On point-to-point transmission, the transmission method originally employed on, satellites face increasing competition from fiber optic cables, so point-to-multipoint transmission increasingly becomes the ruling satellite technology. Point-to-multipoint transmission enables the quick implementation of private networks consisting of very small aperture terminals (VSAT). Such networks are independent and make mobile access possible.

In the future, satellites will become stronger, cheaper and their orbits will be lower; their services might become as common as satellite TV is today.

For more information about satellites, see How Satellites Work (http://octopus.gma.org/surfing/satellites) and the Tech Museum's satellite site (http://www.thetech.org/hyper/satellite).

http://www.whatis.com/vsat.htm
http://octopus.gma.org/surfing/satellites
INDEXCARD, 13/36
 
CIA

CIA's mission is to support the President, the National Security Council, and all officials who make and execute U.S. national security policy by: Providing accurate, comprehensive, and timely foreign intelligence on national security topics; Conducting counterintelligence activities, special activities, and other functions related to foreign intelligence and national security, as directed by the President. To accomplish its mission, the CIA engages in research, development, and deployment of high-leverage technology for intelligence purposes. As a separate agency, CIA serves as an independent source of analysis on topics of concern and works closely with the other organizations in the Intelligence Community to ensure that the intelligence consumer--whether Washington policymaker or battlefield commander--receives the adaequate intelligence information.

http://www.cia.gov

INDEXCARD, 14/36
 
Critical Art Ensemble

Critical Art Ensemble is a collective of five artists of various specializations dedicated to exploring the intersections between art, technology, radical politics, and critical theory. CAE have published a number of books and carried out innovative art projects containing insightful and ironic theoretical contributions to media art. Projects include Addictionmania, Useless Technology, The Therapeutic State, Diseases of Consciousness, Machineworld, As Above So Below, and Flesh Machine.

http://www.critical-art.net

INDEXCARD, 15/36
 
The Spot

http://www.thespot.com/

http://www.thespot.com/
INDEXCARD, 16/36
 
Gateway

A gateway is a computer supplying point-to-multipoint connections between computer networks.

INDEXCARD, 17/36
 
Industrial design

Industrial design refers to the ornamental aspect of a useful article which may constitute of two or three-dimensional elements. To be qualified for intellectual property protection the design must be novel or original. Protection can be obtained through registration in a government office and usually is given for 10 to 15 years.

INDEXCARD, 18/36
 
Convergence, 2000-

Digital technologies are used to combine previously separated communication and media systems as telephony, audiovisual technologies and computing to new services and technologies, thus forming extensions of existing communication systems and resulting in fundamentally new communication systems. This is what is meant by today's new buzzwords "multimedia" and "convergence".

Classical dichotomies as the one of computing and telephony and traditional categorisations no longer apply, because these new services no longer fit traditional categories.

INDEXCARD, 19/36
 
VISA

Visa International's over 21,000 member financial institutions have made VISA one of the world's leading full-service payment network. Visa's products and services include Visa Classic card, Visa Gold card, Visa debit cards, Visa commercial cards and the Visa Global ATM Network. VISA operates in 300 countries and territories and also provides a large consumer payments processing system.

INDEXCARD, 20/36
 
Fiber-optic cable networks

Fiber-optic cable networks may become the dominant method for high-speed Internet connections. Since the first fiber-optic cable was laid across the Atlantic in 1988, the demand for faster Internet connections is growing, fuelled by the growing network traffic, partly due to increasing implementation of corporate networks spanning the globe and to the use of graphics-heavy contents on the World Wide Web.

Fiber-optic cables have not much more in common with copper wires than the capacity to transmit information. As copper wires, they can be terrestrial and submarine connections, but they allow much higher transmission rates. Copper wires allow 32 telephone calls at the same time, but fiber-optic cable can carry 40,000 calls at the same time. A capacity, Alexander Graham Bell might have not envisioned when he transmitted the first words - "Mr. Watson, come here. I want you" - over a copper wire.

Copper wires will not come out of use in the foreseeable future because of technologies as DSL that speed up access drastically. But with the technology to transmit signals at more than one wavelength on fiber-optic cables, there bandwidth is increasing, too.

For technical information from the Encyclopaedia Britannica on telecommunication cables, click here. For technical information from the Encyclopaedia Britannica focusing on fiber-optic cables, click here.

An entertaining report of the laying of the FLAG submarine cable, up to now the longest fiber-optic cable on earth, including detailed background information on the cable industry and its history, Neal Stephenson has written for Wired: Mother Earth Mother Board. Click here for reading.

Susan Dumett has written a short history of undersea cables for Pretext magazine, Evolution of a Wired World. Click here for reading.

A timeline history of submarine cables and a detailed list of seemingly all submarine cables of the world, operational, planned and out of service, can be found on the Web site of the International Cable Protection Committee.

For maps of fiber-optic cable networks see the website of Kessler Marketing Intelligence, Inc.

http://www.britannica.com/bcom/eb/article/4/0...
http://www.britannica.com/bcom/eb/article/4/0...
http://www.wired.com/wired/archive/4.12/ffgla...
http://www.pretext.com/mar98/features/story3....
INDEXCARD, 21/36
 
1996 WIPO Copyright Treaty (WCT)

The 1996 WIPO Copyright Treaty, which focused on taking steps to protect copyright "in the digital age" among other provisions 1) makes clear that computer programs are protected as literary works, 2) the contracting parties must protect databases that constitute intellectual creations, 3) affords authors with the new right of making their works "available to the public", 4) gives authors the exclusive right to authorize "any communication to the public of their works, by wire or wireless means ... in such a way that members of the public may access these works from a place and at a time individually chosen by them." and 5) requires the contracting states to protect anti-copying technology and copyright management information that is embedded in any work covered by the treaty. The WCT is available on: http://www.wipo.int/documents/en/diplconf/distrib/94dc.htm



http://www.wipo.int/documents/en/diplconf/dis...
INDEXCARD, 22/36
 
Terrestrial antennas

Microwave transmission systems based on terrestrial antennas are similar to satellite transmission system. Providing reliable high-speed access, they are used for cellular phone networks.

The implementation of the Wide Application Protocol (WAP) makes the wireless access to Internet services as E-Mail and even the World Wide Web via cellular phones convenient. Therefore microwave transmission systems become increasingly important.

INDEXCARD, 23/36
 
The World Wide Web History Project

The ongoing World Wide Web History Project was established to record and publish the history of the World Wide Web and its roots in hypermedia and networking. As primary research methods are used archival research and the analysis of interviews and talks with pioneers of the World Wide Web. As result a vast of collection of historic video, audio, documents, and software is expected. The project's digital archive is currently under development.

http://www.webhistory.org/home.html

INDEXCARD, 24/36
 
Java Applets

Java applets are small programs that can be sent along with a Web page to a user. Java applets can perform interactive animations, immediate calculations, or other simple tasks without having to send a user request back to the server. They are written in Java, a platform-independent computer language, which was invented by Sun Microsystems, Inc.

Source: Whatis.com

INDEXCARD, 25/36
 
Server

A server is program, not a computer, as it sometimes said, dedicated to store files, manage printers and network traffic, or process database queries.

Web sites, the nodes of the World Wide Web (WWW), e.g., are stored on servers.

INDEXCARD, 26/36
 
Next Generation Internet Program

A research and development program funded by the US government. Goal is the development of advanced networking technologies and applications requiring advanced networking with capabilities that are 100 to 1,000 times faster end-to-end than today's Internet.

http://www.ngi.gov

INDEXCARD, 27/36
 
Total copyright industries

The total copyright industries encompass the "core copyright industries" and portions of many other industries that either create, distribute, or depend upon copyrighted works. Examples include retail trade (a portion of which is sales of video, audio, software, and books, for example), the doll and toy industry, and computer manufacturing.


INDEXCARD, 28/36
 
The Flesh Machine

This is the tile of a book by the Critical Art Ensemble which puts the development of artifical life into a critical historical and political context, defining the power vectors which act as the driving force behind this development. The book is available in a print version (New York, Autonomedia 1998) and in an online version at http://www.critical-art.net/fles/book/index.html

INDEXCARD, 29/36
 
DMCA

The DMCA (Digital Millennium Copyright Act) was signed into law by U.S. President Clinton in 1998 and implements the two 1996 WIPO treaties (WIPO Performances and Phonograms Treaty and WIPO Copyright Treaty). Besides other issues the DMCA addresses the influence of new technologies on traditional copyright. Of special interest in the context of the digitalization of intellectual property are the titles no. 2, which refers to the limitation on the liability of online service providers for copyright infringement (when certain conditions are met), no. 3, that creates an exemption for making a copy of a computer program in case of maintenance and repair, and no. 4 which is concerned with the status of libraries and webcasting. The DCMA has been widely criticized for giving copyright-holders even more power and damage the rights and freedom of consumers, technological innovation, and the free market for information.

INDEXCARD, 30/36
 
Backbone Networks

Backbone networks are central networks usually of very high bandwidth, that is, of very high transmitting capacity, connecting regional networks. The first backbone network was the NSFNet run by the National Science Federation of the United States.

INDEXCARD, 31/36
 
Internet Society

Founded in 1992, the Internet Society is an umbrella organization of several mostly self-organized organizations dedicated to address the social, political, and technical issues, which arise as a result of the evolution and the growth of the Net. Its most important subsidiary organizations are the Internet Architecture Board, the Internet Engineering Steering Group, the Internet Engineering Task Force, the Internet Research Task Force, and the Internet Societal Task Force.

Its members comprise companies, government agencies, foundations, corporations and individuals. The Internet Society is governed by elected trustees.

http://www.isoc.org

http://www.isoc.org/
INDEXCARD, 32/36
 
Medieval universities and copying of books

The first of the great medieval universities was established at Bologna. At the beginning, universities predominantly offered a kind of do-it-yourself publishing service.

Books still had to be copied by hand and were so rare that a copy of a widely desired book qualified for being invited to a university. Holding a lecture equaled to reading a book aloud, like a priest read from the Bible during services. Attending a lecture equaled to copy a lecture word by word, so you had your own copy of a book, thus enabling you to hold a lecture, too.

For further details see History of the Idea of a University, http://quarles.unbc.edu/ideas/net/history/history.html

http://quarles.unbc.edu/ideas/net/history/his...
INDEXCARD, 33/36
 
Electronic Messaging (E-Mail)

Electronic messages are transmitted and received by computers through a network. By E-Mail texts, images, sounds and videos can be sent to single users or simultaneously to a group of users. Now texts can be sent and read without having them printed.

E-Mail is one of the most popular and important services on the Internet.

INDEXCARD, 34/36
 
IIPA

The International Intellectual Property Alliance formed in 1984 is a private sector coalition and represents the U.S. copyright-based industries. It is comprised of seven trade associations: Association of American Publishers, AFMA, Business Software Alliance, Interactive Digital Software Association, Motion Picture Association of America, National Music Publishers' Association and Recording Industry Association of America. IIPA and its member's track copyright legislative and enforcement developments in over 80 countries and aim at a legal and enforcement regime for copyright that deters piracy. On a national level IIPA cooperates with the U.S. Trade Representative and on the multilateral level has been involved in the development of the TRIPS (Trade-Related Aspects of Intellectual Property Rights) agreement of the WTO (World Trade Organization) and also participates in the copyright discussion of the WIPO (World Intellectual Property Organization).

INDEXCARD, 35/36
 
Machine language

Initially computer programmers had to write instructions in machine language. This coded language, which can be understood and executed directly by the computer without conversion or translation, consists of binary digits representing operation codes and memory addresses. Because it is made up of strings of 1s and 0s, machine language is difficult for humans to use.

INDEXCARD, 36/36