Global data bodies - intro

- Education files, insurance files, tax files, communication files, consumption files, medical files, travel files, criminal files, investment files, files into infinity ...

Critical Art Ensemble

Global data bodies

1. Introduction

Informatisation has meant that things that once were "real", i.e. whose existence could be experienced sensually, are becoming virtual. Instead of the real existence of a thing, the virtual refers to its possibility of existence. As this process advances, an increasing identification of the possible with the real occurs. Reality migrates into a dim and dematerialised grey area. In the end, the possible counts for the real, virtualisation creates an "as-if" experience.

The experience of the body is also affected by this process. For example, in bio-technology, the human body and its functions are digitised, which prepares and understanding of the body exlusively in terms of its potential manipulation, the body becomes whatever it could be. But digitisation has not only affected the understanding and the social significance of the body, it has also altered the meaning of presence, traditionally identified with the body. The advance of information and communication technologies (ICTs) has meant that for an increasing number of activities we no longer need be physically present, our "virtual" presence, achieved by logging onto a electronic information network, is sufficient.

This development, trumpeted as the pinnacle of convenience by the ICT industries and governments interested in attracting investment, has deeply problematic aspects as well. For example, when it is no longer "necessary" to be physically present, it may soon no longer be possible or allowed. Online-banking, offered to customers as a convenience, is also serves as a justification for charging higher fees from those unwilling or unable to add banking to their household chores. Online public administration may be expected to lead to similar effects. The reason for this is that the digitalisation of the economy relies on the production of surplus data. Data has become the most important raw material of modern economies.

In modern economies, informatisation and virtualisation mean that people are structurally forced to carry out their business and life their lives in such a way as to generate data.

Data are the most important resource for the New Economy. By contrast, activities which do not leave behind a trace of data, as for example growing your own carrots or paying cash rather than by plastic card, are discouraged and structurally suppressed.

TEXTBLOCK 1/23 // URL: http://world-information.org/wio/infostructure/100437611761/100438659649
 
Basics: Protected Persons

Generally copyright vests in the author of the work. Certain national laws provide for exceptions and, for example, regard the employer as the original owner of a copyright if the author was, when the work was created, an employee and employed for the purpose of creating that work. In the case of some types of creations, particularly audiovisual works, several national laws provide for different solutions to the question that should be the first holder of copyright in such works.

Many countries allow copyright to be assigned, which means that the owner of the copyright transfers it to another person or entity, which then becomes its holder. When the national law does not permit assignment it usually provides the possibility to license the work to someone else. Then the owner of the copyright remains the holder, but authorizes another person or entity to exercise all or some of his rights subject to possible limitations. Yet in any case the "moral rights" always belong to the author of the work, whoever may be the owner of the copyright (and therefore of the "economic rights").


TEXTBLOCK 2/23 // URL: http://world-information.org/wio/infostructure/100437611725/100438659527
 
The Copyright Industry

Copyright is not only about protecting the rights of creators, but has also become a major branch of industry with significant contributions to the global economy. According to the International Intellectual Property Alliance the U.S. copyright industry has grown almost three times as fast as the economy as a whole for the past 20 years. In 1997, the total copyright industries contributed an estimated US$ 529.3 billion to the U.S. economy with the core copyright industries accounting for US$ 348.4 billion. Between 1977 and 1997, the absolute growth rate of value added to the U.S. GDP by the core copyright industries was 241 %. Also the copyright industry's foreign sales in 1997 (US$ 66.85 billion for the core copyright industries) were larger than the U.S. Commerce Department International Trade Administration's estimates of the exports of almost all other leading industry sectors. They exceeded even the combined automobile and automobile parts industries, as well as the agricultural sector.

In an age where knowledge and information become more and more important and with the advancement of new technologies, transmission systems and distribution channels a further increase in the production of intellectual property is expected. Therefore as copyright establishes ownership in intellectual property it is increasingly seen as the key to wealth in the future.

TEXTBLOCK 3/23 // URL: http://world-information.org/wio/infostructure/100437611725/100438658710
 
Linking and Framing: Cases

Mormon Church v. Sandra and Jerald Tanner

In a ruling of December 1999, a federal judge in Utah temporarily barred two critics of the Mormon Church from posting on their website the Internet addresses of other sites featuring pirated copies of a Mormon text. The Judge said that it was likely that Sandra and Jerald Tanner had engaged in contributory copyright infringement when they posted the addresses of three Web sites that they knew, or should have known, contained the copies.

Kaplan, Carl S.: Copyright Decision Threatens Freedom to Link. In: New York Times. December 10, 1999.

Universal Studios v. Movie-List

The website Movie-List, which features links to online, externally hosted movie trailers has been asked to completely refrain from linking to any of Universal Studio's servers containing the trailers as this would infringe copyright.

Cisneros, Oscar S.: Universal: Don't Link to Us. In: Wired. July 27, 1999.

More cases concerned with the issue of linking, framing and the infringement of intellectual property are published in:

Ross, Alexandra: Copyright Law and the Internet: Selected Statutes and Cases.

TEXTBLOCK 4/23 // URL: http://world-information.org/wio/infostructure/100437611725/100438659639
 
2000 A.D.

2000
Convergence of telephony, audiovisual technologies and computing

Digital technologies are used to combine previously separated communication and media systems such as telephony, audiovisual technologies and computing to new services and technologies, thus forming extensions of existing communication systems and resulting in fundamentally new communication systems. This is what is meant by today's new buzzwords "multimedia" and "convergence".

Classical dichotomies as the one of computing and telephony and traditional categorizations no longer apply, because these new services no longer fit traditional categories.

Convergence and Regulatory Institutions

Digital technology permits the integration of telecommunications with computing and audiovisual technologies. New services that extend existing communication systems emerge. The convergence of communication and media systems corresponds to a convergence of corporations. Recently, America Online, the world's largest online service provider, merged with Time Warner, the world's largest media corporation. For such corporations the classical approach to regulation - separate institutions regulate separate markets - is no longer appropriate, because the institutions' activities necessarily overlap. The current challenges posed to these institutions are not solely due to the convergence of communication and media systems made possible by digital technologies; they are also due to the liberalization and internationalization of the electronic communications sector. For regulation to be successful, new categorizations and supranational agreements are needed.
For further information on this issue see Natascha Just and Michael Latzer, The European Policy Response to Convergence with Special Consideration of Competition Policy and Market Power Control, http://www.soe.oeaw.ac.at/workpap.htm or http://www.soe.oeaw.ac.at/WP01JustLatzer.doc.

TEXTBLOCK 5/23 // URL: http://world-information.org/wio/infostructure/100437611796/100438659802
 
Intellectual Property: A Definition

Intellectual property, very generally, relates to the output, which result from intellectual activity in the industrial, scientific, literary and artistic fields. Traditionally intellectual property is divided into two branches:

1) Industrial Property

a) Inventions
b) Marks (trademarks and service marks)
c) Industrial designs
d) Unfair competition (trade secrets)
e) Geographical indications (indications of source and appellations of origin)

2) Copyright

The protection of intellectual property is guaranteed through a variety of laws, which grant the creators of intellectual goods, and services certain time-limited rights to control the use made of their products. Those rights apply to the intellectual creation as such, and not to the physical object in which the work may be embodied.

TEXTBLOCK 6/23 // URL: http://world-information.org/wio/infostructure/100437611725/100438659434
 
Challenges for Copyright by ICT: Digital Content Providers

Providers of digital information might be confronted with copyright related problems when using some of the special features of hypertext media like frames and hyperlinks (which both use third party content available on the Internet to enhance a webpage or CD ROM), or operate a search engine or online directory on their website.

Framing

Frames are often used to help define, and navigate within, a content provider's website. Still, when they are used to present (copyrighted) third party material from other sites issues of passing off and misleading or deceptive conduct, as well as copyright infringement, immediately arise.

Hyperlinking

It is generally held that the mere creation of a hyperlink does not, of itself, infringe copyright as usually the words indicating a link or the displayed URL are unlikely to be considered a "work". Nevertheless if a link is clicked on the users browser will download a full copy of the material at the linked address creating a copy in the RAM of his computer courtesy of the address supplied by the party that published the link. Although it is widely agreed that the permission to download material over the link must be part of an implied license granted by the person who has made the material available on the web in the first place, the scope of this implied license is still the subject of debate. Another option that has been discussed is to consider linking fair use.

Furthermore hyperlinks, and other "information location tools", like online directories or search engines could cause their operators trouble if they refer or link users to a site that contains infringing material. In this case it is yet unclear whether providers can be held liable for infringement.

TEXTBLOCK 7/23 // URL: http://world-information.org/wio/infostructure/100437611725/100438659590
 
Product Placement

With television still being very popular, commercial entertainment has transferred the concept of soap operas onto the Web. The first of this new species of "Cybersoaps" was "The Spot", a story about the ups and downs of an American commune. The Spot not only within short time attracted a large audience, but also pioneered in the field of online product placement. Besides Sony banners, the companies logo is also placed on nearly every electronic product appearing in the story. Appearing as a site for light entertainment, The Spots main goal is to make the name Sony and its product range well known within the target audience.

TEXTBLOCK 8/23 // URL: http://world-information.org/wio/infostructure/100437611652/100438658026
 
Other biometric technologies

Other biometric technologies not specified here include ear recognition, signature dynamics, key stroke dynamics, vein pattern recognition, retinal scan, body odour recognition, and DNA recognition. These are technologies which are either in early stages of development or used in highly specialised and limited contexts.

TEXTBLOCK 9/23 // URL: http://world-information.org/wio/infostructure/100437611729/100438658399
 
Definition

During the last 20 years the old Immanuel Wallerstein-paradigm of center - periphery and semi-periphery found a new costume: ICTs. After Colonialism, Neo-Colonialism and Neoliberalism a new method of marginalization is emerging: the digital divide.

"Digital divide" describes the fact that the world can be divided into people who
do and people who do not have access to (or the education to handle with) modern information technologies, e.g. cellular telephone, television, Internet. This digital divide is concerning people all over the world, but as usually most of all people in the formerly so called third world countries and in rural areas suffer; the poor and less-educated suffer from that divide.
More than 80% of all computers with access to the Internet are situated in larger cities.

"The cost of the information today consists not so much of the creation of content, which should be the real value, but of the storage and efficient delivery of information, that is in essence the cost of paper, printing, transporting, warehousing and other physical distribution means, plus the cost of the personnel manpower needed to run these `extra' services ....Realizing an autonomous distributed networked society, which is the real essence of the Internet, will be the most critical issue for the success of the information and communication revolution of the coming century of millennium."
(Izumi Aizi)

for more information see:
http://www.whatis.com/digital_divide.htm

TEXTBLOCK 10/23 // URL: http://world-information.org/wio/infostructure/100437611730/100438659300
 
Transparent customers. Direct marketing online



This process works even better on the Internet because of the latter's interactive nature. "The Internet is a dream to direct marketers", said Wil Lansing, CEO of the American retailer Fingerhut Companies. Many services require you to register online, requiring users to provide as much information about them as possible. And in addition, the Internet is fast, cheap and used by people who tend to be young and on the search for something interesting.

Many web sites also are equipped with user tracking technology that registers a users behaviour and preferences during a visit. For example, user tracking technology is capable of identifying the equipment and software employed by a user, as well as movements on the website, visit of links etc. Normally such information is anonymous, but can be personalised when it is coupled with online registration, or when personal identifcation has been obtained from other sources. Registration is often a prerequisite not just for obtaining a free web mail account, but also for other services, such as personalised start pages. Based on the information provided by user, the start page will then include advertisements and commercial offers that correspond to the users profile, or to the user's activity on the website.

One frequent way of obtaining such personal information of a user is by offering free web mail accounts offered by a great many companies, internet providers and web portals (e.g. Microsoft, Yahoo, Netscape and many others). In most cases, users get "free" accounts in return for submitting personal information and agreeing to receive marketing mails. Free web mail accounts are a simple and effective direct marketing and data capturing strategy which is, however, rarely understood as such. However, the alliances formed between direct advertising and marketing agencies on the one hand, and web mail providers on the other hand, such as the one between DoubleClick and Yahoo, show the common logic of data capturing and direct marketing. The alliance between DoubleClick and Yahoo eventually attracted the US largest direct marketing agency, Abacus Direct, who ended up buying DoubleClick.

However, the intention of collecting users personal data and create consumer profiles based on online behaviour can also take on more creative and playful forms. One such example is sixdegrees.com. This is a networking site based on the assumption that everybody on the planet is connected to everybody else by a chain of six people at most. The site offers users to get to know a lot of new people, the friends of their friends of their friends, for example, and if they try hard enough, eventually Warren Beatty or Claudia Schiffer. But of course, in order to make the whole game more useful for marketing purposes, users are encouraged to join groups which share common interests, which are identical with marketing categories ranging from arts and entertainment to travel and holiday. Evidently, the game becomes more interesting the more new people a user brings into the network. What seems to be fun for the 18 to 24 year old college student customer segment targeted by sixdegrees is, of course, real business. While users entertain themselves they are being carefully profiled. After all, data of young people who can be expected to be relatively affluent one day are worth more than money.

The particular way in which sites such as sixdegrees.com and others are structured mean that not only to users provide initial information about them, but also that this information is constantly updated and therefore becomes even more valuable. Consequently, many free online services or web mail providers cancel a user's account if it has not been uses for some time.

There are also other online services which offer free services in return for personal information which is then used for marketing purposes, e.g. Yahoo's Geocities, where users may maintain their own free websites, Bigfoot, where people are offered a free e-mail address for life, that acts as a relais whenever a customer's residence or e-mail address changes. In this way, of course, the marketers can identify friendship and other social networks, and turn this knowledge into a marketing advantage. People finders such as WhoWhere? operate along similar lines.

A further way of collecting consumer data that has recently become popular is by offering free PCs. Users are provided with a PC for free or for very little money, and in return commit themselves to using certain services rather than others (e.g. a particular internet provider), providing information about themselves, and agree to have their online behaviour monitored by the company providing the PC, so that accurate user profiles can be compiled. For example, the Free PC Network offers advertisers user profiles containing "over 60 individual demographics". There are literally thousands of variations of how a user's data are extracted and commercialised when online. Usually this happens quietly in the background.

A good inside view of the world of direct marketing can be gained at the website of the American Direct Marketing Association and the Federation of European Direct Marketing.

TEXTBLOCK 11/23 // URL: http://world-information.org/wio/infostructure/100437611761/100438659667
 
Data bunkers

Personal data are collected, appropriated, processed and used for commercial purposes on a global scale. In order for such a global system to operate smoothly, there a server nodes at which the data streams converge. Among the foremost of these are the data bases of credit card companies, whose operation has long depended on global networking.

On top of credit card companies such as Visa, American Express, Master Card, and others. It would be erroneous to believe that the primary purpose of business of these companies is the provision of credit, and the facilitation of credit information for sale transactions. In fact, Information means much more than just credit information. In an advertisement of 1982, American Express described itself in these terms: ""Our product is information ...Information that charges airline tickets, hotel rooms, dining out, the newest fashions ...information that grows money funds buys and sells equities ...information that pays life insurance annuities ...information that schedules entertainment on cable television and electronically guards houses ...information that changes kroners into guilders and figures tax rates in Bermuda ..."

Information has become something like the gospel of the New Economy, a doctrine of salvation - the life blood of society, as Bill Gates expresses it. But behind information there are always data that need to be generated and collected. Because of the critical importance of data to the economy, their possession amounts to power and their loss can cause tremendous damage. The data industry therefore locates its data warehouses behind fortifications that bar physical or electronic access. Such structures are somewhat like a digital reconstruction of the medieval fortress

Large amounts of data are concentrated in fortress-like structures, in data bunkers. As the Critical Art Ensemble argue in Electronic Civil Disobedience: "The bunker is the foundation of homogeneity, and allows only a singular action within a given situation." All activities within data bunker revolve around the same principle of calculation. Calculation is the predominant mode of thinking in data-driven societies, and it reaches its greatest density inside data bunkers. However, calculation is not a politically neutral activity, as it provides the rational basis - and therefore the formal legitimisation most every decision taken. Data bunkers therefore have an essentially conservative political function, and function to maintain and strengthen the given social structures.

TEXTBLOCK 12/23 // URL: http://world-information.org/wio/infostructure/100437611761/100438659754
 
Problems of Copyright Management and Control Technologies

Profiling and Data Mining

At their most basic copyright management and control technologies might simply be used to provide pricing information, negotiate the purchase transaction, and release a copy of a work for downloading to the customer's computer. Still, from a technological point of view, such systems also have the capacity to be employed for digital monitoring. Copyright owners could for example use the transaction records generated by their copyright management systems to learn more about their customers. Profiles, in their crudest form consisting of basic demographic information, about the purchasers of copyrighted material might be created. Moreover copyright owners could use search agents or complex data mining techniques to gather more information about their customers that could either be used to market other works or being sold to third parties.

Fair Use

Through the widespread use of copyright management and control systems the balance of control could excessively be shifted in favor of the owners of intellectual property. The currently by copyright law supported practice of fair use might potentially be restricted or even eliminated. While information in analogue form can easily be reproduced, the protection of digital works through copyright management systems might complicate or make impossible the copying of material for purposes, which are explicitly exempt under the doctrine of fair use.

Provisions concerning technological protection measures and fair use are stated in the DMCA, which provides that "Since copying of a work may be a fair use under appropriate circumstances, section 1201 does not prohibit the act of circumventing a technological measure that prevents copying. By contrast, since the fair use doctrine is not a defense e to the act of gaining unauthorized access to a work, the act of circumventing a technological measure in order to gain access is prohibited." Also the proposed EU Directive on copyright and related rights in the information society contains similar clauses. It distinguishes between the circumvention of technical protection systems for lawful purposes (fair use) and the circumvention to infringe copyright. Yet besides a still existing lack of legal clarity also very practical problems arise. Even if the circumvention of technological protection measures under fair use is allowed, how will an average user without specialized technological know-how be able to gain access or make a copy of a work? Will the producers of copyright management and control systems provide fair use versions that permit the reproduction of copyrighted material? Or will users only be able to access and copy works if they hold a digital "fair use license" ("fair use licenses" have been proposed by Mark Stefik, whereby holders of such licenses could exercise some limited "permissions" to use a digital work without a fee)?

TEXTBLOCK 13/23 // URL: http://world-information.org/wio/infostructure/100437611725/100438659629
 
Databody convergence

In the phrase "the rise of the citizen as a consumer", to be found on the EDS website, the cardinal political problem posed by the databody industry is summarised: the convergence of commercial and political interest in the data body business, the convergence of bureaucratic and commercial data bodies, the erosion of privacy, and the consequent undermining of democratic politics by private business interest.

When the citizen becomes a consumer, the state must become a business. In the data body business, the key word behind this new identity of government is "outsourcing". Functions, that are not considered core functions of government activity are put into the hands of private contractors.

There have long been instances where privately owned data companies, e.g. credit card companies, are allowed access to public records, e.g. public registries or electoral rolls. For example, in a normal credit card transaction, credit card companies have had access to public records in order to verify identity of a customer. For example, in the UK citizen's personal data stored on the Electoral Roll have been used for commercial purposes for a long time. The new British Data Protection Act now allows people to "opt out" of this kind of commercialisation - a legislation that has prompted protests on the part of the data industry: Experian has claimed to lose LST 500 mn as a consequence of this restriction - a figure that, even if exaggerated, may help to understand what the value of personal data actually is.

While this may serve as an example of an increased public awareness of privacy issues, the trend towards outsourcing seems to lead to a complete breakdown of the barriers between commercial and public use of personal data. This trend can be summarised by the term "outsourcing" of government functions.

Governments increasingly outsource work that is not considered core function of government, e.g. cooking meals in hospitals or mowing lawns in public parks. Such peripheral activities marked a first step of outsourcing. In a further step, governmental functions were divided between executive and judgemental functions, and executive functions increasingly entrusted to private agencies. For these agencies to be able to carry out the work assigned to them, the need data. Data that one was stored in public places, and whose handling was therefore subject to democratic accountability. Outsourcing has produced gains in efficiency, and a decrease of accountability. Outsourced data are less secure, what use they are put to is difficult to control.

The world's largest data corporation, EDS, is also among the foremost outsourcing companies. In an article about EDS' involvement in government outsourcing in Britain, Simon Davies shows how the general trend towards outsourcing combined with advances in computer technology allow companies EDS, outside of any public accountability, to create something like blueprints for the societies of the 21st century. But the problem of accountability is not the only one to be considered in this context. As Davies argues, the data business is taking own its own momentum "a ruthless company could easily hold a government to ransom". As the links between government agencies and citizens thin out, however, the links among the various agencies might increase. Linking the various government information systems would amount to further increase in efficiency, and a further undermining of democracy. The latter, after all, relies upon the separation of powers - matching government information systems would therefore pave the way to a kind of electronic totalitarianism that has little to do with the ideological bent of George Orwell's 1984 vision, but operates on purely technocratic principles.

Technically the linking of different systems is already possible. It would also create more efficiency, which means generate more income. The question, then, whether democracy concerns will prevent it from happening is one that is capable of creating

But what the EDS example shows is something that applies everywhere, and that is that the data industry is whether by intention or whether by default, a project with profound political implications. The current that drives the global economy deeper and deeper into becoming a global data body economy may be too strong to be stopped by conventional means.

However, the convergence of political and economic data bodies also has technological roots. The problem is that politically motivated surveillance and economically motivated data collection are located in the same area of information and communication technologies. For example, monitoring internet use requires more or less the same technical equipment whether done for political or economic purposes. Data mining and data warehousing techniques are almost the same. Creating transparency of citizens and customers is therefore a common objective of intelligence services and the data body industry. Given that data are exchanged in electronic networks, a compatibility among the various systems is essential. This is another factor that encourages "leaks" between state-run intelligence networks and the private data body business. And finally, given the secretive nature of state intelligence and commercial data capturing , there is little transparency. Both structures occupy an opaque zone.

TEXTBLOCK 14/23 // URL: http://world-information.org/wio/infostructure/100437611761/100438659769
 
Private data bunkers

On the other hand are the data bunkers of the private sector, whose position is different. Although these are fast-growing engines of data collection with a much greater degree of dynamism, they may not have the same privileged position - although one has to differentiate among the general historical and social conditions into which a data bunker is embedded. For example, it can safely be assumed that the databases of a large credit card company or bank are more protected than the bureaucracies of small developing countries.

Private data bunkers include

    Banks

    Building societies

    Credit bureaus

    Credit card companies

    Direct marketing companies

    Insurance companies

    Telecom service providers

    Mail order stores

    Online stores


TEXTBLOCK 15/23 // URL: http://world-information.org/wio/infostructure/100437611761/100438659735
 
Feeding the data body

TEXTBLOCK 16/23 // URL: http://world-information.org/wio/infostructure/100437611761/100438659644
 
Legal Protection: WIPO (World Intellectual Property Organization)

Presumably the major player in the field of international intellectual property protection and administrator of various multilateral treaties dealing with the legal and administrative aspects of intellectual property is the WIPO.

Information on WIPO administered agreements in the field of industrial property (Paris Convention for the Protection of Industrial Property (1883), Madrid Agreement Concerning the International Registration of Marks (1891) etc.) can be found on: http://www.wipo.org/eng/general/index3.htm

Information on treaties concerning copyright and neighboring rights (Berne Convention for the Protection of Literary and Artistic Works (1886) etc.) is published on: http://www.wipo.org/eng/general/index5.htm

The most recent multilateral agreement on copyright is the 1996 WIPO Copyright Treaty. Among other things it provides that computer programs are protected as literary works and also introduces the protection of databases, which "... by reason of the selection or arrangement of their content constitute intellectual creations." Furthermore the 1996 WIPO Copyright Treaty contains provisions concerning technological measures, rights management information and establishes a new "right of communication to the public". It is available on: http://www.wipo.org/eng/diplconf/distrib/treaty01.htm

TEXTBLOCK 17/23 // URL: http://world-information.org/wio/infostructure/100437611725/100438659588
 
Biometrics applications: privacy issues

All biometric technologies capture biometric data from individuals. Once these date have been captured by a system, they can, in principle, be forwarded to other locations and put to many different uses which are capable of compromising on an individuals privacy.

Technically it is easy to match biometric data with other personal data stored in government or corporate files, and to come a step closer to the counter-utopia of the transparent citizen and customer whose data body is under outside control.

While biometric technologies are often portrayed as protectors of personal data and safeguards against identity theft, they can thus contribute to an advance in "Big Brother" technology.

The combination of personalised data files with biometric data would amount to an enormous control potential. While nobody in government and industry would admit to such intentions, leading data systems companies such as EDS (Electronic Data Systems; http://www.eds.com) are also suppliers of biometric systems to the intelligence agencies of government and industry.

Biometric technologies have the function of identification. Historically, identification has been a prerequisite for the exercise of power and serves as a protection only to those who are in no conflict with this power. If the digitalisation of the body by biometric technologies becomes as widespread as its proponents hope, a new electronic feudal system could be emerging, in which people are reduced to subjects dispossessed of their to their bodies, even if these, unlike in the previous one, are data bodies. Unlike the gatekeepers of medieval towns, wear no uniforms by they might be identified; biometric technologies are pure masks.

TEXTBLOCK 18/23 // URL: http://world-information.org/wio/infostructure/100437611729/100438658826
 
The Piracy "Industry"

Until recent years, the problem of piracy (the unauthorized reproduction or distribution of copyrighted works (for commercial purposes)) was largely confined to the copying and physical distribution of tapes, disks and CDs. Yet the emergence and increased use of global data networks and the WWW has added a new dimension to the piracy of intellectual property by permitting still easier copying, electronic sales and transmissions of illegally reproduced copyrighted works on a grand scale.

This new development, often referred to as Internet piracy, broadly relates to the use of global data networks to 1) transmit and download digitized copies of pirated works, 2) advertise and market pirated intellectual property that is delivered on physical media through the mails or other traditional means, and 3) offer and transmit codes or other technologies which can be used to circumvent copy-protection security measures.

Lately the International Intellectual Property Alliance has published a new report on the estimated trade losses due to piracy. (The IIPA assumes that their report actually underestimates the loss of income due to the unlawful copying and distribution of copyrighted works. Yet it should be taken into consideration that the IIPA is the representative of the U.S. core copyright industries (business software, films, videos, music, sound recordings, books and journals, and interactive entertainment software).)

Table: IIPA 1998 - 1999 Estimated Trade Loss due to Copyright Piracy (in millions of US$)





Motion Pictures

Records & Music

Business Applications

Entertainment Software

Books





1999

1998

1999

1998

1999

1998

1999

1998

1999

1998

Total Losses

1323

1421

1684

1613

3211

3437

3020

2952

673

619



Total Losses (core copyright industries)

1999

1998

9910.0

10041.5




TEXTBLOCK 19/23 // URL: http://world-information.org/wio/infostructure/100437611725/100438659531
 
Late 1950s - Early 1960s: Second Generation Computers

An important change in the development of computers occurred in 1948 with the invention of the transistor. It replaced the large, unwieldy vacuum tube and as a result led to a shrinking in size of electronic machinery. The transistor was first applied to a computer in 1956. Combined with the advances in magnetic-core memory, the use of transistors resulted in computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.

Stretch by IBM and LARC by Sperry-Rand (1959) were the first large-scale machines to take advantage of the transistor technology (and also used assembly language instead of the difficult machine language). Both developed for atomic energy laboratories could handle enormous amounts of data, but still were costly and too powerful for the business sector's needs. Therefore only two LARC's were ever installed.

Throughout the early 1960s there were a number of commercially successful computers (for example the IBM 1401) used in business, universities, and government and by 1965 most large firms routinely processed financial information by using computers. Decisive for the success of computers in business was the stored program concept and the development of sophisticated high-level programming languages like FORTRAN (Formular Translator), 1956, and COBOL (Common Business-Oriented Language), 1960, that gave them the flexibility to be cost effective and productive. The invention of second generation computers also marked the beginning of an entire branch, the software industry, and the birth of a wide range of new types of careers.

TEXTBLOCK 20/23 // URL: http://world-information.org/wio/infostructure/100437611663/100438659439
 
Legal Protection: National Legislation

Intellectual property - comprising industrial property and copyright - in general is protected by national legislation. Therefore those rights are limited territorially and can be exercised only within the jurisdiction of the country or countries under whose laws they are granted.

TEXTBLOCK 21/23 // URL: http://world-information.org/wio/infostructure/100437611725/100438659540
 
Challenges for Copyright by ICT: Internet Service Providers

ISPs (Internet Service Providers) (and to a certain extent also telecom operators) are involved in the copyright debate primarily because of their role in the transmission and storage of digital information. Problems arise particularly concerning caching, information residing on systems or networks of ISPs at the directions of users and transitory communication.

Caching

Caching it is argued could cause damage because the copies in the cache are not necessarily the most current ones and the delivery of outdated information to users could deprive website operators of accurate "hit" information (information about the number of requests for a particular material on a website) from which advertising revenue is frequently calculated. Similarly harms such as defamation or infringement that existed on the original page may propagate for years until flushed from each cache where they have been replicated.

Although different concepts, similar issues to caching arise with mirroring (establishing an identical copy of a website on a different server), archiving (providing a historical repository for information, such as with newsgroups and mailing lists), and full-text indexing (the copying of a document for loading into a full-text or nearly full-text database which is searchable for keywords or concepts).

Under a literal reading of some copyright laws caching constitutes an infringement of copyright. Yet recent legislation like the DMCA or the proposed EU Directive on copyright and related rights in the information society (amended version) have provided exceptions for ISPs concerning particular acts of reproduction that are considered technical copies (caching). Nevertheless the exemption of liability for ISPs only applies if they meet a variety of specific conditions. In the course of the debate about caching also suggestions have been made to subject it to an implied license or fair use defense or make it (at least theoretically) actionable.

Information Residing on Systems or Networks at the Direction of Users

ISPs may be confronted with problems if infringing material on websites (of users) is hosted on their systems. Although some copyright laws like the DMCA provide for limitations on the liability of ISPs if certain conditions are met, it is yet unclear if ISPs should generally be accountable for the storage of infringing material (even if they do not have actual knowledge) or exceptions be established under specific circumstances.

Transitory Communication

In the course of transmitting digital information from one point on a network to another ISPs act as a data conduit. If a user requests information ISPs engage in the transmission, providing of a connection, or routing thereof. In the case of a person sending infringing material over a network, and the ISP merely providing facilities for the transmission it is widely held that they should not be liable for infringement. Yet some copyright laws like the DMCA provide for a limitation (which also covers the intermediate and transient copies that are made automatically in the operation of a network) of liability only if the ISPs activities meet certain conditions.

For more information on copyright (intellectual property) related problems of ISPs (BBSs (Bulletin Board Service Operators), systems operators and other service providers) see:

Harrington, Mark E.: On-line Copyright Infringement Liability for Internet Service Providers: Context, Cases & Recently Enacted Legislation. In: Intellectual Property and Technology Forum. June 4, 1999.

Teran, G.: Who is Vulnerable to Suit? ISP Liability for Copyright Infringement. November 2, 1999.

TEXTBLOCK 22/23 // URL: http://world-information.org/wio/infostructure/100437611725/100438659550
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 23/23 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
Moral rights

Authors of copyrighted works (besides economic rights) enjoy moral rights on the basis of which they have the right to claim their authorship and require that their names be indicated on the copies of the work and in connection with other uses thereof. Moral rights are generally inalienable and remain with the creator even after he has transferred his economic rights, although the author may waive their exercise.

INDEXCARD, 1/30
 
Copyright management information

Copyright management information refers to information which identifies a work, the author of a work, the owner of any right in a work, or information about the terms and conditions of the use of a work, and any numbers or codes that represent such information, when any of these items of information are attached to a copy of a work or appear in connection with the communication of a work to the public.

INDEXCARD, 2/30
 
The World Wide Web History Project

The ongoing World Wide Web History Project was established to record and publish the history of the World Wide Web and its roots in hypermedia and networking. As primary research methods are used archival research and the analysis of interviews and talks with pioneers of the World Wide Web. As result a vast of collection of historic video, audio, documents, and software is expected. The project's digital archive is currently under development.

http://www.webhistory.org/home.html

INDEXCARD, 3/30
 
Invention

According to the WIPO an invention is a "... novel idea which permits in practice the solution of a specific problem in the field of technology." Concerning its protection by law the idea "... must be new in the sense that is has not already been published or publicly used; it must be non-obvious in the sense that it would not have occurred to any specialist in the particular industrial field, had such a specialist been asked to find a solution to the particular problem; and it must be capable of industrial application in the sense that it can be industrially manufactured or used." Protection can be obtained through a patent (granted by a government office) and typically is limited to 20 years.

INDEXCARD, 4/30
 
Bill Clinton

William J. Clinton (* 1946) studied law at Yale University, then taught at the University of Arkansas. He was elected Arkansas attorney general in 1976 and served as a governor until 1992. That year he became U.S.-President, the first democratic President after a row of Republicans. His sexual affairs not only cost him nearly his career but he also had to distract from his private affairs: he thought of fighting another war against Saddam Hussein in February 1999. Short afterwards he had a more interesting enemy, Slobodan Milosevic - and the NATO was most willing to fight with him.

For more information see: http://www.whitehouse.gov/WH/glimpse/presidents/html/bc42.html

http://www.whitehouse.gov/WH/glimpse/presiden...
INDEXCARD, 5/30
 
Wide Area Network (WAN)

A Wide Area Network is a wide area proprietary network or a network of local area networks. Usually consisting of computers, it may consist of cellular phones, too.

INDEXCARD, 6/30
 
Microsoft Corporation

Founded by Bill Gates and Paul Allen and headquartered in Redmond, USA, Microsoft Corporation is today's world-leading developer of personal-computer software systems and applications. As MS-DOS, the first operating system released by Microsoft, before, Windows, its successor, has become the de-facto standard operating system for personal computer. According to critics and following a recent court ruling this is due to unfair competition.

http://www.microsoft.com

For more detailed information see the Encyclopaedia Britannica: http://www.britannica.com/bcom/eb/article/4/0,5716,1524+1+1522,00.html

http://www.microsoft.com/
http://www.britannica.com/bcom/eb/article/4/0...
INDEXCARD, 7/30
 
Internet Architecture Board

On behalf of the Internet Society, the Internet Architecture Board oversees the evolution of the architecture, the standards and the protocols of the Net.

Internet Society: http://www.isoc.org/iab

http://www.isoc.org/
INDEXCARD, 8/30
 
Netiquette

Although referred to as a single body of rules, there is not just one Netiquette, but there are several, though overlapping largely. Proposing general guidelines for posting messages to newsgroups and mailing lists and using the World Wide Web and FTP, Netiquettes address civility topics (i.e., avoiding hate speech) and comprise technical advises (i.e., using simple and platform-independent file formats).
Well-known Netiquettes are the Request for Comment #1855 and The Net: User Guidelines and Netiquette by Arlene H. Rinaldi.

ftp://ftp.isi.edu/in-notes/rfc1855.txt
http://www.fau.edu/netiquette/net/index.html
INDEXCARD, 9/30
 
Immanuel Wallerstein

Immanuel Wallerstein (* 1930) is director of the Fernand Braudel Center for the Study of Economies, Historical Systems, and Civilizations. He is one of the most famous sociologists in the Western World. With his book The Modern World-System: Capitalist Agriculture and the Origins of the European World-Economy in the Sixteenth Century (1976), which led to the expression World-System Theory about centers, peripheries and semi-peripheries in the capitalist world system, he did not only influence a whole generation of scientists, but this theory seems to get popular again, due to globalization.

INDEXCARD, 10/30
 
Vinton Cerf

Addressed as one of the fathers of the Internet, Vinton Cerf together with Robert Kahn developed the TCP/IP protocol suite, up to now the de facto-communication standard for the Internet, and also contributed to the development of other important communication standards. The early work on the protocols broke new ground with the realization of a multi-network open architecture.

In 1992, he co-founded the Internet Society where he served as its first President and later Chairman.

Today, Vinton Cerf is Senior Vice President for Internet Architecture and Technology at WorldCom, one of the world's most important ICT companies

Vinton Cerf's web site: http://www.wcom.com/about_the_company/cerfs_up/

http://www.isoc.org/
http://www.wcom.com/
INDEXCARD, 11/30
 
ARPAnet

ARPAnet was the small network of individual computers connected by leased lines that marked the beginning of today's global data networks. Being an experimental network mainly serving the purpose to test the feasibility of wide area networks, the possibility of remote computing, it was created for resource sharing between research institutions, not for messaging services like E-mail. Although research was sponsored by US military, ARPAnet was not designed for directly martial use but to support military-related research.

In 1969 ARPANET went online and links the first two computers, one of them located at the University of California, Los Angeles, the other at the Stanford Research Institute.

But ARPAnet has not become widely accepted before it was demonstrated in action to a public of computer experts at the First International Conference on Computers and Communication in Washington, D. C. in 1972.

Before it was decommissioned in 1990, NSFnet, a network of scientific and academic computers funded by the National Science Foundation, and a separate new military network went online in 1986. In 1988 the first private Internet service providers offered a general public access to NSFnet. Beginning in 1995, after having become the backbone of the Internet in the USA, NSFnet was turned over to a consortium of commercial backbone providers. This and the launch of the World Wide Web added to the success of the global data network we call the Net.

In the USA commercial users already outnumbered military and academic users in 1994.

Despite the rapid growth of the Net, most computers linked to it are still located in the United States.

INDEXCARD, 12/30
 
Mark

A mark (trademark or service mark) is "... a sign, or a combination of signs, capable of distinguishing the goods or services of one undertaking from those of other undertakings. The sign may particularly consist of one or more distinctive words, letters, numbers, drawings or pictures, emblems, colors or combinations of colors, or may be three-dimensional..." (WIPO) To be protected a mark must be registered in a government office whereby generally the duration is limited in time, but can be periodically (usually every 10 years) renewed.

INDEXCARD, 13/30
 
Memex Animation by Ian Adelman and Paul Kahn


INDEXCARD, 14/30
 
AT&T Labs-Research

The research and development division of AT&T. Inventions made at AT&T Labs-Research include so important ones as stereo recording, the transistor and the communications satellite.

http://www.research.att.com/

INDEXCARD, 15/30
 
World Wide Web (WWW)

Probably the most significant Internet service, the World Wide Web is not the essence of the Internet, but a subset of it. It is constituted by documents that are linked together in a way you can switch from one document to another by simply clicking on the link connecting these documents. This is made possible by the Hypertext Mark-up Language (HTML), the authoring language used in creating World Wide Web-based documents. These so-called hypertexts can combine text documents, graphics, videos, sounds, and Java applets, so making multimedia content possible.

Especially on the World Wide Web, documents are often retrieved by entering keywords into so-called search engines, sets of programs that fetch documents from as many servers as possible and index the stored information. (For regularly updated lists of the 100 most popular words that people are entering into search engines, click here). No search engine can retrieve all information on the whole World Wide Web; every search engine covers just a small part of it.

Among other things that is the reason why the World Wide Web is not simply a very huge database, as is sometimes said, because it lacks consistency. There is virtually almost infinite storage capacity on the Internet, that is true, a capacity, which might become an almost everlasting too, a prospect, which is sometimes consoling, but threatening too.

According to the Internet domain survey of the Internet Software Consortium the number of Internet host computers is growing rapidly. In October 1969 the first two computers were connected; this number grows to 376.000 in January 1991 and 72,398.092 in January 2000.

World Wide Web History Project, http://www.webhistory.org/home.html

http://www.searchwords.com/
http://www.islandnet.com/deathnet/
http://www.salonmagazine.com/21st/feature/199...
INDEXCARD, 16/30
 
French laws against anonymity on the Net

Since the end of June in France anonymous publishing - on the World Wide Web, in newsgroups, mailing lists or chat rooms - is prohibited. The use of pseudonyms, so popular in chat rooms, e.g., is not restricted, but the true identities of those who "publish" on the Net must be known to the users' Internet service and Internet content providers. Additionally, Internet providers are obliged to point out the possibility of blocking access to material to their customers and to offer them appropriate technology for blocking access.

Loi sur la communication audiovisuelle, http://www.legalis.net/jnet/2000/loi-audio/projetloi-fin.htm

Source: Florian Rötzer, Frankreich hat mit der Anonymität im internet Schluss gemacht, in: Telepolis, July 2, 2000

http://www.heise.de/tp
INDEXCARD, 17/30
 
International Cable Protection Committee (ICPC)

The ICPC aims at reducing the number of incidents of damages to submarine telecommunications cables by hazards.

The Committee also serves as a forum for the exchange of technical and legal information pertaining to submarine cable protection methods and programs and funds projects and programs, which are beneficial for the protection of submarine cables.

Membership is restricted to authorities (governmental administrations or commercial companies) owning or operating submarine telecommunications cables. As of May 1999, 67 members representing 38 nations were members.

http://www.iscpc.org

INDEXCARD, 18/30
 
Gateway

A gateway is a computer supplying point-to-multipoint connections between computer networks.

INDEXCARD, 19/30
 
Internet Research Task Force

Being itself under the umbrella of the Internet Society, the Internet Research Task Force is an umbrella organization of small research groups working on topics related to Internet protocols, applications, architecture and technology. It is governed by the Internet Research Steering Group.

http://www.irtf.org

http://www.irtf.org/
INDEXCARD, 20/30
 
Ku Klux Klan

The Ku Klux Klan has a long history of violence. It emerged out of the resentment and hatred many white Southerners. Black Americans are not considered human beings. While the menace of the KKK has peaked and waned over the years, it has never vanished.

INDEXCARD, 21/30
 
Amazon.com

Amazon.com is an online shop that serves approx. 17 mn customers in 150 countries. Starting out as a bookshop, Amazon today offers a wide range of other products as well.

Among privacy campaigners, the company's name has become almost synonymous with aggressive online direct marketing practices as well as user profiling and tracking. Amazon and has been involved in privacy disputes at numerous occasions.

http://www.amazon.com/
http://www.computeruser.com/newstoday/00/01/0...
INDEXCARD, 22/30
 
Server

A server is program, not a computer, as it sometimes said, dedicated to store files, manage printers and network traffic, or process database queries.

Web sites, the nodes of the World Wide Web (WWW), e.g., are stored on servers.

INDEXCARD, 23/30
 
The Spot

http://www.thespot.com/

http://www.thespot.com/
INDEXCARD, 24/30
 
Royalties

Royalties refer to the payment made to the owners of certain types of rights by those who are permitted by the owners to exercise the rights. The rights concerned are literary, musical, and artistic copyright and patent rights in inventions and designs (as well as rights in mineral deposits, including oil and natural gas). The term originated from the fact that in Great Britain for centuries gold and silver mines were the property of the crown and such "royal" metals could be mined only if a payment ("royalty") were made to the crown.

INDEXCARD, 25/30
 
Bandwidth

The bandwidth of a transmitted communications signal is a measure of the range of frequencies the signal occupies. The term is also used in reference to the frequency-response characteristics of a communications receiving system. All transmitted signals, whether analog or digital, have a certain bandwidth. The same is true of receiving systems.

Generally speaking, bandwidth is directly proportional to the amount of data transmitted or received per unit time. In a qualitative sense, bandwidth is proportional to the complexity of the data for a given level of system performance. For example, it takes more bandwidth to download a photograph in one second than it takes to download a page of text in one second. Large sound files, computer programs, and animated videos require still more bandwidth for acceptable system performance. Virtual reality (VR) and full-length three-dimensional audio/visual presentations require the most bandwidth of all.

In digital systems, bandwidth is data speed in bits per second (bps).

Source: Whatis.com

INDEXCARD, 26/30
 
Expert system

Expert systems are advanced computer programs that mimic the knowledge and reasoning capabilities of an expert in a particular discipline. Their creators strive to clone the expertise of one or several human specialists to develop a tool that can be used by the layman to solve difficult or ambiguous problems. Expert systems differ from conventional computer programs as they combine facts with rules that state relations between the facts to achieve a crude form of reasoning analogous to artificial intelligence. The three main elements of expert systems are: (1) an interface which allows interaction between the system and the user, (2) a database (also called the knowledge base) which consists of axioms and rules, and (3) the inference engine, a computer program that executes the inference-making process. The disadvantage of rule-based expert systems is that they cannot handle unanticipated events, as every condition that may be encountered must be described by a rule. They also remain limited to narrow problem domains such as troubleshooting malfunctioning equipment or medical image interpretation, but still have the advantage of being much lower in costs compared with paying an expert or a team of specialists.

INDEXCARD, 27/30
 
Digital Subscriber Line (DSL)

DSL connections are high-speed data connections over copper wire telephone lines. As with cable connections, with DSL you can look up information on the Internet and make a phone call at the same time but you do not need to have a new or additional cable or line installed. One of the most prominent DSL services is ISDN (integrated services digital network, for more information click here ( http://www.britannica.com/bcom/eb/article/4/0,5716,129614+15,00.html )).

http://www.britannica.com/bcom/eb/article/4/0...
INDEXCARD, 28/30
 
Fiber-optic cable networks

Fiber-optic cable networks may become the dominant method for high-speed Internet connections. Since the first fiber-optic cable was laid across the Atlantic in 1988, the demand for faster Internet connections is growing, fuelled by the growing network traffic, partly due to increasing implementation of corporate networks spanning the globe and to the use of graphics-heavy contents on the World Wide Web.

Fiber-optic cables have not much more in common with copper wires than the capacity to transmit information. As copper wires, they can be terrestrial and submarine connections, but they allow much higher transmission rates. Copper wires allow 32 telephone calls at the same time, but fiber-optic cable can carry 40,000 calls at the same time. A capacity, Alexander Graham Bell might have not envisioned when he transmitted the first words - "Mr. Watson, come here. I want you" - over a copper wire.

Copper wires will not come out of use in the foreseeable future because of technologies as DSL that speed up access drastically. But with the technology to transmit signals at more than one wavelength on fiber-optic cables, there bandwidth is increasing, too.

For technical information from the Encyclopaedia Britannica on telecommunication cables, click here. For technical information from the Encyclopaedia Britannica focusing on fiber-optic cables, click here.

An entertaining report of the laying of the FLAG submarine cable, up to now the longest fiber-optic cable on earth, including detailed background information on the cable industry and its history, Neal Stephenson has written for Wired: Mother Earth Mother Board. Click here for reading.

Susan Dumett has written a short history of undersea cables for Pretext magazine, Evolution of a Wired World. Click here for reading.

A timeline history of submarine cables and a detailed list of seemingly all submarine cables of the world, operational, planned and out of service, can be found on the Web site of the International Cable Protection Committee.

For maps of fiber-optic cable networks see the website of Kessler Marketing Intelligence, Inc.

http://www.britannica.com/bcom/eb/article/4/0...
http://www.britannica.com/bcom/eb/article/4/0...
http://www.wired.com/wired/archive/4.12/ffgla...
http://www.pretext.com/mar98/features/story3....
INDEXCARD, 29/30
 
Roman smoke telegraph network, 150 A.D.

The Roman smoke signals network consisted of towers within visible range of each other and had a total length of about 4500 kilometers. It was used for military signaling.

For a similar telegraph network in ancient Greece see Aeneas Tacitus' optical communication system.

INDEXCARD, 30/30