Challenges for Copyright by ICT: Internet Service Providers

ISPs (Internet Service Providers) (and to a certain extent also telecom operators) are involved in the copyright debate primarily because of their role in the transmission and storage of digital information. Problems arise particularly concerning caching, information residing on systems or networks of ISPs at the directions of users and transitory communication.

Caching

Caching it is argued could cause damage because the copies in the cache are not necessarily the most current ones and the delivery of outdated information to users could deprive website operators of accurate "hit" information (information about the number of requests for a particular material on a website) from which advertising revenue is frequently calculated. Similarly harms such as defamation or infringement that existed on the original page may propagate for years until flushed from each cache where they have been replicated.

Although different concepts, similar issues to caching arise with mirroring (establishing an identical copy of a website on a different server), archiving (providing a historical repository for information, such as with newsgroups and mailing lists), and full-text indexing (the copying of a document for loading into a full-text or nearly full-text database which is searchable for keywords or concepts).

Under a literal reading of some copyright laws caching constitutes an infringement of copyright. Yet recent legislation like the DMCA or the proposed EU Directive on copyright and related rights in the information society (amended version) have provided exceptions for ISPs concerning particular acts of reproduction that are considered technical copies (caching). Nevertheless the exemption of liability for ISPs only applies if they meet a variety of specific conditions. In the course of the debate about caching also suggestions have been made to subject it to an implied license or fair use defense or make it (at least theoretically) actionable.

Information Residing on Systems or Networks at the Direction of Users

ISPs may be confronted with problems if infringing material on websites (of users) is hosted on their systems. Although some copyright laws like the DMCA provide for limitations on the liability of ISPs if certain conditions are met, it is yet unclear if ISPs should generally be accountable for the storage of infringing material (even if they do not have actual knowledge) or exceptions be established under specific circumstances.

Transitory Communication

In the course of transmitting digital information from one point on a network to another ISPs act as a data conduit. If a user requests information ISPs engage in the transmission, providing of a connection, or routing thereof. In the case of a person sending infringing material over a network, and the ISP merely providing facilities for the transmission it is widely held that they should not be liable for infringement. Yet some copyright laws like the DMCA provide for a limitation (which also covers the intermediate and transient copies that are made automatically in the operation of a network) of liability only if the ISPs activities meet certain conditions.

For more information on copyright (intellectual property) related problems of ISPs (BBSs (Bulletin Board Service Operators), systems operators and other service providers) see:

Harrington, Mark E.: On-line Copyright Infringement Liability for Internet Service Providers: Context, Cases & Recently Enacted Legislation. In: Intellectual Property and Technology Forum. June 4, 1999.

Teran, G.: Who is Vulnerable to Suit? ISP Liability for Copyright Infringement. November 2, 1999.

TEXTBLOCK 1/6 // URL: http://world-information.org/wio/infostructure/100437611725/100438659550
 
Bureaucratic data bunkers



Among the foremost of the data bunkers government bureaucracies. Bureaucracies are the oldest forms of bunkers and are today deeply engrained in modern societies. Bureaucracies have always had the function of collecting and administering the data of subjects. What make this process more problematic in the age of ICT is that a lot more data can be collected, they can be collected in clandestine ways (e.g. in surveillance situations), and the can be combined and merged using advanced data mining technologies. In addition, there is a greater rationale for official data collecting, as a lot more data is required for the functioning of public administration as in previous periods, as societies rush to adopt increasingly complex technologies, above all ICTs. The increasing complexity of modern societies means that an increasing number of bureaucratic decision is taken, all of which require a calculation process. Complexity, viewed through government spectacles, generates insecurity - a great deal of the bureaucratic activity therefore revolves around the topic of security.

In spite of the anti-bureaucratic rhetoric of most governments, these factors provides the bureaucracies with an increased hold on society. Foremost bureaucratic data bunkers include the following:

    Law enforcement agencies

    Fiscal agencies

    Intelligence agencies

    Social welfare agencies

    Social insurance institutions

    Public health agencies

    Educational institutions



These are agencies that enjoy the privileged protection of the state. Those among them that operate in the field of security are further protected against public scrutiny, as they operate in an area to which democratic reason has no access.

What makes the data repositories of these institutions different from private data bunkers is their "official", i.e. their politically binding and definitive character. CAE speak of the bureaucracy as a "concrete form of uninterruptible, official and legitimised memory."

TEXTBLOCK 2/6 // URL: http://world-information.org/wio/infostructure/100437611761/100438659721
 
Virtual body and data body



The result of this informatisation is the creation of a virtual body which is the exterior of a man or woman's social existence. It plays the same role that the physical body, except located in virtual space (it has no real location). The virtual body holds a certain emancipatory potential. It allows us to go to places and to do things which in the physical world would be impossible. It does not have the weight of the physical body, and is less conditioned by physical laws. It therefore allows one to create an identity of one's own, with much less restrictions than would apply in the physical world.

But this new freedom has a price. In the shadow of virtualisation, the data body has emerged. The data body is a virtual body which is composed of the files connected to an individual. As the Critical Art Ensemble observe in their book Flesh Machine, the data body is the "fascist sibling" of the virtual body; it is " a much more highly developed virtual form, and one that exists in complete service to the corporate and police state."

The virtual character of the data body means that social regulation that applies to the real body is absent. While there are limits to the manipulation and exploitation of the real body (even if these limits are not respected everywhere), there is little regulation concerning the manipulation and exploitation of the data body, although the manipulation of the data body is much easier to perform than that of the real body. The seizure of the data body from outside the concerned individual is often undetected as it has become part of the basic structure of an informatised society. But data bodies serve as raw material for the "New Economy". Both business and governments claim access to data bodies. Power can be exercised, and democratic decision-taking procedures bypassed by seizing data bodies. This totalitarian potential of the data body makes the data body a deeply problematic phenomenon that calls for an understanding of data as social construction rather than as something representative of an objective reality. How data bodies are generated, what happens to them and who has control over them is therefore a highly relevant political question.

TEXTBLOCK 3/6 // URL: http://world-information.org/wio/infostructure/100437611761/100438659695
 
The Copyright Industry

Copyright is not only about protecting the rights of creators, but has also become a major branch of industry with significant contributions to the global economy. According to the International Intellectual Property Alliance the U.S. copyright industry has grown almost three times as fast as the economy as a whole for the past 20 years. In 1997, the total copyright industries contributed an estimated US$ 529.3 billion to the U.S. economy with the core copyright industries accounting for US$ 348.4 billion. Between 1977 and 1997, the absolute growth rate of value added to the U.S. GDP by the core copyright industries was 241 %. Also the copyright industry's foreign sales in 1997 (US$ 66.85 billion for the core copyright industries) were larger than the U.S. Commerce Department International Trade Administration's estimates of the exports of almost all other leading industry sectors. They exceeded even the combined automobile and automobile parts industries, as well as the agricultural sector.

In an age where knowledge and information become more and more important and with the advancement of new technologies, transmission systems and distribution channels a further increase in the production of intellectual property is expected. Therefore as copyright establishes ownership in intellectual property it is increasingly seen as the key to wealth in the future.

TEXTBLOCK 4/6 // URL: http://world-information.org/wio/infostructure/100437611725/100438658710
 
Intellectual Property and the "Information Society" Metaphor

Today the talk about the so-called "information society" is ubiquitous. By many it is considered as the successor of the industrial society and said to represent a new form of societal and economical organization. This claim is based on the argument, that the information society uses a new kind of resource, which fundamentally differentiates from that of its industrial counterpart. Whereas industrial societies focus on physical objects, the information society's raw material is said to be knowledge and information. Yet the conception of the capitalist system, which underlies industrial societies, also continues to exist in an information-based environment. Although there have been changes in the forms of manufacture, the relations of production remain organized on the same basis. The principle of property.

In the context of a capitalist system based on industrial production the term property predominantly relates to material goods. Still even as in an information society the raw materials, resources and products change, the concept of property persists. It merely is extended and does no longer solely consider physical objects as property, but also attempts to put information into a set of property relations. This new kind of knowledge-based property is widely referred to as "intellectual property". Although intellectual property in some ways represents a novel form of property, it has quickly been integrated in the traditional property framework. Whether material or immaterial products, within the capitalist system they are both treated the same - as property.

TEXTBLOCK 5/6 // URL: http://world-information.org/wio/infostructure/100437611725/100438659429
 
How the Internet works

On the Internet, when you want to retrieve a document from another computer, you request a service from this computer. Your computer is the client, the computer on which the information you want to access is stored, is called the server. Therefore the Internet's architecture is called client-server architecture.

A common set of standards allows the exchange of data and commands independent from locations, time, and operating systems through the Internet. These standards are called communication protocols, or the Internet Protocol Suite, and are implemented in Internet software. Sometimes the Internet Protocol Suite is erroneously identified with TCP/IP (Transmission Control Protocol / Internet Protocol).

Any information to be transferred is broken down into pieces, so-called packets, and the Internet Protocol figures out how the data is supposed to get from A to B by passing through routers.

Each packet is "pushed" from router to router via gateways and might take a different route. It is not possible to determine in advance which ways these packets will take. At the receiving end the packets are checked and reassembled.

The technique of breaking down all messages and requests into packets has the advantage that a large data bundle (e.g. videos) sent by a single user cannot block a whole network, because the bandwidth needed is deployed on several packets sent on different routes. Detailed information about routing in the Internet can be obtained at http://www.scit.wlv.ac.uk/~jphb/comms/iproute.html.

One of the Internet's (and of the Matrix's) beginnings was the ARPANet, whose design was intended to withstand any disruption, as for example in military attacks. The ARPANet was able to route data around damaged areas, so that the disruption would not impede communication. This design, whith its origin in strategic and military considerations, remained unchanged for the Internet. Yet the design of the ARPANet's design cannot be completely applied to the Internet.

Routing around depends on the location of the interruption and on the availability of intersecting points between networks. If, for example, an E-mail message is sent from Brussels to Athens and in Germany a channel is down, it will not affect access very much, the message will be routed around this damage, as long as a major Internet exchange is not affected. However, if access depends on a single backbone connection to the Internet and this connection is cut off, there is no way to route around.

In most parts of the world the Internet is therefore vulnerable to disruption. "The idea of the Internet as a highly distributed, redundant global communications system is a myth. Virtually all communications between countries take place through a very small number of bottlenecks, and the available bandwidth isn't that great," says Douglas Barnes. These bottlenecks are the network connections to neighboring countries. Many countries rely on a one single connection to the Net, and in some places, such as the Suez Canal, there is a concentration of fiber-optic cables of critical importance.

TEXTBLOCK 6/6 // URL: http://world-information.org/wio/infostructure/100437611791/100438659870
 
The Spot

http://www.thespot.com/

http://www.thespot.com/
INDEXCARD, 1/6
 
Expert system

Expert systems are advanced computer programs that mimic the knowledge and reasoning capabilities of an expert in a particular discipline. Their creators strive to clone the expertise of one or several human specialists to develop a tool that can be used by the layman to solve difficult or ambiguous problems. Expert systems differ from conventional computer programs as they combine facts with rules that state relations between the facts to achieve a crude form of reasoning analogous to artificial intelligence. The three main elements of expert systems are: (1) an interface which allows interaction between the system and the user, (2) a database (also called the knowledge base) which consists of axioms and rules, and (3) the inference engine, a computer program that executes the inference-making process. The disadvantage of rule-based expert systems is that they cannot handle unanticipated events, as every condition that may be encountered must be described by a rule. They also remain limited to narrow problem domains such as troubleshooting malfunctioning equipment or medical image interpretation, but still have the advantage of being much lower in costs compared with paying an expert or a team of specialists.

INDEXCARD, 2/6
 
Backbone Networks

Backbone networks are central networks usually of very high bandwidth, that is, of very high transmitting capacity, connecting regional networks. The first backbone network was the NSFNet run by the National Science Federation of the United States.

INDEXCARD, 3/6
 
The Internet Engineering Task Force

The Internet Engineering Task Force contributes to the evolution of the architecture, the protocols and technologies of the Net by developing new Internet standard specifications. The directors of its functional areas form the Internet Engineering Steering Group.

Internet Society: http://www.ietf.org

http://www.ietf.org/
INDEXCARD, 4/6
 
First Monday

An English language peer reviewed media studies journal based in Denmark.

http://firstmonday.dk

INDEXCARD, 5/6
 
The World Wide Web History Project

The ongoing World Wide Web History Project was established to record and publish the history of the World Wide Web and its roots in hypermedia and networking. As primary research methods are used archival research and the analysis of interviews and talks with pioneers of the World Wide Web. As result a vast of collection of historic video, audio, documents, and software is expected. The project's digital archive is currently under development.

http://www.webhistory.org/home.html

INDEXCARD, 6/6