some essential definitions

some essential definitions in the field of cryptography are:
- cryptoanalysis
- cryptology
- ciphers

"Few false ideas have more firmly gripped the minds of so many intelligent men than the one that, if they just tried, they could invent a cipher that no one could break." (David Kahn)

codes
plaintext
ciphertext
to encipher/encode
to decipher/decode

The variants of encryption systems are endless.
For deciphering there exists always the same game of trial and error (first guessing the encryption method, then the code). A help to do so is pruning. Once, after a more or less long or short period a code/cipher breaks. Monoalphabetic ciphers can be broken easily and of course are no longer used today but for games.

for further information on codes and ciphers etc. see:
http://www.optonline.com/comptons/ceo/01004A.html
http://www.ridex.co.uk/cryptology/#_Toc439908851

TEXTBLOCK 1/26 // URL: http://world-information.org/wio/infostructure/100437611776/100438659070
 
Challenges for Copyright by ICT: Copyright Owners

The main concern of copyright owners as the (in terms of income generation) profiteers of intellectual property protection is the facilitation of pirate activities in digital environments.

Reproduction and Distribution

Unlike copies of works made using analog copiers (photocopy machines, video recorders etc.) digital information can be reproduced extremely fast, at low cost and without any loss in quality. Since each copy is a perfect copy, no quality-related limits inhibit pirates from making as many copies as they please, and recipients of these copies have no incentive to return to authorized sources to get another qualitatively equal product. Additionally the costs of making one extra copy of intellectual property online are insignificant, as are the distribution costs if the copy is moved to the end user over the Internet.

Control and Manipulation

In cross-border, global data networks it is almost impossible to control the exploitation of protected works. Particularly the use of anonymous remailers and other existing technologies complicates the persecution of pirates. Also digital files are especially vulnerable to manipulation, of the work itself, and of the (in some cases) therein-embedded copyright management information.

TEXTBLOCK 2/26 // URL: http://world-information.org/wio/infostructure/100437611725/100438659526
 
Public Relations and Propaganda

Public relations usually is associated with the influencing of public opinion. Therefore it has subsequently been linked with propaganda. Using one of the many definitions of propaganda "... the manipulation of symbols as a means of influencing attitudes on controversial matters" (Harold D. Lasswell), the terms propaganda and PR seem to be easily interchangeable.

Still many authors explicitly distinguish between public relations, advertising and propaganda. Unlike PR, which is often described as objective and extensive information of the public, advertising and propaganda are associated with manipulative activities. Nevertheless to treat public relations and propaganda as equivalents stands in the tradition of PR. Edward L. Bernays, one of the founders of public relations wrote "The only difference between propaganda and education, really, is the point of view. The advocacy of what we believe in is education. The advocacy of what we don't believe is propaganda."

Also institutions like the German Bundeswehr use the terms publics relations and propaganda synonymously. After a 1990 legislation of the former minister of defense Stoltenberg, the "psychological influence of the enemy" was ceased during peace time and the Academy for Psychological Defense renamed to Academy for Information and Communication, among other things responsible for scientific research in the field of public relations.

TEXTBLOCK 3/26 // URL: http://world-information.org/wio/infostructure/100437611652/100438658084
 
Basics: Protected Persons

Generally copyright vests in the author of the work. Certain national laws provide for exceptions and, for example, regard the employer as the original owner of a copyright if the author was, when the work was created, an employee and employed for the purpose of creating that work. In the case of some types of creations, particularly audiovisual works, several national laws provide for different solutions to the question that should be the first holder of copyright in such works.

Many countries allow copyright to be assigned, which means that the owner of the copyright transfers it to another person or entity, which then becomes its holder. When the national law does not permit assignment it usually provides the possibility to license the work to someone else. Then the owner of the copyright remains the holder, but authorizes another person or entity to exercise all or some of his rights subject to possible limitations. Yet in any case the "moral rights" always belong to the author of the work, whoever may be the owner of the copyright (and therefore of the "economic rights").


TEXTBLOCK 4/26 // URL: http://world-information.org/wio/infostructure/100437611725/100438659527
 
History: "Indigenous Tradition"

In preliterate societies the association of rhythmic or repetitively patterned utterances with supernatural knowledge endures well into historic times. Knowledge is passed from one generation to another. Similar as in the Southern tradition intellectual property rights are rooted in a concept of 'collective' or 'communal' intellectual property existing in perpetuity and not limited to the life of an individual creator plus some number of years after his or her death. Often rights are exercised by only one individual in each generation, often through matrilineal descent.


TEXTBLOCK 5/26 // URL: http://world-information.org/wio/infostructure/100437611725/100438659557
 
What is the Internet?

Each definition of the Internet is a simplified statement and runs the risk of being outdated within a short time. What is usually referred to as the Internet is a network of thousands of computer networks (so called autonomous systems) run by governmental authorities, companies, and universities, etc. Generally speaking, every time a user connects to a computer networks, a new Internet is created. Technically speaking, the Internet is a wide area network (WAN) that may be connected to local area networks (LANs).

What constitutes the Internet is constantly changing. Certainly the state of the future Net will be different to the present one. Some years ago the Internet could still be described as a network of computer networks using a common communication protocol, the so-called IP protocol. Today, however, networks using other communication protocols are also connected to other networks via gateways.

Also, the Internet is not solely constituted by computers connected to other computers, because there are also point-of-sale terminals, cameras, robots, telescopes, cellular phones, TV sets and and an assortment of other hardware components that are connected to the Internet.

At the core of the Internet are so-called Internet exchanges, national backbone networks, regional networks, and local networks.

Since these networks are often privately owned, any description of the Internet as a public network is not an accurate. It is easier to say what the Internet is not than to say what it is. On 24 October, 1995 the U.S. Federal Networking Council made the following resolution concerning the definition of the term "Internet": "Internet" refers to the global information system that (i) is logically linked together by a globally unique address space based on the Internet Protocol (IP) or its subsequent extensions/follow-ons; (ii) is able to support communications using the Transmission Control Protocol/Internet Protocol (TCP/IP) suite or its subsequent extensions/follow-ons, and/or other IP-compatible protocols; and (iii) provides, uses or makes accessible, either publicly or privately, high level services layered on the communications and related infrastructure described herein." (http://www.fnc.gov/Internet_res.html)

What is generally and in a simplyfiying manner called the Internet, may be better referred to as the Matrix, a term introduced by science fiction writer William Gibson, as John S. Quarterman and Smoot Carl-Mitchell have proposed. The Matrix consists of all computer systems worldwide capable of exchanging E-Mail: of the USENET, corporate networks and proprietary networks owned by telecommunication and cable TV companies.

Strictly speaking, the Matrix is not a medium; it is a platform for resources: for media and services. The Matrix is mainly a very powerful means for making information easily accessible worldwide, for sending and receiving messages, videos, texts and audio files, for transferring funds and trading securities, for sharing resources, for collecting weather condition data, for trailing the movements of elephants, for playing games online, for video conferencing, for distance learning, for virtual exhibitions, for jamming with other musicians, for long distance ordering, for auctions, for tracking packaged goods, for doing business, for chatting, and for remote access of computers and devices as telescopes and robots remotely, e. g. The Internet is a wonderful tool for exchanging, retrieving, and storing data and sharing equipment over long distances and eventually real-time, if telecommunication infrastructure is reliable and of high quality.

For a comprehensive view of uses of the Matrix, especially the World Wide Web, see ""24 Hours in Cyberspace"

TEXTBLOCK 6/26 // URL: http://world-information.org/wio/infostructure/100437611791/100438659889
 
Intellectual Property and the "Information Society" Metaphor

Today the talk about the so-called "information society" is ubiquitous. By many it is considered as the successor of the industrial society and said to represent a new form of societal and economical organization. This claim is based on the argument, that the information society uses a new kind of resource, which fundamentally differentiates from that of its industrial counterpart. Whereas industrial societies focus on physical objects, the information society's raw material is said to be knowledge and information. Yet the conception of the capitalist system, which underlies industrial societies, also continues to exist in an information-based environment. Although there have been changes in the forms of manufacture, the relations of production remain organized on the same basis. The principle of property.

In the context of a capitalist system based on industrial production the term property predominantly relates to material goods. Still even as in an information society the raw materials, resources and products change, the concept of property persists. It merely is extended and does no longer solely consider physical objects as property, but also attempts to put information into a set of property relations. This new kind of knowledge-based property is widely referred to as "intellectual property". Although intellectual property in some ways represents a novel form of property, it has quickly been integrated in the traditional property framework. Whether material or immaterial products, within the capitalist system they are both treated the same - as property.

TEXTBLOCK 7/26 // URL: http://world-information.org/wio/infostructure/100437611725/100438659429
 
Commercial vs. Independent Content: Power and Scope

Regarding the dimension of their financial and human resources commercial media companies are at any rate much more powerful players than their independent counterparts. Still those reply with an extreme multiplicity and diversity. Today thousands of newsgroups, mailing-list and e-zines covering a wide range of issues from the environment to politics, social and human rights, culture, art and democracy are run by alternative groups.

Moreover independent content provider have started to use digital media for communication, information and co-ordination long before they were discovered by corporate interest. They regularly use the Internet and other networks to further public discourse and put up civic resistance. And in many cases are very successful with their work, as initiatives like widerst@ndMUND's (AT) co-ordination of the critics of the participation of the Freedom Party in the Austrian government via mailing-lists, an online-magazine and discussion forums, show.

TEXTBLOCK 8/26 // URL: http://world-information.org/wio/infostructure/100437611795/100438659058
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 9/26 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
The Secret Behind

The secret behind all this is the conception that nothing bad could ever be referred to the own nation. All the bad words belong to the enemy, whereas the "we" is the good one, the one who never is the aggressor but always defender, the savior - not only for ones own sake but also for the others, even if they never asked for it, like the German population during World War I and II.
The spiritualization of such thinking leads to the point that it gets nearly impossible to believe that this could be un-true, a fake. To imagine injustice committed by the own nation gets more and more difficult, the longer the tactic of this kind of propaganda goes on. U.S.-Americans voluntarily believe in its politics, believing also the USA works as the police of the world, defending the morally good against those who just do not have reached the same level of civilization until today.
To keep up this image, the enemy must be portrayed ugly and bad, like in fairy-tales, black-and-white-pictures. Any connection between oneself and the enemy must be erased and made impossible. In the case of Slobodan Milosevic or Saddam Hussein this meant to delete the positive contact of the last years from the consciousness of the population. Both had received a high amount of money and material help as long as they kept to the rules of the Western game. Later, when the image of the friend/confederate was destroyed, the contact had to be denied. The media, who had reported that help, no longer seemed to remember and did not write anything about that strange change of mind. And if any did, they were not really listened to, because people tend to hear what they want to hear. And who would want to hear that high amounts of his taxes had formerly gone to "a man" (this personification of the war to one single man is the next disinformation) who now is the demon in one's mind.

All of this is no invention of several politicians. Huge think tanks and different governmental organizations are standing behind that. Part of their work is to hide their own work, or to deny it.

TEXTBLOCK 10/26 // URL: http://world-information.org/wio/infostructure/100437611661/100438658637
 
Sponsorship Models

With new sponsorship models being developed, even further influence over content from the corporate side can be expected. Co-operating with Barnes & Nobel Booksellers, the bookish e-zine FEED for instance is in part relying on sponsoring. Whenever a specific title is mentioned in the editorial, a link is placed in the margin - under the heading "Commerce" - to an appropriate page on Barnes & Noble. Steve Johnson, editor of FEED, says "We do not take a cut of any merchandise sold through those links.", but admits that the e-zine does indirectly profit from putting those links there.

TEXTBLOCK 11/26 // URL: http://world-information.org/wio/infostructure/100437611652/100438658034
 
Individualized Audience Targeting

New opportunities for online advertisers arise with the possibility of one-to-one Web applications. Software agents for example promise to "register, recognize and manage end-user profiles; create personalized communities on-line; deliver personalized content to end-users and serve highly targeted advertisements". The probably ultimate tool for advertisers. Although not yet widely used, companies like Amazon.Com have already started to exploit individualized audience targeting for their purposes.

TEXTBLOCK 12/26 // URL: http://world-information.org/wio/infostructure/100437611652/100438658450
 
Late 1950s - Early 1960s: Second Generation Computers

An important change in the development of computers occurred in 1948 with the invention of the transistor. It replaced the large, unwieldy vacuum tube and as a result led to a shrinking in size of electronic machinery. The transistor was first applied to a computer in 1956. Combined with the advances in magnetic-core memory, the use of transistors resulted in computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.

Stretch by IBM and LARC by Sperry-Rand (1959) were the first large-scale machines to take advantage of the transistor technology (and also used assembly language instead of the difficult machine language). Both developed for atomic energy laboratories could handle enormous amounts of data, but still were costly and too powerful for the business sector's needs. Therefore only two LARC's were ever installed.

Throughout the early 1960s there were a number of commercially successful computers (for example the IBM 1401) used in business, universities, and government and by 1965 most large firms routinely processed financial information by using computers. Decisive for the success of computers in business was the stored program concept and the development of sophisticated high-level programming languages like FORTRAN (Formular Translator), 1956, and COBOL (Common Business-Oriented Language), 1960, that gave them the flexibility to be cost effective and productive. The invention of second generation computers also marked the beginning of an entire branch, the software industry, and the birth of a wide range of new types of careers.

TEXTBLOCK 13/26 // URL: http://world-information.org/wio/infostructure/100437611663/100438659439
 
RTMark and Adbusters at the WTO Conference in Seattle

The 1999 WTO (World Trade Organization) Conference in Seattle not only attracted a multitude of demonstrators, but also artistic and cultural activists like RTMark and Adbusters.

Adbusters, well known as fighters against corporate disinformation, injustices in the global economy and "physical and mental pollution", timely for the WTO Conference purchased three billboards in downtown Seattle. Featuring an image with the text "System Error - Type 2000 (progress)", the billboards were meant to challenge "... the WTO's agenda of global corporate growth and expose what isn't reflected in the United State's GNP - human and environmental capital."

At the same time RTMark went on-line with its spoof WTO website http://gatt.org. Shortly after its release WTO Director-General Mike Moore accused RTMark of attempting to "undermine WTO transparency" by copying the WTO website's design and using "domain names such as `www.gatt.org` and page titles such as 'World Trade Organization / GATT Home Page' which make it difficult for visitors to realize that these are fake pages." http://gatt.org is not the first time that RTMark has used website imitation aiming at rendering an entity more transparent. RTMark has performed the same "service" for George W. Bush, Rudy Giuliani, Shell Oil, and others with the principal purpose of publicizing corporate abuses of democratic processes.

TEXTBLOCK 14/26 // URL: http://world-information.org/wio/infostructure/100437611652/100438658922
 
Legal Protection: WIPO (World Intellectual Property Organization)

Presumably the major player in the field of international intellectual property protection and administrator of various multilateral treaties dealing with the legal and administrative aspects of intellectual property is the WIPO.

Information on WIPO administered agreements in the field of industrial property (Paris Convention for the Protection of Industrial Property (1883), Madrid Agreement Concerning the International Registration of Marks (1891) etc.) can be found on: http://www.wipo.org/eng/general/index3.htm

Information on treaties concerning copyright and neighboring rights (Berne Convention for the Protection of Literary and Artistic Works (1886) etc.) is published on: http://www.wipo.org/eng/general/index5.htm

The most recent multilateral agreement on copyright is the 1996 WIPO Copyright Treaty. Among other things it provides that computer programs are protected as literary works and also introduces the protection of databases, which "... by reason of the selection or arrangement of their content constitute intellectual creations." Furthermore the 1996 WIPO Copyright Treaty contains provisions concerning technological measures, rights management information and establishes a new "right of communication to the public". It is available on: http://www.wipo.org/eng/diplconf/distrib/treaty01.htm

TEXTBLOCK 15/26 // URL: http://world-information.org/wio/infostructure/100437611725/100438659588
 
Satyrs, Frankenstein, Machine Men, Cyborgs

The idea of hybrid beings between man and non-human entities can be traced back to mythology: mythologies, European and non-European are populated with beings which are both human and non-human, and which, because of this non-humanness, have served as reference points in the human endeavour of understanding what it means to be human. Perhaps "being human" is not even a meaningful phrase without the possibility to identify ourselves also with the negation of humanness, that is, to be human through the very possibility of identification with the non-human.

While in classical mythology, such being were usually between the man and animal kingdoms, or between the human and the divine, the advent of modern technology in the past two centuries has countered any such irrational representations of humanness. The very same supremacy of rationality which deposited the hybrid beings of mythology (and of religion) on the garbage heap of the modern period and which attempted a "pure" understanding of humanness, has also been responsible for the rapid advance of technology and which in turn prepared a "technical" understanding of the human.

The only non-human world which remains beyond the animal and divine worlds is the world of technology. The very attempt of a purist definition of the human ran encountered difficulty; the theories of Darwin and Freud undermined the believe that there was something essentially human in human beings, something that could be defined without references to the non-human.

Early representations of half man - half machine creatures echo the fear of the violent use of machinery, as in wars. Mary Shelley published Frankenstein in 1818, only a few years after the end of the Napoleonic wars. But machines are not only a source of fear exploited in fiction literature, their power and makes their non-humanness super-humanness. The French philosopher and doctor Julien de La Mettrie argues in his famous Machine Man that human beings are essentially constructed like machines and that they obey to the same principles. Machine Man provides a good example of how the ideas of the Enlightenment of human autonomy are interwoven with a technical discourse of perfection.

What human minds have later dreamed up about - usually hostile - artificial beings has segmented in the literary genre of science fiction. Science fiction seems to have provided the "last" protected zone for the strong emotions and hard values which in standard fiction literature would relegate a story into the realm of kitsch. Violent battles, strong heroes, daring explorations, infinity and solitude, clashes of right and wrong and whatever else makes up the aesthetic repertoire of metaphysics has survived unscathed in science fiction.

However, science fiction also seems to mark the final sequence of pure fiction: the Cyborg heroes populating this genre have transcended the boundary between fact and fiction, ridiculing most established social theories of technology based on technological instrumentalism. Donna Haraway has gone a long way in coming to terms with the cultural and social implications of this development. "By the late twentieth century, our time, a mythic time, we are all chimeras, theorized and fabricated hybrids of machine and organism; in short, we are cyborgs", Haraway states in her Cyborg Manifesto. In cyber culture, the boundaries between organisms and machines, between nature and culture become as ambivalent as the borderline between he physical and the non-physical: "Our best machines are made of sunshine; they are all light and clean because they are nothing but signals".

In the Flesh Machine the Critial Art Ensemble analyses the mapping of the body, as in genetics, as one aspect of keeping state power in place, the other two aspects being the "war machine" and the "sight machine". The mapping of the flesh machine is a logical and necessary consequence of the development of the other two "machines". Cyborgisation is in the words of CEA, the "coming of age of the flesh machine", which, although it has "intersected both the sight and war machine since ancient times ... is the slowest to develop. " Representation is a necessary preliminary to violence, since "Any successful offensive military action begins with visualization and representation. The significant principle here .... is that vision equals control."

TEXTBLOCK 16/26 // URL: http://world-information.org/wio/infostructure/100437611777/100438658891
 
Internet Content Providers Perspective

As within the traditional media landscape, Internet content providers have two primary means of generating revenue: Direct sales or subscriptions, and advertising. Especially as charging Internet users for access to content - with all the free material available - has proven problematic, advertising is seen as the best solution for creating revenues in the short term. Therefore intense competition has started among Internet content providers and access services to attract advertising money.

Table: Web-Sites Seeking Advertising


Period

Number of Web-Sites

June 1999

2111

July 1999

2174

August 1999

2311

September 1999

2560



Source: Adknowledge eAnalytics. Online Advertising Report

TEXTBLOCK 17/26 // URL: http://world-information.org/wio/infostructure/100437611652/100438657986
 
The Copyright Industry

Copyright is not only about protecting the rights of creators, but has also become a major branch of industry with significant contributions to the global economy. According to the International Intellectual Property Alliance the U.S. copyright industry has grown almost three times as fast as the economy as a whole for the past 20 years. In 1997, the total copyright industries contributed an estimated US$ 529.3 billion to the U.S. economy with the core copyright industries accounting for US$ 348.4 billion. Between 1977 and 1997, the absolute growth rate of value added to the U.S. GDP by the core copyright industries was 241 %. Also the copyright industry's foreign sales in 1997 (US$ 66.85 billion for the core copyright industries) were larger than the U.S. Commerce Department International Trade Administration's estimates of the exports of almost all other leading industry sectors. They exceeded even the combined automobile and automobile parts industries, as well as the agricultural sector.

In an age where knowledge and information become more and more important and with the advancement of new technologies, transmission systems and distribution channels a further increase in the production of intellectual property is expected. Therefore as copyright establishes ownership in intellectual property it is increasingly seen as the key to wealth in the future.

TEXTBLOCK 18/26 // URL: http://world-information.org/wio/infostructure/100437611725/100438658710
 
Basics: Protected Works

Usually the subject matter of copyright is described as "literary and artistic works" - original creations in the fields of literature and arts. Such works may be expressed in words, symbols, pictures, music, three-dimensional objects, or combinations thereof. Practically all national copyright laws provide for the protection of the following types of works:

Literary works: novels, poems dramatic works and any other writings, whether published or unpublished; in most countries also computer programs and "oral works"

Musical works

Artistic works: whether two-dimensional or three-dimensional; irrespective of their content and destination

Maps and technical drawings

Photographic works: irrespective of the subject matter and the purpose for which made

Audiovisual works: irrespective of their purpose, genre, length, method employed or technical process used

Some copyright laws also provide for the protection of choreographic works, derivative works (translations, adaptions), collections (compilations) of works and mere data (data bases); collections where they, by reason of the selection and arrangement of the contents, constitute intellectual creations. Furthermore in some countries also "works of applied art" (furniture, wallpaper etc.) and computer programs (either as literary works or independently) constitute copyrightable matter.

Under certain national legislations the notion "copyright" has a wider meaning than "author's rights" and, in addition to literary and artistic works, also extends to the producers of sound recordings, the broadcasters of broadcasts and the creators of distinctive typographical arrangements of publications.


TEXTBLOCK 19/26 // URL: http://world-information.org/wio/infostructure/100437611725/100438659538
 
Biometrics applications: gate keeping

Identity has to do with "place". In less mobile societies, the place where a person finds him/herself tells us something about his/her identity. In pre-industrial times, gatekeepers had the function to control access of people to particular places, i.e. the gatekeepers function was to identify people and then decide whether somebody's identity would allow that person to physically occupy another place - a town, a building, a vehicle, etc.

In modern societies, the unambiguous nature of place has been weakened. There is a great amount of physical mobility, and ever since the emergence and spread of electronic communication technologies there has been a "virtualisation" of places in what today we call "virtual space" (unlike place, space has been a virtual reality from the beginning, a mathematical formula) The question as to who one is no longer coupled to the physical abode. Highly mobile and virtualised social contexts require a new generation of gatekeepers which biometric technology aims to provide.

TEXTBLOCK 20/26 // URL: http://world-information.org/wio/infostructure/100437611729/100438658757
 
Disinformation and Science

Disinformation's tools emerged from science and art.
And furthermore: disinformation can happen in politics of course, but also in science:
for example by launching ideas which have not been proven exactly until the moment of publication. e.g. the thought that time runs backwards in parts of the universe:
http://www.newscientist.com/ns/19991127/newsstory3.html

TEXTBLOCK 21/26 // URL: http://world-information.org/wio/infostructure/100437611661/100438658699
 
Late 1960s - Early 1970s: Third Generation Computers

One of the most important advances in the development of computer hardware in the late 1960s and early 1970s was the invention of the integrated circuit, a solid-state device containing hundreds of transistors, diodes, and resistors on a tiny silicon chip. It made possible the production of large-scale computers (mainframes) of higher operating speeds, capacity, and reliability at significantly lower costs.

Another type of computer developed at the time was the minicomputer. It profited from the progresses in microelectronics and was considerably smaller than the standard mainframe, but, for instance, powerful enough to control the instruments of an entire scientific laboratory. Furthermore operating systems, that allowed machines to run many different programs at once with a central program that monitored and coordinated the computer's memory, attained widespread use.

TEXTBLOCK 22/26 // URL: http://world-information.org/wio/infostructure/100437611663/100438659498
 
Legal Protection: European Union

Within the EU's goal of establishing a European single market also intellectual property rights are of significance. Therefore the European Commission aims at the harmonization of the respective national laws of the EU member states and for a generally more effective protection of intellectual property on an international level. Over the years it has adopted a variety of Conventions and Directives concerned with different aspects of the protection of industrial property as well as copyright and neighboring rights.

An overview of EU activities relating to intellectual property protection is available on the website of the European Commission (DG Internal Market): http://www.europa.eu.int/comm/internal_market/en/intprop/intprop/index.htm

TEXTBLOCK 23/26 // URL: http://world-information.org/wio/infostructure/100437611725/100438659574
 
Commercial vs. Independent Content: Human and Financial Resources

Concerning their human and financial resources commercial media and independent content provider are an extremely unequal pair. While the 1998 revenues of the world's leading media conglomerates (AOL Time Warner, Disney, Bertelsmann, Viacom and the News Corporation) amounted to US$ 91,144,000,000 provider of independent content usually act on a non-profit basis and to a considerable extent depend on donations and contributions.

Also the human resources they have at their disposal quite differ. Viacom for example employs 112,000 people. Alternative media conversely are mostly run by a small group of activists, most of them volunteers. Moreover the majority of the commercial media giants has a multitude of subsidiaries (Bertelsmann for instance has operations in 53 countries), while independent content provider in some cases do not even have proper office spaces. Asked about their offices number of square meters Frank Guerrero from RTMark comments "We have no square meters at all, because we are only on the web. I guess if you add up all of our servers and computers we would take up about one or two square meters."

TEXTBLOCK 24/26 // URL: http://world-information.org/wio/infostructure/100437611734/100438659145
 
Problems of Copyright Management and Control Technologies

Profiling and Data Mining

At their most basic copyright management and control technologies might simply be used to provide pricing information, negotiate the purchase transaction, and release a copy of a work for downloading to the customer's computer. Still, from a technological point of view, such systems also have the capacity to be employed for digital monitoring. Copyright owners could for example use the transaction records generated by their copyright management systems to learn more about their customers. Profiles, in their crudest form consisting of basic demographic information, about the purchasers of copyrighted material might be created. Moreover copyright owners could use search agents or complex data mining techniques to gather more information about their customers that could either be used to market other works or being sold to third parties.

Fair Use

Through the widespread use of copyright management and control systems the balance of control could excessively be shifted in favor of the owners of intellectual property. The currently by copyright law supported practice of fair use might potentially be restricted or even eliminated. While information in analogue form can easily be reproduced, the protection of digital works through copyright management systems might complicate or make impossible the copying of material for purposes, which are explicitly exempt under the doctrine of fair use.

Provisions concerning technological protection measures and fair use are stated in the DMCA, which provides that "Since copying of a work may be a fair use under appropriate circumstances, section 1201 does not prohibit the act of circumventing a technological measure that prevents copying. By contrast, since the fair use doctrine is not a defense e to the act of gaining unauthorized access to a work, the act of circumventing a technological measure in order to gain access is prohibited." Also the proposed EU Directive on copyright and related rights in the information society contains similar clauses. It distinguishes between the circumvention of technical protection systems for lawful purposes (fair use) and the circumvention to infringe copyright. Yet besides a still existing lack of legal clarity also very practical problems arise. Even if the circumvention of technological protection measures under fair use is allowed, how will an average user without specialized technological know-how be able to gain access or make a copy of a work? Will the producers of copyright management and control systems provide fair use versions that permit the reproduction of copyrighted material? Or will users only be able to access and copy works if they hold a digital "fair use license" ("fair use licenses" have been proposed by Mark Stefik, whereby holders of such licenses could exercise some limited "permissions" to use a digital work without a fee)?

TEXTBLOCK 25/26 // URL: http://world-information.org/wio/infostructure/100437611725/100438659629
 
Exchange of the Text

One of the easiest tools for disinformation is to exchange the words written below a photograph. The entire meaning of the picture can be varied like this:

- The visit of a school-group at a former international camp can change into a camp, where children are imprisoned (which happened in the Russian city of Petroskoy in 1944).

- Victims of war can change nationality. The picture of the brutal German soldier in World War II that was shown in many newspapers to demonstrate the so-called typical face of a murderer, turned out to be French and a victim in other newspapers.

- In 1976 a picture of children in a day-nursery in the GDR is taken: The children, coming out of the shower, were dressed up in terry cloth suits with stripes. The same year the photograph with the happily laughing boys and girls wins the contest "a beautiful picture". Two years later a small part of the photograph can be seen in a Christian magazine in West-Germany, supposedly showing children from a concentration camp in the USSR. The smiling faces now seem to scream. (source: Stiftung Haus der Geschichte der Bundesrepublik Deutschland (ed.): Bilder, die lügen. Begleitbuch zur Ausstellung im Haus der Geschichte der Bundesrepublik Deutschland. Bonn 1998, p. 79)

TEXTBLOCK 26/26 // URL: http://world-information.org/wio/infostructure/100437611661/100438658773
 
ARPAnet

ARPAnet was the small network of individual computers connected by leased lines that marked the beginning of today's global data networks. Being an experimental network mainly serving the purpose to test the feasibility of wide area networks, the possibility of remote computing, it was created for resource sharing between research institutions, not for messaging services like E-mail. Although research was sponsored by US military, ARPAnet was not designed for directly martial use but to support military-related research.

In 1969 ARPANET went online and links the first two computers, one of them located at the University of California, Los Angeles, the other at the Stanford Research Institute.

But ARPAnet has not become widely accepted before it was demonstrated in action to a public of computer experts at the First International Conference on Computers and Communication in Washington, D. C. in 1972.

Before it was decommissioned in 1990, NSFnet, a network of scientific and academic computers funded by the National Science Foundation, and a separate new military network went online in 1986. In 1988 the first private Internet service providers offered a general public access to NSFnet. Beginning in 1995, after having become the backbone of the Internet in the USA, NSFnet was turned over to a consortium of commercial backbone providers. This and the launch of the World Wide Web added to the success of the global data network we call the Net.

In the USA commercial users already outnumbered military and academic users in 1994.

Despite the rapid growth of the Net, most computers linked to it are still located in the United States.

INDEXCARD, 1/8
 
WIPO

The World Intellectual Property Organization is one of the specialized agencies of the United Nations (UN), which was designed to promote the worldwide protection of both industrial property (inventions, trademarks, and designs) and copyrighted materials (literary, musical, photographic, and other artistic works). It was established by a convention signed in Stockholm in 1967 and came into force in 1970. The aims of WIPO are threefold. Through international cooperation, WIPO promotes the protection of intellectual property. Secondly, the organization supervises administrative cooperation between the Paris, Berne, and other intellectual unions regarding agreements on trademarks, patents, and the protection of artistic and literary work and thirdly through its registration activities the WIPO provides direct services to applicants for, or owners of, industrial property rights.

INDEXCARD, 2/8
 
Neighboring rights

Copyright laws generally provide for three kinds of neighboring rights: 1) the rights of performing artists in their performances, 2) the rights of producers of phonograms in their phonograms, and 3) the rights of broadcasting organizations in their radio and television programs. Neighboring rights attempt to protect those who assist intellectual creators to communicate their message and to disseminate their works to the public at large.

INDEXCARD, 3/8
 
Gottfried Wilhelm von Leibniz

b. July 1, 1646, Leipzig
d. November 14, 1716, Hannover, Hanover

German philosopher, mathematician, and political adviser, important both as a metaphysician and as a logician and distinguished also for his independent invention of the differential and integral calculus. 1661, he entered the University of Leipzig as a law student; there he came into contact with the thought of men who had revolutionized science and philosophy--men such as Galileo, Francis Bacon, Thomas Hobbes, and René Descartes. In 1666 he wrote De Arte Combinatoria ("On the Art of Combination"), in which he formulated a model that is the theoretical ancestor of some modern computers.

INDEXCARD, 4/8
 
Royal Dutch/Shell Group

One of the world's largest corporate entities in sales, consisting of companies in more than 100 countries, whose shares are owned by NV Koninklijke Nederlandsche Petroleum Maatschappij (Royal Dutch Petroleum Company Ltd.) of The Hague and by the "Shell" Transport and Trading Company, PLC, of London. Below these two parent companies are two holding companies, Shell Petroleum NV and the Shell Petroleum Company Limited, whose shares are owned 60 percent by Royal Dutch and 40 percent by "Shell" Transport and Trading. The holding companies, in turn, hold shares in and administer the subsidiary service companies and operating companies around the world, which engage in oil, petrochemical, and associated industries, from research and exploration to production and marketing. Several companies also deal in metals, nuclear energy, solar energy, coal, and consumer products.

INDEXCARD, 5/8
 
Automation

Automation is concerned with the application of machines to tasks once performed by humans or, increasingly, to tasks that would otherwise be impossible. Although the term mechanization is often used to refer to the simple replacement of human labor by machines, automation generally implies the integration of machines into a self-governing system. Automation has revolutionized those areas in which it has been introduced, and there is scarcely an aspect of modern life that has been unaffected by it. Nearly all industrial installations of automation, and in particular robotics, involve a replacement of human labor by an automated system. Therefore, one of the direct effects of automation in factory operations is the dislocation of human labor from the workplace. The long-term effects of automation on employment and unemployment rates are debatable. Most studies in this area have been controversial and inconclusive. As of the early 1990s, there were fewer than 100,000 robots installed in American factories, compared with a total work force of more than 100 million persons, about 20 million of whom work in factories.

INDEXCARD, 6/8
 
Moral rights

Authors of copyrighted works (besides economic rights) enjoy moral rights on the basis of which they have the right to claim their authorship and require that their names be indicated on the copies of the work and in connection with other uses thereof. Moral rights are generally inalienable and remain with the creator even after he has transferred his economic rights, although the author may waive their exercise.

INDEXCARD, 7/8
 
water-clocks

The water-clocks are an early long-distance-communication-system. Every communicating party had exactly the same jar, with a same-size-hole that was closed and the same amount of water in it. In the jar was a stick with different messages written on. When one party wanted to tell something to the other it made a fire-sign. When the other answered, both of them opened the hole at the same time. And with the help of another fire-sign closed it again at the same time, too. In the end the water covered the stick until the point of the wanted message.

INDEXCARD, 8/8