Global Data Flows

Fiber-optic cables, coaxial cables, copper wires, electric power lines, microwaves, satellite communication, mobile telephony, computer networks: Various telecommunication networks following a variety of standards with bewildering abbreviations - DSL, WAP, GSM, UMTS, Ipv4 etc. - and carrying endless flows of capital and information are the blood veins of modern societies.

In the space of flows constituted by today's global data networks the space of places is transcended. Visualizations of these global data flows show arches bridging seas and continents, thereby linking the world's centres of research and development, economics and politics. In the global "Network Society" (Manuel Castells) the traditional centres of power and domination are not discarded, in the opposite, they are strengthened and reinforced by the use of information and communication technologies. Political, economical and symbolical power becomes increasingly linked to the use of modern information and communication technologies. The most sensitive and advanced centres of information and communication technologies are the stock markets. Excluded from the network constituted by modern information and communication technologies, large parts of Africa, Asia and South America, but also the poor of industrialized countries, are ranking increasingly marginal to the world economy.

Cities are centres of communications, trade and power. The higher the percentage of urban population, the more it is likely that the telecommunications infrastructure is generally good to excellent. This goes hand in hand with lower telecommunications costs. Those parts of the world with the poorest infrastructure are also the world's poorhouse. In Bangladesh for most parts of the population a personal computer is as expensive as a limousine in European one-month's salary in Europe, they have to pay eight annual salaries. Therefore telecommunications infrastructure is concentrated on the highly industrialized world: Most telephone mainlines, mobile telephones, computers, Internet accounts and Internet hosts (computers connected to the global data networks) can be found here. The same applies to media: the daily circulation of newspapers and the use of TV sets and radios. - Telecommunication and media services affordable to most parts of the population are mostly restricted to industrialized countries.

This situation will not change in the foreseeable future: Most expenditure for telecommunications infrastructure will be restricted to the richest countries in the world. In 1998, the world's richest countries consumed 75% of all cables and wires.

TEXTBLOCK 1/31 // URL: http://world-information.org/wio/infostructure/100437611791/100438658776
 
1500 - 1700 A.D.

1588
Agostino Ramelli's reading wheel

Agostino Ramelli designed a "reading wheel", which allowed browsing through a large number of documents without moving from one spot to another.

The device presented a large number of books - a small library - laid open on lecterns on a kind of ferry-wheel. It allowed skipping chapters and browsing through pages by turning the wheel to bring lectern after lectern before the eyes. Ramelli's reading wheel thus linked ideas and texts and reminds of today's browsing software used to navigate the World Wide Web.

1597
The first newspaper is printed in Europe.

TEXTBLOCK 2/31 // URL: http://world-information.org/wio/infostructure/100437611796/100438659704
 
Content as Transport Medium for Values and Ideologies

With the dissemination of their content commercial media are among other things also able to transport values and ideologies. Usually their programming reflects society's dominant social, political, ethical, cultural and economical values. A critical view of the prevalent ideologies often is sacrificed so as not to offend the existing political elites and corporate powers, but rather satisfy shareholders and advertisers.

With most of the worlds content produced by a few commercial media conglomerates, with the overwhelming majority of companies (in terms of revenue generation) concentrated in Europe, the U.S., Japan and Australia there is also a strong flow of content from the 'North-West' to the 'South-East'. Popular culture developed in the world's dominant commercial centers and Western values and ideologies are so disseminated into the most distant corners of the earth with far less coming back.

TEXTBLOCK 3/31 // URL: http://world-information.org/wio/infostructure/100437611795/100438659066
 
Who owns the Internet and who is in charge?

The Internet/Matrix still depends heavily on public infrastructure and there is no dedicated owner of the whole Internet/Matrix, but the networks it consists of are run and owned by corporations and institutions. Access to the Internet is usually provided by Internet Service Providers (ISPs) for a monthly fee. Each network is owned by someone and has a network operation center from where it is centrally controlled, but the Internet/Matrix is not owned by any single authority and has no network operation center of its own. No legal authority determines how and where networks can be connected together, this is something the managers of networks have to agree about. So there is no way to ever gain ultimate control of the Matrix/Internet.
The in some respects decentralized Matrix/Internet architecture and administration do not imply that there are no authorities for oversight and common standards for sustaining basic operations, for administration: There are authorities for IP number and domain name registrations, e.g.
Ever since the organizational structures for Internet administration have changed according to the needs to be addressed. Up to now, administration of the Internet is a collaborative undertaking of several loose cooperative bodies with no strict hierarchy of authority. These bodies make decisions on common guidelines, as communication protocols, e.g., cooperatively, so that compatibility of software is guaranteed. But they have no binding legal authority, nor can they enforce the standards they have agreed upon, nor are they wholly representative for the community of Internet users. The Internet has no official governing body or organization; most parts are still administered by volunteers.
Amazingly, there seems to be an unspoken and uncodified consent of what is allowed and what is forbidden on the Internet that is widely accepted. Codifications, as the so-called Netiquette, are due to individual efforts and mostly just expressively stating the prevailing consent. Violations of accepted standards are fiercely rejected, as reactions to misbehavior in mailing lists and newsgroups prove daily.
Sometimes violations not already subject to law become part of governmental regulations, as it was the case with spamming, the unsolicited sending of advertising mail messages. But engineers proved to be quicker and developed software against spamming. So, in some respects, the Internet is self-regulating, indeed.
For a detailed report on Internet governance, click here.

TEXTBLOCK 4/31 // URL: http://world-information.org/wio/infostructure/100437611791/100438658447
 
Timeline 1600 - 1900 AD

17th century Cardinal Richelieu invents an encryption-tool called grille, a card with holes for writing messages on paper into the holes of those cards. Afterwards he removes the cards and fills in the blanks, so the message looks like an ordinary letter. The recipient needs to own the same card

- Bishop John Wilkins invents a cryptologic system looking like music notes. In a book he describes several forms of steganographic systems like secrets inks, but also the string cipher. He mentions the so-called Pig Latin, a spoken way of encryption that was already used by the ancient Indians

- the English scientist, magician and astrologer John Dee works on the ancient Enochian alphabet; he also possesses an encrypted writing that could not been broken until today

1605/1623 Sir Francis Bacon (= Francis Tudor = William Shakespeare?) writes several works containing ideas about cryptography. One of his most important advises is to use ciphers in such a way that no-one gets suspicious that the text could be enciphered. For this the steganogram was the best method, very often used in poems. The attempt to decipher Shakespeare's sonnets (in the 20th century) lead to the idea that his works had been written by Francis Bacon originally.

1671 Leibniz invents a calculating machine that uses the binary scale which we still use today, more advanced of course, called the ASCII code

18th century this is the time of the Black Chambers of espionage in Europe, Vienna having one of the most effective ones, called the "Geheime Kabinettskanzlei", headed by Baron Ignaz von Koch. Its task is to read through international diplomatic mail, copy letters and return them to the post-office the same morning. Supposedly about 100 letters are dealt with each day.

1790's Thomas Jefferson and Robert Patterson invent a wheel cipher

1799 the Rosetta Stone is found and makes it possible to decipher the Egyptian Hieroglyphs

1832 or 1838 Sam Morse develops the Morse Code, which actually is no code but an enciphered alphabet of short and long sounds. The first Morse code-message is sent by telegraph in 1844.

1834 the Braille Code for blind people is developed in today's form by Louis Braille

1844 the invention of the telegraph changes cryptography very much, as codes are absolutely necessary by then

1854 the Playfair cipher is invented by Sir Charles Wheatstone

1859 for the first time a tomographic cipher gets described

1861 Friedrich W. Kasiski does a cryptoanalysis of the Vigenère ciphers, which had been supposed to be uncrackable for ages

1891 Major Etienne Bazeries creates a new version of the wheel cipher, which is rejected by the French Army

1895 the invention of the radio changes cryptography-tasks again and makes them even more important

TEXTBLOCK 5/31 // URL: http://world-information.org/wio/infostructure/100437611776/100438658974
 
Znet

ZNet provides forum facilities for online discussion and chatting on various topics ranging from culture and ecology to international relations and economics. ZNet also publishes daily commentaries and maintains a Web-zine, which addresses current news and events as well as many other topics, trying to be provocative, informative and inspiring to its readers.

Strategies and Policies

Daily Commentaries: Znet's commentaries address current news and events, cultural happenings, and organizing efforts, providing context, critique, vision, and analysis, but also references to or reviews of broader ideas, new books, activism, the Internet, and other topics that strike the diverse participating authors as worthy of attention.

Forum System: Znet provides a private (and soon also a public) forum system. The fora are among others concerned with topics such as: activism, cultural, community/race/religion/ethnicity, ecology, economics/class, gender/kinship/sexuality, government/polity, international relations, ParEcon, vision/strategy and popular culture. Each forum has a set of threaded discussions, also the fora hosted by commentary writers like Chomsky, Ehrenreich, Cagan, Peters and Wise.

ZNet Daily WebZine: ZNet Daily WebZine offers commentaries in web format.

Z Education Online (planned): The Z Education Online site will provide instructionals and courses of diverse types as well as other university-like, education-aimed features.

TEXTBLOCK 6/31 // URL: http://world-information.org/wio/infostructure/100437611734/100438659288
 
Further Tools: Photography

Art has always contributed a lot to disinformation.
Many modern tools for disinformation are used in art/photography.
Harold D. Lasswell once stated that propaganda was cheaper than violence. Today this is no longer true. Technology has created new tools for propaganda and disinformation - and they are expensive. But by now our possibilities to manipulate pictures and stories have gone so far that it can get difficult to tell the difference between the original and a manipulation.

Trillions of photographs have been taken in the 20th century. Too many to look at, too many to control them and their use. A paradise for manipulation.
We have to keep in mind: There is the world, and there exist pictures of the world, which does not mean that both are the same thing. Photographs are not objective, because the photographer selects the part of the world which is becoming a picture. The rest is left out.

Some tools for manipulation of photography are:



Some of those are digital ways of manipulation, which helps to change pictures in many ways without showing the manipulation.

Pictures taken from the internet could be anything and come from anywhere. To proof the source is nearly impossible. Therefore scientists created on watermarks for pictures, which make it impossible to "steal" or manipulate a picture out of the net.

TEXTBLOCK 7/31 // URL: http://world-information.org/wio/infostructure/100437611661/100438658730
 
Legal Protection: TRIPS (Trade-Related Aspects of Intellectual Property Rights)

Another important multilateral treaty concerned with intellectual property rights is the TRIPS agreement, which was devised at the inauguration of the Uruguay Round negotiations of the WTO in January 1995. It sets minimum standards for the national protection of intellectual property rights and procedures as well as remedies for their enforcement (enforcement measures include the potential for trade sanctions against non-complying WTO members). The TRIPS agreement has been widely criticized for its stipulation that biological organisms be subject to intellectual property protection. In 1999, 44 nations considered it appropriate to treat plant varieties as intellectual property.

The complete TRIPS agreement can be found on: http://www.wto.org/english/tratop_e/trips_e/t_agm1_e.htm

TEXTBLOCK 8/31 // URL: http://world-information.org/wio/infostructure/100437611725/100438659758
 
Biometrics applications: physical access

This is the largest area of application of biometric technologies, and the most direct lineage to the feudal gate keeping system. Initially mainly used in military and other "high security" territories, physical access control by biometric technology is spreading into a much wider field of application. Biometric access control technologies are already being used in schools, supermarkets, hospitals and commercial centres, where the are used to manage the flow of personnel.

Biometric technologies are also used to control access to political territory, as in immigration (airports, Mexico-USA border crossing). In this case, they can be coupled with camera surveillance systems and artificial intelligence in order to identify potential suspects at unmanned border crossings. Examples of such uses in remote video inspection systems can be found at http://www.eds-ms.com/acsd/RVIS.htm

A gate keeping system for airports relying on digital fingerprint and hand geometry is described at http://www.eds-ms.com/acsd/INSPASS.htm. This is another technology which allows separating "low risk" travellers from "other" travellers.

An electronic reconstruction of feudal gate keeping capable of singling out high-risk travellers from the rest is already applied at various border crossing points in the USA. "All enrolees are compared against national lookout databases on a daily basis to ensure that individuals remain low risk". As a side benefit, the economy of time generated by the inspection system has meant that "drug seizures ... have increased since Inspectors are able to spend more time evaluating higher risk vehicles".

However, biometric access control can not only prevent people from gaining access on to a territory or building, they can also prevent them from getting out of buildings, as in the case of prisons.

TEXTBLOCK 9/31 // URL: http://world-information.org/wio/infostructure/100437611729/100438658838
 
Late 1950s - Early 1960s: Second Generation Computers

An important change in the development of computers occurred in 1948 with the invention of the transistor. It replaced the large, unwieldy vacuum tube and as a result led to a shrinking in size of electronic machinery. The transistor was first applied to a computer in 1956. Combined with the advances in magnetic-core memory, the use of transistors resulted in computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.

Stretch by IBM and LARC by Sperry-Rand (1959) were the first large-scale machines to take advantage of the transistor technology (and also used assembly language instead of the difficult machine language). Both developed for atomic energy laboratories could handle enormous amounts of data, but still were costly and too powerful for the business sector's needs. Therefore only two LARC's were ever installed.

Throughout the early 1960s there were a number of commercially successful computers (for example the IBM 1401) used in business, universities, and government and by 1965 most large firms routinely processed financial information by using computers. Decisive for the success of computers in business was the stored program concept and the development of sophisticated high-level programming languages like FORTRAN (Formular Translator), 1956, and COBOL (Common Business-Oriented Language), 1960, that gave them the flexibility to be cost effective and productive. The invention of second generation computers also marked the beginning of an entire branch, the software industry, and the birth of a wide range of new types of careers.

TEXTBLOCK 10/31 // URL: http://world-information.org/wio/infostructure/100437611663/100438659439
 
History: "Indigenous Tradition"

In preliterate societies the association of rhythmic or repetitively patterned utterances with supernatural knowledge endures well into historic times. Knowledge is passed from one generation to another. Similar as in the Southern tradition intellectual property rights are rooted in a concept of 'collective' or 'communal' intellectual property existing in perpetuity and not limited to the life of an individual creator plus some number of years after his or her death. Often rights are exercised by only one individual in each generation, often through matrilineal descent.


TEXTBLOCK 11/31 // URL: http://world-information.org/wio/infostructure/100437611725/100438659557
 
Internet, Intranets, Extranets, and Virtual Private Networks

With the rise of networks and the corresponding decline of mainframe services computers have become communication devices instead of being solely computational or typewriter-like devices. Corporate networks become increasingly important and often use the Internet as a public service network to interconnect. Sometimes they are proprietary networks.

Software companies, consulting agencies, and journalists serving their interests make some further differences by splitting up the easily understandable term "proprietary networks" into terms to be explained and speak of Intranets, Extranets, and Virtual Private Networks.

Cable TV networks and online services as Europe Online, America Online, and Microsoft Network are also proprietary networks. Although their services resemble Internet services, they offer an alternative telecommunication infrastructure with access to Internet services for their subscribers.
America Online is selling its service under the slogan "We organize the Web for you!" Such promises are more frightening than promising because "organizing" is increasingly equated with "filtering" of seemingly objectionable messages and "rating" of content. For more information on these issues, click here If you want to know more about the technical nature of computer networks, here is a link to the corresponding article in the Encyclopaedia Britannica.

Especially for financial transactions, secure proprietary networks become increasingly important. When you transfer funds from your banking account to an account in another country, it is done through the SWIFT network, the network of the Society for Worldwide Interbank Financial Telecommunication (SWIFT). According to SWIFT, in 1998 the average daily value of payments messages was estimated to be above U$ 2 trillion.

Electronic Communications Networks as Instinet force stock exchanges to redefine their positions in trading of equities. They offer faster trading at reduced costs and better prices on trades for brokers and institutional investors as mutual funds and pension funds. Last, but not least clients are not restricted to trading hours and can trade anonymously and directly, thereby bypassing stock exchanges.

TEXTBLOCK 12/31 // URL: http://world-information.org/wio/infostructure/100437611791/100438658384
 
The 2nd Chechnya-War

In the summer of 1999 between 1.200 and 2.000 Muslim rebels from Chechnya fell into Dagestan. Rumors say that Russian soldiers closed their eyes pretending not to see anything. During the fightings that started soon, many persons got killed. The hole issue was blamed on Chechnya.
At that time there were rumors that there would be heavy bombing in Moscow in September. And there was. Those two things together brought back the hatred against the Chechnya rebels. The 2nd War between Russia and the Muslim country began. While the first war was lost at home, because the Russians, especially mothers, did not understand why their sons should fight against Chechnya, this time the atmosphere was completely different. In the cities 85% and all over Russia 65% of the Russian population agreed with the war. This time the war was a national issue, a legitimate defense.
The media emphasized this.
Alexander Zilin, a journalist, found out that the truth was far from the one presented in the media: First of all there was no evidence that the Moscow-bombings were organized by Chechnyans. On the contrary it is more than probable that the crimes were organized by a governmental institution for national security. The disinformation was part of the strategy to make the population support another war with Chechnya. The media were part of the story, maybe without knowing. They kept on the government's and army's side, showing only special and patriotic parts of the war. For example the number of dead Russian soldiers was held back.

The U.S.-behavior on this:
The USA would like to intervene but they are afraid of ruining the weak relation to Russia. For years the main topic of U.S.-politics has been the struggle against terrorism. Now Russia pretends to be fighting terrorism. How could it be criticized for that?

The reason for this war is rather cynical: it worked as a public relations-campaign for Vladimir Putin, candidate for the president's elections in. When Putin came into power as minister-president of Russia in August 1999, opinion polls gave him 2% for the elections in summer 2000. By the end of November he got already 46%! And finally he won. The public relations war worked well.
At the same time a propaganda-campaign against his rival Y. Primakov (98), formerly the most popular candidate, was spreading lies and bad rumors. Opinion-polls showed very fast that he had lost the elections because of this black propaganda, even before the elections took place.

TEXTBLOCK 13/31 // URL: http://world-information.org/wio/infostructure/100437611661/100438658639
 
Timeline 1900-1970 AD

1913 the wheel cipher gets re-invented as a strip

1917 William Frederick Friedman starts working as a cryptoanalyst at Riverbank Laboratories, which also works for the U.S. Government. Later he creates a school for military cryptoanalysis

- an AT&T-employee, Gilbert S. Vernam, invents a polyalphabetic cipher machine that works with random-keys

1918 the Germans start using the ADFGVX-system, that later gets later by the French Georges Painvin

- Arthur Scherbius patents a ciphering machine and tries to sell it to the German Military, but is rejected

1919 Hugo Alexander Koch invents a rotor cipher machine

1921 the Hebern Electric Code, a company producing electro-mechanical cipher machines, is founded

1923 Arthur Scherbius founds an enterprise to construct and finally sell his Enigma machine for the German Military

late 1920's/30's more and more it is criminals who use cryptology for their purposes (e.g. for smuggling). Elizabeth Smith Friedman deciphers the codes of rum-smugglers during prohibition regularly

1929 Lester S. Hill publishes his book Cryptography in an Algebraic Alphabet, which contains enciphered parts

1933-1945 the Germans make the Enigma machine its cryptographic main-tool, which is broken by the Poles Marian Rejewski, Gordon Welchman and Alan Turing's team at Bletchley Park in England in 1939

1937 the Japanese invent their so called Purple machine with the help of Herbert O. Yardley. The machine works with telephone stepping relays. It is broken by a team of William Frederick Friedman. As the Japanese were unable to break the US codes, they imagined their own codes to be unbreakable as well - and were not careful enough.

1930's the Sigaba machine is invented in the USA, either by W.F. Friedman or his colleague Frank Rowlett

- at the same time the British develop the Typex machine, similar to the German Enigma machine

1943 Colossus, a code breaking computer is put into action at Bletchley Park

1943-1980 the cryptographic Venona Project, done by the NSA, is taking place for a longer period than any other program of that type

1948 Shannon, one of the first modern cryptographers bringing mathematics into cryptography, publishes his book A Communications Theory of Secrecy Systems

1960's the Communications-Electronics Security Group (= CESG) is founded as a section of Government Communications Headquarters (= GCHQ)

late 1960's the IBM Watson Research Lab develops the Lucifer cipher

1969 James Ellis develops a system of separate public-keys and private-keys

TEXTBLOCK 14/31 // URL: http://world-information.org/wio/infostructure/100437611776/100438658921
 
Legal Protection: European Union

Within the EU's goal of establishing a European single market also intellectual property rights are of significance. Therefore the European Commission aims at the harmonization of the respective national laws of the EU member states and for a generally more effective protection of intellectual property on an international level. Over the years it has adopted a variety of Conventions and Directives concerned with different aspects of the protection of industrial property as well as copyright and neighboring rights.

An overview of EU activities relating to intellectual property protection is available on the website of the European Commission (DG Internal Market): http://www.europa.eu.int/comm/internal_market/en/intprop/intprop/index.htm

TEXTBLOCK 15/31 // URL: http://world-information.org/wio/infostructure/100437611725/100438659574
 
Enforcement: Copyright Management and Control Technologies

With the increased ease of the reproduction and transmission of unauthorized copies of digital works over electronic networks concerns among the copyright holder community have arisen. They fear a further growth of copyright piracy and demand adequate protection of their works. A development, which started in the mid 1990s and considers the copyright owner's apprehensions, is the creation of copyright management systems. Technological protection for their works, the copyright industry argues, is necessary to prevent widespread infringement, thus giving them the incentive to make their works available online. In their view the ideal technology should be "capable of detecting, preventing, and counting a wide range of operations, including open, print, export, copying, modifying, excerpting, and so on." Additionally such systems could be used to maintain "records indicating which permissions have actually been granted and to whom".

TEXTBLOCK 16/31 // URL: http://world-information.org/wio/infostructure/100437611725/100438659674
 
Legal Protection: National Legislation

Intellectual property - comprising industrial property and copyright - in general is protected by national legislation. Therefore those rights are limited territorially and can be exercised only within the jurisdiction of the country or countries under whose laws they are granted.

TEXTBLOCK 17/31 // URL: http://world-information.org/wio/infostructure/100437611725/100438659540
 
Basics: Infringement and Fair Use

The rights of a copyright holder are infringed when one of the acts requiring the authorization of the owner is done by someone else without his consent. In the case of copyright infringement or the violation of neighboring rights the remedies for the copyright owner consist of civil redress. The unauthorized copying of protected works for commercial purposes and the unauthorized commercial dealing in copied material is usually referred to as "piracy".

Yet copyright laws also provide that the rights of copyright owners are subject to the doctrine of "fair use". That allows the reproduction and use of a work, notwithstanding the rights of the author, for limited purposes such as criticism, comment, news reporting, teaching, and research. Fair use may be described as the privilege to use the copyrighted material in a reasonable manner without the owner's consent. To determine whether a use is fair or not most copyright laws consider:

- the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes (usually certain types of educational copying are allowed)

- the nature of the copyrighted work (mostly originals made for commercial reasons are less protected than their purely artistic counterparts)

- the amount and substantiality of the portion used in relation to the copyrighted work as a whole

- the effect of the use upon the potential market for or value of the copyrighted work (as a general rule copying may be permitted if it is unlikely to cause economic harm to the original author)

Examples of activities that may be excused as fair use include: providing a quotation in a book review; distributing copies of a section of an article in class for educational purposes; and imitating a work for the purpose of parody or social commentary.

TEXTBLOCK 18/31 // URL: http://world-information.org/wio/infostructure/100437611725/100438659569
 
Challenges for Copyright by ICT: Introduction

Traditional copyright and the practice of paying royalties to the creators of intellectual property have emerged with the introduction of the printing press (1456). Therefore early copyright law has been tailored to the technology of print and the (re) production of works in analogue form. Over the centuries legislation concerning the protection of intellectual property has been adapted several times in order to respond to the technological changes in the production and distribution of information.

Yet again new technologies have altered the way of how (copyrighted) works are produced, copied, made obtainable and distributed. The emergence of global electronic networks and the increased availability of digitalized intellectual property confront existing copyright with a variety of questions and challenges. Although the combination of several types of works within one larger work or on one data carrier, and the digital format (although this may be a recent development it has been the object of detailed legal scrutiny), as well as networking (telephone and cable networks have been in use for a long time, although they do not permit interactivity) are nothing really new, the circumstance that recent technologies allow the presentation and storage of text, sound and visual information in digital form indeed is a novel fact. Like that the entire information can be generated, altered and used by and on one and the same device, irrespective of whether it is provided online or offline.


TEXTBLOCK 19/31 // URL: http://world-information.org/wio/infostructure/100437611725/100438659517
 
Transparent customers. Direct marketing online



This process works even better on the Internet because of the latter's interactive nature. "The Internet is a dream to direct marketers", said Wil Lansing, CEO of the American retailer Fingerhut Companies. Many services require you to register online, requiring users to provide as much information about them as possible. And in addition, the Internet is fast, cheap and used by people who tend to be young and on the search for something interesting.

Many web sites also are equipped with user tracking technology that registers a users behaviour and preferences during a visit. For example, user tracking technology is capable of identifying the equipment and software employed by a user, as well as movements on the website, visit of links etc. Normally such information is anonymous, but can be personalised when it is coupled with online registration, or when personal identifcation has been obtained from other sources. Registration is often a prerequisite not just for obtaining a free web mail account, but also for other services, such as personalised start pages. Based on the information provided by user, the start page will then include advertisements and commercial offers that correspond to the users profile, or to the user's activity on the website.

One frequent way of obtaining such personal information of a user is by offering free web mail accounts offered by a great many companies, internet providers and web portals (e.g. Microsoft, Yahoo, Netscape and many others). In most cases, users get "free" accounts in return for submitting personal information and agreeing to receive marketing mails. Free web mail accounts are a simple and effective direct marketing and data capturing strategy which is, however, rarely understood as such. However, the alliances formed between direct advertising and marketing agencies on the one hand, and web mail providers on the other hand, such as the one between DoubleClick and Yahoo, show the common logic of data capturing and direct marketing. The alliance between DoubleClick and Yahoo eventually attracted the US largest direct marketing agency, Abacus Direct, who ended up buying DoubleClick.

However, the intention of collecting users personal data and create consumer profiles based on online behaviour can also take on more creative and playful forms. One such example is sixdegrees.com. This is a networking site based on the assumption that everybody on the planet is connected to everybody else by a chain of six people at most. The site offers users to get to know a lot of new people, the friends of their friends of their friends, for example, and if they try hard enough, eventually Warren Beatty or Claudia Schiffer. But of course, in order to make the whole game more useful for marketing purposes, users are encouraged to join groups which share common interests, which are identical with marketing categories ranging from arts and entertainment to travel and holiday. Evidently, the game becomes more interesting the more new people a user brings into the network. What seems to be fun for the 18 to 24 year old college student customer segment targeted by sixdegrees is, of course, real business. While users entertain themselves they are being carefully profiled. After all, data of young people who can be expected to be relatively affluent one day are worth more than money.

The particular way in which sites such as sixdegrees.com and others are structured mean that not only to users provide initial information about them, but also that this information is constantly updated and therefore becomes even more valuable. Consequently, many free online services or web mail providers cancel a user's account if it has not been uses for some time.

There are also other online services which offer free services in return for personal information which is then used for marketing purposes, e.g. Yahoo's Geocities, where users may maintain their own free websites, Bigfoot, where people are offered a free e-mail address for life, that acts as a relais whenever a customer's residence or e-mail address changes. In this way, of course, the marketers can identify friendship and other social networks, and turn this knowledge into a marketing advantage. People finders such as WhoWhere? operate along similar lines.

A further way of collecting consumer data that has recently become popular is by offering free PCs. Users are provided with a PC for free or for very little money, and in return commit themselves to using certain services rather than others (e.g. a particular internet provider), providing information about themselves, and agree to have their online behaviour monitored by the company providing the PC, so that accurate user profiles can be compiled. For example, the Free PC Network offers advertisers user profiles containing "over 60 individual demographics". There are literally thousands of variations of how a user's data are extracted and commercialised when online. Usually this happens quietly in the background.

A good inside view of the world of direct marketing can be gained at the website of the American Direct Marketing Association and the Federation of European Direct Marketing.

TEXTBLOCK 20/31 // URL: http://world-information.org/wio/infostructure/100437611761/100438659667
 
Global hubs of the data body industry

While most data bunkers are restricted to particular areas or contexts, there are others which act as global data nodes. Companies such as EDS (Electronic Data Systems), Experian, First Data Corporation and Equifax operate globally and run giant databases containing personal information. They are the global hubs of the data body economy.

Company

Sales in USD billions

Size of client database in million datasets





Equifax





1,7





360





Experian





1,5





779





Fist Data Corporation





5,5





260





EDS





18,5









(not disclosed)

(Sales and database sizes, 1998)

The size of these data repositories is constantly growing, so it is only a matter of time when everybody living in the technologically saturated part of the world will be registered in one of these data bunkers.

Among these companies, EDS, founded by the former US presidential candidate Ross Perot, known for his right-wing views and direct language, is of particular importance. Not only is it the world's largest data body company, it is also secretive about the size of its client database - a figure disclosed by the other companies either in company publications or upon enquiry. After all, the size of such a data base makes a company more attractive for potential customers.

For many years, EDS has been surrounded by rumours concerning sinister involvement with intelligence agencies. Beyond the rumours, though, there are also facts. EDS has a special division for government services. EDS does business with all military agencies of the US, as well as law enforcement agencies, justice agencies, and many others. The company also maintains a separate division for military equipment In 1984, the company became a subsidiary of General Motors, itself a leading manufacturer of military and intelligence systems. EDS is listed by the Federation of American Scientist's intelligence resource program as contractor to US intelligence agencies, and prides itself, amongst other things, to respond to the "rise of the citizen as a consumer".

TEXTBLOCK 21/31 // URL: http://world-information.org/wio/infostructure/100437611761/100438659778
 
It is always the others

Disinformation is supposed to be something evil, something ethically not correct. And therefore we prefer to connect it to the past or to other political systems than the ones in the Western hemisphere. It is always the others who work with disinformation. The same is true for propaganda.
Even better, if we can refer it to the past: Adolf Hitler, supposedly one of the world's greatest and most horrible propagandists (together with his Reichsminister für Propaganda Josef Goebbels) did not invent modern propaganda either. It was the British example during World War I, the invention of modern propaganda, where he took his knowledge from. And it was Hitler's Reich, where (racist) propaganda and disinformation were developed to a perfect manipulation-tool in a way that the consequences are still working today.
A war loses support of the people, if it is getting lost. Therefore it is extremely important to launch a feeling of winning the war. Never give up emotions of victory. Governments know this and work hard on keeping the mood up. The Germans did a very hard job on that in the last months of World War II.
But the in the 1990s disinformation- and propaganda-business came back to life (if it ever had gone out of sight) through Iraq's invasion of Kuwait and the reactions by democratic states. After the war, reports made visible that not much had happened the way we had been told it had happened. Regarded like this the Gulf War was the end of the New World Order, a better and geographically broader democratic order, that had just pretended to having begun.

TEXTBLOCK 22/31 // URL: http://world-information.org/wio/infostructure/100437611661/100438658640
 
What is the Internet?

Each definition of the Internet is a simplified statement and runs the risk of being outdated within a short time. What is usually referred to as the Internet is a network of thousands of computer networks (so called autonomous systems) run by governmental authorities, companies, and universities, etc. Generally speaking, every time a user connects to a computer networks, a new Internet is created. Technically speaking, the Internet is a wide area network (WAN) that may be connected to local area networks (LANs).

What constitutes the Internet is constantly changing. Certainly the state of the future Net will be different to the present one. Some years ago the Internet could still be described as a network of computer networks using a common communication protocol, the so-called IP protocol. Today, however, networks using other communication protocols are also connected to other networks via gateways.

Also, the Internet is not solely constituted by computers connected to other computers, because there are also point-of-sale terminals, cameras, robots, telescopes, cellular phones, TV sets and and an assortment of other hardware components that are connected to the Internet.

At the core of the Internet are so-called Internet exchanges, national backbone networks, regional networks, and local networks.

Since these networks are often privately owned, any description of the Internet as a public network is not an accurate. It is easier to say what the Internet is not than to say what it is. On 24 October, 1995 the U.S. Federal Networking Council made the following resolution concerning the definition of the term "Internet": "Internet" refers to the global information system that (i) is logically linked together by a globally unique address space based on the Internet Protocol (IP) or its subsequent extensions/follow-ons; (ii) is able to support communications using the Transmission Control Protocol/Internet Protocol (TCP/IP) suite or its subsequent extensions/follow-ons, and/or other IP-compatible protocols; and (iii) provides, uses or makes accessible, either publicly or privately, high level services layered on the communications and related infrastructure described herein." (http://www.fnc.gov/Internet_res.html)

What is generally and in a simplyfiying manner called the Internet, may be better referred to as the Matrix, a term introduced by science fiction writer William Gibson, as John S. Quarterman and Smoot Carl-Mitchell have proposed. The Matrix consists of all computer systems worldwide capable of exchanging E-Mail: of the USENET, corporate networks and proprietary networks owned by telecommunication and cable TV companies.

Strictly speaking, the Matrix is not a medium; it is a platform for resources: for media and services. The Matrix is mainly a very powerful means for making information easily accessible worldwide, for sending and receiving messages, videos, texts and audio files, for transferring funds and trading securities, for sharing resources, for collecting weather condition data, for trailing the movements of elephants, for playing games online, for video conferencing, for distance learning, for virtual exhibitions, for jamming with other musicians, for long distance ordering, for auctions, for tracking packaged goods, for doing business, for chatting, and for remote access of computers and devices as telescopes and robots remotely, e. g. The Internet is a wonderful tool for exchanging, retrieving, and storing data and sharing equipment over long distances and eventually real-time, if telecommunication infrastructure is reliable and of high quality.

For a comprehensive view of uses of the Matrix, especially the World Wide Web, see ""24 Hours in Cyberspace"

TEXTBLOCK 23/31 // URL: http://world-information.org/wio/infostructure/100437611791/100438659889
 
Global data bodies - intro

- Education files, insurance files, tax files, communication files, consumption files, medical files, travel files, criminal files, investment files, files into infinity ...

Critical Art Ensemble

Global data bodies

1. Introduction

Informatisation has meant that things that once were "real", i.e. whose existence could be experienced sensually, are becoming virtual. Instead of the real existence of a thing, the virtual refers to its possibility of existence. As this process advances, an increasing identification of the possible with the real occurs. Reality migrates into a dim and dematerialised grey area. In the end, the possible counts for the real, virtualisation creates an "as-if" experience.

The experience of the body is also affected by this process. For example, in bio-technology, the human body and its functions are digitised, which prepares and understanding of the body exlusively in terms of its potential manipulation, the body becomes whatever it could be. But digitisation has not only affected the understanding and the social significance of the body, it has also altered the meaning of presence, traditionally identified with the body. The advance of information and communication technologies (ICTs) has meant that for an increasing number of activities we no longer need be physically present, our "virtual" presence, achieved by logging onto a electronic information network, is sufficient.

This development, trumpeted as the pinnacle of convenience by the ICT industries and governments interested in attracting investment, has deeply problematic aspects as well. For example, when it is no longer "necessary" to be physically present, it may soon no longer be possible or allowed. Online-banking, offered to customers as a convenience, is also serves as a justification for charging higher fees from those unwilling or unable to add banking to their household chores. Online public administration may be expected to lead to similar effects. The reason for this is that the digitalisation of the economy relies on the production of surplus data. Data has become the most important raw material of modern economies.

In modern economies, informatisation and virtualisation mean that people are structurally forced to carry out their business and life their lives in such a way as to generate data.

Data are the most important resource for the New Economy. By contrast, activities which do not leave behind a trace of data, as for example growing your own carrots or paying cash rather than by plastic card, are discouraged and structurally suppressed.

TEXTBLOCK 24/31 // URL: http://world-information.org/wio/infostructure/100437611761/100438659649
 
Timeline BC

~ 1900 BC: Egyptian writers use non-standard Hieroglyphs in inscriptions of a royal tomb; supposedly this is not the first but the first documented example of written cryptography

1500 an enciphered formula for the production of pottery is done in Mesopotamia

parts of the Hebrew writing of Jeremiah's words are written down in "atbash", which is nothing else than a reverse alphabet and one of the first famous methods of enciphering

4th century Aeneas Tacticus invents a form of beacons, by introducing a sort of water-clock

487 the Spartans introduce the so called "skytale" for sending short secret messages to and from the battle field

170 Polybius develops a system to convert letters into numerical characters, an invention called the Polybius Chequerboard.

50-60 Julius Caesar develops an enciphering method, later called the Caesar Cipher, shifting each letter of the alphabet an amount which is fixed before. Like atbash this is a monoalphabetic substitution.

TEXTBLOCK 25/31 // URL: http://world-information.org/wio/infostructure/100437611776/100438659084
 
0 - 1400 A.D.

150
A smoke signals network covers the Roman Empire

The Roman smoke signals network consisted of towers within a visible range of each other and had a total length of about 4500 kilometers. It was used for military signaling.
For a similar telegraph network in ancient Greece see Aeneas Tacitus' optical communication system.

About 750
In Japan block printing is used for the first time.

868
In China the world's first dated book, the Diamond Sutra, is printed.

1041-1048
In China moveable types made from clay are invented.

1088
First European medieval university is established in Bologna.

The first of the great medieval universities was established in Bologna. At the beginning universities predominantly offered a kind of do-it-yourself publishing service.

Books still had to be copied by hand and were so rare that a copy of a widely desired book qualified for being invited to a university. Holding a lecture equaled to reading a book aloud, like a priest read from the Bible during services. Attending a lecture equaled to copy a lecture word by word, so that you had your own copy of a book, thus enabling you to hold a lecture, too.

For further details see History of the Idea of a University, http://quarles.unbc.edu/ideas/net/history/history.html

TEXTBLOCK 26/31 // URL: http://world-information.org/wio/infostructure/100437611796/100438659702
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 27/31 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
Other biometric technologies

Other biometric technologies not specified here include ear recognition, signature dynamics, key stroke dynamics, vein pattern recognition, retinal scan, body odour recognition, and DNA recognition. These are technologies which are either in early stages of development or used in highly specialised and limited contexts.

TEXTBLOCK 28/31 // URL: http://world-information.org/wio/infostructure/100437611729/100438658399
 
Virtual cartels; mergers

In parallel to the deregulation of markets, there has been a trend towards large-scale mergers which ridicules dreams of increased competition.

Recent mega-mergers and acquisitions include

SBC Communications - Ameritech, $ 72,3 bn

Bell Atlantic - GTE, $ 71,3

AT&T - Media One, $ 63,1

AOL - Time Warner, $ 165 bn

MCI Worldcom - Spring, $ 129 bn

The total value of all major mergers since the beginnings of the 1990s has been 20 trillion Dollars, 2,5 times the size of the USA's GIP.

The AOL- Time Warner reflects a trend which can be observed everywhere: the convergence of the ICT and the content industries. This represents the ultimate advance in complete market domination, and a alarming threat to independent content.

"Is TIME going to write something negative about AOL? Will AOL be able to offer anything other than CNN sources? Is the Net becoming as silly and unbearable as television?"

(Detlev Borchers, journalist)

TEXTBLOCK 29/31 // URL: http://world-information.org/wio/infostructure/100437611709/100438658959
 
Basics: Protected Persons

Generally copyright vests in the author of the work. Certain national laws provide for exceptions and, for example, regard the employer as the original owner of a copyright if the author was, when the work was created, an employee and employed for the purpose of creating that work. In the case of some types of creations, particularly audiovisual works, several national laws provide for different solutions to the question that should be the first holder of copyright in such works.

Many countries allow copyright to be assigned, which means that the owner of the copyright transfers it to another person or entity, which then becomes its holder. When the national law does not permit assignment it usually provides the possibility to license the work to someone else. Then the owner of the copyright remains the holder, but authorizes another person or entity to exercise all or some of his rights subject to possible limitations. Yet in any case the "moral rights" always belong to the author of the work, whoever may be the owner of the copyright (and therefore of the "economic rights").


TEXTBLOCK 30/31 // URL: http://world-information.org/wio/infostructure/100437611725/100438659527
 
Challenges for Copyright by ICT: Digital Content Providers

Providers of digital information might be confronted with copyright related problems when using some of the special features of hypertext media like frames and hyperlinks (which both use third party content available on the Internet to enhance a webpage or CD ROM), or operate a search engine or online directory on their website.

Framing

Frames are often used to help define, and navigate within, a content provider's website. Still, when they are used to present (copyrighted) third party material from other sites issues of passing off and misleading or deceptive conduct, as well as copyright infringement, immediately arise.

Hyperlinking

It is generally held that the mere creation of a hyperlink does not, of itself, infringe copyright as usually the words indicating a link or the displayed URL are unlikely to be considered a "work". Nevertheless if a link is clicked on the users browser will download a full copy of the material at the linked address creating a copy in the RAM of his computer courtesy of the address supplied by the party that published the link. Although it is widely agreed that the permission to download material over the link must be part of an implied license granted by the person who has made the material available on the web in the first place, the scope of this implied license is still the subject of debate. Another option that has been discussed is to consider linking fair use.

Furthermore hyperlinks, and other "information location tools", like online directories or search engines could cause their operators trouble if they refer or link users to a site that contains infringing material. In this case it is yet unclear whether providers can be held liable for infringement.

TEXTBLOCK 31/31 // URL: http://world-information.org/wio/infostructure/100437611725/100438659590
 
Enigma Machine

The Enigma Encryption Machine was famous for its insecurities as for the security that it gave to German ciphers. It was broken, first by the Poles in the 1930s, then by the British in World War II.

INDEXCARD, 1/44
 
Economic rights

The economic rights (besides moral rights and in some cases also neighboring rights) granted to the owners of copyright usually include 1) copying or reproducing a work, 2) performing a work in public, 3) making a sound recording of a work, 4) making a motion picture of a work, 5) broadcasting a work, 6) translating a work and 7) adapting a work. Under certain national laws some of these rights are not exclusive rights of authorization but in specific cases, merely rights to remuneration.

INDEXCARD, 2/44
 
Center for Democracy and Technology

The Center for Democracy and Technology works to promote democratic values and constitutional liberties in the digital age. With expertise in law, technology, and policy, the Center seeks practical solutions to enhance free expression and privacy in global communications technologies. The Center is dedicated to building consensus among all parties interested in the future of the Internet and other new communications media.

http://www.cdt.org

INDEXCARD, 3/44
 
Vacuum tube

The first half of the 20th century was the era of the vacuum tube in electronics. This variety of electron tube permitted the development of radio broadcasting, long-distance telephony, television, and the first electronic digital computers. These early electronic computers were, in fact, the largest vacuum-tube systems ever built. Perhaps the best-known representative is the ENIAC (Electronic Numerical Integrator and Computer, completed in 1946).

INDEXCARD, 4/44
 
Gaius Julius Caesar

Gaius Julius Caesar (100-44 BC) was a Roman Statesman who came to power through a military career and by buying of votes. His army won the civil war, run over Spain, Sicily and Egypt, where he made Cleopatra a Queen. For reaching even more power he increased the number of senators. But he also organized social measures to improve the people's food-situation. In February 44 BC he did not accept the kingship offered by Marc Anthony, which made him even more popular. One month later he was murdered during a senate sitting.

INDEXCARD, 5/44
 
Sperry

Formerly (1955 - 1979) Sperry Rand Corporation, American corporation that merged with the Burroughs Corporation in 1986 to form Unisys Corporation, a large computer manufacturer.

INDEXCARD, 6/44
 
Fiber-optic cable networks

Fiber-optic cable networks may become the dominant method for high-speed Internet connections. Since the first fiber-optic cable was laid across the Atlantic in 1988, the demand for faster Internet connections is growing, fuelled by the growing network traffic, partly due to increasing implementation of corporate networks spanning the globe and to the use of graphics-heavy contents on the World Wide Web.

Fiber-optic cables have not much more in common with copper wires than the capacity to transmit information. As copper wires, they can be terrestrial and submarine connections, but they allow much higher transmission rates. Copper wires allow 32 telephone calls at the same time, but fiber-optic cable can carry 40,000 calls at the same time. A capacity, Alexander Graham Bell might have not envisioned when he transmitted the first words - "Mr. Watson, come here. I want you" - over a copper wire.

Copper wires will not come out of use in the foreseeable future because of technologies as DSL that speed up access drastically. But with the technology to transmit signals at more than one wavelength on fiber-optic cables, there bandwidth is increasing, too.

For technical information from the Encyclopaedia Britannica on telecommunication cables, click here. For technical information from the Encyclopaedia Britannica focusing on fiber-optic cables, click here.

An entertaining report of the laying of the FLAG submarine cable, up to now the longest fiber-optic cable on earth, including detailed background information on the cable industry and its history, Neal Stephenson has written for Wired: Mother Earth Mother Board. Click here for reading.

Susan Dumett has written a short history of undersea cables for Pretext magazine, Evolution of a Wired World. Click here for reading.

A timeline history of submarine cables and a detailed list of seemingly all submarine cables of the world, operational, planned and out of service, can be found on the Web site of the International Cable Protection Committee.

For maps of fiber-optic cable networks see the website of Kessler Marketing Intelligence, Inc.

http://www.britannica.com/bcom/eb/article/4/0...
http://www.britannica.com/bcom/eb/article/4/0...
http://www.wired.com/wired/archive/4.12/ffgla...
http://www.pretext.com/mar98/features/story3....
INDEXCARD, 7/44
 
François Duvalier

b. April 14, 1907, Port-au-Prince, Haiti
d. April 21, 1971, Port-au-Prince

By name PAPA DOC, president of Haiti whose 14-year regime was of unprecedented duration in that country. A supporter of President Dumarsais Estimé, Duvalier was appointed director general of the National Public Health Service in 1946. He was appointed underminister of labour in 1948 and the following year became minister of public health and labour, a post that he retained until May 10, 1950, when President Estimé was overthrown by a military junta under Paul E. Magloire, who was subsequently elected president. By 1954 he had become the central opposition figure and went underground. Duvalier was elected president in September 1957. Setting about to consolidate his power, he reduced the size of the army and organized the Tontons Macoutes ("Bogeymen"), a private force responsible for terrorizing and assassinating alleged foes of the regime. Late in 1963 Duvalier moved further toward an absolutist regime, promoting a cult of his person as the semi divine embodiment of the Haitian nation. In April 1964 he was declared president for life. Although diplomatically almost completely isolated, excommunicated by the Vatican until 1966 for harassing the clergy, and threatened by conspiracies against him, Duvalier was able to stay in power longer than any of his predecessors.

INDEXCARD, 8/44
 
User tracking

User tracking is a generic term that covers all the techniques of monitoring the movements of a user on a web site. User tracking has become an essential component in online commerce, where no personal contact to customers is established, leaving companies with the predicament of not knowing who they are talking to. Some companies, such as Red Eye, Cyber Dialogue, and SAS offer complete technology packages for user tracking and data analysis to online businesses. Technologies include software solutions such as e-mine, e-discovery, or WebHound

Whenever user tracking is performed without the explicit agreement of the user, or without laying open which data are collected and what is done with them, considerable privacy concerns have been raised.

http://www.redeye.co.uk/
http://www.cyberdialogue.com/
http://www.sas.com/
http://www.spss.com/emine/
http://www.sas.com/solutions/e-discovery/inde...
http://www.sas.com/products/webhound/index.ht...
http://www.linuxcare.com.au/mbp/meantime/
INDEXCARD, 9/44
 
Open Systems Interconnection (OSI)

Open Systems Interconnection (OSI) is a standard reference model for communication between two end users in a network. It is used in developing products and understanding networks.

Source: Whatis.com

INDEXCARD, 10/44
 
Caching

Caching is a mechanism that attempts to decrease the time it takes to retrieve data by storing a copy at a closer location.

INDEXCARD, 11/44
 
Machine vision

A branch of artificial intelligence and image processing concerned with the identification of graphic patterns or images that involves both cognition and abstraction. In such a system, a device linked to a computer scans, senses, and transforms images into digital patterns, which in turn are compared with patterns stored in the computer's memory. The computer processes the incoming patterns in rapid succession, isolating relevant features, filtering out unwanted signals, and adding to its memory new patterns that deviate beyond a specified threshold from the old and are thus perceived as new entities.

INDEXCARD, 12/44
 
Robot

Robot relates to any automatically operated machine that replaces human effort, though it may not resemble human beings in appearance or perform functions in a humanlike manner. The term is derived from the Czech word robota, meaning "forced labor." Modern use of the term stems from the play R.U.R., written in 1920 by the Czech author Karel Capek, which depicts society as having become dependent on mechanical workers called robots that are capable of doing any kind of mental or physical work. Modern robot devices descend through two distinct lines of development--the early automation, essentially mechanical toys, and the successive innovations and refinements introduced in the development of industrial machinery.

INDEXCARD, 13/44
 
Convergence, 2000-

Digital technologies are used to combine previously separated communication and media systems as telephony, audiovisual technologies and computing to new services and technologies, thus forming extensions of existing communication systems and resulting in fundamentally new communication systems. This is what is meant by today's new buzzwords "multimedia" and "convergence".

Classical dichotomies as the one of computing and telephony and traditional categorisations no longer apply, because these new services no longer fit traditional categories.

INDEXCARD, 14/44
 
water-clocks

The water-clocks are an early long-distance-communication-system. Every communicating party had exactly the same jar, with a same-size-hole that was closed and the same amount of water in it. In the jar was a stick with different messages written on. When one party wanted to tell something to the other it made a fire-sign. When the other answered, both of them opened the hole at the same time. And with the help of another fire-sign closed it again at the same time, too. In the end the water covered the stick until the point of the wanted message.

INDEXCARD, 15/44
 
VISA

Visa International's over 21,000 member financial institutions have made VISA one of the world's leading full-service payment network. Visa's products and services include Visa Classic card, Visa Gold card, Visa debit cards, Visa commercial cards and the Visa Global ATM Network. VISA operates in 300 countries and territories and also provides a large consumer payments processing system.

INDEXCARD, 16/44
 
atbash

Atbash is regarded as the simplest way of encryption. It is nothing else than a reverse-alphabet. a=z, b= y, c=x and so on. Many different nations used it in the early times of writing.

for further explanations see:
http://www.ftech.net/~monark/crypto/crypt/atbash.htm

http://www.ftech.net/~monark/crypto/crypt/atb...
INDEXCARD, 17/44
 
Kosov@

The "word" Kosov@ is a compromise between the Serb name KosovO and the Albanian KosovA. It is mostly used by international people who want to demonstrate a certain consciousness about the conflict including some sort of neutrality, believing that neither the one side nor the other (and maybe not even NATO) is totally right. Using the word Kosov@ is seen as a symbol of peace.

For more explanations (in German) see: http://www.zivildienst.at/kosov@.htm

http://www.zivildienst.at/kosov@.htm
INDEXCARD, 18/44
 
AT&T Labs-Research

The research and development division of AT&T. Inventions made at AT&T Labs-Research include so important ones as stereo recording, the transistor and the communications satellite.

http://www.research.att.com/

INDEXCARD, 19/44
 
Intranet

As a local area network (LAN), an Intranet is a secured network of computers based on the IP protocol and with restricted access.

INDEXCARD, 20/44
 
Ron Rivest

Ronald L. Rivest is Webster Professor of Electrical Engineering and Computer Science in MIT's EECS Department. He was one of three persons in a team to invent the RSA public-key cryptosystem. The co-authors were Adi Shamir and Leonard M. Adleman.

INDEXCARD, 21/44
 
NATO

The North Atlantic Treaty was signed in Washington on 4 April 1949, creating NATO (= North Atlantic Treaty Organization). It was an alliance of 12 independent nations, originally committed to each other's defense. Between 1952 and 1982 four more members were welcomed and in 1999, the first ex-members of COMECON became members of NATO (the Czech Republic, Hungary and Poland), which makes 19 members now. Around its 50th anniversary NATO changed its goals and tasks by intervening in the Kosovo Crisis.

INDEXCARD, 22/44
 
NSFNet

Developed under the auspices of the National Science Foundation (NSF), NSFnet served as the successor of the ARPAnet as the main network linking universities and research facilities until 1995, when it was replaced it with a commercial backbone network. Being research networks, ARPAnet and NSFnet served as testing grounds for future networks.

INDEXCARD, 23/44
 
Cooperative Association of Internet Data Analysis (CAIDA)

Based at the University of California's San Diego Supercomputer Center, CAIDA supports cooperative efforts among the commercial, government and research communities aimed at promoting a scalable, robust Internet infrastructure. It is sponsored by the Defense Advanced Research Project Agency (DARPA) through its Next Generation Internet program, by the National Science Foundation, Cisco, Inc., and Above.net.

INDEXCARD, 24/44
 
Internet Relay Chat (IRC)

IRC is a text-based chat system used for live discussions of groups.

For a history of IRC see Charles A. Gimon, IRC: The Net in Realtime, http://www.skypoint.com/~gimonca/irc2.html

http://www.skypoint.com/~gimonca/irc2.html
INDEXCARD, 25/44
 
Internet Software Consortium

The Internet Software Consortium (ISC) is a nonprofit corporation dedicated to the production of high-quality reference implementations of Internet standards that meet production standards. Its goal is to ensure that those reference implementations are properly supported and made freely available to the Internet community.

http://www.isc.org

INDEXCARD, 26/44
 
CIA

CIA's mission is to support the President, the National Security Council, and all officials who make and execute U.S. national security policy by: Providing accurate, comprehensive, and timely foreign intelligence on national security topics; Conducting counterintelligence activities, special activities, and other functions related to foreign intelligence and national security, as directed by the President. To accomplish its mission, the CIA engages in research, development, and deployment of high-leverage technology for intelligence purposes. As a separate agency, CIA serves as an independent source of analysis on topics of concern and works closely with the other organizations in the Intelligence Community to ensure that the intelligence consumer--whether Washington policymaker or battlefield commander--receives the adaequate intelligence information.

http://www.cia.gov

INDEXCARD, 27/44
 
Satellites

Communications satellites are relay stations for radio signals and provide reliable and distance-independent high-speed connections even at remote locations without high-bandwidth infrastructure.

On point-to-point transmission, the transmission method originally employed on, satellites face increasing competition from fiber optic cables, so point-to-multipoint transmission increasingly becomes the ruling satellite technology. Point-to-multipoint transmission enables the quick implementation of private networks consisting of very small aperture terminals (VSAT). Such networks are independent and make mobile access possible.

In the future, satellites will become stronger, cheaper and their orbits will be lower; their services might become as common as satellite TV is today.

For more information about satellites, see How Satellites Work (http://octopus.gma.org/surfing/satellites) and the Tech Museum's satellite site (http://www.thetech.org/hyper/satellite).

http://www.whatis.com/vsat.htm
http://octopus.gma.org/surfing/satellites
INDEXCARD, 28/44
 
National Science Foundation (NSF)

Established in 1950, the National Science Foundation is an independent agency of the U.S. government dedicated to the funding in basic research and education in a wide range of sciences and in mathematics and engineering. Today, the NSF supplies about one quarter of total federal support of basic scientific research at academic institutions.

http://www.nsf.gov

For more detailed information see the Encyclopaedia Britannica: http://www.britannica.com/bcom/eb/article/0/0,5716,2450+1+2440,00.html

http://www.nsf.gov/
INDEXCARD, 29/44
 
Galileo Galilee

Galileo Galilee (1564-1642), the Italian Mathematician and Physicist is called the father of Enlightenment. He proofed the laws of the free fall, improved the technique for the telescope and so on. Galilee is still famous for his fights against the Catholic Church. He published his writings in Italian instead of writing in Latin. Like this, everybody could understand him, which made him popular. As he did not stop talking about the world as a ball (the Heliocentric World System) instead of a disk, the Inquisition put him on trial twice and forbid him to go on working on his experiments.

INDEXCARD, 30/44
 
Technological measures

As laid down in the proposed EU Directive on copyright and related rights in the information society technological measures mean "... any technology, device, or component that, in the normal course of its operations, is designed to prevent or inhibit the infringement of any copyright..." The U.S. DMCA (Digital Millennium Copyright Act) divides technological measures in two categories: 1) measures that prevent unauthorized access to a copyrighted work, and 2) measures that prevent unauthorized copying of a copyrighted work. Also the making or selling of devices or services that can be used to circumvent either category of technological measures is prohibited under certain circumstances in the DMCA. Furthermore the 1996 WIPO Copyright Treaty states that the "... contracting parties shall provide adequate legal protection and effective legal remedies against the circumvention of effective technological measures that are used by authors..."

INDEXCARD, 31/44
 
RIPE

The RIPE Network Coordination Centre (RIPE NCC) is one of three Regional Internet

Registries (RIR), which exist in the world today, providing allocation and registration services which support the operation of the Internet globally, mainly the allocation of IP address space for Europe.

http://www.ripe.net

INDEXCARD, 32/44
 
Bulletin Board Systems

A BBS (bulletin board system) is a computer that can be reached by computer modem dialing (you need to know the phone number) or, in some cases, by Telnet for the purpose of sharing or exchanging messages or other files. Some BBSs are devoted to specific interests; others offer a more general service. The definitive BBS List says that there are 40,000 BBSs worldwide.

Bulletin board systems originated and generally operate independently of the Internet.

Source: Whatis.com

INDEXCARD, 33/44
 
Aeneas Tacticus

Supposedly his real name was Aeneas of Stymphalus. He was a Greek military scientist and cryptographer. He invented an optical system for communication similar to a telegraph: the water-clocks.

INDEXCARD, 34/44
 
Royalties

Royalties refer to the payment made to the owners of certain types of rights by those who are permitted by the owners to exercise the rights. The rights concerned are literary, musical, and artistic copyright and patent rights in inventions and designs (as well as rights in mineral deposits, including oil and natural gas). The term originated from the fact that in Great Britain for centuries gold and silver mines were the property of the crown and such "royal" metals could be mined only if a payment ("royalty") were made to the crown.

INDEXCARD, 35/44
 
Proprietary Network

Proprietary networks are computer networks with standards different to the ones proposed by the International Standardization Organization (ISO), the Open Systems Interconnection (OSI). Designed to conform to standards implemented by the manufacturer, compatibility to other network standards is not assured.

INDEXCARD, 36/44
 
John Dee

b. July 13, 1527, London, England
d. December 1608, Mortlake, Surrey

English alchemist, astrologer, and mathematician who contributed greatly to the revival of interest in mathematics in England. After lecturing and studying on the European continent between 1547 and 1550, Dee returned to England in 1551 and was granted a pension by the government. He became astrologer to the queen, Mary Tudor, and shortly thereafter was imprisoned for being a magician but was released in 1555. Dee later toured Poland and Bohemia (1583-89), giving exhibitions of magic at the courts of various princes. He became warden of Manchester College in 1595.

INDEXCARD, 37/44
 
Network Information Center (NIC)

Network information centers are organizations responsible for registering and maintaining the domain names on the World Wide Web. Until competition in domain name registration was introduced, they were the only ones responsible. Most countries have their own network information center.

INDEXCARD, 38/44
 
Telephone

The telephone was not invented by Alexander Graham Bell, as is widely held to be true, but by Philipp Reiss, a German teacher. When he demonstrated his invention to important German professors in 1861, it was not enthusiastically greeted. Because of this dismissal, no financial support for further development was provided to him.

And here Bell comes in: In 1876 he successfully filed a patent for the telephone. Soon afterwards he established the first telephone company.

INDEXCARD, 39/44
 
Bandwidth

The bandwidth of a transmitted communications signal is a measure of the range of frequencies the signal occupies. The term is also used in reference to the frequency-response characteristics of a communications receiving system. All transmitted signals, whether analog or digital, have a certain bandwidth. The same is true of receiving systems.

Generally speaking, bandwidth is directly proportional to the amount of data transmitted or received per unit time. In a qualitative sense, bandwidth is proportional to the complexity of the data for a given level of system performance. For example, it takes more bandwidth to download a photograph in one second than it takes to download a page of text in one second. Large sound files, computer programs, and animated videos require still more bandwidth for acceptable system performance. Virtual reality (VR) and full-length three-dimensional audio/visual presentations require the most bandwidth of all.

In digital systems, bandwidth is data speed in bits per second (bps).

Source: Whatis.com

INDEXCARD, 40/44
 
Copyright management information

Copyright management information refers to information which identifies a work, the author of a work, the owner of any right in a work, or information about the terms and conditions of the use of a work, and any numbers or codes that represent such information, when any of these items of information are attached to a copy of a work or appear in connection with the communication of a work to the public.

INDEXCARD, 41/44
 
Transistor

A transistor is a solid-state device for amplifying, controlling, and generating electrical signals. Transistors are used in a wide array of electronic equipment, ranging from pocket calculators and radios to industrial robots and communications satellites.

INDEXCARD, 42/44
 
Richard Barbrook and Andy Cameron, The Californian Ideology

According to Barbrook and Cameron there is an emerging global orthodoxy concerning the relation between society, technology and politics. In this paper they are calling this orthodoxy the Californian Ideology in honor of the state where it originated. By naturalizing and giving a technological proof to a political philosophy, and therefore foreclosing on alternative futures, the Californian ideologues are able to assert that social and political debates about the future have now become meaningless and - horror of horrors - unfashionable. - This paper argues for an interactive future.

http://www.wmin.ac.uk/media/HRC/ci/calif.html

INDEXCARD, 43/44
 
Cutting

The cutting of pictures in movies or photographs is highly manipulative: it is easy to produce a new video out of an already existing one. The result is a form of manipulation that is difficult to contradict. A reputation destroyed by this, is nearly impossible to heal.

INDEXCARD, 44/44