Virtual cartels, introduction

Among the most striking development of the 1990s has been the emergence of a global commercial media market utilizing new technologies and the global trend toward deregulation.
This global commercial media market is a result of aggressive maneuvering by the dominant firms, new technologies that make global systems cost-efficient, and neoliberal economic policies encouraged by the World Bank, IMF, WTO, and the US government to break down regulatory barriers to a global commercial media and telecommunication market.

A global oligopolistic market that covers the spectrum of media is now crystallizing the very high barriers to entry."

(Robert McChesney, author of "Rich Media, Poor Democracy")

The network structure of information and communication technologies means that even deregulated markets are not "free". The functional logic of global networks only tolerates a small number of large players. Mergers, strategic alliances, partnerships and cooperations are therefore the daily routine of the ICT business. They bypass competition and create "virtual cartels".

TEXTBLOCK 1/9 // URL: http://world-information.org/wio/infostructure/100437611709/100438658911
 
The Concept of the Public Sphere

According to social critic and philosopher Jürgen Habermas "public sphere" first of all means "... a domain of our social life in which such a thing as public opinion can be formed. Access to the public sphere is open in principle to all citizens. A portion of the public sphere is constituted in every conversation in which private persons come together to form a public. They are then acting neither as business or professional people conducting their private affairs, nor as legal consociates subject to the legal regulations of a state bureaucracy and obligated to obedience. Citizens act as a public when they deal with matters of general interest without being subject to coercion; thus with the guarantee that they may assemble and unite freely, and express and publicize their opinions freely."

The system of the public sphere is extremely complex, consisting of spatial and communicational publics of different sizes, which can overlap, exclude and cover, but also mutually influence each other. Public sphere is not something that just happens, but also produced through social norms and rules, and channeled via the construction of spaces and the media. In the ideal situation the public sphere is transparent and accessible for all citizens, issues and opinions. For democratic societies the public sphere constitutes an extremely important element within the process of public opinion formation.

TEXTBLOCK 2/9 // URL: http://world-information.org/wio/infostructure/100437611734/100438658403
 
Economic structure; transparent customers

Following the dynamics of informatised economies, the consumption habits and lifestyles if customers are of great interest. New technologies make it possible to store and combine collected data of an enormous amount of people.

User profiling helps companies understand what potential customers might want. Often enough, such data collecting takes place without the customer's knowledge and amounts to spying.

"Much of the information collection that occurs on the Internet is invisible to the consumer, which raises serious questions of fairness and informed consent."

(David Sobel, Electronic Privacy Information Center)

TEXTBLOCK 3/9 // URL: http://world-information.org/wio/infostructure/100437611726/100438658925
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 4/9 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
4000 - 1000 B.C.

4th millennium B.C.
In Sumer writing is invented.

Writing and calculating came into being at about the same time. The first pictographs carved into clay tablets were used for administrative purposes. As an instrument for the administrative bodies of early empires, which began to rely on the collection, storage, processing and transmission of data, the skill of writing was restricted to only very few. Being more or less separated tasks, writing and calculating converge in today's computers.

Letters are invented so that we might be able to converse even with the absent, says Saint Augustine. The invention of writing made it possible to transmit and store information. No longer the ear predominates; face-to-face communication becomes more and more obsolete for administration and bureaucracy. Standardization and centralization become the constituents of high culture and vast empires as Sumer and China.

3200 B.C.
In Sumer the seal is invented.

About 3000 B.C.
In Egypt papyrus scrolls and hieroglyphs are used.

About 1350 B.C.
In Assyria the cuneiform script is invented.

1200 B.C.
According to Aeschylus, the conquest of the town of Troy was transmitted via torch signals.

About 1100 B.C.
Egyptians use homing pigeons to deliver military information.

TEXTBLOCK 5/9 // URL: http://world-information.org/wio/infostructure/100437611796/100438659725
 
Cryptography's Terms and background

"All nature is merely a cipher and a secret writing."
Blaise de Vigenère

In the (dis-)information age getting information but at the same time excluding others from it is part of a power-game (keeping the other uneducated). The reason for it eventually has found an argument called security.
Compared to the frequency of its presence in articles, the news and political speeches security seems to be one of the most popular words of the 90's. It must be a long time ago when that word was only used for and by the military and the police. Today one can find it as part of every political issue. Even development assistance and nutrition programs consider it part of its work.
The so-called but also real need for information security is widespread and concerning everybody, whether someone uses information technology or not. In any case information about individuals is moving globally; mostly sensitive information like about bank records, insurance and medical data, credit card transactions, and much much more. Any kind of personal or business communication, including telephone conversations, fax messages, and of course e-mail is concerned. Not to forget further financial transactions and business information. Almost every aspect of modern life is affected.
We want to communicate with everybody - but do not want anybody to know.

Whereas the market already depends on the electronic flow of information and the digital tools get faster and more sophisticated all the time, the rise of privacy and security concerns have to be stated as well.
With the increase of digital communication its vulnerability is increasing just as fast. And there exist two (or three) elements competing and giving the term digital security a rather drastic bitter taste: this is on the one hand the growing possibility for criminals to use modern technology not only to hide their source and work secretly but also to manipulate financial and other transfers. On the other hand there are the governments of many states telling the population that they need access to any kind of data to keep control against those criminals. And finally there are those people, living between enlightening security gaps and at the same time harming other private people's actions with their work: computer hackers.
While the potential of global information is regarded as endless, it is those elements that reduce it.

There is no definite solution, but at least some tools have been developed to improve the situation: cryptography, the freedom to encode those data that one does not want to be known by everybody, and give a possibility to decode them to those who shall know the data.

During the last 80 years cryptography has changed from a mere political into a private, economic but still political tool: at the same time it was necessary to improve the tools, eventually based on mathematics. Hence generally cryptography is regarded as something very complicated. And in many ways this is true as the modern ways of enciphering are all about mathematics.

"Crypto is not mathematics, but crypto can be highly mathematical, crypto can use mathematics, but good crypto can be done without a great reliance on complex mathematics." (W.T. Shaw)

For an introduction into cryptography and the mathematical tasks see:
http://www.sbox.tu-graz.ac.at/home/j/jonny/projects/crypto/index.htm
http://www.ccc.de/CCC-CA/policy.html

TEXTBLOCK 6/9 // URL: http://world-information.org/wio/infostructure/100437611776/100438658895
 
Acessing the Internet

The Net connections can be based on wire-line and wireless access technolgies.

Wire-line access

Wire-less access

copper wires

Satellites

coaxial cables

mobile terrestrial antennas

electric power lines

fixed terrestrial antennas

fiber-optic cables







Usually several kinds of network connections are employed at once. Generally speaking, when an E-mail message is sent it travels from the user's computer via copper wires or coaxial cables ISDN lines, etc., to an Internet Service Provider, from there, via fibre-optic cables, to the nearest Internet exchange, and on into a backbone network, tunneling across the continent und diving through submarine fibre-optic cables across the Atlantic to another Internet exchange, from there, via another backbone network and across another regional network to the Internet Service Provider of the supposed message recipient, from there via cables and wires of different bandwidth arriving at its destination, a workstation permanently connected to the Internet. Finally a sound or flashing icon informs your virtual neighbor that a new message has arrived.

Satellite communication

Although facing competition from fiber-optic cables as cost-effective solutions for broadband data transmission services, the space industry is gaining increasing importance in global communications. As computing, telephony, and audiovisual technologies converge, new wireless technologies are rapidly deployed occupying an increasing market share and accelerating the construction of high-speed networks.

Privatization of satellite communication

Until recently transnational satellite communication was provided exclusively by intergovernmental organizations as Intelsat, Intersputnik and Inmarsat.

Scheduled privatization of intergovernmental satellite consortia:

Satellite consortia

Year of foundation

Members

Scheduled date for privatization

Intelsat

1964

200 nations under the leadership of the USA

2001

Intersputnik

1971

23 nations under the leadership of Russia

?

Inmarsat

1979

158 nations (all members of the International Maritime Organization)

privatized since 1999

Eutelsat

1985

Nearly 50 European nations

2001



When Intelsat began to accumulate losses because of management failures and the increasing market share of fiber-optic cables, this organizational scheme came under attack. Lead by the USA, the Western industrialized countries successfully pressed for the privatization of all satellite consortia they are members of and for competition by private carriers.

As of February 2000, there are 2680 satellites in service. Within the next four years a few hundred will be added by the new private satellite systems. Most of these systems will be so-called Low Earth Orbit satellite systems, which are capable of providing global mobile data services on a high-speed level at low cost.

Because of such technological improvements and increasing competition, experts expect satellite-based broadband communication to be as common, cheap, and ubiquitous as satellite TV today within the next five or ten years.

Major satellite communication projects

Project name

Main investors

Expected cost

Number of satellites

Date of service start-up

Astrolink

Lockheed Martin, TRW, Telespazio, Liberty Media Group

US$ 3.6 billion

9

2003

Globalstar

13 investors including Loral Space & Communications, Qualcomm, Hyundai, Alcatel, France Telecom, China Telecom, Daimler Benz and Vodafone/Airtouch

US$ 3.26 billion

48

1998

ICO

57 investors including British Telecom, Deutsche Telecom, Inmarsat, TRW and Telefonica

US$ 4.5 billion

10

2001

Skybridge

9 investors including Alcatel Space, Loral Space & Communications, Toshiba, Mitsubishi and Sharp

US$ 6.7 billion

80

2002

Teledesic

Bill Gates, Craig McCaw, Prince Alwaleed Bin Talal Bin Abdul Aziz Alsaud, Abu Dhabi Investment Company

US$ 9 billion

288

2004


Source: Analysys Satellite Communications Database

TEXTBLOCK 7/9 // URL: http://world-information.org/wio/infostructure/100437611791/100438659839
 
Missing Labeling of Online Ads

One of the most crucial issues in on-line advertising is the blurring of the line between editorial content and ads. Unlike on TV and in the print media, where guidelines on the labeling of advertisements, which shall enable the customer to distinguish between editorial and ads, exist, similar conventions have not yet evolved for Internet content. Labeling of online advertisement up to now has remained the rare exception, with only few sites (e.g. http://www.orf.at) explicitly indicating non-editorial content.

TEXTBLOCK 8/9 // URL: http://world-information.org/wio/infostructure/100437611652/100438657963
 
Internet Content Providers Perspective

As within the traditional media landscape, Internet content providers have two primary means of generating revenue: Direct sales or subscriptions, and advertising. Especially as charging Internet users for access to content - with all the free material available - has proven problematic, advertising is seen as the best solution for creating revenues in the short term. Therefore intense competition has started among Internet content providers and access services to attract advertising money.

Table: Web-Sites Seeking Advertising


Period

Number of Web-Sites

June 1999

2111

July 1999

2174

August 1999

2311

September 1999

2560



Source: Adknowledge eAnalytics. Online Advertising Report

TEXTBLOCK 9/9 // URL: http://world-information.org/wio/infostructure/100437611652/100438657986
 
Galileo Galilee

Galileo Galilee (1564-1642), the Italian Mathematician and Physicist is called the father of Enlightenment. He proofed the laws of the free fall, improved the technique for the telescope and so on. Galilee is still famous for his fights against the Catholic Church. He published his writings in Italian instead of writing in Latin. Like this, everybody could understand him, which made him popular. As he did not stop talking about the world as a ball (the Heliocentric World System) instead of a disk, the Inquisition put him on trial twice and forbid him to go on working on his experiments.

INDEXCARD, 1/14
 
Multiple User Dungeons

MUDs are virtual spaces, usually a kind of adventurous ones, you can log into, enabling you to chat with others, to explore and sometimes to create rooms. Each user takes on the identity of an avatar, a computerized character.

INDEXCARD, 2/14
 
codes

an algorithm for bringing a legible message into an illegible form. There has to exist some sort of code book to encode/decode it.

INDEXCARD, 3/14
 
CNN

CNN is a U.S.-TV-enterprise, probably the world's most famous one. Its name has become the symbol for the mass-media, but also the symbol of a power that can decide which news are important for the world and which are not worth talking about. Every message that is published on CNN goes around the world. The Gulf War has been the best example for this until now, when a CNN-reporter was the one person to do the countdown to a war. The moments when he stood on the roof of a hotel in Baghdad and green flashes surrounded him, went around the world.

INDEXCARD, 4/14
 
Gopher

Gopher is a menu system with hierarchically structured list of files that predates the World Wide Web.

Today Gopher is of diminishing importance and mostly replaced by the World Wide Web.

INDEXCARD, 5/14
 
Avatar

Traditionally, an avatar is a mythical figure half man half god. In Hindu mythology, avatars are the form that deities assume when they descend on earth. Greek and Roman mythologies also contain avatars in animal form or half animal, half man. In virtual space, the word avatar refers to a "virtual identity" that a user can construct for him / herself, e.g. in a chat-room. Avatars have also been a preferred object of media art.

INDEXCARD, 6/14
 
Invention

According to the WIPO an invention is a "... novel idea which permits in practice the solution of a specific problem in the field of technology." Concerning its protection by law the idea "... must be new in the sense that is has not already been published or publicly used; it must be non-obvious in the sense that it would not have occurred to any specialist in the particular industrial field, had such a specialist been asked to find a solution to the particular problem; and it must be capable of industrial application in the sense that it can be industrially manufactured or used." Protection can be obtained through a patent (granted by a government office) and typically is limited to 20 years.

INDEXCARD, 7/14
 
ciphertext

the enciphered/encoded and primarily illegible text

INDEXCARD, 8/14
 
ciphers

the word "cipher" comes from the Hebrew word "saphar", meaning "to number". Ciphers are mere substitutions. Each letter of the alphabet gets substituted; maybe by one letter or two or more.

an example:
PLAINTEXT a b c d e f g h i j k l m n o p q r s t u v w x y z
CIPHERTEXT D E F G H I J K L M N O P Q R S T U V W X Y Z A B C

INDEXCARD, 9/14
 
cryptoanalysis

the study of breaking others' codes to transform a message back into a legible form without knowing the key from the beginning

INDEXCARD, 10/14
 
plaintext

the original, legible text

INDEXCARD, 11/14
 
WIPO

The World Intellectual Property Organization is one of the specialized agencies of the United Nations (UN), which was designed to promote the worldwide protection of both industrial property (inventions, trademarks, and designs) and copyrighted materials (literary, musical, photographic, and other artistic works). It was established by a convention signed in Stockholm in 1967 and came into force in 1970. The aims of WIPO are threefold. Through international cooperation, WIPO promotes the protection of intellectual property. Secondly, the organization supervises administrative cooperation between the Paris, Berne, and other intellectual unions regarding agreements on trademarks, patents, and the protection of artistic and literary work and thirdly through its registration activities the WIPO provides direct services to applicants for, or owners of, industrial property rights.

INDEXCARD, 12/14
 
Transistor

A transistor is a solid-state device for amplifying, controlling, and generating electrical signals. Transistors are used in a wide array of electronic equipment, ranging from pocket calculators and radios to industrial robots and communications satellites.

INDEXCARD, 13/14
 
Industrial design

Industrial design refers to the ornamental aspect of a useful article which may constitute of two or three-dimensional elements. To be qualified for intellectual property protection the design must be novel or original. Protection can be obtained through registration in a government office and usually is given for 10 to 15 years.

INDEXCARD, 14/14