Economic structure; digital euphoria

The dream of a conflict-free capitalism appeals to a diverse audience. No politician can win elections without eulogising the benefits of the information society and promising universal wealth through informatisation. "Europe must not lose track and should be able to make the step into the new knowledge and information society in the 21st century", said Tony Blair.

The US government has declared the construction of a fast information infrastructure network the centerpiece of its economic policies

In Lisbon the EU heads of state agreed to accelerate the informatisation of the European economies

The German Chancellor Schröder has requested the industry to create 20,000 new informatics jobs.

The World Bank understands information as the principal tool for third world development

Electronic classrooms and on-line learning schemes are seen as the ultimate advance in education by politicians and industry leaders alike.

But in the informatised economies, traditional exploitative practices are obscured by the glamour of new technologies. And the nearly universal acceptance of the ICT message has prepared the ground for a revival of 19th century "adapt-or-perish" ideology.

"There is nothing more relentlessly ideological than the apparently anti-ideological rhetoric of information technology"

(Arthur and Marilouise Kroker, media theorists)

TEXTBLOCK 1/10 // URL: http://world-information.org/wio/infostructure/100437611726/100438658999
 
Advertising and the Media System

Media systems (especially broadcasting) can be classified in two different types:

Public Media Systems: Government control over broadcasting through ownership, regulation, and partial funding of public broadcasting services.

Private Media System: Ownership and control lies in the hands of private companies and shareholders.

Both systems can exist in various forms, according to the degree of control by governments and private companies, with mixed systems (public and private) as the third main kind.

Whereas public media systems are usually at least partially funded by governments, private broadcasting solely relies on advertising revenue. Still also public media systems cannot exclude advertising as a source of revenue. Therefore both types are to a certain degree dependent on money coming in by advertisers.

And this implies consequences on the content provided by the media. As the attraction of advertisers becomes critically important, interests of the advertising industry frequently play a dominant role concerning the structure of content and the creation of environments favorable for advertising goods and services within the media becomes more and more common.

TEXTBLOCK 2/10 // URL: http://world-information.org/wio/infostructure/100437611652/100438657942
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 3/10 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
The Concept of the Public Sphere

According to social critic and philosopher Jürgen Habermas "public sphere" first of all means "... a domain of our social life in which such a thing as public opinion can be formed. Access to the public sphere is open in principle to all citizens. A portion of the public sphere is constituted in every conversation in which private persons come together to form a public. They are then acting neither as business or professional people conducting their private affairs, nor as legal consociates subject to the legal regulations of a state bureaucracy and obligated to obedience. Citizens act as a public when they deal with matters of general interest without being subject to coercion; thus with the guarantee that they may assemble and unite freely, and express and publicize their opinions freely."

The system of the public sphere is extremely complex, consisting of spatial and communicational publics of different sizes, which can overlap, exclude and cover, but also mutually influence each other. Public sphere is not something that just happens, but also produced through social norms and rules, and channeled via the construction of spaces and the media. In the ideal situation the public sphere is transparent and accessible for all citizens, issues and opinions. For democratic societies the public sphere constitutes an extremely important element within the process of public opinion formation.

TEXTBLOCK 4/10 // URL: http://world-information.org/wio/infostructure/100437611734/100438658403
 
Eliminating online censorship: Freenet, Free Haven and Publius

Protecting speech on the global data networks attracts an increasing attention. The efforts and the corresponding abilities of governmental authorities, corporations and copyright enforcement agencies are countered by similar efforts and abilities of researchers and engineers to provide means for anonymous and uncensored communication, as Freenet, Free Haven and Publius. All three of them show a similar design. Content is split up and spread on several servers. When a file is requested, the pieces are reassembled. This design makes it difficult to censor content. All of these systems are not commercial products.

The most advanced system seems to be Publius. Because of being designed by researchers and engineers at the prestigious AT&T Labs, Publius is a strong statement against online censorship. No longer can it be said that taking a firm stand against the use of technologies limiting the freedom of individuals is a position of radical leftists only.

For more information on Publius, see John Schwartz, Online and Unidentifiable? in: The Washington Post, June 30, 2000, http://www.washingtonpost.com/wp-dyn/articles/A21689-2000Jun29.html .

Freenet web site: http://freenet.sourceforge.net

Free Haven web site: http://www.freehaven.net

Publius web site: http://www.cs.nyu.edu/waldman/publius

TEXTBLOCK 5/10 // URL: http://world-information.org/wio/infostructure/100437611742/100438658749
 
Virtual cartels, introduction

Among the most striking development of the 1990s has been the emergence of a global commercial media market utilizing new technologies and the global trend toward deregulation.
This global commercial media market is a result of aggressive maneuvering by the dominant firms, new technologies that make global systems cost-efficient, and neoliberal economic policies encouraged by the World Bank, IMF, WTO, and the US government to break down regulatory barriers to a global commercial media and telecommunication market.

A global oligopolistic market that covers the spectrum of media is now crystallizing the very high barriers to entry."

(Robert McChesney, author of "Rich Media, Poor Democracy")

The network structure of information and communication technologies means that even deregulated markets are not "free". The functional logic of global networks only tolerates a small number of large players. Mergers, strategic alliances, partnerships and cooperations are therefore the daily routine of the ICT business. They bypass competition and create "virtual cartels".

TEXTBLOCK 6/10 // URL: http://world-information.org/wio/infostructure/100437611709/100438658911
 
"Stealth Sites"

"Stealth sites" account for a particular form of hidden advertisement. Stealth sites look like magazines, nicely designed and featuring articles on different topics, but in reality are set up for the sole purpose of featuring a certain companies products and services. "About Wines" for example is a well-done online magazine, featuring articles on food and travel and also publishes articles on wine, which surprisingly all happen to be from Seagram.

TEXTBLOCK 7/10 // URL: http://world-information.org/wio/infostructure/100437611652/100438657995
 
Sponsorship Models

With new sponsorship models being developed, even further influence over content from the corporate side can be expected. Co-operating with Barnes & Nobel Booksellers, the bookish e-zine FEED for instance is in part relying on sponsoring. Whenever a specific title is mentioned in the editorial, a link is placed in the margin - under the heading "Commerce" - to an appropriate page on Barnes & Noble. Steve Johnson, editor of FEED, says "We do not take a cut of any merchandise sold through those links.", but admits that the e-zine does indirectly profit from putting those links there.

TEXTBLOCK 8/10 // URL: http://world-information.org/wio/infostructure/100437611652/100438658034
 
Timeline 1600 - 1900 AD

17th century Cardinal Richelieu invents an encryption-tool called grille, a card with holes for writing messages on paper into the holes of those cards. Afterwards he removes the cards and fills in the blanks, so the message looks like an ordinary letter. The recipient needs to own the same card

- Bishop John Wilkins invents a cryptologic system looking like music notes. In a book he describes several forms of steganographic systems like secrets inks, but also the string cipher. He mentions the so-called Pig Latin, a spoken way of encryption that was already used by the ancient Indians

- the English scientist, magician and astrologer John Dee works on the ancient Enochian alphabet; he also possesses an encrypted writing that could not been broken until today

1605/1623 Sir Francis Bacon (= Francis Tudor = William Shakespeare?) writes several works containing ideas about cryptography. One of his most important advises is to use ciphers in such a way that no-one gets suspicious that the text could be enciphered. For this the steganogram was the best method, very often used in poems. The attempt to decipher Shakespeare's sonnets (in the 20th century) lead to the idea that his works had been written by Francis Bacon originally.

1671 Leibniz invents a calculating machine that uses the binary scale which we still use today, more advanced of course, called the ASCII code

18th century this is the time of the Black Chambers of espionage in Europe, Vienna having one of the most effective ones, called the "Geheime Kabinettskanzlei", headed by Baron Ignaz von Koch. Its task is to read through international diplomatic mail, copy letters and return them to the post-office the same morning. Supposedly about 100 letters are dealt with each day.

1790's Thomas Jefferson and Robert Patterson invent a wheel cipher

1799 the Rosetta Stone is found and makes it possible to decipher the Egyptian Hieroglyphs

1832 or 1838 Sam Morse develops the Morse Code, which actually is no code but an enciphered alphabet of short and long sounds. The first Morse code-message is sent by telegraph in 1844.

1834 the Braille Code for blind people is developed in today's form by Louis Braille

1844 the invention of the telegraph changes cryptography very much, as codes are absolutely necessary by then

1854 the Playfair cipher is invented by Sir Charles Wheatstone

1859 for the first time a tomographic cipher gets described

1861 Friedrich W. Kasiski does a cryptoanalysis of the Vigenère ciphers, which had been supposed to be uncrackable for ages

1891 Major Etienne Bazeries creates a new version of the wheel cipher, which is rejected by the French Army

1895 the invention of the radio changes cryptography-tasks again and makes them even more important

TEXTBLOCK 9/10 // URL: http://world-information.org/wio/infostructure/100437611776/100438658974
 
Beautiful bodies

However, artificial beings need not be invisible or look like Arnold Schwarzenegger in "Terminator". "My dream would be to create an artificial man that does not look like a robot but like a beautiful, graceful human being. The artificial man should be beautiful". Nadia Thalman's hopes for beautiful robots may become reality in the work of MIRALab, a research laboratory attached to the University of Geneva dedicated to realistic modelling of human functionalities. The laboratory has produced an artificial Marylyn Monroe showing just how beautiful artificial creatures can be, and there is a biography featuring details of her career and her - however virtual - love life. Yet beautiful creatures have been made before, at leas on the movie screen. Frank-N-furter, the protagonist of the Rocky Horror picture show ("I've been making a man / with blond hair and a tan / and he is good for relieving my /tension) did set remakrable esthetic standards.

While in Hindu mythology, avatars are bodies chosen by gods for their representation on earth, often animals such as swans or horses, the avatars populating cyberspace have a different function. The cyber bodies of real people, often 3 dimensional images of creatures whose aesthetics reflects both the tastes prevalent in the entertainment and advertising industries as the state of art in visual representation.

TEXTBLOCK 10/10 // URL: http://world-information.org/wio/infostructure/100437611777/100438658861
 
Calculator

Calculators are machines for automatically performing arithmetical operations and certain mathematical functions. Modern calculators are descendants of a digital arithmetic machine devised by Blaise Pascal in 1642. Later in the 17th century, Gottfried Wilhelm von Leibniz created a more advanced machine, and, especially in the late 19th century, inventors produced calculating machines that were smaller and smaller and less and less laborious to use.

INDEXCARD, 1/12
 
Roman smoke telegraph network, 150 A.D.

The Roman smoke signals network consisted of towers within visible range of each other and had a total length of about 4500 kilometers. It was used for military signaling.

For a similar telegraph network in ancient Greece see Aeneas Tacitus' optical communication system.

INDEXCARD, 2/12
 
Amazon.Com

Amazon.Com was one of the first online bookstores. With thousands of books, CDs and videos ordered via the Internet every year, Amazon.Com probably is the most successful Internet bookstore.

INDEXCARD, 3/12
 
Human Genome Project

The Human Genome Project is an international colaborative research project that aims to map the human genome. It's goal is to idenitfy the 100,000 genes of the human DNA as well as to sequence the 3 billion chemical base pairs that make up the DNA. The HGP is designed on an open source basis, i.e. the information that is obtained and stored in databases should, in principle, be available to researchers and businesses all over the world. However, the HGP's work has been challenged by private businesses such as Celera whose objective is the private exploitation of genome information.

INDEXCARD, 4/12
 
Gottfried Wilhelm von Leibniz

b. July 1, 1646, Leipzig
d. November 14, 1716, Hannover, Hanover

German philosopher, mathematician, and political adviser, important both as a metaphysician and as a logician and distinguished also for his independent invention of the differential and integral calculus. 1661, he entered the University of Leipzig as a law student; there he came into contact with the thought of men who had revolutionized science and philosophy--men such as Galileo, Francis Bacon, Thomas Hobbes, and René Descartes. In 1666 he wrote De Arte Combinatoria ("On the Art of Combination"), in which he formulated a model that is the theoretical ancestor of some modern computers.

INDEXCARD, 5/12
 
Internet Research Task Force

Being itself under the umbrella of the Internet Society, the Internet Research Task Force is an umbrella organization of small research groups working on topics related to Internet protocols, applications, architecture and technology. It is governed by the Internet Research Steering Group.

http://www.irtf.org

http://www.irtf.org/
INDEXCARD, 6/12
 
Internet Engineering Steering Group

On behalf of the Internet Society, the Internet Engineering Steering Group is responsible for the technical management of the evolution of the architecture, the standards and the protocols of the Net.

http://www.ietf.org/iesg.html

http://www.ietf.org/iesg.html
INDEXCARD, 7/12
 
John Dee

b. July 13, 1527, London, England
d. December 1608, Mortlake, Surrey

English alchemist, astrologer, and mathematician who contributed greatly to the revival of interest in mathematics in England. After lecturing and studying on the European continent between 1547 and 1550, Dee returned to England in 1551 and was granted a pension by the government. He became astrologer to the queen, Mary Tudor, and shortly thereafter was imprisoned for being a magician but was released in 1555. Dee later toured Poland and Bohemia (1583-89), giving exhibitions of magic at the courts of various princes. He became warden of Manchester College in 1595.

INDEXCARD, 8/12
 
WIPO

The World Intellectual Property Organization is one of the specialized agencies of the United Nations (UN), which was designed to promote the worldwide protection of both industrial property (inventions, trademarks, and designs) and copyrighted materials (literary, musical, photographic, and other artistic works). It was established by a convention signed in Stockholm in 1967 and came into force in 1970. The aims of WIPO are threefold. Through international cooperation, WIPO promotes the protection of intellectual property. Secondly, the organization supervises administrative cooperation between the Paris, Berne, and other intellectual unions regarding agreements on trademarks, patents, and the protection of artistic and literary work and thirdly through its registration activities the WIPO provides direct services to applicants for, or owners of, industrial property rights.

INDEXCARD, 9/12
 
Internet Societal Task Force

The Internet Societal Task Force is an organization under the umbrella of the Internet Society dedicated to assure that the Internet is for everyone by identifying and characterizing social and economic issues associated with the growth and use of Internet. It supplements the technical tasks of the Internet Architecture Board, the Internet Engineering Steering Group and the Internet Engineering Task Force.

Topics under discussion are social, economic, regulatory, physical barriers to the use of the Net, privacy, interdependencies of Internet penetration rates and economic conditions, regulation and taxation.

http://www.istf.isoc.org/

http://www.istf.isoc.org/
INDEXCARD, 10/12
 
Caching

Caching is a mechanism that attempts to decrease the time it takes to retrieve data by storing a copy at a closer location.

INDEXCARD, 11/12
 
Critical Art Ensemble

Critical Art Ensemble is a collective of five artists of various specializations dedicated to exploring the intersections between art, technology, radical politics, and critical theory. CAE have published a number of books and carried out innovative art projects containing insightful and ironic theoretical contributions to media art. Projects include Addictionmania, Useless Technology, The Therapeutic State, Diseases of Consciousness, Machineworld, As Above So Below, and Flesh Machine.

http://www.critical-art.net

INDEXCARD, 12/12