Advertising and the Media System

Media systems (especially broadcasting) can be classified in two different types:

Public Media Systems: Government control over broadcasting through ownership, regulation, and partial funding of public broadcasting services.

Private Media System: Ownership and control lies in the hands of private companies and shareholders.

Both systems can exist in various forms, according to the degree of control by governments and private companies, with mixed systems (public and private) as the third main kind.

Whereas public media systems are usually at least partially funded by governments, private broadcasting solely relies on advertising revenue. Still also public media systems cannot exclude advertising as a source of revenue. Therefore both types are to a certain degree dependent on money coming in by advertisers.

And this implies consequences on the content provided by the media. As the attraction of advertisers becomes critically important, interests of the advertising industry frequently play a dominant role concerning the structure of content and the creation of environments favorable for advertising goods and services within the media becomes more and more common.

TEXTBLOCK 1/25 // URL: http://world-information.org/wio/infostructure/100437611652/100438657942
 
Virtual cartels, oligopolistic structures

Global networks require global technical standards ensuring the compatibility of systems. Being able to define such standards makes a corporation extremely powerful. And it requires the suspension of competitive practices. Competition is relegated to the symbolic realm. Diversity and pluralism become the victims of the globalisation of baroque sameness.

The ICT market is dominated by incomplete competition aimed at short-term market domination. In a very short time, new ideas can turn into best-selling technologies. Innovation cycles are extremely short. But today's state-of-the-art products are embryonic trash.

    According to the Computer and Communications Industry Association, Microsoft is trying to aggressively take over the network market. This would mean that AT&T would control 70 % of all long distance phone calls and 60 % of cable connections.



    AOL and Yahoo are lone leaders in the provider market. AOL has 21 million subscribers in 100 countries. In a single month, AOL registers 94 million visits. Two thirds of all US internet users visited Yahoo in December 1999.



    The world's 13 biggest internet providers are all American.



    AOL and Microsoft have concluded a strategic cross-promotion deal. In the US, the AOL icon is installed on every Windows desktop. AOL has also concluded a strategic alliance with Coca Cola.


TEXTBLOCK 2/25 // URL: http://world-information.org/wio/infostructure/100437611709/100438658963
 
What is the Internet?

Each definition of the Internet is a simplified statement and runs the risk of being outdated within a short time. What is usually referred to as the Internet is a network of thousands of computer networks (so called autonomous systems) run by governmental authorities, companies, and universities, etc. Generally speaking, every time a user connects to a computer networks, a new Internet is created. Technically speaking, the Internet is a wide area network (WAN) that may be connected to local area networks (LANs).

What constitutes the Internet is constantly changing. Certainly the state of the future Net will be different to the present one. Some years ago the Internet could still be described as a network of computer networks using a common communication protocol, the so-called IP protocol. Today, however, networks using other communication protocols are also connected to other networks via gateways.

Also, the Internet is not solely constituted by computers connected to other computers, because there are also point-of-sale terminals, cameras, robots, telescopes, cellular phones, TV sets and and an assortment of other hardware components that are connected to the Internet.

At the core of the Internet are so-called Internet exchanges, national backbone networks, regional networks, and local networks.

Since these networks are often privately owned, any description of the Internet as a public network is not an accurate. It is easier to say what the Internet is not than to say what it is. On 24 October, 1995 the U.S. Federal Networking Council made the following resolution concerning the definition of the term "Internet": "Internet" refers to the global information system that (i) is logically linked together by a globally unique address space based on the Internet Protocol (IP) or its subsequent extensions/follow-ons; (ii) is able to support communications using the Transmission Control Protocol/Internet Protocol (TCP/IP) suite or its subsequent extensions/follow-ons, and/or other IP-compatible protocols; and (iii) provides, uses or makes accessible, either publicly or privately, high level services layered on the communications and related infrastructure described herein." (http://www.fnc.gov/Internet_res.html)

What is generally and in a simplyfiying manner called the Internet, may be better referred to as the Matrix, a term introduced by science fiction writer William Gibson, as John S. Quarterman and Smoot Carl-Mitchell have proposed. The Matrix consists of all computer systems worldwide capable of exchanging E-Mail: of the USENET, corporate networks and proprietary networks owned by telecommunication and cable TV companies.

Strictly speaking, the Matrix is not a medium; it is a platform for resources: for media and services. The Matrix is mainly a very powerful means for making information easily accessible worldwide, for sending and receiving messages, videos, texts and audio files, for transferring funds and trading securities, for sharing resources, for collecting weather condition data, for trailing the movements of elephants, for playing games online, for video conferencing, for distance learning, for virtual exhibitions, for jamming with other musicians, for long distance ordering, for auctions, for tracking packaged goods, for doing business, for chatting, and for remote access of computers and devices as telescopes and robots remotely, e. g. The Internet is a wonderful tool for exchanging, retrieving, and storing data and sharing equipment over long distances and eventually real-time, if telecommunication infrastructure is reliable and of high quality.

For a comprehensive view of uses of the Matrix, especially the World Wide Web, see ""24 Hours in Cyberspace"

TEXTBLOCK 3/25 // URL: http://world-information.org/wio/infostructure/100437611791/100438659889
 
Timeline 1600 - 1900 AD

17th century Cardinal Richelieu invents an encryption-tool called grille, a card with holes for writing messages on paper into the holes of those cards. Afterwards he removes the cards and fills in the blanks, so the message looks like an ordinary letter. The recipient needs to own the same card

- Bishop John Wilkins invents a cryptologic system looking like music notes. In a book he describes several forms of steganographic systems like secrets inks, but also the string cipher. He mentions the so-called Pig Latin, a spoken way of encryption that was already used by the ancient Indians

- the English scientist, magician and astrologer John Dee works on the ancient Enochian alphabet; he also possesses an encrypted writing that could not been broken until today

1605/1623 Sir Francis Bacon (= Francis Tudor = William Shakespeare?) writes several works containing ideas about cryptography. One of his most important advises is to use ciphers in such a way that no-one gets suspicious that the text could be enciphered. For this the steganogram was the best method, very often used in poems. The attempt to decipher Shakespeare's sonnets (in the 20th century) lead to the idea that his works had been written by Francis Bacon originally.

1671 Leibniz invents a calculating machine that uses the binary scale which we still use today, more advanced of course, called the ASCII code

18th century this is the time of the Black Chambers of espionage in Europe, Vienna having one of the most effective ones, called the "Geheime Kabinettskanzlei", headed by Baron Ignaz von Koch. Its task is to read through international diplomatic mail, copy letters and return them to the post-office the same morning. Supposedly about 100 letters are dealt with each day.

1790's Thomas Jefferson and Robert Patterson invent a wheel cipher

1799 the Rosetta Stone is found and makes it possible to decipher the Egyptian Hieroglyphs

1832 or 1838 Sam Morse develops the Morse Code, which actually is no code but an enciphered alphabet of short and long sounds. The first Morse code-message is sent by telegraph in 1844.

1834 the Braille Code for blind people is developed in today's form by Louis Braille

1844 the invention of the telegraph changes cryptography very much, as codes are absolutely necessary by then

1854 the Playfair cipher is invented by Sir Charles Wheatstone

1859 for the first time a tomographic cipher gets described

1861 Friedrich W. Kasiski does a cryptoanalysis of the Vigenère ciphers, which had been supposed to be uncrackable for ages

1891 Major Etienne Bazeries creates a new version of the wheel cipher, which is rejected by the French Army

1895 the invention of the radio changes cryptography-tasks again and makes them even more important

TEXTBLOCK 4/25 // URL: http://world-information.org/wio/infostructure/100437611776/100438658974
 
Biometrics applications: physical access

This is the largest area of application of biometric technologies, and the most direct lineage to the feudal gate keeping system. Initially mainly used in military and other "high security" territories, physical access control by biometric technology is spreading into a much wider field of application. Biometric access control technologies are already being used in schools, supermarkets, hospitals and commercial centres, where the are used to manage the flow of personnel.

Biometric technologies are also used to control access to political territory, as in immigration (airports, Mexico-USA border crossing). In this case, they can be coupled with camera surveillance systems and artificial intelligence in order to identify potential suspects at unmanned border crossings. Examples of such uses in remote video inspection systems can be found at http://www.eds-ms.com/acsd/RVIS.htm

A gate keeping system for airports relying on digital fingerprint and hand geometry is described at http://www.eds-ms.com/acsd/INSPASS.htm. This is another technology which allows separating "low risk" travellers from "other" travellers.

An electronic reconstruction of feudal gate keeping capable of singling out high-risk travellers from the rest is already applied at various border crossing points in the USA. "All enrolees are compared against national lookout databases on a daily basis to ensure that individuals remain low risk". As a side benefit, the economy of time generated by the inspection system has meant that "drug seizures ... have increased since Inspectors are able to spend more time evaluating higher risk vehicles".

However, biometric access control can not only prevent people from gaining access on to a territory or building, they can also prevent them from getting out of buildings, as in the case of prisons.

TEXTBLOCK 5/25 // URL: http://world-information.org/wio/infostructure/100437611729/100438658838
 
Late 1950s - Early 1960s: Second Generation Computers

An important change in the development of computers occurred in 1948 with the invention of the transistor. It replaced the large, unwieldy vacuum tube and as a result led to a shrinking in size of electronic machinery. The transistor was first applied to a computer in 1956. Combined with the advances in magnetic-core memory, the use of transistors resulted in computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.

Stretch by IBM and LARC by Sperry-Rand (1959) were the first large-scale machines to take advantage of the transistor technology (and also used assembly language instead of the difficult machine language). Both developed for atomic energy laboratories could handle enormous amounts of data, but still were costly and too powerful for the business sector's needs. Therefore only two LARC's were ever installed.

Throughout the early 1960s there were a number of commercially successful computers (for example the IBM 1401) used in business, universities, and government and by 1965 most large firms routinely processed financial information by using computers. Decisive for the success of computers in business was the stored program concept and the development of sophisticated high-level programming languages like FORTRAN (Formular Translator), 1956, and COBOL (Common Business-Oriented Language), 1960, that gave them the flexibility to be cost effective and productive. The invention of second generation computers also marked the beginning of an entire branch, the software industry, and the birth of a wide range of new types of careers.

TEXTBLOCK 6/25 // URL: http://world-information.org/wio/infostructure/100437611663/100438659439
 
Basics: Infringement and Fair Use

The rights of a copyright holder are infringed when one of the acts requiring the authorization of the owner is done by someone else without his consent. In the case of copyright infringement or the violation of neighboring rights the remedies for the copyright owner consist of civil redress. The unauthorized copying of protected works for commercial purposes and the unauthorized commercial dealing in copied material is usually referred to as "piracy".

Yet copyright laws also provide that the rights of copyright owners are subject to the doctrine of "fair use". That allows the reproduction and use of a work, notwithstanding the rights of the author, for limited purposes such as criticism, comment, news reporting, teaching, and research. Fair use may be described as the privilege to use the copyrighted material in a reasonable manner without the owner's consent. To determine whether a use is fair or not most copyright laws consider:

- the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes (usually certain types of educational copying are allowed)

- the nature of the copyrighted work (mostly originals made for commercial reasons are less protected than their purely artistic counterparts)

- the amount and substantiality of the portion used in relation to the copyrighted work as a whole

- the effect of the use upon the potential market for or value of the copyrighted work (as a general rule copying may be permitted if it is unlikely to cause economic harm to the original author)

Examples of activities that may be excused as fair use include: providing a quotation in a book review; distributing copies of a section of an article in class for educational purposes; and imitating a work for the purpose of parody or social commentary.

TEXTBLOCK 7/25 // URL: http://world-information.org/wio/infostructure/100437611725/100438659569
 
Identificaiton in history

In biometric technology, the subject is reduced to its physical and therefore inseparable properties. The subject is a subject in so far as it is objectified; that is, in so far as is identified with its own res extensa, Descartes' "extended thing". The subject exists in so far as it can be objectified, if it resists the objectification that comes with measurement, it is rejected or punished. Biometrics therefore provides the ultimate tool for control; in it, the dream of hermetic identity control seems to become a reality, a modern technological reconstruction of traditional identification techniques such as the handshake or the look into somebody's eyes.

The use of identification by states and other institutions of authority is evidently not simply a modern phenomenon. The ancient Babylonians and Chinese already made use of finger printing on clay to identify authors of documents, while the Romans already systematically compared handwritings.

Body measurement has long been used by the military. One of the first measures after entering the military is the identification and appropriation of the body measurements of a soldier. These measurements are filed and combined with other data and make up what today we would call the soldier's data body. With his data body being in possession of the authority, a soldier is no longer able freely socialise and is instead dependent on the disciplinary structure of the military institution. The soldier's social being in the world is defined by the military institution.

However, the military and civilian spheres of modern societies are no longer distinct entities. The very ambivalence of advanced technology (dual use technologies) has meant that "good" and "bad" uses of technology can no longer be clearly distinguished. The measurement of physical properties and the creation of data bodies in therefore no longer a military prerogative, it has become diffused into all areas of modern societies.

If the emancipatory potential of weak identities is to be of use, it is therefore necessary to know how biometric technologies work and what uses they are put to.

TEXTBLOCK 8/25 // URL: http://world-information.org/wio/infostructure/100437611729/100438658096
 
Biometric applications: surveillance

Biometric technologies are not surveillance technologies in themselves, but as identification technologies they provide an input into surveillance which can make such as face recognition are combined with camera systems and criminal data banks in order to supervise public places and single out individuals.

Another example is the use of biometrics technologies is in the supervision of probationers, who in this way can carry their special hybrid status between imprisonment and freedom with them, so that they can be tracked down easily.

Unlike biometric applications in access control, where one is aware of the biometric data extraction process, what makes biometrics used in surveillance a particularly critical issue is the fact that biometric samples are extracted routinely, unnoticed by the individuals concerned.

TEXTBLOCK 9/25 // URL: http://world-information.org/wio/infostructure/100437611729/100438658740
 
Global data bodies - intro

- Education files, insurance files, tax files, communication files, consumption files, medical files, travel files, criminal files, investment files, files into infinity ...

Critical Art Ensemble

Global data bodies

1. Introduction

Informatisation has meant that things that once were "real", i.e. whose existence could be experienced sensually, are becoming virtual. Instead of the real existence of a thing, the virtual refers to its possibility of existence. As this process advances, an increasing identification of the possible with the real occurs. Reality migrates into a dim and dematerialised grey area. In the end, the possible counts for the real, virtualisation creates an "as-if" experience.

The experience of the body is also affected by this process. For example, in bio-technology, the human body and its functions are digitised, which prepares and understanding of the body exlusively in terms of its potential manipulation, the body becomes whatever it could be. But digitisation has not only affected the understanding and the social significance of the body, it has also altered the meaning of presence, traditionally identified with the body. The advance of information and communication technologies (ICTs) has meant that for an increasing number of activities we no longer need be physically present, our "virtual" presence, achieved by logging onto a electronic information network, is sufficient.

This development, trumpeted as the pinnacle of convenience by the ICT industries and governments interested in attracting investment, has deeply problematic aspects as well. For example, when it is no longer "necessary" to be physically present, it may soon no longer be possible or allowed. Online-banking, offered to customers as a convenience, is also serves as a justification for charging higher fees from those unwilling or unable to add banking to their household chores. Online public administration may be expected to lead to similar effects. The reason for this is that the digitalisation of the economy relies on the production of surplus data. Data has become the most important raw material of modern economies.

In modern economies, informatisation and virtualisation mean that people are structurally forced to carry out their business and life their lives in such a way as to generate data.

Data are the most important resource for the New Economy. By contrast, activities which do not leave behind a trace of data, as for example growing your own carrots or paying cash rather than by plastic card, are discouraged and structurally suppressed.

TEXTBLOCK 10/25 // URL: http://world-information.org/wio/infostructure/100437611761/100438659649
 
Enforcement: Copyright Management and Control Technologies

With the increased ease of the reproduction and transmission of unauthorized copies of digital works over electronic networks concerns among the copyright holder community have arisen. They fear a further growth of copyright piracy and demand adequate protection of their works. A development, which started in the mid 1990s and considers the copyright owner's apprehensions, is the creation of copyright management systems. Technological protection for their works, the copyright industry argues, is necessary to prevent widespread infringement, thus giving them the incentive to make their works available online. In their view the ideal technology should be "capable of detecting, preventing, and counting a wide range of operations, including open, print, export, copying, modifying, excerpting, and so on." Additionally such systems could be used to maintain "records indicating which permissions have actually been granted and to whom".

TEXTBLOCK 11/25 // URL: http://world-information.org/wio/infostructure/100437611725/100438659674
 
World War II ...

Never before propaganda had been as important as in the 2nd World War. From now on education was one more field of propaganda: its purpose was to teach how to think, while pure propaganda was supposed to show what to think.
Every nation founded at least one ministry of propaganda - of course without calling it that way. For example the British called it the Ministry of Information (= MOI), the U.S. distinguished between the Office of Strategic Services (= OSS) and the Office of War Information (= OWI), the Germans created a Ministry of Propaganda and Public Enlightenment (= RMVP) and the Japanese called their disinformation and propaganda campaign the "Thought War".
British censorship was so strict that the text of an ordinary propaganda leaflet, that had been dropped from planes several million times, was not given to a journalist who asked for it.

Atrocity stories were no longer used the same way as in the 1st World War. Instead, black propaganda was preferred, especially to separate the Germans from their leaders.
German war propaganda had started long before the war. In the middle of the 1930s Leni Riefenstahl filmed Hitler best propaganda movies. For the most famous one, "Triumph of the Will" (1935), she was the only professional filmier who was allowed to make close-up pictures of her admirer.

Some of the pictures of fear, hatred and intolerance still exist in people's heads. Considering this propaganda did a good job, unfortunately it was the anti-national-socialist propaganda that failed at that time.

TEXTBLOCK 12/25 // URL: http://world-information.org/wio/infostructure/100437611661/100438658610
 
Data bunkers

Personal data are collected, appropriated, processed and used for commercial purposes on a global scale. In order for such a global system to operate smoothly, there a server nodes at which the data streams converge. Among the foremost of these are the data bases of credit card companies, whose operation has long depended on global networking.

On top of credit card companies such as Visa, American Express, Master Card, and others. It would be erroneous to believe that the primary purpose of business of these companies is the provision of credit, and the facilitation of credit information for sale transactions. In fact, Information means much more than just credit information. In an advertisement of 1982, American Express described itself in these terms: ""Our product is information ...Information that charges airline tickets, hotel rooms, dining out, the newest fashions ...information that grows money funds buys and sells equities ...information that pays life insurance annuities ...information that schedules entertainment on cable television and electronically guards houses ...information that changes kroners into guilders and figures tax rates in Bermuda ..."

Information has become something like the gospel of the New Economy, a doctrine of salvation - the life blood of society, as Bill Gates expresses it. But behind information there are always data that need to be generated and collected. Because of the critical importance of data to the economy, their possession amounts to power and their loss can cause tremendous damage. The data industry therefore locates its data warehouses behind fortifications that bar physical or electronic access. Such structures are somewhat like a digital reconstruction of the medieval fortress

Large amounts of data are concentrated in fortress-like structures, in data bunkers. As the Critical Art Ensemble argue in Electronic Civil Disobedience: "The bunker is the foundation of homogeneity, and allows only a singular action within a given situation." All activities within data bunker revolve around the same principle of calculation. Calculation is the predominant mode of thinking in data-driven societies, and it reaches its greatest density inside data bunkers. However, calculation is not a politically neutral activity, as it provides the rational basis - and therefore the formal legitimisation most every decision taken. Data bunkers therefore have an essentially conservative political function, and function to maintain and strengthen the given social structures.

TEXTBLOCK 13/25 // URL: http://world-information.org/wio/infostructure/100437611761/100438659754
 
2000 A.D.

2000
Convergence of telephony, audiovisual technologies and computing

Digital technologies are used to combine previously separated communication and media systems such as telephony, audiovisual technologies and computing to new services and technologies, thus forming extensions of existing communication systems and resulting in fundamentally new communication systems. This is what is meant by today's new buzzwords "multimedia" and "convergence".

Classical dichotomies as the one of computing and telephony and traditional categorizations no longer apply, because these new services no longer fit traditional categories.

Convergence and Regulatory Institutions

Digital technology permits the integration of telecommunications with computing and audiovisual technologies. New services that extend existing communication systems emerge. The convergence of communication and media systems corresponds to a convergence of corporations. Recently, America Online, the world's largest online service provider, merged with Time Warner, the world's largest media corporation. For such corporations the classical approach to regulation - separate institutions regulate separate markets - is no longer appropriate, because the institutions' activities necessarily overlap. The current challenges posed to these institutions are not solely due to the convergence of communication and media systems made possible by digital technologies; they are also due to the liberalization and internationalization of the electronic communications sector. For regulation to be successful, new categorizations and supranational agreements are needed.
For further information on this issue see Natascha Just and Michael Latzer, The European Policy Response to Convergence with Special Consideration of Competition Policy and Market Power Control, http://www.soe.oeaw.ac.at/workpap.htm or http://www.soe.oeaw.ac.at/WP01JustLatzer.doc.

TEXTBLOCK 14/25 // URL: http://world-information.org/wio/infostructure/100437611796/100438659802
 
Challenges for Copyright by ICT: Introduction

Traditional copyright and the practice of paying royalties to the creators of intellectual property have emerged with the introduction of the printing press (1456). Therefore early copyright law has been tailored to the technology of print and the (re) production of works in analogue form. Over the centuries legislation concerning the protection of intellectual property has been adapted several times in order to respond to the technological changes in the production and distribution of information.

Yet again new technologies have altered the way of how (copyrighted) works are produced, copied, made obtainable and distributed. The emergence of global electronic networks and the increased availability of digitalized intellectual property confront existing copyright with a variety of questions and challenges. Although the combination of several types of works within one larger work or on one data carrier, and the digital format (although this may be a recent development it has been the object of detailed legal scrutiny), as well as networking (telephone and cable networks have been in use for a long time, although they do not permit interactivity) are nothing really new, the circumstance that recent technologies allow the presentation and storage of text, sound and visual information in digital form indeed is a novel fact. Like that the entire information can be generated, altered and used by and on one and the same device, irrespective of whether it is provided online or offline.


TEXTBLOCK 15/25 // URL: http://world-information.org/wio/infostructure/100437611725/100438659517
 
Other biometric technologies

Other biometric technologies not specified here include ear recognition, signature dynamics, key stroke dynamics, vein pattern recognition, retinal scan, body odour recognition, and DNA recognition. These are technologies which are either in early stages of development or used in highly specialised and limited contexts.

TEXTBLOCK 16/25 // URL: http://world-information.org/wio/infostructure/100437611729/100438658399
 
Feeding the data body

TEXTBLOCK 17/25 // URL: http://world-information.org/wio/infostructure/100437611761/100438659644
 
Online data capturing

Hardly a firm today can afford not to engage in electronic commerce if it does not want to be swept out of business by competitors. "Information is everything" has become something like the Lord's prayer of the New Economy. But how do you get information about your customer online? Who are the people who visit a website, where do they come from, what are they looking for? How much money do they have, what might they want to buy? These are key questions for a company doing electronic business. Obviously not all of this information can be obtained by monitoring the online behaviour of web users, but there are always little gimmicks that, when combined with common tracking technologies, can help to get more detailed information about a potential customer. These are usually online registration forms, either required for entry to a site, or competitions, sometimes a combination of the two. Obviously, if you want to win that weekend trip to New York, you want to provide your contact details.

The most common way of obtaining information about a user online is a cookie. However, a cookie by itself is not sufficient to identify a user personally. It merely identifies the computer to the server by providing its IP number. Only combined with other data extraction techniques, such as online registration, can a user be identified personally ("Register now to get the full benefit of xy.com. It's free!")

But cookies record enough information to fine-tune advertising strategies according to a user's preferences and interests, e.g. by displaying certain commercial banners rather than others. For example, if a user is found to respond to a banner of a particular kind, he / she may find two of them at the next visit. Customizing the offers on a website to the particular user is part of one-to-one marketing, a type of direct marketing. But one-to-one marketing can go further than this. It can also offer different prices to different users. This was done by Amazon.com in September 2000, when fist-time visitors were offered cheaper prices than regular customers.

One-to-one marketing can create very different realities that undermine traditional concepts of demand and supply. The ideal is a "frictionless market", where the differential between demand and supply is progressively eliminated. If a market is considered a structure within which demand / supply differentials are negotiated, this amounts to the abolition of the established notion of the nature of a market. Demand and supply converge, desire and it fulfilment coincide. In the end, there is profit without labour. However, such a structure is a hermetic structure of unfreedom.

It can only function when payment is substituted by credit, and the exploitation of work power by the exploitation of data. In fact, in modern economies there is great pressure to increase spending on credit. Using credit cards and taking up loans generates a lot of data around a person's economic behaviour, while at the same restricting the scope of social activity and increasing dependence. On the global level, the consequences of credit spirals can be observed in many of the developing countries that have had to abandon most of their political autonomy. As the data body economy advances, this is also the fate of people in western societies when they are structurally driven into credit spending. It shows that data bodies are not politically neutral.

The interrelation between data, profit and unfreedom is frequently overlooked by citizens and customers. Any company in a modern economy will apply data collecting strategies for profit, with dependence and unfreedom as a "secondary effect". The hunger for data has made IT companies eager to profit from e-business rather resourceful. "Getting to know the customer" - this is a catchphrase that is heard frequently, and which suggests that there are no limits to what a company may want to about a customer. In large online shops, such as amazon.com, where customer's identity is accurately established by the practice of paying with credit cards, an all business happens online, making it easy for the company to accurately profile the customers.

But there are more advanced and effective ways of identification. The German company Sevenval has developed a new way of customer tracking which works with "virtual domains". Every visitor of a website is assigned an 33-digit identification number which the browser understands as part of the www address, which will then read something like http://XCF49BEB7E97C00A328BF562BAAC75FB2.sevenval.com. Therefore, this tracking method, which is advertised by Sevenval as a revolutionary method capable of tracking the exact and complete path of a user on a website, can not be simple switched off. In addition, the method makes it possible for the identity of a user can travel with him when he / she visits one of the other companies linked to the site in question. As in the case of cookies, this tracking method by itself is not sufficient to identify a user personally. Such an identification only occurs once a customer pays with a credit card, or decides to participate in a draw, or voluntarily completes a registration form.

Bu there are much less friendly ways of extracting data from a user and feeding the data body. Less friendly means: these methods monitor users in situations where the latter are likely not to want to be monitored. Monitoring therefore takes place in a concealed manner. One of these monitoring methods are so-called web bugs. These are tiny graphics, not more than 1 x 1 pixel in size, and therefore invisible on a screen, capable of monitoring an unsuspecting user's e-mails or movements on a website. Leading corporations such as Barnes and Noble, eToys, Cooking.com, and Microsoft have all used web bugs in advertising campaigns. Richard Smith has compiled a web bugs FAQ site that contains detailed information and examples of web bugs in use.

Bugs monitoring users have also been packaged in seemingly harmless toys made available on the Internet. For example, Comet Systems offers cursor images which have been shown to collect user data and send them back to the company's server. These little images replace the customary white arrow of a mouse with a little image of a baseball, a cat, an UFO, etc. large enough to carry a bug collecting user information. The technology is offered as a marketing tool to companies looking for a "fun, new way to interact with their audience".

The cursor image technology relies on what is called a GUID (global unique identifier). This is an identification number which is assigned to a customer at the time of registration, or when downloading a product. Many among the online community were alarmed when in 1999 it was discovered that Microsoft assigned GUIDS without their customer's knowledge. Following protests, the company was forced to change the registration procedure, assuring that under no circumstances would these identification numbers be used for tracking or marketing.

However, in the meantime, another possible infringement on user anonymity by Microsoft was discovered, when it as found out that MS Office documents, such as Word, Excel or Powerpoint, contain a bug that is capable of tracking the documents as they are sent through the net. The bug sends information about the user who opens the document back to the originating server. A document that contains the bug can be tracked across the globe, through thousands of stopovers. In detailed description of the bug and how it works can be found at the Privacy Foundation's website. Also, there is an example of such a bug at the Privacy Center of the University of Denver.

Of course there are many other ways of collecting users' data and creating appropriating data bodies which can then be used for economic purposes. Indeed, as Bill Gates commented, "information is the lifeblood of business". The electronic information networks are becoming the new frontier of capitalism.

TEXTBLOCK 18/25 // URL: http://world-information.org/wio/infostructure/100437611761/100438659686
 
Individualized Audience Targeting

New opportunities for online advertisers arise with the possibility of one-to-one Web applications. Software agents for example promise to "register, recognize and manage end-user profiles; create personalized communities on-line; deliver personalized content to end-users and serve highly targeted advertisements". The probably ultimate tool for advertisers. Although not yet widely used, companies like Amazon.Com have already started to exploit individualized audience targeting for their purposes.

TEXTBLOCK 19/25 // URL: http://world-information.org/wio/infostructure/100437611652/100438658450
 
Who owns the Internet and who is in charge?

The Internet/Matrix still depends heavily on public infrastructure and there is no dedicated owner of the whole Internet/Matrix, but the networks it consists of are run and owned by corporations and institutions. Access to the Internet is usually provided by Internet Service Providers (ISPs) for a monthly fee. Each network is owned by someone and has a network operation center from where it is centrally controlled, but the Internet/Matrix is not owned by any single authority and has no network operation center of its own. No legal authority determines how and where networks can be connected together, this is something the managers of networks have to agree about. So there is no way to ever gain ultimate control of the Matrix/Internet.
The in some respects decentralized Matrix/Internet architecture and administration do not imply that there are no authorities for oversight and common standards for sustaining basic operations, for administration: There are authorities for IP number and domain name registrations, e.g.
Ever since the organizational structures for Internet administration have changed according to the needs to be addressed. Up to now, administration of the Internet is a collaborative undertaking of several loose cooperative bodies with no strict hierarchy of authority. These bodies make decisions on common guidelines, as communication protocols, e.g., cooperatively, so that compatibility of software is guaranteed. But they have no binding legal authority, nor can they enforce the standards they have agreed upon, nor are they wholly representative for the community of Internet users. The Internet has no official governing body or organization; most parts are still administered by volunteers.
Amazingly, there seems to be an unspoken and uncodified consent of what is allowed and what is forbidden on the Internet that is widely accepted. Codifications, as the so-called Netiquette, are due to individual efforts and mostly just expressively stating the prevailing consent. Violations of accepted standards are fiercely rejected, as reactions to misbehavior in mailing lists and newsgroups prove daily.
Sometimes violations not already subject to law become part of governmental regulations, as it was the case with spamming, the unsolicited sending of advertising mail messages. But engineers proved to be quicker and developed software against spamming. So, in some respects, the Internet is self-regulating, indeed.
For a detailed report on Internet governance, click here.

TEXTBLOCK 20/25 // URL: http://world-information.org/wio/infostructure/100437611791/100438658447
 
Legal Protection: WIPO (World Intellectual Property Organization)

Presumably the major player in the field of international intellectual property protection and administrator of various multilateral treaties dealing with the legal and administrative aspects of intellectual property is the WIPO.

Information on WIPO administered agreements in the field of industrial property (Paris Convention for the Protection of Industrial Property (1883), Madrid Agreement Concerning the International Registration of Marks (1891) etc.) can be found on: http://www.wipo.org/eng/general/index3.htm

Information on treaties concerning copyright and neighboring rights (Berne Convention for the Protection of Literary and Artistic Works (1886) etc.) is published on: http://www.wipo.org/eng/general/index5.htm

The most recent multilateral agreement on copyright is the 1996 WIPO Copyright Treaty. Among other things it provides that computer programs are protected as literary works and also introduces the protection of databases, which "... by reason of the selection or arrangement of their content constitute intellectual creations." Furthermore the 1996 WIPO Copyright Treaty contains provisions concerning technological measures, rights management information and establishes a new "right of communication to the public". It is available on: http://www.wipo.org/eng/diplconf/distrib/treaty01.htm

TEXTBLOCK 21/25 // URL: http://world-information.org/wio/infostructure/100437611725/100438659588
 
Challenges for Copyright by ICT: Digital Content Providers

Providers of digital information might be confronted with copyright related problems when using some of the special features of hypertext media like frames and hyperlinks (which both use third party content available on the Internet to enhance a webpage or CD ROM), or operate a search engine or online directory on their website.

Framing

Frames are often used to help define, and navigate within, a content provider's website. Still, when they are used to present (copyrighted) third party material from other sites issues of passing off and misleading or deceptive conduct, as well as copyright infringement, immediately arise.

Hyperlinking

It is generally held that the mere creation of a hyperlink does not, of itself, infringe copyright as usually the words indicating a link or the displayed URL are unlikely to be considered a "work". Nevertheless if a link is clicked on the users browser will download a full copy of the material at the linked address creating a copy in the RAM of his computer courtesy of the address supplied by the party that published the link. Although it is widely agreed that the permission to download material over the link must be part of an implied license granted by the person who has made the material available on the web in the first place, the scope of this implied license is still the subject of debate. Another option that has been discussed is to consider linking fair use.

Furthermore hyperlinks, and other "information location tools", like online directories or search engines could cause their operators trouble if they refer or link users to a site that contains infringing material. In this case it is yet unclear whether providers can be held liable for infringement.

TEXTBLOCK 22/25 // URL: http://world-information.org/wio/infostructure/100437611725/100438659590
 
Timeline 1900-1970 AD

1913 the wheel cipher gets re-invented as a strip

1917 William Frederick Friedman starts working as a cryptoanalyst at Riverbank Laboratories, which also works for the U.S. Government. Later he creates a school for military cryptoanalysis

- an AT&T-employee, Gilbert S. Vernam, invents a polyalphabetic cipher machine that works with random-keys

1918 the Germans start using the ADFGVX-system, that later gets later by the French Georges Painvin

- Arthur Scherbius patents a ciphering machine and tries to sell it to the German Military, but is rejected

1919 Hugo Alexander Koch invents a rotor cipher machine

1921 the Hebern Electric Code, a company producing electro-mechanical cipher machines, is founded

1923 Arthur Scherbius founds an enterprise to construct and finally sell his Enigma machine for the German Military

late 1920's/30's more and more it is criminals who use cryptology for their purposes (e.g. for smuggling). Elizabeth Smith Friedman deciphers the codes of rum-smugglers during prohibition regularly

1929 Lester S. Hill publishes his book Cryptography in an Algebraic Alphabet, which contains enciphered parts

1933-1945 the Germans make the Enigma machine its cryptographic main-tool, which is broken by the Poles Marian Rejewski, Gordon Welchman and Alan Turing's team at Bletchley Park in England in 1939

1937 the Japanese invent their so called Purple machine with the help of Herbert O. Yardley. The machine works with telephone stepping relays. It is broken by a team of William Frederick Friedman. As the Japanese were unable to break the US codes, they imagined their own codes to be unbreakable as well - and were not careful enough.

1930's the Sigaba machine is invented in the USA, either by W.F. Friedman or his colleague Frank Rowlett

- at the same time the British develop the Typex machine, similar to the German Enigma machine

1943 Colossus, a code breaking computer is put into action at Bletchley Park

1943-1980 the cryptographic Venona Project, done by the NSA, is taking place for a longer period than any other program of that type

1948 Shannon, one of the first modern cryptographers bringing mathematics into cryptography, publishes his book A Communications Theory of Secrecy Systems

1960's the Communications-Electronics Security Group (= CESG) is founded as a section of Government Communications Headquarters (= GCHQ)

late 1960's the IBM Watson Research Lab develops the Lucifer cipher

1969 James Ellis develops a system of separate public-keys and private-keys

TEXTBLOCK 23/25 // URL: http://world-information.org/wio/infostructure/100437611776/100438658921
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 24/25 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
0 - 1400 A.D.

150
A smoke signals network covers the Roman Empire

The Roman smoke signals network consisted of towers within a visible range of each other and had a total length of about 4500 kilometers. It was used for military signaling.
For a similar telegraph network in ancient Greece see Aeneas Tacitus' optical communication system.

About 750
In Japan block printing is used for the first time.

868
In China the world's first dated book, the Diamond Sutra, is printed.

1041-1048
In China moveable types made from clay are invented.

1088
First European medieval university is established in Bologna.

The first of the great medieval universities was established in Bologna. At the beginning universities predominantly offered a kind of do-it-yourself publishing service.

Books still had to be copied by hand and were so rare that a copy of a widely desired book qualified for being invited to a university. Holding a lecture equaled to reading a book aloud, like a priest read from the Bible during services. Attending a lecture equaled to copy a lecture word by word, so that you had your own copy of a book, thus enabling you to hold a lecture, too.

For further details see History of the Idea of a University, http://quarles.unbc.edu/ideas/net/history/history.html

TEXTBLOCK 25/25 // URL: http://world-information.org/wio/infostructure/100437611796/100438659702
 
Digital Subscriber Line (DSL)

DSL connections are high-speed data connections over copper wire telephone lines. As with cable connections, with DSL you can look up information on the Internet and make a phone call at the same time but you do not need to have a new or additional cable or line installed. One of the most prominent DSL services is ISDN (integrated services digital network, for more information click here ( http://www.britannica.com/bcom/eb/article/4/0,5716,129614+15,00.html )).

http://www.britannica.com/bcom/eb/article/4/0...
INDEXCARD, 1/34
 
Above.net

Headquartered in San Jose, USA, AboveNet Communications is a backbone service provider. Through its extensive peering relationships, the company has built a network with the largest aggregated bandwidth in the world.

http://www.above.net

INDEXCARD, 2/34
 
Chappe's fixed optical network

Claude Chappe built a fixed optical network between Paris and Lille. Covering a distance of about 240kms, it consisted of fifteen towers with semaphores.

Because this communication system was destined to practical military use, the transmitted messages were encoded. The messages were kept such secretly, even those who transmit them from tower to tower did not capture their meaning, they just transmitted codes they did not understand. Depending on weather conditions, messages could be sent at a speed of 2880 kms/hr at best.

Forerunners of Chappe's optical network are the Roman smoke signals network and Aeneas Tacitus' optical communication system.

For more information on early communication networks see Gerard J. Holzmann and Bjoern Pehrson, The Early History of Data Networks.

INDEXCARD, 3/34
 
Saddam Hussein

Saddam Hussein joined the revolutionary Baath party when he was a university student. In 1958 he had the head of Iraq, Abdul-Karim Qassim, killed. Since 1979 he has been President of Iraq. Under his reign Iraq fought a decade-long war with Iran. Because of his steady enmity with extreme Islamic leaders the West supported him first of all, until his army invaded Kuwait in August 1990, an incident that the USA led to the Gulf War. Since then many rumors about a coup d'état have been launched, but Saddam Hussein is still in unrestricted power.

INDEXCARD, 4/34
 
DMCA

The DMCA (Digital Millennium Copyright Act) was signed into law by U.S. President Clinton in 1998 and implements the two 1996 WIPO treaties (WIPO Performances and Phonograms Treaty and WIPO Copyright Treaty). Besides other issues the DMCA addresses the influence of new technologies on traditional copyright. Of special interest in the context of the digitalization of intellectual property are the titles no. 2, which refers to the limitation on the liability of online service providers for copyright infringement (when certain conditions are met), no. 3, that creates an exemption for making a copy of a computer program in case of maintenance and repair, and no. 4 which is concerned with the status of libraries and webcasting. The DCMA has been widely criticized for giving copyright-holders even more power and damage the rights and freedom of consumers, technological innovation, and the free market for information.

INDEXCARD, 5/34
 
CIM

To perform manufacturing firm's functions related to design and production the CAD/CAM technology, for computer-aided design and computer-aided manufacturing, was developed. Today it is widely recognized that the scope of computer applications must extend beyond design and production to include the business functions of the firm. The name given to this more comprehensive use of computers is computer-integrated manufacturing (CIM).

INDEXCARD, 6/34
 
NSFNet

Developed under the auspices of the National Science Foundation (NSF), NSFnet served as the successor of the ARPAnet as the main network linking universities and research facilities until 1995, when it was replaced it with a commercial backbone network. Being research networks, ARPAnet and NSFnet served as testing grounds for future networks.

INDEXCARD, 7/34
 
Optical communication system by Aeneas Tacitus, 4th century B.C.

Aeneas Tacitus, a Greek military scientist and cryptographer, invented an optical communication system that combines water and beacon telegraphy. Torches indicated the beginnings and the ends of message transmissions while water jars were used to transmit the messages. These jars had a plugged standard-size hole drilled on the bottom side and were filled with water. As those who sent and those who received the message unplugged the jars simultaneously, the water drained out. Because the transmitted messages corresponded to water levels, the sender indicated by torch signal that the appropriate water level has been reached. It is a disadvantage that the possible messages are restricted to a given code, but as this system was mainly used for military purposes, this was offset by the advantage that it was almost impossible for outsiders to understand these messages unless they possessed the codebook.

With communication separated from transportation, the distant became near.

Tacitus' telegraph system was very fast and not excelled until the end of the 18th century.

For further information see Joanne Chang & Anna Soellner, Decoding Device, http://www.smith.edu/hsc/museum/ancient_inventions/decoder2.html

http://www.smith.edu/hsc/museum/ancient_inven...
INDEXCARD, 8/34
 
George Boole

b. Nov. 2, 1815, Lincoln, Lincolnshire, England
d. Dec. 8, 1864, Ballintemple, County Cork, Ireland

English mathematician who helped establish modern symbolic logic and whose algebra of logic, now called Boolean algebra, is basic to the design of digital computer circuits. One of the first Englishmen to write on logic, Boole pointed out the analogy between the algebraic symbols and those that can represent logical forms and syllogisms, showing how the symbols of quantity can be separated from those of operation. With Boole in 1847 and 1854 began the algebra of logic, or what is now called Boolean algebra. It is basically two-valued in that it involves a subdivision of objects into separate classes, each with a given property. Different classes can then be treated as to the presence or absence of the same property.


INDEXCARD, 9/34
 
Nero

Nero's full name was Nero Claudius Caesar Augustus Germanicus (37-68 AD). Nero was Roman Emperor from 54-68 AD; during the first years in power he stood under the influence of his teacher Seneca. In this period he was very successful in inner politics and abroad, for example in Britannia. Soon he changed into a selfish dictator, had his brother, mother and wife killed and probably burnt Rome, blaming the Christians for it. More than in political affairs he was interested in arts. when he was dismissed in 68, he committed suicide.

INDEXCARD, 10/34
 
Alan Turing

b. June 23, 1912, London, England
d. June 7, 1954, Wilmslow, Cheshire

English mathematician and logician who pioneered in the field of computer theory and who contributed important logical analyses of computer processes. Many mathematicians in the first decades of the 20th century had attempted to eliminate all possible error from mathematics by establishing a formal, or purely algorithmic, procedure for establishing truth. The mathematician Kurt Gödel threw up an obstacle to this effort with his incompleteness theorem. Turing was motivated by Gödel's work to seek an algorithmic method of determining whether any given propositions were undecidable, with the ultimate goal of eliminating them from mathematics. Instead, he proved in his seminal paper "On Computable Numbers, with an Application to the Entscheidungsproblem [Decision Problem]" (1936) that there cannot exist any such universal method of determination and, hence, that mathematics will always contain undecidable propositions. During World War II he served with the Government Code and Cypher School, at Bletchley, Buckinghamshire, where he played a significant role in breaking the codes of the German "Enigma Machine". He also championed the theory that computers eventually could be constructed that would be capable of human thought, and he proposed the Turing test, to assess this capability. Turing's papers on the subject are widely acknowledged as the foundation of research in artificial intelligence. In 1952 Alan M. Turing committed suicide, probably because of the depressing medical treatment that he had been forced to undergo (in lieu of prison) to "cure" him of homosexuality.

INDEXCARD, 11/34
 
Hill & Knowlton

John W. Hill opened the doors of his first public relations office in 1927 in Cleveland, Ohio. His early clients were banks, steel manufacturers, and other industrial companies in the Midwest. Hill managed the firm until 1962, and remained active in it until shortly before his death in New York City in 1977. In 1952, Hill and Knowlton became the first American public relations consultancy to recognize the business communication implications engendered by formation of the European Economic Community. Hill and Knowlton established a network of affiliates across Europe and by the middle of the decade had become the first American public relations firm to have wholly-owned offices in Europe. Hill and Knowlton, a member of the WPP Group integrated communications services family, has extensive resources and geographic coverage with its 59 offices in 34 countries. Hill and Knowlton is known for its hard-hitting tactics and said to have connections with intelligence services.

INDEXCARD, 12/34
 
Machine vision

A branch of artificial intelligence and image processing concerned with the identification of graphic patterns or images that involves both cognition and abstraction. In such a system, a device linked to a computer scans, senses, and transforms images into digital patterns, which in turn are compared with patterns stored in the computer's memory. The computer processes the incoming patterns in rapid succession, isolating relevant features, filtering out unwanted signals, and adding to its memory new patterns that deviate beyond a specified threshold from the old and are thus perceived as new entities.

INDEXCARD, 13/34
 
Scientology

Official name Church Of Scientology, religio-scientific movement developed in the United States in the 1950s by the author L. Ron Hubbard (1911-86). The Church of Scientology was formally established in the United States in 1954 and was later incorporated in Great Britain and other countries. The scientific basis claimed by the church for its diagnostic and therapeutic practice is disputed, and the church has been criticized for the financial demands that it makes on its followers. From the 1960s the church and various of its officials or former officials faced government prosecutions as well as private lawsuits on charges of fraud, tax evasion, financial mismanagement, and conspiring to steal government documents, while the church on the other hand claimed it was being persecuted by government agencies and by established medical organizations. Some former Scientology officials have charged that Hubbard used the tax-exempt status of the church to build a profitable business empire.

INDEXCARD, 14/34
 
Medieval universities and copying of books

The first of the great medieval universities was established at Bologna. At the beginning, universities predominantly offered a kind of do-it-yourself publishing service.

Books still had to be copied by hand and were so rare that a copy of a widely desired book qualified for being invited to a university. Holding a lecture equaled to reading a book aloud, like a priest read from the Bible during services. Attending a lecture equaled to copy a lecture word by word, so you had your own copy of a book, thus enabling you to hold a lecture, too.

For further details see History of the Idea of a University, http://quarles.unbc.edu/ideas/net/history/history.html

http://quarles.unbc.edu/ideas/net/history/his...
INDEXCARD, 15/34
 
Newsgroups

Newsgroups are on-line discussion groups on the Usenet. Over 20,000 newsgroups exist, organized by subject into hierarchies. Each subject hierarchy is further broken down into subcategories. Covering an incredible wide area of interests and used intensively every day, they are an important part of the Internet.

For more information, click here ( http://www.terena.nl/libr/gnrt/group/usenet.html ).

http://www.terena.nl/libr/gnrt/group/usenet.h...
INDEXCARD, 16/34
 
Agostino Ramelli's reading wheel, 1588

Agostino Ramelli designed a "reading wheel" which allowed browsing through a large number of documents without moving from one spot.

Presenting a large number of books, a small library, laid open on lecterns on a kind of ferry-wheel, allowing us to skip chapters and to browse through pages by turning the wheel to bring lectern after lectern before our eyes, thus linking ideas and texts together, Ramelli's reading wheel reminds of today's browsing software used to navigate the World Wide Web.

INDEXCARD, 17/34
 
WTO

An international organization designed to supervise and liberalize world trade. The WTO (World Trade Organization) is the successor to the General Agreement on Tariffs and Trade (GATT), which was created in 1947 and liberalized the world's trade over the next five decades. The WTO came into being on Jan. 1, 1995, with 104 countries as its founding members. The WTO is charged with policing member countries' adherence to all prior GATT agreements, including those of the last major GATT trade conference, the Uruguay Round (1986-94), at whose conclusion GATT had formally gone out of existence. The WTO is also responsible for negotiating and implementing new trade agreements. The WTO is governed by a Ministerial Conference, which meets every two years; a General Council, which implements the conference's policy decisions and is responsible for day-to-day administration; and a director-general, who is appointed by the Ministerial Conference. The WTO's headquarters are in Geneva, Switzerland.



INDEXCARD, 18/34
 
Polybius Checkerboard


 

1

2

3

4

5

1

A

B

C

D

E

2

F

G

H

I

K

3

L

M

N

O

P

4

Q

R

S

T

U

5

V

W

X

Y

Z



It is a system, where letters get converted into numeric characters.
The numbers were not written down and sent but signaled with torches.

for example:
A=1-1
B=1-2
C=1-3
W=5-2

for more information see:
http://www.ftech.net/~monark/crypto/crypt/polybius.htm

http://www.ftech.net/~monark/crypto/crypt/pol...
INDEXCARD, 19/34
 
Division of labor

The term refers to the separation of a work process into a number of tasks, with each task performed by a separate person or group of persons. It is most often applied to mass production systems, where it is one of the basic organizing principles of the assembly line. Breaking down work into simple, repetitive tasks eliminates unnecessary motion and limits the handling of tools and parts. The consequent reduction in production time and the ability to replace craftsmen with lower-paid, unskilled workers result in lower production costs and a less expensive final product. The Scottish economist Adam Smith saw in this splitting of tasks a key to economic progress by providing a cheaper and more efficient means of producing economic goods.

INDEXCARD, 20/34
 
Citicorp/Citibank

American holding company (formerly (1967-74) First National City Corporation),
incorporated in 1967, with the City Bank of New York, National Association (a bank tracing to 1812), as its principal subsidiary. The latter's name changed successively to First National City Bank in 1968 and to Citibank, N.A. (i.e., National Association), in 1976. Citicorp was the holding company's popular and trade name from its inception but became the legal name only in 1974. Headquarters are in New York City.

INDEXCARD, 21/34
 
Apple

Founded by Steve Jobs and Steve Wozniak and headquartered in Cupertino, USA, Apple Computer was the first commercially successful personal computer company.

In 1978 Wozniak invented the first personal computer, the Apple II. IBM countered its successful introduction to the market by introducing a personal computer running MS-DOS, the operating system supplied by Microsoft Corporation. And IBM gained leadership again. Although by introducing the first graphical user interface affordable to consumers having started the desktop publishing revolution, Apple could not regain leadership again.

http://www.apple.com

For more detailed information see the Encyclopaedia Britannica: http://www.britannica.com/bcom/eb/article/6/0,5716,115726+1+108787,00.html

http://www.apple.com/
INDEXCARD, 22/34
 
François Duvalier

b. April 14, 1907, Port-au-Prince, Haiti
d. April 21, 1971, Port-au-Prince

By name PAPA DOC, president of Haiti whose 14-year regime was of unprecedented duration in that country. A supporter of President Dumarsais Estimé, Duvalier was appointed director general of the National Public Health Service in 1946. He was appointed underminister of labour in 1948 and the following year became minister of public health and labour, a post that he retained until May 10, 1950, when President Estimé was overthrown by a military junta under Paul E. Magloire, who was subsequently elected president. By 1954 he had become the central opposition figure and went underground. Duvalier was elected president in September 1957. Setting about to consolidate his power, he reduced the size of the army and organized the Tontons Macoutes ("Bogeymen"), a private force responsible for terrorizing and assassinating alleged foes of the regime. Late in 1963 Duvalier moved further toward an absolutist regime, promoting a cult of his person as the semi divine embodiment of the Haitian nation. In April 1964 he was declared president for life. Although diplomatically almost completely isolated, excommunicated by the Vatican until 1966 for harassing the clergy, and threatened by conspiracies against him, Duvalier was able to stay in power longer than any of his predecessors.

INDEXCARD, 23/34
 
Clipper Chip

The Clipper Chip is a cryptographic device proposed by the U.S. government that purportedly intended to protect private communications while at the same time permitting government agents to obtain the "keys" upon presentation of what has been vaguely characterized as "legal authorization." The "keys" are held by two government "escrow agents" and would enable the government to access the encrypted private communication. While Clipper would be used to encrypt voice transmissions, a similar chip known as Capstone
would be used to encrypt data. The underlying cryptographic algorithm, known as Skipjack, was developed by the National Security Agency (NSA).

INDEXCARD, 24/34
 
Internet Society

Founded in 1992, the Internet Society is an umbrella organization of several mostly self-organized organizations dedicated to address the social, political, and technical issues, which arise as a result of the evolution and the growth of the Net. Its most important subsidiary organizations are the Internet Architecture Board, the Internet Engineering Steering Group, the Internet Engineering Task Force, the Internet Research Task Force, and the Internet Societal Task Force.

Its members comprise companies, government agencies, foundations, corporations and individuals. The Internet Society is governed by elected trustees.

http://www.isoc.org

http://www.isoc.org/
INDEXCARD, 25/34
 
Instinet

Instinet, a wholly owned subsidiary of Reuters Group plc since 1987, is the world's largest agency brokerage firm and the industry brokerage leader in after hours trading. It trades in over 40 global markets daily and is a member of seventeen exchanges in North America, Europe, and Asia. Its institutional clients represent more than 90 percent of the institutional equity funds under management in the United States. Instinet accounts for about 20 percent of the NASDAQ daily trading volume and trades approximately 170 million shares of all U.S. equities daily.

INDEXCARD, 26/34
 
Sputnik

At the beginning of the story of today's global data networks is the story of the development of satellite communication.

In 1955 President Eisenhower announced the USA's intention to launch a satellite. But it was the Soviet Union, which launched the first satellite in 1957: Sputnik I. After Sputnik's launch it became evident that the Cold War was also a race for leadership in the application of state-of-the-art technology to defence. As the US Department of Defence encouraged the formation of high-tech companies, it laid the ground to Silicon Valley, the hot spot of the world's computer industry.

In the same year the USA launched their first satellite - Explorer I - data were transmitted over regular phone circuits for the first time, thus laying the ground for today's global data networks.

Today's satellites may record weather data, scan the planet with powerful cameras, offer global positioning and monitoring services, and relay high-speed data transmissions. But up to now, most satellites are designed for military purposes such as reconnaissance.

INDEXCARD, 27/34
 
Internet Software Consortium

The Internet Software Consortium (ISC) is a nonprofit corporation dedicated to the production of high-quality reference implementations of Internet standards that meet production standards. Its goal is to ensure that those reference implementations are properly supported and made freely available to the Internet community.

http://www.isc.org

INDEXCARD, 28/34
 
Harold. D. Lasswell

Harold. D. Lasswell (* 1902) studied at the London School of Economics. He then became a professor of social sciences at different Universities, like the University of Chicago, Columbia University, and Yale University. He also was a consultant for several governments. One of Lasswell's many famous works was Propaganda Technique in World War. In this he defines propaganda. He also discussed major objectives of propaganda, like to mobilize hatred against the enemy, to preserve the friendship of allies, to procure the co-operation of neutrals and to demoralize the enemy.

INDEXCARD, 29/34
 
Wide Application Protocol (WAP)

The WAP (Wireless Application Protocol) is a specification for a set of communication protocols to standardize the way that wireless devices, such as cellular telephones and radio transceivers, can be used for Internet access, including e-mail, the World Wide Web, newsgroups, and Internet Relay Chat (IRC).

While Internet access has been possible in the past, different manufacturers have used different technologies. In the future, devices and service systems that use WAP will be able to interoperate.

Source: Whatis.com

INDEXCARD, 30/34
 
Cookie

A cookie is an information package assigned to a client program (mostly a Web browser) by a server. The cookie is saved on your hard disk and is sent back each time this server is accessed. The cookie can contain various information: preferences for site access, identifying authorized users, or tracking visits.

In online advertising, cookies serve the purpose of changing advertising banners between visits, or identifying a particular direct marketing strategy based on a user's preferences and responses.

Advertising banners can be permanently eliminated from the screen by filtering software as offered by Naviscope or Webwash

Cookies are usually stored in a separate file of the browser, and can be erased or permanently deactivated, although many web sites require cookies to be active.

http://www.naviscope.com/
http://www.webwash.com/
INDEXCARD, 31/34
 
Amoco Corporation

American petroleum corporation that was founded in 1889 by the Standard Oil trust to direct the refining and marketing of oil in the Midwestern states. By the late 20th century, American operations still accounted for more than half of Amoco's total assets, though the company has been active in some 40 other countries in the areas of production, refining, and marketing. In addition to refining crude oil, Amoco is one of the largest producers of natural gas in the North American continent.

INDEXCARD, 32/34
 
Roman smoke telegraph network, 150 A.D.

The Roman smoke signals network consisted of towers within visible range of each other and had a total length of about 4500 kilometers. It was used for military signaling.

For a similar telegraph network in ancient Greece see Aeneas Tacitus' optical communication system.

INDEXCARD, 33/34
 
The Flesh Machine

This is the tile of a book by the Critical Art Ensemble which puts the development of artifical life into a critical historical and political context, defining the power vectors which act as the driving force behind this development. The book is available in a print version (New York, Autonomedia 1998) and in an online version at http://www.critical-art.net/fles/book/index.html

INDEXCARD, 34/34