Advertising and the Content Industry - The Coca-Cola Case

Attempts to dictate their rules to the media has become a common practice among marketers and the advertising industry. Similar as in the Chrysler case, where the company demanded that magazines give advance notice about controversial articles, recent attempts to put pressure on content providers have been pursued by the Coca-Cola Company.

According to a memo published by the New York Post, Coca-Cola demands a free ad from any publication that publishes a Coke ad adjacent to stories on religion, politics, disease, sex, food, drugs, environmental issues, health, or stories that employ vulgar language. "Inappropriate editorial matter" will result in the publisher being liable for a "full make good," said the memo by Coke advertising agency McCann-Erickson. Asked about this practice, a Coke spokes person said the policy has long been in effect.

(Source: Odwyerpr.com: Coke Dictates nearby Editorial. http://www.odwyerpr.com)

TEXTBLOCK 1/9 // URL: http://world-information.org/wio/infostructure/100437611652/100438657998
 
Advertising and the Media System

Media systems (especially broadcasting) can be classified in two different types:

Public Media Systems: Government control over broadcasting through ownership, regulation, and partial funding of public broadcasting services.

Private Media System: Ownership and control lies in the hands of private companies and shareholders.

Both systems can exist in various forms, according to the degree of control by governments and private companies, with mixed systems (public and private) as the third main kind.

Whereas public media systems are usually at least partially funded by governments, private broadcasting solely relies on advertising revenue. Still also public media systems cannot exclude advertising as a source of revenue. Therefore both types are to a certain degree dependent on money coming in by advertisers.

And this implies consequences on the content provided by the media. As the attraction of advertisers becomes critically important, interests of the advertising industry frequently play a dominant role concerning the structure of content and the creation of environments favorable for advertising goods and services within the media becomes more and more common.

TEXTBLOCK 2/9 // URL: http://world-information.org/wio/infostructure/100437611652/100438657942
 
RTMark

RTMark is a group of culture jammers applying a brokerage-system that benefits from "limited liability" like any other corporation. Using this principle, RTMark supports the sabotage (informative alternation) of corporate products, from dolls and children's learning tools to electronic action games, by channeling funds from investors to workers. RTMark searches for solutions that go beyond public relations and defines its "bottom line" in improving culture. It seeks cultural and not financial profit.

Strategies and Policies

RTMark is engaged in a whole lot of projects, which are designed to lead to a positive social change. Projects with roughly similar intent, risk, or likelihood of accomplishment are grouped into "fund families", like for example "The Frontier Fund". This fund is dedicated to challenge naive, utopic visions of the "global village", focusing on the implications of allowing corporations and other multinational interests to operate free of social context.

RTMark pursues its projects through donations by individuals, which can invest in a certain fund, whereby an exact specification of how the donated money should be used can be made. RTMark has repeatedly gained attention through its projects, especially with its spoof websites, like the ones of Rudy Giuliani and the WTO, or its campaign against eToys, which prevents the Internet art group etoy from using the domain etoy.com.

TEXTBLOCK 3/9 // URL: http://world-information.org/wio/infostructure/100437611734/100438659283
 
Another Question of Security

Even with the best techniques it is impossible to invent a cryptographic system that is absolutely safe/unbreakable. To decipher a text means to go through many, sometimes nearly - but never really - endless attempts. For the computers of today it might take hundreds of years or even more to go through all possibilities of codes, but still, finally the code stays breakable. The much faster quantum computers will proof that one day.
Therefore the decision to elect a certain method of enciphering finally is a matter of trust.

For the average user of computers it is rather difficult to understand or even realize the dangers and/or the technological background of electronic transmission of data. For the majority thinking about one's own necessities for encryption first of all means to trust others, the specialists, to rely on the information they provide.
The websites explaining the problems behind (and also the articles and books concerning the topic) are written by experts of course as well, very often in their typical scientific language, merely understandable for laymen. The introductions and other superficial elements of those articles can be understood, whereas the real background appears as untouchable spheres of knowledge.

The fact that dangers are hard to see through and the need for security measures appears as something most people know from media reports, leads directly to the problem of an underdeveloped democracy in the field of cryptography. Obviously the connection between cryptography and democracy is rather invisible for many people. Those mentioned media reports often specialize in talking about the work computer hackers do (sometimes being presented as criminals, sometimes as heroes) and the danger to lose control over the money drawn away from one's bank account, if someone steals the credit card number or other important financial data. The term "security", surely connected to those issues, is a completely different one from the one that is connected to privacy.
It is especially the latter that touches the main elements of democracy.

for the question of security see:
http://www-db.stanford.edu/pub/gio/CS99I/security.html

TEXTBLOCK 4/9 // URL: http://world-information.org/wio/infostructure/100437611776/100438658850
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 5/9 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
Individualized Audience Targeting

New opportunities for online advertisers arise with the possibility of one-to-one Web applications. Software agents for example promise to "register, recognize and manage end-user profiles; create personalized communities on-line; deliver personalized content to end-users and serve highly targeted advertisements". The probably ultimate tool for advertisers. Although not yet widely used, companies like Amazon.Com have already started to exploit individualized audience targeting for their purposes.

TEXTBLOCK 6/9 // URL: http://world-information.org/wio/infostructure/100437611652/100438658450
 
4000 - 1000 B.C.

4th millennium B.C.
In Sumer writing is invented.

Writing and calculating came into being at about the same time. The first pictographs carved into clay tablets were used for administrative purposes. As an instrument for the administrative bodies of early empires, which began to rely on the collection, storage, processing and transmission of data, the skill of writing was restricted to only very few. Being more or less separated tasks, writing and calculating converge in today's computers.

Letters are invented so that we might be able to converse even with the absent, says Saint Augustine. The invention of writing made it possible to transmit and store information. No longer the ear predominates; face-to-face communication becomes more and more obsolete for administration and bureaucracy. Standardization and centralization become the constituents of high culture and vast empires as Sumer and China.

3200 B.C.
In Sumer the seal is invented.

About 3000 B.C.
In Egypt papyrus scrolls and hieroglyphs are used.

About 1350 B.C.
In Assyria the cuneiform script is invented.

1200 B.C.
According to Aeschylus, the conquest of the town of Troy was transmitted via torch signals.

About 1100 B.C.
Egyptians use homing pigeons to deliver military information.

TEXTBLOCK 7/9 // URL: http://world-information.org/wio/infostructure/100437611796/100438659725
 
Definition

During the last 20 years the old Immanuel Wallerstein-paradigm of center - periphery and semi-periphery found a new costume: ICTs. After Colonialism, Neo-Colonialism and Neoliberalism a new method of marginalization is emerging: the digital divide.

"Digital divide" describes the fact that the world can be divided into people who
do and people who do not have access to (or the education to handle with) modern information technologies, e.g. cellular telephone, television, Internet. This digital divide is concerning people all over the world, but as usually most of all people in the formerly so called third world countries and in rural areas suffer; the poor and less-educated suffer from that divide.
More than 80% of all computers with access to the Internet are situated in larger cities.

"The cost of the information today consists not so much of the creation of content, which should be the real value, but of the storage and efficient delivery of information, that is in essence the cost of paper, printing, transporting, warehousing and other physical distribution means, plus the cost of the personnel manpower needed to run these `extra' services ....Realizing an autonomous distributed networked society, which is the real essence of the Internet, will be the most critical issue for the success of the information and communication revolution of the coming century of millennium."
(Izumi Aizi)

for more information see:
http://www.whatis.com/digital_divide.htm

TEXTBLOCK 8/9 // URL: http://world-information.org/wio/infostructure/100437611730/100438659300
 
ZaMir.net

ZaMir.net started in 1992 trying to enable anti-war and human rights groups of former Yugoslavia to communicate with each other and co-ordinate their activities. Today there are an estimated 1,700 users in 5 different Bulletin Board Systems (Zagreb, Belgrade, Ljubljana, Sarajevo and Pristiana). Za-mir Transnational Network (ZTN) offers e-mail and conferences/newsgroups. The ZTN has its own conferences, which are exchanged between the 5 BBS, and additionally offers more than 150 international conferences. ZTN aim is to help set up systems in other cities in the post-Yugoslav countries that have difficulty connecting to the rest of the world.

History

With the war in Yugoslavia anti-war and human rights groups of former Yugoslavia found it very difficult to organize and met huge problems to co-ordinate their activities due to immense communication difficulties. So in 1992 foreign peace groups together with Institutions in Ljubljana, Zagreb and Belgrade launched the Communications Aid project. Modems were distributed to peace and anti-war groups in Ljubljana, Zagreb, Belgrade and Sarajevo and a BBS (Bulletin Board System) installed.

As after spring 1992 no directs connections could be made they were done indirectly through Austria, Germany or Britain, which also enabled a connection with the worldwide networks of BBS's. Nationalist dictators therefore lost their power to prevent communication of their people. BBS were installed in Zagreb and Belgrade and connected to the APC Network and associated networks. Za-mir Transnational Network (ZTN) was born.

Strategies and Policies

With the help of ZaMir's e-mail network it have been possible to find and coordinate humanitarian aid for some of the many refugees of the war. It has become an important means of communication for humanitarian organizations working in the war region and sister organizations form other countries. It helps co-ordinate work of activists form different countries of former Yugoslavia, and it also helps to coordinate the search for volunteers to aid in war reconstruction. ZTN also helped facilitate exchange of information undistorted by government propaganda between Croatia, Serbia and Bosnia. Independent magazines like Arkzin (Croatia) and Vreme (Serbia) now publish electronic editions on ZTN.

TEXTBLOCK 9/9 // URL: http://world-information.org/wio/infostructure/100437611734/100438659208
 
Edward L. Bernays

Born 1891 in Vienna, Bernays was one of the founders of modern public relations. An enigmatic character, he was a master of mise en scène with far-reaching contacts in the world of business and politics. The nephew of Sigmund Freund and related with Heinrich Heine, he was also among the first to pursue PR for governments and to produce pseudo-events. Bernays considered the manipulation of public opinion as an important element of mass democracies and was of the opinion that only through PR a society's order can be kept.

INDEXCARD, 1/27
 
Fiber-optic cable networks

Fiber-optic cable networks may become the dominant method for high-speed Internet connections. Since the first fiber-optic cable was laid across the Atlantic in 1988, the demand for faster Internet connections is growing, fuelled by the growing network traffic, partly due to increasing implementation of corporate networks spanning the globe and to the use of graphics-heavy contents on the World Wide Web.

Fiber-optic cables have not much more in common with copper wires than the capacity to transmit information. As copper wires, they can be terrestrial and submarine connections, but they allow much higher transmission rates. Copper wires allow 32 telephone calls at the same time, but fiber-optic cable can carry 40,000 calls at the same time. A capacity, Alexander Graham Bell might have not envisioned when he transmitted the first words - "Mr. Watson, come here. I want you" - over a copper wire.

Copper wires will not come out of use in the foreseeable future because of technologies as DSL that speed up access drastically. But with the technology to transmit signals at more than one wavelength on fiber-optic cables, there bandwidth is increasing, too.

For technical information from the Encyclopaedia Britannica on telecommunication cables, click here. For technical information from the Encyclopaedia Britannica focusing on fiber-optic cables, click here.

An entertaining report of the laying of the FLAG submarine cable, up to now the longest fiber-optic cable on earth, including detailed background information on the cable industry and its history, Neal Stephenson has written for Wired: Mother Earth Mother Board. Click here for reading.

Susan Dumett has written a short history of undersea cables for Pretext magazine, Evolution of a Wired World. Click here for reading.

A timeline history of submarine cables and a detailed list of seemingly all submarine cables of the world, operational, planned and out of service, can be found on the Web site of the International Cable Protection Committee.

For maps of fiber-optic cable networks see the website of Kessler Marketing Intelligence, Inc.

http://www.britannica.com/bcom/eb/article/4/0...
http://www.britannica.com/bcom/eb/article/4/0...
http://www.wired.com/wired/archive/4.12/ffgla...
http://www.pretext.com/mar98/features/story3....
INDEXCARD, 2/27
 
Intelsat

Intelsat, the world's biggest communication satellite services provider, is still mainly owned by governments, but will be privatised during 2001, like Eutelsat. A measure already discussed 1996 at an OECD competition policy roundtable in 1996. Signatory of the Intelsat treaty for the United States of America is Comsat, a private company listed on the New York Stock Exchange. Additionally Comsat is one of the United Kingdom's signatories. Aggregated, Comsat owns about 20,5% of Intelsat already and is Intelsat's biggest shareholder. In September 1998 Comsat agreed to merge with Lockheed Martin. After the merger, Lockheed Martin will hold at least 49% of Comsat share capital.

http://www.intelsat.int/index.htm

http://www.eutelsat.org/
http://www.oecd.org//daf/clp/roundtables/SATS...
http://www.comsat.com/
http://www.nyse.com/
http://www.comsat.com/
http://www.comsat.com/
http://www.comsat.com/
http://www.comsat.com/
INDEXCARD, 3/27
 
Kessler Marketing Intelligence (KMI)

KMI is the leading source for information on fiber-optics markets. It offers market research, strategic analysis and product planning services to the opto-electronics and communications industries. KMI tracks the worldwide fiber-optic cable system and sells the findings to the industry. KMI says that every fiber-optics corporation with a need for strategic market planning is a subscriber to their services.

http://www.kmicorp.com/

http://www.kmicorp.com/
INDEXCARD, 4/27
 
Punch card, 1801

Invented by Joseph Marie Jacquard, an engineer and architect in Lyon, France, the punch cards laid the ground for automatic information processing. For the first time information was stored in binary format on perforated cardboard cards. In 1890 Hermann Hollerith used Joseph-Marie Jacquard's punch card technology for processing statistical data retrieved from the US census in 1890, thus speeding up data analysis from eight to three years. His application of Jacquard's invention was also used for programming computers and data processing until electronic data processing was introduced in the 1960's. - As with writing and calculating, administrative purposes account for the beginning of modern automatic data processing.

Paper tapes are a medium similar to Jacquard's punch cards. In 1857 Sir Charles Wheatstone applied them as a medium for the preparation, storage, and transmission of data for the first time. By their means, telegraph messages could be prepared off-line, sent ten times quicker (up to 400 words per minute), and stored. Later similar paper tapes were used for programming computers.

INDEXCARD, 5/27
 
United Brands Company

American corporation formed in 1970 in the merger of United Fruit Company and AMK Corporation. United Fruit Company, the main company, was founded in 1899 producing and marketing bananas grown in the Caribbean islands, Central America, and Colombia. The principal founder was Minor C. Keith, who had begun to acquire banana plantations and to build a railroad in Costa Rica as early as 1872. In 1884 he contracted with the Costa Rican government to fund the national debt and to lay about 50 more miles of track. In return he received, for 99 years, full rights to these rail lines and 800,000 acres of virgin land, tax exempt for 20 years. By 1930 it had absorbed 20 rival firms and became the largest employer in Central America. As a foreign corporation of conspicuous size, United Fruit sometimes became the target of popular attacks. The Latin-American press often referred to it as el pulpo ("the octopus"), accusing it of exploiting labourers, bribing officials, and influencing governments during the period of Yankee "dollar diplomacy" in the first decades of the 20th century.

INDEXCARD, 6/27
 
Writing

Writing and calculating came into being at about the same time. The first pictographs carved into clay tablets are used for administrative purposes. As an instrument for the administrative bodies of early empires, who began to rely on the collection, storage, processing and transmission of data, the skill of writing was restricted to a few. Being more or less separated tasks, writing and calculating converge in today's computers.

Letters are invented so that we might be able to converse even with the absent, says Saint Augustine. The invention of writing made it possible to transmit and store information. No longer the ear predominates; face-to-face communication becomes more and more obsolete for administration and bureaucracy. Standardization and centralization become the constituents of high culture and vast empires as Sumer and China.

INDEXCARD, 7/27
 
Alexander Graham Bell

b., March 3, 1847, Edinburgh

d. Aug. 2, 1922, Beinn Bhreagh, Cape Breton Island, Nova Scotia, Canada

American audiologist and inventor wrongly remembered for having invented the telephone in 1876. Although Bell introduced the first commercial application of the telephone, in fact a German teacher called Reiss invented it.

For more detailed information see the Encyclopaedia Britannica: http://www.britannica.com/bcom/eb/article/1/0,5716,15411+1+15220,00.html

INDEXCARD, 8/27
 
RIPE

The RIPE Network Coordination Centre (RIPE NCC) is one of three Regional Internet

Registries (RIR), which exist in the world today, providing allocation and registration services which support the operation of the Internet globally, mainly the allocation of IP address space for Europe.

http://www.ripe.net

INDEXCARD, 9/27
 
Telnet

Telnet allows you to login remotely on a computer connected to the Internet.

INDEXCARD, 10/27
 
Telephone

The telephone was not invented by Alexander Graham Bell, as is widely held to be true, but by Philipp Reiss, a German teacher. When he demonstrated his invention to important German professors in 1861, it was not enthusiastically greeted. Because of this dismissal, no financial support for further development was provided to him.

And here Bell comes in: In 1876 he successfully filed a patent for the telephone. Soon afterwards he established the first telephone company.

INDEXCARD, 11/27
 
Digital Subscriber Line (DSL)

DSL connections are high-speed data connections over copper wire telephone lines. As with cable connections, with DSL you can look up information on the Internet and make a phone call at the same time but you do not need to have a new or additional cable or line installed. One of the most prominent DSL services is ISDN (integrated services digital network, for more information click here ( http://www.britannica.com/bcom/eb/article/4/0,5716,129614+15,00.html )).

http://www.britannica.com/bcom/eb/article/4/0...
INDEXCARD, 12/27
 
Samuel Thomas Soemmering's electric telegraph, 1809

With Samuel Thomas Soemmering's invention of the electrical telegraph the telegraphic transmission of messages is no longer tied to visibility, as it is the case with smoke and light signals networks. Economical and reliable, the electric telegraph became the state-of-the-art communication system for fast data transmissions, even over long distances.

Click here for an image of Soemmering's electric telegraph.

http://www.heise.de/tp/deutsch/inhalt/co/2335...
INDEXCARD, 13/27
 
Hieroglyphs

Hieroglyphs are pictures, used for writing in ancient Egypt. First of all those pictures were used for the names of kings, later more and more signs were added, until a number of 750 pictures

INDEXCARD, 14/27
 
Integrated circuit

Also called microcircuit, the integrated circuit is an assembly of electronic components, fabricated as a single unit, in which active semiconductor devices (transistors and diodes) and passive devices (capacitors and resistors) and their interconnections are built up on a chip of material called a substrate (most commonly made of silicon). The circuit thus consists of a unitary structure with no connecting wires. The individual circuit elements are microscopic in size.

INDEXCARD, 15/27
 
Internet Architecture Board

On behalf of the Internet Society, the Internet Architecture Board oversees the evolution of the architecture, the standards and the protocols of the Net.

Internet Society: http://www.isoc.org/iab

http://www.isoc.org/
INDEXCARD, 16/27
 
AT&T Labs-Research

The research and development division of AT&T. Inventions made at AT&T Labs-Research include so important ones as stereo recording, the transistor and the communications satellite.

http://www.research.att.com/

INDEXCARD, 17/27
 
Proxy Servers

A proxy server is a server that acts as an intermediary between a workstation user and the Internet so that security, administrative control, and caching service can be ensured.

A proxy server receives a request for an Internet service (such as a Web page request) from a user. If it passes filtering requirements, the proxy server, assuming it is also a cache server, looks in its local cache of previously downloaded Web pages. If it finds the page, it returns it to the user without needing to forward the request to the Internet. If the page is not in the cache, the proxy server, acting as a client on behalf of the user, uses one of its own IP addresses to request the page from the server out on the Internet. When the page is returned, the proxy server relates it to the original request and forwards it on to the user.

Source: Whatis.com

INDEXCARD, 18/27
 
International Cable Protection Committee (ICPC)

The ICPC aims at reducing the number of incidents of damages to submarine telecommunications cables by hazards.

The Committee also serves as a forum for the exchange of technical and legal information pertaining to submarine cable protection methods and programs and funds projects and programs, which are beneficial for the protection of submarine cables.

Membership is restricted to authorities (governmental administrations or commercial companies) owning or operating submarine telecommunications cables. As of May 1999, 67 members representing 38 nations were members.

http://www.iscpc.org

INDEXCARD, 19/27
 
Bandwidth

The bandwidth of a transmitted communications signal is a measure of the range of frequencies the signal occupies. The term is also used in reference to the frequency-response characteristics of a communications receiving system. All transmitted signals, whether analog or digital, have a certain bandwidth. The same is true of receiving systems.

Generally speaking, bandwidth is directly proportional to the amount of data transmitted or received per unit time. In a qualitative sense, bandwidth is proportional to the complexity of the data for a given level of system performance. For example, it takes more bandwidth to download a photograph in one second than it takes to download a page of text in one second. Large sound files, computer programs, and animated videos require still more bandwidth for acceptable system performance. Virtual reality (VR) and full-length three-dimensional audio/visual presentations require the most bandwidth of all.

In digital systems, bandwidth is data speed in bits per second (bps).

Source: Whatis.com

INDEXCARD, 20/27
 
National Laboratory for Applied Network Research

NLANR, initially a collaboration among supercomputer sites supported by the National Science Foundation, was created in 1995 to provide technical and engineering support and overall coordination of the high-speed connections at these five supercomputer centers.

Today NLANR offers support and services to institutions that are qualified to use high performance network service providers - such as Internet 2 and Next Generation Internet.

http://www.nlanr.net

INDEXCARD, 21/27
 
ARPAnet

ARPAnet was the small network of individual computers connected by leased lines that marked the beginning of today's global data networks. Being an experimental network mainly serving the purpose to test the feasibility of wide area networks, the possibility of remote computing, it was created for resource sharing between research institutions, not for messaging services like E-mail. Although research was sponsored by US military, ARPAnet was not designed for directly martial use but to support military-related research.

In 1969 ARPANET went online and links the first two computers, one of them located at the University of California, Los Angeles, the other at the Stanford Research Institute.

But ARPAnet has not become widely accepted before it was demonstrated in action to a public of computer experts at the First International Conference on Computers and Communication in Washington, D. C. in 1972.

Before it was decommissioned in 1990, NSFnet, a network of scientific and academic computers funded by the National Science Foundation, and a separate new military network went online in 1986. In 1988 the first private Internet service providers offered a general public access to NSFnet. Beginning in 1995, after having become the backbone of the Internet in the USA, NSFnet was turned over to a consortium of commercial backbone providers. This and the launch of the World Wide Web added to the success of the global data network we call the Net.

In the USA commercial users already outnumbered military and academic users in 1994.

Despite the rapid growth of the Net, most computers linked to it are still located in the United States.

INDEXCARD, 22/27
 
Backbone Networks

Backbone networks are central networks usually of very high bandwidth, that is, of very high transmitting capacity, connecting regional networks. The first backbone network was the NSFNet run by the National Science Federation of the United States.

INDEXCARD, 23/27
 
Internet Exchanges

Internet exchanges are intersecting points between major networks.

List of the World's Public Internet exchanges (http://www.ep.net)

http://www.ep.net/
INDEXCARD, 24/27
 
Caching

Caching is a mechanism that attempts to decrease the time it takes to retrieve data by storing a copy at a closer location.

INDEXCARD, 25/27
 
Intranet

As a local area network (LAN), an Intranet is a secured network of computers based on the IP protocol and with restricted access.

INDEXCARD, 26/27
 
Public Relations Consultants Association (PRCA)

The PRCA was formed in November 1969 as an association limited by guarantee of up to £5 per member and therefore has no share capital. The PRCA tries to encourage and promote the advancement of companies and firms engaged in public relations consultancy..

INDEXCARD, 27/27