4000 - 1000 B.C.

4th millennium B.C.
In Sumer writing is invented.

Writing and calculating came into being at about the same time. The first pictographs carved into clay tablets were used for administrative purposes. As an instrument for the administrative bodies of early empires, which began to rely on the collection, storage, processing and transmission of data, the skill of writing was restricted to only very few. Being more or less separated tasks, writing and calculating converge in today's computers.

Letters are invented so that we might be able to converse even with the absent, says Saint Augustine. The invention of writing made it possible to transmit and store information. No longer the ear predominates; face-to-face communication becomes more and more obsolete for administration and bureaucracy. Standardization and centralization become the constituents of high culture and vast empires as Sumer and China.

3200 B.C.
In Sumer the seal is invented.

About 3000 B.C.
In Egypt papyrus scrolls and hieroglyphs are used.

About 1350 B.C.
In Assyria the cuneiform script is invented.

1200 B.C.
According to Aeschylus, the conquest of the town of Troy was transmitted via torch signals.

About 1100 B.C.
Egyptians use homing pigeons to deliver military information.

TEXTBLOCK 1/12 // URL: http://world-information.org/wio/infostructure/100437611796/100438659725
 
Virtual cartels, introduction

Among the most striking development of the 1990s has been the emergence of a global commercial media market utilizing new technologies and the global trend toward deregulation.
This global commercial media market is a result of aggressive maneuvering by the dominant firms, new technologies that make global systems cost-efficient, and neoliberal economic policies encouraged by the World Bank, IMF, WTO, and the US government to break down regulatory barriers to a global commercial media and telecommunication market.

A global oligopolistic market that covers the spectrum of media is now crystallizing the very high barriers to entry."

(Robert McChesney, author of "Rich Media, Poor Democracy")

The network structure of information and communication technologies means that even deregulated markets are not "free". The functional logic of global networks only tolerates a small number of large players. Mergers, strategic alliances, partnerships and cooperations are therefore the daily routine of the ICT business. They bypass competition and create "virtual cartels".

TEXTBLOCK 2/12 // URL: http://world-information.org/wio/infostructure/100437611709/100438658911
 
Commercial vs. Independent Content

Commercial media aim towards economies of scale and scope, with the goal to maximize profits. As advertising money usually is their primary source of revenue their content very often is attuned to meet the needs of advertisers and marketers. Information necessary for a citizen's participation in the public sphere usually only plays a minor role in their programming, as it does not comply with the demands of an economic system whose principal aim is the generation of profit. They also virtually always are structured in accord with and to help reinforce society's defining hierarchical social relationships, and are generally controlled by and controlling of other major social institutions, particularly corporations.

Independent content provider on the other hand mostly act on a non-profit basis and try to avoid dependence on corporate powers and the state. One of their main concerns is the critical observation of public interest issues. The central aim of independent content provider's activities usually is to bring aspects and standpoints neglected by the (commercial) mainstream media to the public and subvert society's defining hierarchical social relationships. Promoting public debate and an active civil society they engage in the organization of alert actions and information campaigns or create subversive art

TEXTBLOCK 3/12 // URL: http://world-information.org/wio/infostructure/100437611734/100438659280
 
Cryptography's Terms and background

"All nature is merely a cipher and a secret writing."
Blaise de Vigenère

In the (dis-)information age getting information but at the same time excluding others from it is part of a power-game (keeping the other uneducated). The reason for it eventually has found an argument called security.
Compared to the frequency of its presence in articles, the news and political speeches security seems to be one of the most popular words of the 90's. It must be a long time ago when that word was only used for and by the military and the police. Today one can find it as part of every political issue. Even development assistance and nutrition programs consider it part of its work.
The so-called but also real need for information security is widespread and concerning everybody, whether someone uses information technology or not. In any case information about individuals is moving globally; mostly sensitive information like about bank records, insurance and medical data, credit card transactions, and much much more. Any kind of personal or business communication, including telephone conversations, fax messages, and of course e-mail is concerned. Not to forget further financial transactions and business information. Almost every aspect of modern life is affected.
We want to communicate with everybody - but do not want anybody to know.

Whereas the market already depends on the electronic flow of information and the digital tools get faster and more sophisticated all the time, the rise of privacy and security concerns have to be stated as well.
With the increase of digital communication its vulnerability is increasing just as fast. And there exist two (or three) elements competing and giving the term digital security a rather drastic bitter taste: this is on the one hand the growing possibility for criminals to use modern technology not only to hide their source and work secretly but also to manipulate financial and other transfers. On the other hand there are the governments of many states telling the population that they need access to any kind of data to keep control against those criminals. And finally there are those people, living between enlightening security gaps and at the same time harming other private people's actions with their work: computer hackers.
While the potential of global information is regarded as endless, it is those elements that reduce it.

There is no definite solution, but at least some tools have been developed to improve the situation: cryptography, the freedom to encode those data that one does not want to be known by everybody, and give a possibility to decode them to those who shall know the data.

During the last 80 years cryptography has changed from a mere political into a private, economic but still political tool: at the same time it was necessary to improve the tools, eventually based on mathematics. Hence generally cryptography is regarded as something very complicated. And in many ways this is true as the modern ways of enciphering are all about mathematics.

"Crypto is not mathematics, but crypto can be highly mathematical, crypto can use mathematics, but good crypto can be done without a great reliance on complex mathematics." (W.T. Shaw)

For an introduction into cryptography and the mathematical tasks see:
http://www.sbox.tu-graz.ac.at/home/j/jonny/projects/crypto/index.htm
http://www.ccc.de/CCC-CA/policy.html

TEXTBLOCK 4/12 // URL: http://world-information.org/wio/infostructure/100437611776/100438658895
 
Beautiful bodies

However, artificial beings need not be invisible or look like Arnold Schwarzenegger in "Terminator". "My dream would be to create an artificial man that does not look like a robot but like a beautiful, graceful human being. The artificial man should be beautiful". Nadia Thalman's hopes for beautiful robots may become reality in the work of MIRALab, a research laboratory attached to the University of Geneva dedicated to realistic modelling of human functionalities. The laboratory has produced an artificial Marylyn Monroe showing just how beautiful artificial creatures can be, and there is a biography featuring details of her career and her - however virtual - love life. Yet beautiful creatures have been made before, at leas on the movie screen. Frank-N-furter, the protagonist of the Rocky Horror picture show ("I've been making a man / with blond hair and a tan / and he is good for relieving my /tension) did set remakrable esthetic standards.

While in Hindu mythology, avatars are bodies chosen by gods for their representation on earth, often animals such as swans or horses, the avatars populating cyberspace have a different function. The cyber bodies of real people, often 3 dimensional images of creatures whose aesthetics reflects both the tastes prevalent in the entertainment and advertising industries as the state of art in visual representation.

TEXTBLOCK 5/12 // URL: http://world-information.org/wio/infostructure/100437611777/100438658861
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 6/12 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
"Stealth Sites"

"Stealth sites" account for a particular form of hidden advertisement. Stealth sites look like magazines, nicely designed and featuring articles on different topics, but in reality are set up for the sole purpose of featuring a certain companies products and services. "About Wines" for example is a well-done online magazine, featuring articles on food and travel and also publishes articles on wine, which surprisingly all happen to be from Seagram.

TEXTBLOCK 7/12 // URL: http://world-information.org/wio/infostructure/100437611652/100438657995
 
Eliminating online censorship: Freenet, Free Haven and Publius

Protecting speech on the global data networks attracts an increasing attention. The efforts and the corresponding abilities of governmental authorities, corporations and copyright enforcement agencies are countered by similar efforts and abilities of researchers and engineers to provide means for anonymous and uncensored communication, as Freenet, Free Haven and Publius. All three of them show a similar design. Content is split up and spread on several servers. When a file is requested, the pieces are reassembled. This design makes it difficult to censor content. All of these systems are not commercial products.

The most advanced system seems to be Publius. Because of being designed by researchers and engineers at the prestigious AT&T Labs, Publius is a strong statement against online censorship. No longer can it be said that taking a firm stand against the use of technologies limiting the freedom of individuals is a position of radical leftists only.

For more information on Publius, see John Schwartz, Online and Unidentifiable? in: The Washington Post, June 30, 2000, http://www.washingtonpost.com/wp-dyn/articles/A21689-2000Jun29.html .

Freenet web site: http://freenet.sourceforge.net

Free Haven web site: http://www.freehaven.net

Publius web site: http://www.cs.nyu.edu/waldman/publius

TEXTBLOCK 8/12 // URL: http://world-information.org/wio/infostructure/100437611742/100438658749
 
Product Placement

With television still being very popular, commercial entertainment has transferred the concept of soap operas onto the Web. The first of this new species of "Cybersoaps" was "The Spot", a story about the ups and downs of an American commune. The Spot not only within short time attracted a large audience, but also pioneered in the field of online product placement. Besides Sony banners, the companies logo is also placed on nearly every electronic product appearing in the story. Appearing as a site for light entertainment, The Spots main goal is to make the name Sony and its product range well known within the target audience.

TEXTBLOCK 9/12 // URL: http://world-information.org/wio/infostructure/100437611652/100438658026
 
On-line Advertising and the Internet Content Industry

Applied to on-line content the advertising model leads to similar problems like in the traditional media. Dependence on advertising revenue puts pressure on content providers to consider advertising interests. Nevertheless new difficulties caused by the technical structure of online media, missing legal regulation and not yet established ethical rules, appear.

TEXTBLOCK 10/12 // URL: http://world-information.org/wio/infostructure/100437611652/100438658181
 
acceleration

TEXTBLOCK 11/12 // URL: http://world-information.org/wio/infostructure/100437611777/100438658418
 
The third industiral revolution. Life as a product.

Many years ago, the German philosopher Günther Anders already described the historical situation in which the homo creator and homo materia coincide as the "third industrial revolution". Anders, who spent many years exiled in the USA after fleeing from the Nazis, made issue of the ambivalence of modern science and technology as early as in the 1950s, and many of the concerns which today form part of the debates around the implications of computer technology are already polemically discussed in his work.

The "third industrial revolution" is characterized by men becoming the "raw material" of their own industries. Product and producer, production and consumption, technology and nature are no longer meaningful pairs of opposites. The third is also the last revolution, as it is difficult to think of further revolutions when the distinction between subject and object becomes blurred. The world is becoming a Bestand and the human body and mind are no protected zones. They are something like the last safety zone of human being which is now itself becoming a basis for technological innovation. When the subject is weakened by its technical environment, the use of technical crooks for body and mind becomes an obvious "solution", even if the technically strengthened subject is strengthened at the cost of no longer being a "subject" in the traditional, metaphysical sense. Biological processes are dissected and subjected to technical control. This technical control is technical in two senses: it is not only control through technology but by ttechnology itsself, since it is not carried out by unaided human minds, but increasingly by intelligent machines.

The point where this Andersian third industrial revolution reaches an unprecedented logic seems to lie within the realm of genetic engeneering. This example shows that the dissection of humanness - the decoding of genetic information - is tantamount to commodification. The purpose of the commercial genetic research projects is the use of genetic information as a resource for the development of new products, e.g. in pharmaceutics. Genetic products carry the promise of offering a solution to so-far uncurable diseases such as cancer, Alzeheimer, heart disorders, schizophrenia, and others, but they also open up the possibility of "breaking the chains of evolution", of actively manipulating the genetic structure of human beings and of "designing" healthy, long-living, beautiful, hard-working etc. beings. Here, the homo creator and the homo materia finally become indistinguishable and we are being to merge with our products in such a way that it "we" loses the remains of its meaning.

Since 1990 research on human genetics is organised in the Human Genome Project where universities from various countries cooperate in transcribing the entire genetic information of the predecessor of the homo sapiens , composed of 80,000 genes and more than 3 billion DNA sequences. The objective of the project is to complet the transcription process by the year 2003. One of the rationales of organising Genome research in an international fashion has been its extremely high cost, and also an ethical consideration, according to which human genetic information must not be a private property, which would be the case when genetic information becomes patentised.

But exactly this patentising is of paramount importance in the emerging "post-industrial" society where knowledge becomes the most important resource. A patent is nothing else than a property title to a piece of "know-how", and an necessary consequence commodification. When life no longer simply a natural creation but a product, it, too, will be patented and becomes a commodity.

Against the idea of the human genome as a public good, or an "open source", there is a growing competion on the part of private industry. Companies such as Celera deloped deciphering technologies which may allow an earlier completion of the project. In the case that human genetic information actually becomes patentised, then the technical possibility of interfering in human evolution would at leasst be partly in the hands of private business. What has been called a "quintessentially public resource" Iceland. In this nordic country, the government decided to allow the American genetics company DeCode to access and commercially exploit the anonymised genetic information of the entire population of Iceland. The Icelandic population provides a particularly good "sample" for research, because there has been almost no immigration since the times of the Vikings, and therefore genetic variations can be more easily detected than in populations with a more diverse genome. Also, Iceland possesses a wealth of genealogical information - many families are able to trace their origins back to the 12th century. Here modern science has found optimal laboratory conditions. Perhaps, had European history taken a different course in the 1930s and 40s, the frontier of commercial gentetic research would have found optimal conditions in an "ethnically clean" centre of Euorpe? The requirement of "purity", of "eliminating" difference prior to constructing knowledge, inscribed in the modern science since its beginnings, also applies to genome research. Except that in this kind of research humankind itself needs to fulfill laboratory standards of cleanliness, and that the biological transcription of humanness, the biological "nucleus" of the species, becomes the object of research, much like the nucleus of matter, the atom, in the 1940s and 50s.

But the commodification of life is not limited ot the human species. Genetically altered animals and plants are also suffering the same fate, and in most industrialised nations it is now possible to patent genetically engeneered species and crops. The promises of the "Green Revolution" of the 1960s are now repeated in the genetic revolution. Genetic engeneering, so it is argued, will be able to breed animals and plants which resist disease and yield more "food" and will therfore help to tackle problems of undernutrition and starvation. Companies such as Monsanto are at the forefront of developing genetically altered ("enhanced") food crops and promise to solve not only the problem of world hunger, but to improve the safety and even the taste of food. Convinced of the opposite of such high-flown promises, Vandana Shiva from the Indian Research Foundation for Science, Technology and Ecology emphasises the relationship between post-colonial style exploitation of so-called "third world" countries. She also stresses the adverse ecological impact of biotechnology: "Today, the world is on the brink of a biological diversity crisis. The constantly diminishing store of biodiversity on our planet poses an enormous environmental threat"http://www.cnn.com/bioethics/9902/iceland.dna/template.html, 22 February 1999

http://www.indiaserver.com/betas/vshiva/title.htm, 9 February 2000

TEXTBLOCK 12/12 // URL: http://world-information.org/wio/infostructure/100437611777/100438658827
 
cryptoanalysis

the study of breaking others' codes to transform a message back into a legible form without knowing the key from the beginning

INDEXCARD, 1/11
 
File Transfer Protocol (FTP)

FTP enables the transfer of files (text, image, video, sound) to and from other remote computers connected to the Internet.

INDEXCARD, 2/11
 
Internet Exchanges

Internet exchanges are intersecting points between major networks.

List of the World's Public Internet exchanges (http://www.ep.net)

http://www.ep.net/
INDEXCARD, 3/11
 
CIM

To perform manufacturing firm's functions related to design and production the CAD/CAM technology, for computer-aided design and computer-aided manufacturing, was developed. Today it is widely recognized that the scope of computer applications must extend beyond design and production to include the business functions of the firm. The name given to this more comprehensive use of computers is computer-integrated manufacturing (CIM).

INDEXCARD, 4/11
 
Samuel Thomas Soemmering's electric telegraph, 1809

With Samuel Thomas Soemmering's invention of the electrical telegraph the telegraphic transmission of messages is no longer tied to visibility, as it is the case with smoke and light signals networks. Economical and reliable, the electric telegraph became the state-of-the-art communication system for fast data transmissions, even over long distances.

Click here for an image of Soemmering's electric telegraph.

http://www.heise.de/tp/deutsch/inhalt/co/2335...
INDEXCARD, 5/11
 
DES

The U.S. Data Encryption Standard (= DES) is the most widely used encryption algorithm, especially used for protection of financial transactions. It was developed by IBM in 1971. It is a symmetric-key cryptosystem. The DES algorithm uses a 56-bit encryption key, meaning that there are 72,057,594,037,927,936 possible keys.

for more information see:
http://www.britannica.com/bcom/eb/article/3/0,5716,117763+5,00.html
http://www.cryptography.com/des/

http://www.britannica.com/bcom/eb/article/3/0...
http://www.cryptography.com/des/
INDEXCARD, 6/11
 
Gopher

Gopher is a menu system with hierarchically structured list of files that predates the World Wide Web.

Today Gopher is of diminishing importance and mostly replaced by the World Wide Web.

INDEXCARD, 7/11
 
ciphertext

the enciphered/encoded and primarily illegible text

INDEXCARD, 8/11
 
to encipher/encode

to put a word or text into ciphers/codes

INDEXCARD, 9/11
 
Gateway

A gateway is a computer supplying point-to-multipoint connections between computer networks.

INDEXCARD, 10/11
 
Intelsat

Intelsat, the world's biggest communication satellite services provider, is still mainly owned by governments, but will be privatised during 2001, like Eutelsat. A measure already discussed 1996 at an OECD competition policy roundtable in 1996. Signatory of the Intelsat treaty for the United States of America is Comsat, a private company listed on the New York Stock Exchange. Additionally Comsat is one of the United Kingdom's signatories. Aggregated, Comsat owns about 20,5% of Intelsat already and is Intelsat's biggest shareholder. In September 1998 Comsat agreed to merge with Lockheed Martin. After the merger, Lockheed Martin will hold at least 49% of Comsat share capital.

http://www.intelsat.int/index.htm

http://www.eutelsat.org/
http://www.oecd.org//daf/clp/roundtables/SATS...
http://www.comsat.com/
http://www.nyse.com/
http://www.comsat.com/
http://www.comsat.com/
http://www.comsat.com/
http://www.comsat.com/
INDEXCARD, 11/11