The third industiral revolution. Life as a product.

Many years ago, the German philosopher Günther Anders already described the historical situation in which the homo creator and homo materia coincide as the "third industrial revolution". Anders, who spent many years exiled in the USA after fleeing from the Nazis, made issue of the ambivalence of modern science and technology as early as in the 1950s, and many of the concerns which today form part of the debates around the implications of computer technology are already polemically discussed in his work.

The "third industrial revolution" is characterized by men becoming the "raw material" of their own industries. Product and producer, production and consumption, technology and nature are no longer meaningful pairs of opposites. The third is also the last revolution, as it is difficult to think of further revolutions when the distinction between subject and object becomes blurred. The world is becoming a Bestand and the human body and mind are no protected zones. They are something like the last safety zone of human being which is now itself becoming a basis for technological innovation. When the subject is weakened by its technical environment, the use of technical crooks for body and mind becomes an obvious "solution", even if the technically strengthened subject is strengthened at the cost of no longer being a "subject" in the traditional, metaphysical sense. Biological processes are dissected and subjected to technical control. This technical control is technical in two senses: it is not only control through technology but by ttechnology itsself, since it is not carried out by unaided human minds, but increasingly by intelligent machines.

The point where this Andersian third industrial revolution reaches an unprecedented logic seems to lie within the realm of genetic engeneering. This example shows that the dissection of humanness - the decoding of genetic information - is tantamount to commodification. The purpose of the commercial genetic research projects is the use of genetic information as a resource for the development of new products, e.g. in pharmaceutics. Genetic products carry the promise of offering a solution to so-far uncurable diseases such as cancer, Alzeheimer, heart disorders, schizophrenia, and others, but they also open up the possibility of "breaking the chains of evolution", of actively manipulating the genetic structure of human beings and of "designing" healthy, long-living, beautiful, hard-working etc. beings. Here, the homo creator and the homo materia finally become indistinguishable and we are being to merge with our products in such a way that it "we" loses the remains of its meaning.

Since 1990 research on human genetics is organised in the Human Genome Project where universities from various countries cooperate in transcribing the entire genetic information of the predecessor of the homo sapiens , composed of 80,000 genes and more than 3 billion DNA sequences. The objective of the project is to complet the transcription process by the year 2003. One of the rationales of organising Genome research in an international fashion has been its extremely high cost, and also an ethical consideration, according to which human genetic information must not be a private property, which would be the case when genetic information becomes patentised.

But exactly this patentising is of paramount importance in the emerging "post-industrial" society where knowledge becomes the most important resource. A patent is nothing else than a property title to a piece of "know-how", and an necessary consequence commodification. When life no longer simply a natural creation but a product, it, too, will be patented and becomes a commodity.

Against the idea of the human genome as a public good, or an "open source", there is a growing competion on the part of private industry. Companies such as Celera deloped deciphering technologies which may allow an earlier completion of the project. In the case that human genetic information actually becomes patentised, then the technical possibility of interfering in human evolution would at leasst be partly in the hands of private business. What has been called a "quintessentially public resource" Iceland. In this nordic country, the government decided to allow the American genetics company DeCode to access and commercially exploit the anonymised genetic information of the entire population of Iceland. The Icelandic population provides a particularly good "sample" for research, because there has been almost no immigration since the times of the Vikings, and therefore genetic variations can be more easily detected than in populations with a more diverse genome. Also, Iceland possesses a wealth of genealogical information - many families are able to trace their origins back to the 12th century. Here modern science has found optimal laboratory conditions. Perhaps, had European history taken a different course in the 1930s and 40s, the frontier of commercial gentetic research would have found optimal conditions in an "ethnically clean" centre of Euorpe? The requirement of "purity", of "eliminating" difference prior to constructing knowledge, inscribed in the modern science since its beginnings, also applies to genome research. Except that in this kind of research humankind itself needs to fulfill laboratory standards of cleanliness, and that the biological transcription of humanness, the biological "nucleus" of the species, becomes the object of research, much like the nucleus of matter, the atom, in the 1940s and 50s.

But the commodification of life is not limited ot the human species. Genetically altered animals and plants are also suffering the same fate, and in most industrialised nations it is now possible to patent genetically engeneered species and crops. The promises of the "Green Revolution" of the 1960s are now repeated in the genetic revolution. Genetic engeneering, so it is argued, will be able to breed animals and plants which resist disease and yield more "food" and will therfore help to tackle problems of undernutrition and starvation. Companies such as Monsanto are at the forefront of developing genetically altered ("enhanced") food crops and promise to solve not only the problem of world hunger, but to improve the safety and even the taste of food. Convinced of the opposite of such high-flown promises, Vandana Shiva from the Indian Research Foundation for Science, Technology and Ecology emphasises the relationship between post-colonial style exploitation of so-called "third world" countries. She also stresses the adverse ecological impact of biotechnology: "Today, the world is on the brink of a biological diversity crisis. The constantly diminishing store of biodiversity on our planet poses an enormous environmental threat"http://www.cnn.com/bioethics/9902/iceland.dna/template.html, 22 February 1999

http://www.indiaserver.com/betas/vshiva/title.htm, 9 February 2000

TEXTBLOCK 1/12 // URL: http://world-information.org/wio/infostructure/100437611777/100438658827
 
ZaMir.net

ZaMir.net started in 1992 trying to enable anti-war and human rights groups of former Yugoslavia to communicate with each other and co-ordinate their activities. Today there are an estimated 1,700 users in 5 different Bulletin Board Systems (Zagreb, Belgrade, Ljubljana, Sarajevo and Pristiana). Za-mir Transnational Network (ZTN) offers e-mail and conferences/newsgroups. The ZTN has its own conferences, which are exchanged between the 5 BBS, and additionally offers more than 150 international conferences. ZTN aim is to help set up systems in other cities in the post-Yugoslav countries that have difficulty connecting to the rest of the world.

History

With the war in Yugoslavia anti-war and human rights groups of former Yugoslavia found it very difficult to organize and met huge problems to co-ordinate their activities due to immense communication difficulties. So in 1992 foreign peace groups together with Institutions in Ljubljana, Zagreb and Belgrade launched the Communications Aid project. Modems were distributed to peace and anti-war groups in Ljubljana, Zagreb, Belgrade and Sarajevo and a BBS (Bulletin Board System) installed.

As after spring 1992 no directs connections could be made they were done indirectly through Austria, Germany or Britain, which also enabled a connection with the worldwide networks of BBS's. Nationalist dictators therefore lost their power to prevent communication of their people. BBS were installed in Zagreb and Belgrade and connected to the APC Network and associated networks. Za-mir Transnational Network (ZTN) was born.

Strategies and Policies

With the help of ZaMir's e-mail network it have been possible to find and coordinate humanitarian aid for some of the many refugees of the war. It has become an important means of communication for humanitarian organizations working in the war region and sister organizations form other countries. It helps co-ordinate work of activists form different countries of former Yugoslavia, and it also helps to coordinate the search for volunteers to aid in war reconstruction. ZTN also helped facilitate exchange of information undistorted by government propaganda between Croatia, Serbia and Bosnia. Independent magazines like Arkzin (Croatia) and Vreme (Serbia) now publish electronic editions on ZTN.

TEXTBLOCK 2/12 // URL: http://world-information.org/wio/infostructure/100437611734/100438659208
 
1970s: Computer-Integrated Manufacturing (CIM)

Since the 1970s there had been a growing trend towards the use of computer programs in manufacturing companies. Especially functions related to design and production, but also business functions should be facilitated through the use of computers.

Accordingly the CAD/CAM technology, related to the use of computer systems for design and production, was developed. CAD (computer-aided design) was created to assist in the creation, modification, analysis, and optimization of design. CAM (computer-aided manufacturing) was designed to help with the planning, control, and management of production operations. CAD/CAM technology, since the 1970s, has been applied in many industries, including machined components, electronics products, equipment design and fabrication for chemical processing.

To enable a more comprehensive use of computers in firms the CIM (computer-integrated manufacturing) technology, which also includes applications concerning the business functions of companies, was created. CIM systems can handle order entry, cost accounting, customer billing and employee time records and payroll. The scope of CIM technology includes all activities that are concerned with production. Therefore in many ways CIM represents the highest level of automation in manufacturing.

TEXTBLOCK 3/12 // URL: http://world-information.org/wio/infostructure/100437611663/100438659495
 
Epilogue

As scientists are working hard on a quantum computer and also on quantum cryptography one can imagine that another revolution in the study of encryption has to be expected within the next years. By then today's hardware and software tools will look extraordinary dull. At the moment it is impossible to foresee the effects on cryptography and democratic developments by those means; the best and the worst can be expected at the same time. A certain ration of pessimism and prosecution mania are probably the right mixture of emotions about those tendencies, as the idea of big brother has come into existence long ago.

At the same time it will - in part - be a decision of the people to let science work against them or not. Acceleration of data-transmission calls for an acceleration of encryption-methods. And this again falls back on us, on an acceleration of daily life, blurring the private and the public for another time.
We live in an intersection, job and private life growing together. Cryptography cannot help us in that case. The privacy in our mind, the virtuality of all private and public lies in the field of democracy, or at least what is - by connection to the Human Rights - regarded as democracy.

TEXTBLOCK 4/12 // URL: http://world-information.org/wio/infostructure/100437611776/100438658875
 
acceleration

TEXTBLOCK 5/12 // URL: http://world-information.org/wio/infostructure/100437611777/100438658418
 
Advertising and the Content Industry - The Coca-Cola Case

Attempts to dictate their rules to the media has become a common practice among marketers and the advertising industry. Similar as in the Chrysler case, where the company demanded that magazines give advance notice about controversial articles, recent attempts to put pressure on content providers have been pursued by the Coca-Cola Company.

According to a memo published by the New York Post, Coca-Cola demands a free ad from any publication that publishes a Coke ad adjacent to stories on religion, politics, disease, sex, food, drugs, environmental issues, health, or stories that employ vulgar language. "Inappropriate editorial matter" will result in the publisher being liable for a "full make good," said the memo by Coke advertising agency McCann-Erickson. Asked about this practice, a Coke spokes person said the policy has long been in effect.

(Source: Odwyerpr.com: Coke Dictates nearby Editorial. http://www.odwyerpr.com)

TEXTBLOCK 6/12 // URL: http://world-information.org/wio/infostructure/100437611652/100438657998
 
some essential definitions

some essential definitions in the field of cryptography are:
- cryptoanalysis
- cryptology
- ciphers

"Few false ideas have more firmly gripped the minds of so many intelligent men than the one that, if they just tried, they could invent a cipher that no one could break." (David Kahn)

codes
plaintext
ciphertext
to encipher/encode
to decipher/decode

The variants of encryption systems are endless.
For deciphering there exists always the same game of trial and error (first guessing the encryption method, then the code). A help to do so is pruning. Once, after a more or less long or short period a code/cipher breaks. Monoalphabetic ciphers can be broken easily and of course are no longer used today but for games.

for further information on codes and ciphers etc. see:
http://www.optonline.com/comptons/ceo/01004A.html
http://www.ridex.co.uk/cryptology/#_Toc439908851

TEXTBLOCK 7/12 // URL: http://world-information.org/wio/infostructure/100437611776/100438659070
 
"Stealth Sites"

"Stealth sites" account for a particular form of hidden advertisement. Stealth sites look like magazines, nicely designed and featuring articles on different topics, but in reality are set up for the sole purpose of featuring a certain companies products and services. "About Wines" for example is a well-done online magazine, featuring articles on food and travel and also publishes articles on wine, which surprisingly all happen to be from Seagram.

TEXTBLOCK 8/12 // URL: http://world-information.org/wio/infostructure/100437611652/100438657995
 
Steganography

Ciphers as well as codes are transmitted openly. Everyone can see that they exist. Not so with steganograms.
Steganography is the art and science of communicating in a way which hides the existence of the secret part in that communication. During the Italian Renaissance and the time of the Elizabethan Age in England cryptography was very popular, for political reasons as well as for amusements (see John Dee).
In literature steganography played an important role. Many steganographs of that period have only been deciphered recently like some of the Shakespearean sonnets, which now seem to proof that the actor William Shakespeare was not the author of the famous poems and dramas, but that the latter' name was, and Francis Bacon, or even Francis Tudor, as some ciphers and other sources talk of him as Queen Elisabeth I.'s secret son.

for further details see:
http://home.att.net/~tleary/
http://www.thur.de/ulf/stegano/
http://www2.prestel.co.uk/littleton/gm2_rw.htm

One kind of steganogram is digital watermarking:
Watermarks protect digital images, videos, but also audio and multimedia products. They are made out of digital signals, put into other digital signals. They try to be invisible on first sight and should be nearly impossible to remove. The process of producing watermarks is to overlay some sort of identifying image over the original image (non-digital watermarks, like on money can be seen by holding the paper against light). Copying the image destroys the watermark, which cannot be copied. Any alteration of the original destroys the watermark, too.

Watermarking is one of the typical inventions of cryptography to assist the biggest content owners, but advertised as something necessary and helpful for everybody. Who in fact gets any advantage out of watermarking? The private user most of the time will not really need it except for small entities of pictures maybe.
But the big enterprises do. There is a tendency to watermark more and more information in the Internet, which until now was considered as free and as a cheap method to receive information. Watermarking could stop this democratic development.

for further information see:
http://www.isse.gmu.edu/~njohnson/Steganography

TEXTBLOCK 9/12 // URL: http://world-information.org/wio/infostructure/100437611776/100438659021
 
Commercial vs. Independent Content

Commercial media aim towards economies of scale and scope, with the goal to maximize profits. As advertising money usually is their primary source of revenue their content very often is attuned to meet the needs of advertisers and marketers. Information necessary for a citizen's participation in the public sphere usually only plays a minor role in their programming, as it does not comply with the demands of an economic system whose principal aim is the generation of profit. They also virtually always are structured in accord with and to help reinforce society's defining hierarchical social relationships, and are generally controlled by and controlling of other major social institutions, particularly corporations.

Independent content provider on the other hand mostly act on a non-profit basis and try to avoid dependence on corporate powers and the state. One of their main concerns is the critical observation of public interest issues. The central aim of independent content provider's activities usually is to bring aspects and standpoints neglected by the (commercial) mainstream media to the public and subvert society's defining hierarchical social relationships. Promoting public debate and an active civil society they engage in the organization of alert actions and information campaigns or create subversive art

TEXTBLOCK 10/12 // URL: http://world-information.org/wio/infostructure/100437611734/100438659280
 
Eliminating online censorship: Freenet, Free Haven and Publius

Protecting speech on the global data networks attracts an increasing attention. The efforts and the corresponding abilities of governmental authorities, corporations and copyright enforcement agencies are countered by similar efforts and abilities of researchers and engineers to provide means for anonymous and uncensored communication, as Freenet, Free Haven and Publius. All three of them show a similar design. Content is split up and spread on several servers. When a file is requested, the pieces are reassembled. This design makes it difficult to censor content. All of these systems are not commercial products.

The most advanced system seems to be Publius. Because of being designed by researchers and engineers at the prestigious AT&T Labs, Publius is a strong statement against online censorship. No longer can it be said that taking a firm stand against the use of technologies limiting the freedom of individuals is a position of radical leftists only.

For more information on Publius, see John Schwartz, Online and Unidentifiable? in: The Washington Post, June 30, 2000, http://www.washingtonpost.com/wp-dyn/articles/A21689-2000Jun29.html .

Freenet web site: http://freenet.sourceforge.net

Free Haven web site: http://www.freehaven.net

Publius web site: http://www.cs.nyu.edu/waldman/publius

TEXTBLOCK 11/12 // URL: http://world-information.org/wio/infostructure/100437611742/100438658749
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 12/12 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
Electronic Data Interchange (EDI)

EDI is an international standard relating to the exchange of trade goods and services. It enables trading partners to conduct routine business transactions, such as purchase orders, invoices and shipping notices independent of the computer platform used by the trading partners. Standardization by EDI translation software assures the correct interpretation of data.

EDI might become increasingly important to electronic commerce.

INDEXCARD, 1/9
 
Seagram Company Ltd.

Seagram is the largest producer and marketer of distilled spirits in the world. It is headquartered in Montreal, Que. The company began when Distillers Corp., Ltd., a Montreal distillery owned by Samuel Bronfman, acquired Joseph E. Seagram & Sons in 1928. Under the leadership of the founder's son, Edgar M. Bronfman, who became head of the company in 1971, the firm diversified during the 1950s and '60s from its original base of blended whiskies into the production and marketing of scotch, bourbon, rum, vodka, gin, and many different wines. It also expanded into the European, Latin American, East Asian, and African markets with its products. The company adopted its present name in 1975. It produces more than 400 different brands of distilled spirits and wines. Edgar M. Bronfman, Jr., took over as head of the company in 1989. Seagram in 1995 purchased MCA Inc., a media and entertainment firm, from the Matsushita Electric Industrial Company.

INDEXCARD, 2/9
 
Virtual Private Networks

Virtual Private Networks provide secured connections to a corporate site over a public network as the Internet. Data transmitted through secure connections are encrypted and therefore have to be encrypted before they can be read.
These networks are called virtual because connections are provided only when you connect to a corporate site; they do not rely on dedicated lines and support mobile use.

INDEXCARD, 3/9
 
Internet Exchanges

Internet exchanges are intersecting points between major networks.

List of the World's Public Internet exchanges (http://www.ep.net)

http://www.ep.net/
INDEXCARD, 4/9
 
The Spot

http://www.thespot.com/

http://www.thespot.com/
INDEXCARD, 5/9
 
Galileo Galilee

Galileo Galilee (1564-1642), the Italian Mathematician and Physicist is called the father of Enlightenment. He proofed the laws of the free fall, improved the technique for the telescope and so on. Galilee is still famous for his fights against the Catholic Church. He published his writings in Italian instead of writing in Latin. Like this, everybody could understand him, which made him popular. As he did not stop talking about the world as a ball (the Heliocentric World System) instead of a disk, the Inquisition put him on trial twice and forbid him to go on working on his experiments.

INDEXCARD, 6/9
 
Cooperative Association of Internet Data Analysis (CAIDA)

Based at the University of California's San Diego Supercomputer Center, CAIDA supports cooperative efforts among the commercial, government and research communities aimed at promoting a scalable, robust Internet infrastructure. It is sponsored by the Defense Advanced Research Project Agency (DARPA) through its Next Generation Internet program, by the National Science Foundation, Cisco, Inc., and Above.net.

INDEXCARD, 7/9
 
Integrated circuit

Also called microcircuit, the integrated circuit is an assembly of electronic components, fabricated as a single unit, in which active semiconductor devices (transistors and diodes) and passive devices (capacitors and resistors) and their interconnections are built up on a chip of material called a substrate (most commonly made of silicon). The circuit thus consists of a unitary structure with no connecting wires. The individual circuit elements are microscopic in size.

INDEXCARD, 8/9
 
Terrestrial antennas

Microwave transmission systems based on terrestrial antennas are similar to satellite transmission system. Providing reliable high-speed access, they are used for cellular phone networks.

The implementation of the Wide Application Protocol (WAP) makes the wireless access to Internet services as E-Mail and even the World Wide Web via cellular phones convenient. Therefore microwave transmission systems become increasingly important.

INDEXCARD, 9/9