Challenges for Copyright by ICT: Digital Content Providers

Providers of digital information might be confronted with copyright related problems when using some of the special features of hypertext media like frames and hyperlinks (which both use third party content available on the Internet to enhance a webpage or CD ROM), or operate a search engine or online directory on their website.

Framing

Frames are often used to help define, and navigate within, a content provider's website. Still, when they are used to present (copyrighted) third party material from other sites issues of passing off and misleading or deceptive conduct, as well as copyright infringement, immediately arise.

Hyperlinking

It is generally held that the mere creation of a hyperlink does not, of itself, infringe copyright as usually the words indicating a link or the displayed URL are unlikely to be considered a "work". Nevertheless if a link is clicked on the users browser will download a full copy of the material at the linked address creating a copy in the RAM of his computer courtesy of the address supplied by the party that published the link. Although it is widely agreed that the permission to download material over the link must be part of an implied license granted by the person who has made the material available on the web in the first place, the scope of this implied license is still the subject of debate. Another option that has been discussed is to consider linking fair use.

Furthermore hyperlinks, and other "information location tools", like online directories or search engines could cause their operators trouble if they refer or link users to a site that contains infringing material. In this case it is yet unclear whether providers can be held liable for infringement.

TEXTBLOCK 1/24 // URL: http://world-information.org/wio/infostructure/100437611725/100438659590
 
Enforcement: Copyright Management and Control Technologies

With the increased ease of the reproduction and transmission of unauthorized copies of digital works over electronic networks concerns among the copyright holder community have arisen. They fear a further growth of copyright piracy and demand adequate protection of their works. A development, which started in the mid 1990s and considers the copyright owner's apprehensions, is the creation of copyright management systems. Technological protection for their works, the copyright industry argues, is necessary to prevent widespread infringement, thus giving them the incentive to make their works available online. In their view the ideal technology should be "capable of detecting, preventing, and counting a wide range of operations, including open, print, export, copying, modifying, excerpting, and so on." Additionally such systems could be used to maintain "records indicating which permissions have actually been granted and to whom".

TEXTBLOCK 2/24 // URL: http://world-information.org/wio/infostructure/100437611725/100438659674
 
0 - 1400 A.D.

150
A smoke signals network covers the Roman Empire

The Roman smoke signals network consisted of towers within a visible range of each other and had a total length of about 4500 kilometers. It was used for military signaling.
For a similar telegraph network in ancient Greece see Aeneas Tacitus' optical communication system.

About 750
In Japan block printing is used for the first time.

868
In China the world's first dated book, the Diamond Sutra, is printed.

1041-1048
In China moveable types made from clay are invented.

1088
First European medieval university is established in Bologna.

The first of the great medieval universities was established in Bologna. At the beginning universities predominantly offered a kind of do-it-yourself publishing service.

Books still had to be copied by hand and were so rare that a copy of a widely desired book qualified for being invited to a university. Holding a lecture equaled to reading a book aloud, like a priest read from the Bible during services. Attending a lecture equaled to copy a lecture word by word, so that you had your own copy of a book, thus enabling you to hold a lecture, too.

For further details see History of the Idea of a University, http://quarles.unbc.edu/ideas/net/history/history.html

TEXTBLOCK 3/24 // URL: http://world-information.org/wio/infostructure/100437611796/100438659702
 
Biometric applications: surveillance

Biometric technologies are not surveillance technologies in themselves, but as identification technologies they provide an input into surveillance which can make such as face recognition are combined with camera systems and criminal data banks in order to supervise public places and single out individuals.

Another example is the use of biometrics technologies is in the supervision of probationers, who in this way can carry their special hybrid status between imprisonment and freedom with them, so that they can be tracked down easily.

Unlike biometric applications in access control, where one is aware of the biometric data extraction process, what makes biometrics used in surveillance a particularly critical issue is the fact that biometric samples are extracted routinely, unnoticed by the individuals concerned.

TEXTBLOCK 4/24 // URL: http://world-information.org/wio/infostructure/100437611729/100438658740
 
Global data bodies - intro

- Education files, insurance files, tax files, communication files, consumption files, medical files, travel files, criminal files, investment files, files into infinity ...

Critical Art Ensemble

Global data bodies

1. Introduction

Informatisation has meant that things that once were "real", i.e. whose existence could be experienced sensually, are becoming virtual. Instead of the real existence of a thing, the virtual refers to its possibility of existence. As this process advances, an increasing identification of the possible with the real occurs. Reality migrates into a dim and dematerialised grey area. In the end, the possible counts for the real, virtualisation creates an "as-if" experience.

The experience of the body is also affected by this process. For example, in bio-technology, the human body and its functions are digitised, which prepares and understanding of the body exlusively in terms of its potential manipulation, the body becomes whatever it could be. But digitisation has not only affected the understanding and the social significance of the body, it has also altered the meaning of presence, traditionally identified with the body. The advance of information and communication technologies (ICTs) has meant that for an increasing number of activities we no longer need be physically present, our "virtual" presence, achieved by logging onto a electronic information network, is sufficient.

This development, trumpeted as the pinnacle of convenience by the ICT industries and governments interested in attracting investment, has deeply problematic aspects as well. For example, when it is no longer "necessary" to be physically present, it may soon no longer be possible or allowed. Online-banking, offered to customers as a convenience, is also serves as a justification for charging higher fees from those unwilling or unable to add banking to their household chores. Online public administration may be expected to lead to similar effects. The reason for this is that the digitalisation of the economy relies on the production of surplus data. Data has become the most important raw material of modern economies.

In modern economies, informatisation and virtualisation mean that people are structurally forced to carry out their business and life their lives in such a way as to generate data.

Data are the most important resource for the New Economy. By contrast, activities which do not leave behind a trace of data, as for example growing your own carrots or paying cash rather than by plastic card, are discouraged and structurally suppressed.

TEXTBLOCK 5/24 // URL: http://world-information.org/wio/infostructure/100437611761/100438659649
 
Hill & Knowlton

Although it is generally hard to distinguish between public relations and propaganda, Hill & Knowlton, the worlds leading PR agency, represents an extraordinary example for the manipulation of public opinion with public relations activities. Hill & Knowlton did not only lobby for countries, accused of the abuse of human rights, like China, Peru, Israel, Egypt and Indonesia, but also represented the repressive Duvalier regime in Haiti.

It furthermore played a central role in the Gulf War. On behalf of the Kuwaiti government it presented a 15-year-old girl to testify before Congress about human rights violations in a Kuwaiti hospital. The girl, later found out to be the daughter of Kuwait's ambassador to the U.S., and its testimony then became the centerpiece of a finely tuned PR campaign orchestrated by Hill & Knowlton and co-ordinated with the White House on behalf of the government of Kuwait an the Citizens for a Free Kuwait group. Inflaming public opinion against Iraq and bringing the U.S. Congress in favor of war in the Gulf, this probably was one of the largest and most effective public relations campaigns in history.

Running campaigns against abortion for the Catholic Church and representing the Church of Scientology, large PR firms like Hill & Knowlton, scarcely hesitate to manipulate public and congressional opinion and government policy through media campaigns, congressional hearings, and lobbying, when necessary. Also co-operation with intelligence agencies seems to be not unknown to Hill & Knowlton.

Accused of pursuing potentially illegal proxy spying operation for intelligence agencies, Richard Cheney, head of Hill & Knowltons New York office, denied this allegations, but said that "... in such a large organization you never know if there's not some sneak operation going on." On the other hand former CIA official Robert T. Crowley acknowledged, that "Hill & Knowlton's overseas offices were perfect 'cover` for the ever-expanding CIA. Unlike other cover jobs, being a public relations specialist did not require technical training for CIA officers." Furthermore the CIA, Crowley admitted, used its Hill & Knowlton connections to "... put out press releases and make media contacts to further its positions. ... Hill & Knowlton employees at the small Washington office and elsewhere distributed this material through CIA assets working in the United States news media."

(Source: Carlisle, Johan: Public Relationships: Hill & Knowlton, Robert Gray, and the CIA. http://mediafilter.org/caq/)

TEXTBLOCK 6/24 // URL: http://world-information.org/wio/infostructure/100437611652/100438658088
 
History: "Indigenous Tradition"

In preliterate societies the association of rhythmic or repetitively patterned utterances with supernatural knowledge endures well into historic times. Knowledge is passed from one generation to another. Similar as in the Southern tradition intellectual property rights are rooted in a concept of 'collective' or 'communal' intellectual property existing in perpetuity and not limited to the life of an individual creator plus some number of years after his or her death. Often rights are exercised by only one individual in each generation, often through matrilineal descent.


TEXTBLOCK 7/24 // URL: http://world-information.org/wio/infostructure/100437611725/100438659557
 
Who owns the Internet and who is in charge?

The Internet/Matrix still depends heavily on public infrastructure and there is no dedicated owner of the whole Internet/Matrix, but the networks it consists of are run and owned by corporations and institutions. Access to the Internet is usually provided by Internet Service Providers (ISPs) for a monthly fee. Each network is owned by someone and has a network operation center from where it is centrally controlled, but the Internet/Matrix is not owned by any single authority and has no network operation center of its own. No legal authority determines how and where networks can be connected together, this is something the managers of networks have to agree about. So there is no way to ever gain ultimate control of the Matrix/Internet.
The in some respects decentralized Matrix/Internet architecture and administration do not imply that there are no authorities for oversight and common standards for sustaining basic operations, for administration: There are authorities for IP number and domain name registrations, e.g.
Ever since the organizational structures for Internet administration have changed according to the needs to be addressed. Up to now, administration of the Internet is a collaborative undertaking of several loose cooperative bodies with no strict hierarchy of authority. These bodies make decisions on common guidelines, as communication protocols, e.g., cooperatively, so that compatibility of software is guaranteed. But they have no binding legal authority, nor can they enforce the standards they have agreed upon, nor are they wholly representative for the community of Internet users. The Internet has no official governing body or organization; most parts are still administered by volunteers.
Amazingly, there seems to be an unspoken and uncodified consent of what is allowed and what is forbidden on the Internet that is widely accepted. Codifications, as the so-called Netiquette, are due to individual efforts and mostly just expressively stating the prevailing consent. Violations of accepted standards are fiercely rejected, as reactions to misbehavior in mailing lists and newsgroups prove daily.
Sometimes violations not already subject to law become part of governmental regulations, as it was the case with spamming, the unsolicited sending of advertising mail messages. But engineers proved to be quicker and developed software against spamming. So, in some respects, the Internet is self-regulating, indeed.
For a detailed report on Internet governance, click here.

TEXTBLOCK 8/24 // URL: http://world-information.org/wio/infostructure/100437611791/100438658447
 
Another Question of Security

Even with the best techniques it is impossible to invent a cryptographic system that is absolutely safe/unbreakable. To decipher a text means to go through many, sometimes nearly - but never really - endless attempts. For the computers of today it might take hundreds of years or even more to go through all possibilities of codes, but still, finally the code stays breakable. The much faster quantum computers will proof that one day.
Therefore the decision to elect a certain method of enciphering finally is a matter of trust.

For the average user of computers it is rather difficult to understand or even realize the dangers and/or the technological background of electronic transmission of data. For the majority thinking about one's own necessities for encryption first of all means to trust others, the specialists, to rely on the information they provide.
The websites explaining the problems behind (and also the articles and books concerning the topic) are written by experts of course as well, very often in their typical scientific language, merely understandable for laymen. The introductions and other superficial elements of those articles can be understood, whereas the real background appears as untouchable spheres of knowledge.

The fact that dangers are hard to see through and the need for security measures appears as something most people know from media reports, leads directly to the problem of an underdeveloped democracy in the field of cryptography. Obviously the connection between cryptography and democracy is rather invisible for many people. Those mentioned media reports often specialize in talking about the work computer hackers do (sometimes being presented as criminals, sometimes as heroes) and the danger to lose control over the money drawn away from one's bank account, if someone steals the credit card number or other important financial data. The term "security", surely connected to those issues, is a completely different one from the one that is connected to privacy.
It is especially the latter that touches the main elements of democracy.

for the question of security see:
http://www-db.stanford.edu/pub/gio/CS99I/security.html

TEXTBLOCK 9/24 // URL: http://world-information.org/wio/infostructure/100437611776/100438658850
 
Definition

During the last 20 years the old Immanuel Wallerstein-paradigm of center - periphery and semi-periphery found a new costume: ICTs. After Colonialism, Neo-Colonialism and Neoliberalism a new method of marginalization is emerging: the digital divide.

"Digital divide" describes the fact that the world can be divided into people who
do and people who do not have access to (or the education to handle with) modern information technologies, e.g. cellular telephone, television, Internet. This digital divide is concerning people all over the world, but as usually most of all people in the formerly so called third world countries and in rural areas suffer; the poor and less-educated suffer from that divide.
More than 80% of all computers with access to the Internet are situated in larger cities.

"The cost of the information today consists not so much of the creation of content, which should be the real value, but of the storage and efficient delivery of information, that is in essence the cost of paper, printing, transporting, warehousing and other physical distribution means, plus the cost of the personnel manpower needed to run these `extra' services ....Realizing an autonomous distributed networked society, which is the real essence of the Internet, will be the most critical issue for the success of the information and communication revolution of the coming century of millennium."
(Izumi Aizi)

for more information see:
http://www.whatis.com/digital_divide.htm

TEXTBLOCK 10/24 // URL: http://world-information.org/wio/infostructure/100437611730/100438659300
 
Intellectual Property: A Definition

Intellectual property, very generally, relates to the output, which result from intellectual activity in the industrial, scientific, literary and artistic fields. Traditionally intellectual property is divided into two branches:

1) Industrial Property

a) Inventions
b) Marks (trademarks and service marks)
c) Industrial designs
d) Unfair competition (trade secrets)
e) Geographical indications (indications of source and appellations of origin)

2) Copyright

The protection of intellectual property is guaranteed through a variety of laws, which grant the creators of intellectual goods, and services certain time-limited rights to control the use made of their products. Those rights apply to the intellectual creation as such, and not to the physical object in which the work may be embodied.

TEXTBLOCK 11/24 // URL: http://world-information.org/wio/infostructure/100437611725/100438659434
 
Late 1950s - Early 1960s: Second Generation Computers

An important change in the development of computers occurred in 1948 with the invention of the transistor. It replaced the large, unwieldy vacuum tube and as a result led to a shrinking in size of electronic machinery. The transistor was first applied to a computer in 1956. Combined with the advances in magnetic-core memory, the use of transistors resulted in computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.

Stretch by IBM and LARC by Sperry-Rand (1959) were the first large-scale machines to take advantage of the transistor technology (and also used assembly language instead of the difficult machine language). Both developed for atomic energy laboratories could handle enormous amounts of data, but still were costly and too powerful for the business sector's needs. Therefore only two LARC's were ever installed.

Throughout the early 1960s there were a number of commercially successful computers (for example the IBM 1401) used in business, universities, and government and by 1965 most large firms routinely processed financial information by using computers. Decisive for the success of computers in business was the stored program concept and the development of sophisticated high-level programming languages like FORTRAN (Formular Translator), 1956, and COBOL (Common Business-Oriented Language), 1960, that gave them the flexibility to be cost effective and productive. The invention of second generation computers also marked the beginning of an entire branch, the software industry, and the birth of a wide range of new types of careers.

TEXTBLOCK 12/24 // URL: http://world-information.org/wio/infostructure/100437611663/100438659439
 
Internet, Intranets, Extranets, and Virtual Private Networks

With the rise of networks and the corresponding decline of mainframe services computers have become communication devices instead of being solely computational or typewriter-like devices. Corporate networks become increasingly important and often use the Internet as a public service network to interconnect. Sometimes they are proprietary networks.

Software companies, consulting agencies, and journalists serving their interests make some further differences by splitting up the easily understandable term "proprietary networks" into terms to be explained and speak of Intranets, Extranets, and Virtual Private Networks.

Cable TV networks and online services as Europe Online, America Online, and Microsoft Network are also proprietary networks. Although their services resemble Internet services, they offer an alternative telecommunication infrastructure with access to Internet services for their subscribers.
America Online is selling its service under the slogan "We organize the Web for you!" Such promises are more frightening than promising because "organizing" is increasingly equated with "filtering" of seemingly objectionable messages and "rating" of content. For more information on these issues, click here If you want to know more about the technical nature of computer networks, here is a link to the corresponding article in the Encyclopaedia Britannica.

Especially for financial transactions, secure proprietary networks become increasingly important. When you transfer funds from your banking account to an account in another country, it is done through the SWIFT network, the network of the Society for Worldwide Interbank Financial Telecommunication (SWIFT). According to SWIFT, in 1998 the average daily value of payments messages was estimated to be above U$ 2 trillion.

Electronic Communications Networks as Instinet force stock exchanges to redefine their positions in trading of equities. They offer faster trading at reduced costs and better prices on trades for brokers and institutional investors as mutual funds and pension funds. Last, but not least clients are not restricted to trading hours and can trade anonymously and directly, thereby bypassing stock exchanges.

TEXTBLOCK 13/24 // URL: http://world-information.org/wio/infostructure/100437611791/100438658384
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 14/24 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
Legal Protection: TRIPS (Trade-Related Aspects of Intellectual Property Rights)

Another important multilateral treaty concerned with intellectual property rights is the TRIPS agreement, which was devised at the inauguration of the Uruguay Round negotiations of the WTO in January 1995. It sets minimum standards for the national protection of intellectual property rights and procedures as well as remedies for their enforcement (enforcement measures include the potential for trade sanctions against non-complying WTO members). The TRIPS agreement has been widely criticized for its stipulation that biological organisms be subject to intellectual property protection. In 1999, 44 nations considered it appropriate to treat plant varieties as intellectual property.

The complete TRIPS agreement can be found on: http://www.wto.org/english/tratop_e/trips_e/t_agm1_e.htm

TEXTBLOCK 15/24 // URL: http://world-information.org/wio/infostructure/100437611725/100438659758
 
Challenges for Copyright by ICT: Introduction

Traditional copyright and the practice of paying royalties to the creators of intellectual property have emerged with the introduction of the printing press (1456). Therefore early copyright law has been tailored to the technology of print and the (re) production of works in analogue form. Over the centuries legislation concerning the protection of intellectual property has been adapted several times in order to respond to the technological changes in the production and distribution of information.

Yet again new technologies have altered the way of how (copyrighted) works are produced, copied, made obtainable and distributed. The emergence of global electronic networks and the increased availability of digitalized intellectual property confront existing copyright with a variety of questions and challenges. Although the combination of several types of works within one larger work or on one data carrier, and the digital format (although this may be a recent development it has been the object of detailed legal scrutiny), as well as networking (telephone and cable networks have been in use for a long time, although they do not permit interactivity) are nothing really new, the circumstance that recent technologies allow the presentation and storage of text, sound and visual information in digital form indeed is a novel fact. Like that the entire information can be generated, altered and used by and on one and the same device, irrespective of whether it is provided online or offline.


TEXTBLOCK 16/24 // URL: http://world-information.org/wio/infostructure/100437611725/100438659517
 
Legal Protection: European Union

Within the EU's goal of establishing a European single market also intellectual property rights are of significance. Therefore the European Commission aims at the harmonization of the respective national laws of the EU member states and for a generally more effective protection of intellectual property on an international level. Over the years it has adopted a variety of Conventions and Directives concerned with different aspects of the protection of industrial property as well as copyright and neighboring rights.

An overview of EU activities relating to intellectual property protection is available on the website of the European Commission (DG Internal Market): http://www.europa.eu.int/comm/internal_market/en/intprop/intprop/index.htm

TEXTBLOCK 17/24 // URL: http://world-information.org/wio/infostructure/100437611725/100438659574
 
Timeline Cryptography - Introduction

Besides oral conversations and written language many other ways of information-transport are known: like the bush telegraph, drums, smoke signals etc. Those methods are not cryptography, still they need en- and decoding, which means that the history of language, the history of communication and the history of cryptography are closely connected to each other
The timeline gives an insight into the endless fight between enciphering and deciphering. The reasons for them can be found in public and private issues at the same time, though mostly connected to military maneuvers and/or political tasks.

One of the most important researchers on Cryptography through the centuries is David Kahn; many parts of the following timeline are originating from his work.

TEXTBLOCK 18/24 // URL: http://world-information.org/wio/infostructure/100437611776/100438658824
 
Anonymity

"Freedom of anonymous speech is an essential component of free speech."

Ian Goldberg/David Wagner, TAZ Servers and the Rewebber Network: Enabling Anonymous Publishing on the World Wide Web, in: First Monday 3,4, 1999

Someone wants to hide one's identity, to remain anonymous, if s/he fears to be holding accountable for something, say, a publication, that is considered to be prohibited. Anonymous publishing has a long tradition in European history. Writers of erotic literature or pamphlets, e. g., preferred to use pseudonyms or publish anonymously. During the Enlightenment books as d'Alembert's and Diderot's famous Encyclopaedia were printed and distributed secretly. Today Book Locker, a company selling electronic books, renews this tradition by allowing to post writings anonymously, to publish without the threat of being perishing for it. Sometimes anonymity is a precondition for reporting human rights abuses. For example, investigative journalists and regime critics may rely on anonymity. But we do not have to look that far; even you might need or use anonymity sometimes, say, when you are a woman wanting to avoid sexual harassment in chat rooms.

The original design of the Net, as far as it is preserved, offers a relatively high degree of privacy, because due to the client-server model all what is known about you is a report of the machine from which information was, respectively is requested. But this design of the Net interferes with the wish of corporations to know you, even to know more about you than you want them to know. What is euphemistically called customer relationship management systems means the collection, compilation and analysis of personal information about you by others.

In 1997 America Online member Timothy McVeigh, a Navy employee, made his homosexuality publicly known in a short autobiographical sketch. Another Navy employee reading this sketch informed the Navy. America Online revealed McVeigh's identity to the Navy, who discharged McVeigh. As the consequence of a court ruling on that case, Timothy McVeigh was allowed to return to the Navy. Sometimes anonymity really matters.

On the Net you still have several possibilities to remain anonymous. You may visit web sites via an anonymizing service. You might use a Web mail account (given the personal information given to the web mail service provider is not true) or you might use an anonymous remailing service which strips off the headers of your mail to make it impossible to identify the sender and forward your message. Used in combination with encryption tools and technologies like FreeHaven or Publius anonymous messaging services provide a powerful tool for countering censorship.

In Germany, in 1515, printers had to swear not to print or distribute any publication bypassing the councilmen. Today repressive regimes, such as China and Burma, and democratic governments, such as the France and Great Britain, alike impose or already have imposed laws against anonymous publishing on the Net.

Anonymity might be used for abuses, that is true, but "the burden of proof rests with those who would seek to limit it. (Rob Kling, Ya-ching Lee, Al Teich, Mark S. Frankel, Assessing Anonymous Communication on the Internet: Policy Deliberations, in: The Information Society, 1999).

TEXTBLOCK 19/24 // URL: http://world-information.org/wio/infostructure/100437611742/100438659040
 
Virtual cartels, oligopolistic structures

Global networks require global technical standards ensuring the compatibility of systems. Being able to define such standards makes a corporation extremely powerful. And it requires the suspension of competitive practices. Competition is relegated to the symbolic realm. Diversity and pluralism become the victims of the globalisation of baroque sameness.

The ICT market is dominated by incomplete competition aimed at short-term market domination. In a very short time, new ideas can turn into best-selling technologies. Innovation cycles are extremely short. But today's state-of-the-art products are embryonic trash.

    According to the Computer and Communications Industry Association, Microsoft is trying to aggressively take over the network market. This would mean that AT&T would control 70 % of all long distance phone calls and 60 % of cable connections.



    AOL and Yahoo are lone leaders in the provider market. AOL has 21 million subscribers in 100 countries. In a single month, AOL registers 94 million visits. Two thirds of all US internet users visited Yahoo in December 1999.



    The world's 13 biggest internet providers are all American.



    AOL and Microsoft have concluded a strategic cross-promotion deal. In the US, the AOL icon is installed on every Windows desktop. AOL has also concluded a strategic alliance with Coca Cola.


TEXTBLOCK 20/24 // URL: http://world-information.org/wio/infostructure/100437611709/100438658963
 
Atrocity Stories

Atrocity stories are nothing else than lies; the two words "atrocity stories" simply pretend to be more diplomatic.
The purpose is to destroy an image of the enemy, to create a new one, mostly a bad one. The story creating the image is not necessarily made up completely. It can also be a changed into a certain variable direction.
The most important thing about atrocity stories is to follow the line of possibility. Even if the whole story is made up it must be probable or at least possible, following rumors. Most successful might it be if a rumor is spread on purpose, some time before the atrocity story is launched, because as soon as something seems to be familiar, it is easier to believe it.

TEXTBLOCK 21/24 // URL: http://world-information.org/wio/infostructure/100437611661/100438658524
 
Basics: Introduction

Copyright law is a branch of intellectual property law and deals with the rights of intellectual creators in their works. The scope of copyright protection as laid down in Article 2 of the 1996 WIPO Copyright Treaty "... extends to expressions and not to ideas, procedures, methods of operation or mathematical concepts as such." Copyright law protects the creativity concerning the choice and arrangement of words, colors, musical notes etc. It grants the creators of certain specified works exclusive rights relating to the "copying" and use of their original creation.


TEXTBLOCK 22/24 // URL: http://world-information.org/wio/infostructure/100437611725/100438659520
 
Timeline 1900-1970 AD

1913 the wheel cipher gets re-invented as a strip

1917 William Frederick Friedman starts working as a cryptoanalyst at Riverbank Laboratories, which also works for the U.S. Government. Later he creates a school for military cryptoanalysis

- an AT&T-employee, Gilbert S. Vernam, invents a polyalphabetic cipher machine that works with random-keys

1918 the Germans start using the ADFGVX-system, that later gets later by the French Georges Painvin

- Arthur Scherbius patents a ciphering machine and tries to sell it to the German Military, but is rejected

1919 Hugo Alexander Koch invents a rotor cipher machine

1921 the Hebern Electric Code, a company producing electro-mechanical cipher machines, is founded

1923 Arthur Scherbius founds an enterprise to construct and finally sell his Enigma machine for the German Military

late 1920's/30's more and more it is criminals who use cryptology for their purposes (e.g. for smuggling). Elizabeth Smith Friedman deciphers the codes of rum-smugglers during prohibition regularly

1929 Lester S. Hill publishes his book Cryptography in an Algebraic Alphabet, which contains enciphered parts

1933-1945 the Germans make the Enigma machine its cryptographic main-tool, which is broken by the Poles Marian Rejewski, Gordon Welchman and Alan Turing's team at Bletchley Park in England in 1939

1937 the Japanese invent their so called Purple machine with the help of Herbert O. Yardley. The machine works with telephone stepping relays. It is broken by a team of William Frederick Friedman. As the Japanese were unable to break the US codes, they imagined their own codes to be unbreakable as well - and were not careful enough.

1930's the Sigaba machine is invented in the USA, either by W.F. Friedman or his colleague Frank Rowlett

- at the same time the British develop the Typex machine, similar to the German Enigma machine

1943 Colossus, a code breaking computer is put into action at Bletchley Park

1943-1980 the cryptographic Venona Project, done by the NSA, is taking place for a longer period than any other program of that type

1948 Shannon, one of the first modern cryptographers bringing mathematics into cryptography, publishes his book A Communications Theory of Secrecy Systems

1960's the Communications-Electronics Security Group (= CESG) is founded as a section of Government Communications Headquarters (= GCHQ)

late 1960's the IBM Watson Research Lab develops the Lucifer cipher

1969 James Ellis develops a system of separate public-keys and private-keys

TEXTBLOCK 23/24 // URL: http://world-information.org/wio/infostructure/100437611776/100438658921
 
Timeline BC

~ 1900 BC: Egyptian writers use non-standard Hieroglyphs in inscriptions of a royal tomb; supposedly this is not the first but the first documented example of written cryptography

1500 an enciphered formula for the production of pottery is done in Mesopotamia

parts of the Hebrew writing of Jeremiah's words are written down in "atbash", which is nothing else than a reverse alphabet and one of the first famous methods of enciphering

4th century Aeneas Tacticus invents a form of beacons, by introducing a sort of water-clock

487 the Spartans introduce the so called "skytale" for sending short secret messages to and from the battle field

170 Polybius develops a system to convert letters into numerical characters, an invention called the Polybius Chequerboard.

50-60 Julius Caesar develops an enciphering method, later called the Caesar Cipher, shifting each letter of the alphabet an amount which is fixed before. Like atbash this is a monoalphabetic substitution.

TEXTBLOCK 24/24 // URL: http://world-information.org/wio/infostructure/100437611776/100438659084
 
Gottfried Wilhelm von Leibniz

b. July 1, 1646, Leipzig
d. November 14, 1716, Hannover, Hanover

German philosopher, mathematician, and political adviser, important both as a metaphysician and as a logician and distinguished also for his independent invention of the differential and integral calculus. 1661, he entered the University of Leipzig as a law student; there he came into contact with the thought of men who had revolutionized science and philosophy--men such as Galileo, Francis Bacon, Thomas Hobbes, and René Descartes. In 1666 he wrote De Arte Combinatoria ("On the Art of Combination"), in which he formulated a model that is the theoretical ancestor of some modern computers.

INDEXCARD, 1/30
 
Network Information Center (NIC)

Network information centers are organizations responsible for registering and maintaining the domain names on the World Wide Web. Until competition in domain name registration was introduced, they were the only ones responsible. Most countries have their own network information center.

INDEXCARD, 2/30
 
Liability of ISPs

ISPs (Internet Service Provider), BBSs (Bulletin Board Service Operators), systems operators and other service providers (in the U.S.) can usually be hold liable for infringing activities that take place through their facilities under three theories: 1) direct liability: to establish direct infringement liability there must be some kind of a direct volitional act, 2) contributory liability: a party may be liable for contributory infringement where "... with knowledge of the infringing activity, [it] induces, causes or materially contributes to the infringing activity of another." Therefore a person must know or have reason to know that the subject matter is copyrighted and that particular uses violated copyright law. There must be a direct infringement of which the contributory infringer has knowledge, and encourages or facilitates for contributory infringement to attach, and 3) vicarious liability: a party may be vicariously liable for the infringing acts of another if it a) has the right and ability to control the infringer's acts and b) receives a direct financial benefit from the infringement. Unlike contributory infringement, knowledge is not an element of vicarious liability.


INDEXCARD, 3/30
 
The World Wide Web History Project

The ongoing World Wide Web History Project was established to record and publish the history of the World Wide Web and its roots in hypermedia and networking. As primary research methods are used archival research and the analysis of interviews and talks with pioneers of the World Wide Web. As result a vast of collection of historic video, audio, documents, and software is expected. The project's digital archive is currently under development.

http://www.webhistory.org/home.html

INDEXCARD, 4/30
 
User tracking

User tracking is a generic term that covers all the techniques of monitoring the movements of a user on a web site. User tracking has become an essential component in online commerce, where no personal contact to customers is established, leaving companies with the predicament of not knowing who they are talking to. Some companies, such as Red Eye, Cyber Dialogue, and SAS offer complete technology packages for user tracking and data analysis to online businesses. Technologies include software solutions such as e-mine, e-discovery, or WebHound

Whenever user tracking is performed without the explicit agreement of the user, or without laying open which data are collected and what is done with them, considerable privacy concerns have been raised.

http://www.redeye.co.uk/
http://www.cyberdialogue.com/
http://www.sas.com/
http://www.spss.com/emine/
http://www.sas.com/solutions/e-discovery/inde...
http://www.sas.com/products/webhound/index.ht...
http://www.linuxcare.com.au/mbp/meantime/
INDEXCARD, 5/30
 
Leonard M. Adleman

Leonard M. Adleman was one of three persons in a team to invent the RSA public-key cryptosystem. The co-authors were Adi Shamir and Ron Rivest.

INDEXCARD, 6/30
 
File Transfer Protocol (FTP)

FTP enables the transfer of files (text, image, video, sound) to and from other remote computers connected to the Internet.

INDEXCARD, 7/30
 
VISA

Visa International's over 21,000 member financial institutions have made VISA one of the world's leading full-service payment network. Visa's products and services include Visa Classic card, Visa Gold card, Visa debit cards, Visa commercial cards and the Visa Global ATM Network. VISA operates in 300 countries and territories and also provides a large consumer payments processing system.

INDEXCARD, 8/30
 
Neighboring rights

Copyright laws generally provide for three kinds of neighboring rights: 1) the rights of performing artists in their performances, 2) the rights of producers of phonograms in their phonograms, and 3) the rights of broadcasting organizations in their radio and television programs. Neighboring rights attempt to protect those who assist intellectual creators to communicate their message and to disseminate their works to the public at large.

INDEXCARD, 9/30
 
Telephone

The telephone was not invented by Alexander Graham Bell, as is widely held to be true, but by Philipp Reiss, a German teacher. When he demonstrated his invention to important German professors in 1861, it was not enthusiastically greeted. Because of this dismissal, no financial support for further development was provided to him.

And here Bell comes in: In 1876 he successfully filed a patent for the telephone. Soon afterwards he established the first telephone company.

INDEXCARD, 10/30
 
Enigma

Device used by the German military command to encode strategic messages before and during World War II. The Enigma code was broken by a British intelligence system known as Ultra.

INDEXCARD, 11/30
 
Copyright management information

Copyright management information refers to information which identifies a work, the author of a work, the owner of any right in a work, or information about the terms and conditions of the use of a work, and any numbers or codes that represent such information, when any of these items of information are attached to a copy of a work or appear in connection with the communication of a work to the public.

INDEXCARD, 12/30
 
Proxy Servers

A proxy server is a server that acts as an intermediary between a workstation user and the Internet so that security, administrative control, and caching service can be ensured.

A proxy server receives a request for an Internet service (such as a Web page request) from a user. If it passes filtering requirements, the proxy server, assuming it is also a cache server, looks in its local cache of previously downloaded Web pages. If it finds the page, it returns it to the user without needing to forward the request to the Internet. If the page is not in the cache, the proxy server, acting as a client on behalf of the user, uses one of its own IP addresses to request the page from the server out on the Internet. When the page is returned, the proxy server relates it to the original request and forwards it on to the user.

Source: Whatis.com

INDEXCARD, 13/30
 
Europe Online

Established in 1998 and privately held, Europe Online created and operates the world's largest broadband "Internet via the Sky" network. The Europe Online "Internet via the Sky" service is available to subscribers in English, French, German, Dutch and Danish with more languages to come.

http://www.europeonline.com

INDEXCARD, 14/30
 
Proprietary Network

Proprietary networks are computer networks with standards different to the ones proposed by the International Standardization Organization (ISO), the Open Systems Interconnection (OSI). Designed to conform to standards implemented by the manufacturer, compatibility to other network standards is not assured.

INDEXCARD, 15/30
 
cryptology

also called "the study of code". It includes both, cryptography and cryptoanalysis

INDEXCARD, 16/30
 
Framing

Framing is the practice of creating a frame or window within a web page where the content of a different web page can be display. Usually when a link is clicked on, the new web page is presented with the reminders of the originating page.

INDEXCARD, 17/30
 
IBM

IBM (International Business Machines Corporation) manufactures and develops cumputer hardware equipment, application and sysem software, and related equipment.

IBM produced the first PC (Personal Computer), and its decision to make Microsoft DOS the standard operating system initiated Microsoft's rise to global dominance in PC software.

Business indicators:

1999 Sales: $ 86,548 (+ 7,2 % from 1998)

Market capitalization: $ 181 bn

Employees: approx. 291,000

Corporate website: www.ibm.com

http://www.ibm.com/
INDEXCARD, 18/30
 
NSFNet

Developed under the auspices of the National Science Foundation (NSF), NSFnet served as the successor of the ARPAnet as the main network linking universities and research facilities until 1995, when it was replaced it with a commercial backbone network. Being research networks, ARPAnet and NSFnet served as testing grounds for future networks.

INDEXCARD, 19/30
 
Vacuum tube

The first half of the 20th century was the era of the vacuum tube in electronics. This variety of electron tube permitted the development of radio broadcasting, long-distance telephony, television, and the first electronic digital computers. These early electronic computers were, in fact, the largest vacuum-tube systems ever built. Perhaps the best-known representative is the ENIAC (Electronic Numerical Integrator and Computer, completed in 1946).

INDEXCARD, 20/30
 
Total copyright industries

The total copyright industries encompass the "core copyright industries" and portions of many other industries that either create, distribute, or depend upon copyrighted works. Examples include retail trade (a portion of which is sales of video, audio, software, and books, for example), the doll and toy industry, and computer manufacturing.


INDEXCARD, 21/30
 
water-clocks

The water-clocks are an early long-distance-communication-system. Every communicating party had exactly the same jar, with a same-size-hole that was closed and the same amount of water in it. In the jar was a stick with different messages written on. When one party wanted to tell something to the other it made a fire-sign. When the other answered, both of them opened the hole at the same time. And with the help of another fire-sign closed it again at the same time, too. In the end the water covered the stick until the point of the wanted message.

INDEXCARD, 22/30
 
CIA

CIA's mission is to support the President, the National Security Council, and all officials who make and execute U.S. national security policy by: Providing accurate, comprehensive, and timely foreign intelligence on national security topics; Conducting counterintelligence activities, special activities, and other functions related to foreign intelligence and national security, as directed by the President. To accomplish its mission, the CIA engages in research, development, and deployment of high-leverage technology for intelligence purposes. As a separate agency, CIA serves as an independent source of analysis on topics of concern and works closely with the other organizations in the Intelligence Community to ensure that the intelligence consumer--whether Washington policymaker or battlefield commander--receives the adaequate intelligence information.

http://www.cia.gov

INDEXCARD, 23/30
 
Satellites

Communications satellites are relay stations for radio signals and provide reliable and distance-independent high-speed connections even at remote locations without high-bandwidth infrastructure.

On point-to-point transmission, the transmission method originally employed on, satellites face increasing competition from fiber optic cables, so point-to-multipoint transmission increasingly becomes the ruling satellite technology. Point-to-multipoint transmission enables the quick implementation of private networks consisting of very small aperture terminals (VSAT). Such networks are independent and make mobile access possible.

In the future, satellites will become stronger, cheaper and their orbits will be lower; their services might become as common as satellite TV is today.

For more information about satellites, see How Satellites Work (http://octopus.gma.org/surfing/satellites) and the Tech Museum's satellite site (http://www.thetech.org/hyper/satellite).

http://www.whatis.com/vsat.htm
http://octopus.gma.org/surfing/satellites
INDEXCARD, 24/30
 
AT&T

AT&T Corporation provides voice, data and video communications services to large and small businesses, consumers and government entities. AT&T and its subsidiaries furnish domestic and international long distance, regional, local and wireless communications services, cable television and Internet communications services. AT&T also provides billing, directory and calling card services to support its communications business. AT&T's primary lines of business are business services, consumer services, broadband services and wireless services. In addition, AT&T's other lines of business include network management and professional services through AT&T Solutions and international operations and ventures. In June 2000, AT&T completed the acquisition of MediaOne Group. With the addition of MediaOne's 5 million cable subscribers, AT&T becomes the country's largest cable operator, with about 16 million customers on the systems it owns and operates, which pass nearly 28 million American homes. (source: Yahoo)

Slogan: "It's all within your reach"

Business indicators:

Sales 1999: $ 62.391 bn (+ 17,2 % from 1998)

Market capitalization: $ 104 bn

Employees: 107,800

Corporate website: http://www.att.com http://www.att.com/
INDEXCARD, 25/30
 
Chappe's fixed optical network

Claude Chappe built a fixed optical network between Paris and Lille. Covering a distance of about 240kms, it consisted of fifteen towers with semaphores.

Because this communication system was destined to practical military use, the transmitted messages were encoded. The messages were kept such secretly, even those who transmit them from tower to tower did not capture their meaning, they just transmitted codes they did not understand. Depending on weather conditions, messages could be sent at a speed of 2880 kms/hr at best.

Forerunners of Chappe's optical network are the Roman smoke signals network and Aeneas Tacitus' optical communication system.

For more information on early communication networks see Gerard J. Holzmann and Bjoern Pehrson, The Early History of Data Networks.

INDEXCARD, 26/30
 
National Science Foundation (NSF)

Established in 1950, the National Science Foundation is an independent agency of the U.S. government dedicated to the funding in basic research and education in a wide range of sciences and in mathematics and engineering. Today, the NSF supplies about one quarter of total federal support of basic scientific research at academic institutions.

http://www.nsf.gov

For more detailed information see the Encyclopaedia Britannica: http://www.britannica.com/bcom/eb/article/0/0,5716,2450+1+2440,00.html

http://www.nsf.gov/
INDEXCARD, 27/30
 
National Laboratory for Applied Network Research

NLANR, initially a collaboration among supercomputer sites supported by the National Science Foundation, was created in 1995 to provide technical and engineering support and overall coordination of the high-speed connections at these five supercomputer centers.

Today NLANR offers support and services to institutions that are qualified to use high performance network service providers - such as Internet 2 and Next Generation Internet.

http://www.nlanr.net

INDEXCARD, 28/30
 
COMECON

The Council for Mutual Economic Aid (COMECON) was set up in 1949 consisting of six East European countries: Bulgaria, Czechoslovakia, Hungary, Poland, Romania, and the USSR, followed later by the German Democratic Republic (1950), Mongolia (1962), Cuba (1972), and Vietnam (1978). Its aim was, to develop the member countries' economies on a complementary basis for the purpose of achieving self-sufficiency. In 1991, Comecon was replaced by the Organization for International Economic Cooperation.

INDEXCARD, 29/30
 
Caching

Caching is a mechanism that attempts to decrease the time it takes to retrieve data by storing a copy at a closer location.

INDEXCARD, 30/30