Internet services

The Internet can be used in in different ways: for distributing and retrieving information, for one-to-one, one-to-many and many-to-many communication, and for the access services. Accordingly, there are different services on offer. The most important of these are listed below.

Telnet

FTP (File Transfer Protocol)

Electronic Messaging (E-Mail)

World Wide Web (WWW)

Bulletin Board Systems (BBS)

Electronic Data Interchange (EDI)

Internet Relay Chat (IRC)

Multiple User Dimensions (MUDs)

Gopher

TEXTBLOCK 1/13 // URL: http://world-information.org/wio/infostructure/100437611791/100438659819
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 2/13 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
RTMark

RTMark is a group of culture jammers applying a brokerage-system that benefits from "limited liability" like any other corporation. Using this principle, RTMark supports the sabotage (informative alternation) of corporate products, from dolls and children's learning tools to electronic action games, by channeling funds from investors to workers. RTMark searches for solutions that go beyond public relations and defines its "bottom line" in improving culture. It seeks cultural and not financial profit.

Strategies and Policies

RTMark is engaged in a whole lot of projects, which are designed to lead to a positive social change. Projects with roughly similar intent, risk, or likelihood of accomplishment are grouped into "fund families", like for example "The Frontier Fund". This fund is dedicated to challenge naive, utopic visions of the "global village", focusing on the implications of allowing corporations and other multinational interests to operate free of social context.

RTMark pursues its projects through donations by individuals, which can invest in a certain fund, whereby an exact specification of how the donated money should be used can be made. RTMark has repeatedly gained attention through its projects, especially with its spoof websites, like the ones of Rudy Giuliani and the WTO, or its campaign against eToys, which prevents the Internet art group etoy from using the domain etoy.com.

TEXTBLOCK 3/13 // URL: http://world-information.org/wio/infostructure/100437611734/100438659283
 
Advertisers and Marketers Perspective

With the rapid growth of the Internet and its audience advertisers now have a new medium at their disposal. The placement of the first banner ads in 1994 marks the birth of Internet advertising. Although the advertising industry at first hesitated to adopt the new medium, two facts brushed away their doubts:

Migrating Television Audiences: The increased use of the Internet led people to redistribute their time budget. Whereas some cut down on eating and sleeping, more than a third reduced watching television and instead uses the WWW.

Interesting Internet Demographics: While methodologies and approaches of research organizations studying the demographic composition of the Internet vary, the findings are relatively consistent: Internet users are young, well educated and earn high incomes.

Considering those findings, the Internet in the first place seems to become inevitable to be included in media planning, as part of the audience shifts from TV to the WWW, and secondly, because demographics of the Internet user population are irresistible for marketers.

TEXTBLOCK 4/13 // URL: http://world-information.org/wio/infostructure/100437611652/100438657907
 
FAIR (Fairness & Accuracy In Reporting)

FAIR is a national media watch group that offers criticism of media bias and censorship. It seeks to invigorate the First Amendment by advocating for greater diversity in the press and scrutinizes media practices that marginalize public interest, minority and dissenting viewpoints.

FAIR believes that independent, aggressive and critical media are essential to an informed democracy, but thinks that mainstream media are increasingly cosy with the economic and political powers. With mergers in the news industry, limiting the spectrum of viewpoints, U.S. media outlets overwhelmingly owned by for-profit conglomerates and supported by corporate advertisers FAIR sees independent journalism compromise.

FAIR was established in 1986 to shake up the establishment-dominated media. As an anti-censorship organization, FAIR exposes important news stories that are neglected and defends journalists when they are muzzled.

Strategies and Policies

Research and Monitoring: FAIR monitors a wide range of national news media - newspapers, magazines, television and radio - and publishes regular reports documenting pro-establishment, pro-corporate tilt in major news outlets.

Media Outreach: In its efforts to challenge bias and censorship, FAIR maintains a regular dialogue with journalists at news media outlets across the country. FAIR makes recommendations to media professionals on how to expand, diversify and improve coverage of a wide range of issues.

Media Activism: FAIR encourages media consumers to become media activists and regularly puts out activist alerts. It works with a nationwide network of local activists and groups that focus on key issues in their communities and participate in national campaigns coordinated by FAIR.

Media Watch Desks: FAIR operates specialized research and advocacy desks that work with activists and media professionals on specific issues. The Women's Desk analyses the effects of sexism and homophobia in the media and works to get feminist perspectives included in the public debate. The Labor Desk scrutinizes and confronts class bias in news coverage that favors moneyed interests and slights workers and unions. The Racism Watch Desk monitors and combats the media's marginalization, misrepresentation and exclusion of people of color.

CounterSpin: FAIR runs a radio program, which draws on a network of experts, analysts, activists and artists, which expose and highlight censored stories, biased and inaccurate news and the corporatisation of public broadcasting.

TEXTBLOCK 5/13 // URL: http://world-information.org/wio/infostructure/100437611734/100438659294
 
Virtual cartels, introduction

Among the most striking development of the 1990s has been the emergence of a global commercial media market utilizing new technologies and the global trend toward deregulation.
This global commercial media market is a result of aggressive maneuvering by the dominant firms, new technologies that make global systems cost-efficient, and neoliberal economic policies encouraged by the World Bank, IMF, WTO, and the US government to break down regulatory barriers to a global commercial media and telecommunication market.

A global oligopolistic market that covers the spectrum of media is now crystallizing the very high barriers to entry."

(Robert McChesney, author of "Rich Media, Poor Democracy")

The network structure of information and communication technologies means that even deregulated markets are not "free". The functional logic of global networks only tolerates a small number of large players. Mergers, strategic alliances, partnerships and cooperations are therefore the daily routine of the ICT business. They bypass competition and create "virtual cartels".

TEXTBLOCK 6/13 // URL: http://world-information.org/wio/infostructure/100437611709/100438658911
 
1970s: Computer-Integrated Manufacturing (CIM)

Since the 1970s there had been a growing trend towards the use of computer programs in manufacturing companies. Especially functions related to design and production, but also business functions should be facilitated through the use of computers.

Accordingly the CAD/CAM technology, related to the use of computer systems for design and production, was developed. CAD (computer-aided design) was created to assist in the creation, modification, analysis, and optimization of design. CAM (computer-aided manufacturing) was designed to help with the planning, control, and management of production operations. CAD/CAM technology, since the 1970s, has been applied in many industries, including machined components, electronics products, equipment design and fabrication for chemical processing.

To enable a more comprehensive use of computers in firms the CIM (computer-integrated manufacturing) technology, which also includes applications concerning the business functions of companies, was created. CIM systems can handle order entry, cost accounting, customer billing and employee time records and payroll. The scope of CIM technology includes all activities that are concerned with production. Therefore in many ways CIM represents the highest level of automation in manufacturing.

TEXTBLOCK 7/13 // URL: http://world-information.org/wio/infostructure/100437611663/100438659495
 
Advertising and the Media System

Media systems (especially broadcasting) can be classified in two different types:

Public Media Systems: Government control over broadcasting through ownership, regulation, and partial funding of public broadcasting services.

Private Media System: Ownership and control lies in the hands of private companies and shareholders.

Both systems can exist in various forms, according to the degree of control by governments and private companies, with mixed systems (public and private) as the third main kind.

Whereas public media systems are usually at least partially funded by governments, private broadcasting solely relies on advertising revenue. Still also public media systems cannot exclude advertising as a source of revenue. Therefore both types are to a certain degree dependent on money coming in by advertisers.

And this implies consequences on the content provided by the media. As the attraction of advertisers becomes critically important, interests of the advertising industry frequently play a dominant role concerning the structure of content and the creation of environments favorable for advertising goods and services within the media becomes more and more common.

TEXTBLOCK 8/13 // URL: http://world-information.org/wio/infostructure/100437611652/100438657942
 
Late 1970s - Present: Fourth Generation Computers

Following the invention of the first integrated circuits always more and more components could be fitted onto one chip. LSI (Large Scale Integration) was followed by VLSI (Very Large Scale Integration) and ULSI (Ultra-Large Scale Integration), which increased the number of components squeezed onto one chip into the millions and helped diminish the size as well as the price of computers. The new chips took the idea of the integrated circuit one step further as they allowed to manufacture one microprocessor which could then be programmed to meet any number of demands.

Also, ensuing the introduction of the minicomputer in the mid 1970s by the early 1980s a market for personal computers (PC) was established. As computers had become easier to use and cheaper they were no longer mainly utilized in offices and manufacturing, but also by the average consumer. Therefore the number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used.

Further developments included the creation of mobile computers (laptops and palmtops) and especially networking technology. While mainframes shared time with many terminals for many applications, networking allowed individual computers to form electronic co-operations. LANs (Local Area Network) permitted computers to share memory space, information, software and communicate with each other. Although already LANs could reach enormous proportions with the invention of the Internet an information and communication-network on a global basis was established for the first time.

TEXTBLOCK 9/13 // URL: http://world-information.org/wio/infostructure/100437611663/100438659451
 
Advertising and the Content Industry - The Coca-Cola Case

Attempts to dictate their rules to the media has become a common practice among marketers and the advertising industry. Similar as in the Chrysler case, where the company demanded that magazines give advance notice about controversial articles, recent attempts to put pressure on content providers have been pursued by the Coca-Cola Company.

According to a memo published by the New York Post, Coca-Cola demands a free ad from any publication that publishes a Coke ad adjacent to stories on religion, politics, disease, sex, food, drugs, environmental issues, health, or stories that employ vulgar language. "Inappropriate editorial matter" will result in the publisher being liable for a "full make good," said the memo by Coke advertising agency McCann-Erickson. Asked about this practice, a Coke spokes person said the policy has long been in effect.

(Source: Odwyerpr.com: Coke Dictates nearby Editorial. http://www.odwyerpr.com)

TEXTBLOCK 10/13 // URL: http://world-information.org/wio/infostructure/100437611652/100438657998
 
"Attention Brokerage"

"Attention Brokerage" is one of the latest developments in the field of online advertising. The first Web-site applying the concept of selling and buying attention is Cybergold. Users, who want to earn money have to register and then look at ads, which, of course, they have to prove by e.g. downloading software. Attention, according to this idea, represents a good, which is worth being paid for.

TEXTBLOCK 11/13 // URL: http://world-information.org/wio/infostructure/100437611652/100438658064
 
ZaMir.net

ZaMir.net started in 1992 trying to enable anti-war and human rights groups of former Yugoslavia to communicate with each other and co-ordinate their activities. Today there are an estimated 1,700 users in 5 different Bulletin Board Systems (Zagreb, Belgrade, Ljubljana, Sarajevo and Pristiana). Za-mir Transnational Network (ZTN) offers e-mail and conferences/newsgroups. The ZTN has its own conferences, which are exchanged between the 5 BBS, and additionally offers more than 150 international conferences. ZTN aim is to help set up systems in other cities in the post-Yugoslav countries that have difficulty connecting to the rest of the world.

History

With the war in Yugoslavia anti-war and human rights groups of former Yugoslavia found it very difficult to organize and met huge problems to co-ordinate their activities due to immense communication difficulties. So in 1992 foreign peace groups together with Institutions in Ljubljana, Zagreb and Belgrade launched the Communications Aid project. Modems were distributed to peace and anti-war groups in Ljubljana, Zagreb, Belgrade and Sarajevo and a BBS (Bulletin Board System) installed.

As after spring 1992 no directs connections could be made they were done indirectly through Austria, Germany or Britain, which also enabled a connection with the worldwide networks of BBS's. Nationalist dictators therefore lost their power to prevent communication of their people. BBS were installed in Zagreb and Belgrade and connected to the APC Network and associated networks. Za-mir Transnational Network (ZTN) was born.

Strategies and Policies

With the help of ZaMir's e-mail network it have been possible to find and coordinate humanitarian aid for some of the many refugees of the war. It has become an important means of communication for humanitarian organizations working in the war region and sister organizations form other countries. It helps co-ordinate work of activists form different countries of former Yugoslavia, and it also helps to coordinate the search for volunteers to aid in war reconstruction. ZTN also helped facilitate exchange of information undistorted by government propaganda between Croatia, Serbia and Bosnia. Independent magazines like Arkzin (Croatia) and Vreme (Serbia) now publish electronic editions on ZTN.

TEXTBLOCK 12/13 // URL: http://world-information.org/wio/infostructure/100437611734/100438659208
 
Anonymity

"Freedom of anonymous speech is an essential component of free speech."

Ian Goldberg/David Wagner, TAZ Servers and the Rewebber Network: Enabling Anonymous Publishing on the World Wide Web, in: First Monday 3,4, 1999

Someone wants to hide one's identity, to remain anonymous, if s/he fears to be holding accountable for something, say, a publication, that is considered to be prohibited. Anonymous publishing has a long tradition in European history. Writers of erotic literature or pamphlets, e. g., preferred to use pseudonyms or publish anonymously. During the Enlightenment books as d'Alembert's and Diderot's famous Encyclopaedia were printed and distributed secretly. Today Book Locker, a company selling electronic books, renews this tradition by allowing to post writings anonymously, to publish without the threat of being perishing for it. Sometimes anonymity is a precondition for reporting human rights abuses. For example, investigative journalists and regime critics may rely on anonymity. But we do not have to look that far; even you might need or use anonymity sometimes, say, when you are a woman wanting to avoid sexual harassment in chat rooms.

The original design of the Net, as far as it is preserved, offers a relatively high degree of privacy, because due to the client-server model all what is known about you is a report of the machine from which information was, respectively is requested. But this design of the Net interferes with the wish of corporations to know you, even to know more about you than you want them to know. What is euphemistically called customer relationship management systems means the collection, compilation and analysis of personal information about you by others.

In 1997 America Online member Timothy McVeigh, a Navy employee, made his homosexuality publicly known in a short autobiographical sketch. Another Navy employee reading this sketch informed the Navy. America Online revealed McVeigh's identity to the Navy, who discharged McVeigh. As the consequence of a court ruling on that case, Timothy McVeigh was allowed to return to the Navy. Sometimes anonymity really matters.

On the Net you still have several possibilities to remain anonymous. You may visit web sites via an anonymizing service. You might use a Web mail account (given the personal information given to the web mail service provider is not true) or you might use an anonymous remailing service which strips off the headers of your mail to make it impossible to identify the sender and forward your message. Used in combination with encryption tools and technologies like FreeHaven or Publius anonymous messaging services provide a powerful tool for countering censorship.

In Germany, in 1515, printers had to swear not to print or distribute any publication bypassing the councilmen. Today repressive regimes, such as China and Burma, and democratic governments, such as the France and Great Britain, alike impose or already have imposed laws against anonymous publishing on the Net.

Anonymity might be used for abuses, that is true, but "the burden of proof rests with those who would seek to limit it. (Rob Kling, Ya-ching Lee, Al Teich, Mark S. Frankel, Assessing Anonymous Communication on the Internet: Policy Deliberations, in: The Information Society, 1999).

TEXTBLOCK 13/13 // URL: http://world-information.org/wio/infostructure/100437611742/100438659040
 
Electronic Data Interchange (EDI)

EDI is an international standard relating to the exchange of trade goods and services. It enables trading partners to conduct routine business transactions, such as purchase orders, invoices and shipping notices independent of the computer platform used by the trading partners. Standardization by EDI translation software assures the correct interpretation of data.

EDI might become increasingly important to electronic commerce.

INDEXCARD, 1/9
 
Virtual Private Networks

Virtual Private Networks provide secured connections to a corporate site over a public network as the Internet. Data transmitted through secure connections are encrypted and therefore have to be encrypted before they can be read.
These networks are called virtual because connections are provided only when you connect to a corporate site; they do not rely on dedicated lines and support mobile use.

INDEXCARD, 2/9
 
National Science Foundation (NSF)

Established in 1950, the National Science Foundation is an independent agency of the U.S. government dedicated to the funding in basic research and education in a wide range of sciences and in mathematics and engineering. Today, the NSF supplies about one quarter of total federal support of basic scientific research at academic institutions.

http://www.nsf.gov

For more detailed information see the Encyclopaedia Britannica: http://www.britannica.com/bcom/eb/article/0/0,5716,2450+1+2440,00.html

http://www.nsf.gov/
INDEXCARD, 3/9
 
Multiple User Dungeons

MUDs are virtual spaces, usually a kind of adventurous ones, you can log into, enabling you to chat with others, to explore and sometimes to create rooms. Each user takes on the identity of an avatar, a computerized character.

INDEXCARD, 4/9
 
NSFNet

Developed under the auspices of the National Science Foundation (NSF), NSFnet served as the successor of the ARPAnet as the main network linking universities and research facilities until 1995, when it was replaced it with a commercial backbone network. Being research networks, ARPAnet and NSFnet served as testing grounds for future networks.

INDEXCARD, 5/9
 
World Wide Web (WWW)

Probably the most significant Internet service, the World Wide Web is not the essence of the Internet, but a subset of it. It is constituted by documents that are linked together in a way you can switch from one document to another by simply clicking on the link connecting these documents. This is made possible by the Hypertext Mark-up Language (HTML), the authoring language used in creating World Wide Web-based documents. These so-called hypertexts can combine text documents, graphics, videos, sounds, and Java applets, so making multimedia content possible.

Especially on the World Wide Web, documents are often retrieved by entering keywords into so-called search engines, sets of programs that fetch documents from as many servers as possible and index the stored information. (For regularly updated lists of the 100 most popular words that people are entering into search engines, click here). No search engine can retrieve all information on the whole World Wide Web; every search engine covers just a small part of it.

Among other things that is the reason why the World Wide Web is not simply a very huge database, as is sometimes said, because it lacks consistency. There is virtually almost infinite storage capacity on the Internet, that is true, a capacity, which might become an almost everlasting too, a prospect, which is sometimes consoling, but threatening too.

According to the Internet domain survey of the Internet Software Consortium the number of Internet host computers is growing rapidly. In October 1969 the first two computers were connected; this number grows to 376.000 in January 1991 and 72,398.092 in January 2000.

World Wide Web History Project, http://www.webhistory.org/home.html

http://www.searchwords.com/
http://www.islandnet.com/deathnet/
http://www.salonmagazine.com/21st/feature/199...
INDEXCARD, 6/9
 
Telnet

Telnet allows you to login remotely on a computer connected to the Internet.

INDEXCARD, 7/9
 
Electronic Messaging (E-Mail)

Electronic messages are transmitted and received by computers through a network. By E-Mail texts, images, sounds and videos can be sent to single users or simultaneously to a group of users. Now texts can be sent and read without having them printed.

E-Mail is one of the most popular and important services on the Internet.

INDEXCARD, 8/9
 
Censorship of Online Content in China

During the Tian-an men massacre reports and photos transmitted by fax machines gave notice of what was happening only with a short delay. The Chinese government has learned his lesson well and "regulated" Internet access from the beginning. All Internet traffic to and out of China passes through a few gateways, a few entry-points, thus making censorship a relatively easy task. Screened out are web sites of organizations and media which express dissident viewpoints: Taiwan's Democratic Progress Party and Independence Party, The New York Times, CNN, and sites dealing with Tibetan independence and human rights issues.

Users are expected not to "harm" China's national interests and therefore have to apply for permission of Internet access; Web pages have to be approved before being published on the Net. For the development of measures to monitor and control Chinese content providers, China's state police has joined forces with the MIT.

For further information on Internet censorship, see Human Rights Watch, World Report 1999.

http://www.dpp.org/
http://www.nytimes.com/
http://www.hrw.org/worldreport99/special/inte...
INDEXCARD, 9/9