0 - 1400 A.D.

150
A smoke signals network covers the Roman Empire

The Roman smoke signals network consisted of towers within a visible range of each other and had a total length of about 4500 kilometers. It was used for military signaling.
For a similar telegraph network in ancient Greece see Aeneas Tacitus' optical communication system.

About 750
In Japan block printing is used for the first time.

868
In China the world's first dated book, the Diamond Sutra, is printed.

1041-1048
In China moveable types made from clay are invented.

1088
First European medieval university is established in Bologna.

The first of the great medieval universities was established in Bologna. At the beginning universities predominantly offered a kind of do-it-yourself publishing service.

Books still had to be copied by hand and were so rare that a copy of a widely desired book qualified for being invited to a university. Holding a lecture equaled to reading a book aloud, like a priest read from the Bible during services. Attending a lecture equaled to copy a lecture word by word, so that you had your own copy of a book, thus enabling you to hold a lecture, too.

For further details see History of the Idea of a University, http://quarles.unbc.edu/ideas/net/history/history.html

TEXTBLOCK 1/11 // URL: http://world-information.org/wio/infostructure/100437611796/100438659702
 
Challenges for Copyright by ICT: Copyright Owners

The main concern of copyright owners as the (in terms of income generation) profiteers of intellectual property protection is the facilitation of pirate activities in digital environments.

Reproduction and Distribution

Unlike copies of works made using analog copiers (photocopy machines, video recorders etc.) digital information can be reproduced extremely fast, at low cost and without any loss in quality. Since each copy is a perfect copy, no quality-related limits inhibit pirates from making as many copies as they please, and recipients of these copies have no incentive to return to authorized sources to get another qualitatively equal product. Additionally the costs of making one extra copy of intellectual property online are insignificant, as are the distribution costs if the copy is moved to the end user over the Internet.

Control and Manipulation

In cross-border, global data networks it is almost impossible to control the exploitation of protected works. Particularly the use of anonymous remailers and other existing technologies complicates the persecution of pirates. Also digital files are especially vulnerable to manipulation, of the work itself, and of the (in some cases) therein-embedded copyright management information.

TEXTBLOCK 2/11 // URL: http://world-information.org/wio/infostructure/100437611725/100438659526
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 3/11 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
Legal Protection: WIPO (World Intellectual Property Organization)

Presumably the major player in the field of international intellectual property protection and administrator of various multilateral treaties dealing with the legal and administrative aspects of intellectual property is the WIPO.

Information on WIPO administered agreements in the field of industrial property (Paris Convention for the Protection of Industrial Property (1883), Madrid Agreement Concerning the International Registration of Marks (1891) etc.) can be found on: http://www.wipo.org/eng/general/index3.htm

Information on treaties concerning copyright and neighboring rights (Berne Convention for the Protection of Literary and Artistic Works (1886) etc.) is published on: http://www.wipo.org/eng/general/index5.htm

The most recent multilateral agreement on copyright is the 1996 WIPO Copyright Treaty. Among other things it provides that computer programs are protected as literary works and also introduces the protection of databases, which "... by reason of the selection or arrangement of their content constitute intellectual creations." Furthermore the 1996 WIPO Copyright Treaty contains provisions concerning technological measures, rights management information and establishes a new "right of communication to the public". It is available on: http://www.wipo.org/eng/diplconf/distrib/treaty01.htm

TEXTBLOCK 4/11 // URL: http://world-information.org/wio/infostructure/100437611725/100438659588
 
Individualized Audience Targeting

New opportunities for online advertisers arise with the possibility of one-to-one Web applications. Software agents for example promise to "register, recognize and manage end-user profiles; create personalized communities on-line; deliver personalized content to end-users and serve highly targeted advertisements". The probably ultimate tool for advertisers. Although not yet widely used, companies like Amazon.Com have already started to exploit individualized audience targeting for their purposes.

TEXTBLOCK 5/11 // URL: http://world-information.org/wio/infostructure/100437611652/100438658450
 
Timeline 1900-1970 AD

1913 the wheel cipher gets re-invented as a strip

1917 William Frederick Friedman starts working as a cryptoanalyst at Riverbank Laboratories, which also works for the U.S. Government. Later he creates a school for military cryptoanalysis

- an AT&T-employee, Gilbert S. Vernam, invents a polyalphabetic cipher machine that works with random-keys

1918 the Germans start using the ADFGVX-system, that later gets later by the French Georges Painvin

- Arthur Scherbius patents a ciphering machine and tries to sell it to the German Military, but is rejected

1919 Hugo Alexander Koch invents a rotor cipher machine

1921 the Hebern Electric Code, a company producing electro-mechanical cipher machines, is founded

1923 Arthur Scherbius founds an enterprise to construct and finally sell his Enigma machine for the German Military

late 1920's/30's more and more it is criminals who use cryptology for their purposes (e.g. for smuggling). Elizabeth Smith Friedman deciphers the codes of rum-smugglers during prohibition regularly

1929 Lester S. Hill publishes his book Cryptography in an Algebraic Alphabet, which contains enciphered parts

1933-1945 the Germans make the Enigma machine its cryptographic main-tool, which is broken by the Poles Marian Rejewski, Gordon Welchman and Alan Turing's team at Bletchley Park in England in 1939

1937 the Japanese invent their so called Purple machine with the help of Herbert O. Yardley. The machine works with telephone stepping relays. It is broken by a team of William Frederick Friedman. As the Japanese were unable to break the US codes, they imagined their own codes to be unbreakable as well - and were not careful enough.

1930's the Sigaba machine is invented in the USA, either by W.F. Friedman or his colleague Frank Rowlett

- at the same time the British develop the Typex machine, similar to the German Enigma machine

1943 Colossus, a code breaking computer is put into action at Bletchley Park

1943-1980 the cryptographic Venona Project, done by the NSA, is taking place for a longer period than any other program of that type

1948 Shannon, one of the first modern cryptographers bringing mathematics into cryptography, publishes his book A Communications Theory of Secrecy Systems

1960's the Communications-Electronics Security Group (= CESG) is founded as a section of Government Communications Headquarters (= GCHQ)

late 1960's the IBM Watson Research Lab develops the Lucifer cipher

1969 James Ellis develops a system of separate public-keys and private-keys

TEXTBLOCK 6/11 // URL: http://world-information.org/wio/infostructure/100437611776/100438658921
 
Definition

During the last 20 years the old Immanuel Wallerstein-paradigm of center - periphery and semi-periphery found a new costume: ICTs. After Colonialism, Neo-Colonialism and Neoliberalism a new method of marginalization is emerging: the digital divide.

"Digital divide" describes the fact that the world can be divided into people who
do and people who do not have access to (or the education to handle with) modern information technologies, e.g. cellular telephone, television, Internet. This digital divide is concerning people all over the world, but as usually most of all people in the formerly so called third world countries and in rural areas suffer; the poor and less-educated suffer from that divide.
More than 80% of all computers with access to the Internet are situated in larger cities.

"The cost of the information today consists not so much of the creation of content, which should be the real value, but of the storage and efficient delivery of information, that is in essence the cost of paper, printing, transporting, warehousing and other physical distribution means, plus the cost of the personnel manpower needed to run these `extra' services ....Realizing an autonomous distributed networked society, which is the real essence of the Internet, will be the most critical issue for the success of the information and communication revolution of the coming century of millennium."
(Izumi Aizi)

for more information see:
http://www.whatis.com/digital_divide.htm

TEXTBLOCK 7/11 // URL: http://world-information.org/wio/infostructure/100437611730/100438659300
 
Another Question of Security

Even with the best techniques it is impossible to invent a cryptographic system that is absolutely safe/unbreakable. To decipher a text means to go through many, sometimes nearly - but never really - endless attempts. For the computers of today it might take hundreds of years or even more to go through all possibilities of codes, but still, finally the code stays breakable. The much faster quantum computers will proof that one day.
Therefore the decision to elect a certain method of enciphering finally is a matter of trust.

For the average user of computers it is rather difficult to understand or even realize the dangers and/or the technological background of electronic transmission of data. For the majority thinking about one's own necessities for encryption first of all means to trust others, the specialists, to rely on the information they provide.
The websites explaining the problems behind (and also the articles and books concerning the topic) are written by experts of course as well, very often in their typical scientific language, merely understandable for laymen. The introductions and other superficial elements of those articles can be understood, whereas the real background appears as untouchable spheres of knowledge.

The fact that dangers are hard to see through and the need for security measures appears as something most people know from media reports, leads directly to the problem of an underdeveloped democracy in the field of cryptography. Obviously the connection between cryptography and democracy is rather invisible for many people. Those mentioned media reports often specialize in talking about the work computer hackers do (sometimes being presented as criminals, sometimes as heroes) and the danger to lose control over the money drawn away from one's bank account, if someone steals the credit card number or other important financial data. The term "security", surely connected to those issues, is a completely different one from the one that is connected to privacy.
It is especially the latter that touches the main elements of democracy.

for the question of security see:
http://www-db.stanford.edu/pub/gio/CS99I/security.html

TEXTBLOCK 8/11 // URL: http://world-information.org/wio/infostructure/100437611776/100438658850
 
Data bunkers

Personal data are collected, appropriated, processed and used for commercial purposes on a global scale. In order for such a global system to operate smoothly, there a server nodes at which the data streams converge. Among the foremost of these are the data bases of credit card companies, whose operation has long depended on global networking.

On top of credit card companies such as Visa, American Express, Master Card, and others. It would be erroneous to believe that the primary purpose of business of these companies is the provision of credit, and the facilitation of credit information for sale transactions. In fact, Information means much more than just credit information. In an advertisement of 1982, American Express described itself in these terms: ""Our product is information ...Information that charges airline tickets, hotel rooms, dining out, the newest fashions ...information that grows money funds buys and sells equities ...information that pays life insurance annuities ...information that schedules entertainment on cable television and electronically guards houses ...information that changes kroners into guilders and figures tax rates in Bermuda ..."

Information has become something like the gospel of the New Economy, a doctrine of salvation - the life blood of society, as Bill Gates expresses it. But behind information there are always data that need to be generated and collected. Because of the critical importance of data to the economy, their possession amounts to power and their loss can cause tremendous damage. The data industry therefore locates its data warehouses behind fortifications that bar physical or electronic access. Such structures are somewhat like a digital reconstruction of the medieval fortress

Large amounts of data are concentrated in fortress-like structures, in data bunkers. As the Critical Art Ensemble argue in Electronic Civil Disobedience: "The bunker is the foundation of homogeneity, and allows only a singular action within a given situation." All activities within data bunker revolve around the same principle of calculation. Calculation is the predominant mode of thinking in data-driven societies, and it reaches its greatest density inside data bunkers. However, calculation is not a politically neutral activity, as it provides the rational basis - and therefore the formal legitimisation most every decision taken. Data bunkers therefore have an essentially conservative political function, and function to maintain and strengthen the given social structures.

TEXTBLOCK 9/11 // URL: http://world-information.org/wio/infostructure/100437611761/100438659754
 
Acessing the Internet

The Net connections can be based on wire-line and wireless access technolgies.

Wire-line access

Wire-less access

copper wires

Satellites

coaxial cables

mobile terrestrial antennas

electric power lines

fixed terrestrial antennas

fiber-optic cables







Usually several kinds of network connections are employed at once. Generally speaking, when an E-mail message is sent it travels from the user's computer via copper wires or coaxial cables ISDN lines, etc., to an Internet Service Provider, from there, via fibre-optic cables, to the nearest Internet exchange, and on into a backbone network, tunneling across the continent und diving through submarine fibre-optic cables across the Atlantic to another Internet exchange, from there, via another backbone network and across another regional network to the Internet Service Provider of the supposed message recipient, from there via cables and wires of different bandwidth arriving at its destination, a workstation permanently connected to the Internet. Finally a sound or flashing icon informs your virtual neighbor that a new message has arrived.

Satellite communication

Although facing competition from fiber-optic cables as cost-effective solutions for broadband data transmission services, the space industry is gaining increasing importance in global communications. As computing, telephony, and audiovisual technologies converge, new wireless technologies are rapidly deployed occupying an increasing market share and accelerating the construction of high-speed networks.

Privatization of satellite communication

Until recently transnational satellite communication was provided exclusively by intergovernmental organizations as Intelsat, Intersputnik and Inmarsat.

Scheduled privatization of intergovernmental satellite consortia:

Satellite consortia

Year of foundation

Members

Scheduled date for privatization

Intelsat

1964

200 nations under the leadership of the USA

2001

Intersputnik

1971

23 nations under the leadership of Russia

?

Inmarsat

1979

158 nations (all members of the International Maritime Organization)

privatized since 1999

Eutelsat

1985

Nearly 50 European nations

2001



When Intelsat began to accumulate losses because of management failures and the increasing market share of fiber-optic cables, this organizational scheme came under attack. Lead by the USA, the Western industrialized countries successfully pressed for the privatization of all satellite consortia they are members of and for competition by private carriers.

As of February 2000, there are 2680 satellites in service. Within the next four years a few hundred will be added by the new private satellite systems. Most of these systems will be so-called Low Earth Orbit satellite systems, which are capable of providing global mobile data services on a high-speed level at low cost.

Because of such technological improvements and increasing competition, experts expect satellite-based broadband communication to be as common, cheap, and ubiquitous as satellite TV today within the next five or ten years.

Major satellite communication projects

Project name

Main investors

Expected cost

Number of satellites

Date of service start-up

Astrolink

Lockheed Martin, TRW, Telespazio, Liberty Media Group

US$ 3.6 billion

9

2003

Globalstar

13 investors including Loral Space & Communications, Qualcomm, Hyundai, Alcatel, France Telecom, China Telecom, Daimler Benz and Vodafone/Airtouch

US$ 3.26 billion

48

1998

ICO

57 investors including British Telecom, Deutsche Telecom, Inmarsat, TRW and Telefonica

US$ 4.5 billion

10

2001

Skybridge

9 investors including Alcatel Space, Loral Space & Communications, Toshiba, Mitsubishi and Sharp

US$ 6.7 billion

80

2002

Teledesic

Bill Gates, Craig McCaw, Prince Alwaleed Bin Talal Bin Abdul Aziz Alsaud, Abu Dhabi Investment Company

US$ 9 billion

288

2004


Source: Analysys Satellite Communications Database

TEXTBLOCK 10/11 // URL: http://world-information.org/wio/infostructure/100437611791/100438659839
 
Linking and Framing: Cases

Mormon Church v. Sandra and Jerald Tanner

In a ruling of December 1999, a federal judge in Utah temporarily barred two critics of the Mormon Church from posting on their website the Internet addresses of other sites featuring pirated copies of a Mormon text. The Judge said that it was likely that Sandra and Jerald Tanner had engaged in contributory copyright infringement when they posted the addresses of three Web sites that they knew, or should have known, contained the copies.

Kaplan, Carl S.: Copyright Decision Threatens Freedom to Link. In: New York Times. December 10, 1999.

Universal Studios v. Movie-List

The website Movie-List, which features links to online, externally hosted movie trailers has been asked to completely refrain from linking to any of Universal Studio's servers containing the trailers as this would infringe copyright.

Cisneros, Oscar S.: Universal: Don't Link to Us. In: Wired. July 27, 1999.

More cases concerned with the issue of linking, framing and the infringement of intellectual property are published in:

Ross, Alexandra: Copyright Law and the Internet: Selected Statutes and Cases.

TEXTBLOCK 11/11 // URL: http://world-information.org/wio/infostructure/100437611725/100438659639
 
Virtual Private Networks

Virtual Private Networks provide secured connections to a corporate site over a public network as the Internet. Data transmitted through secure connections are encrypted and therefore have to be encrypted before they can be read.
These networks are called virtual because connections are provided only when you connect to a corporate site; they do not rely on dedicated lines and support mobile use.

INDEXCARD, 1/12
 
User tracking

User tracking is a generic term that covers all the techniques of monitoring the movements of a user on a web site. User tracking has become an essential component in online commerce, where no personal contact to customers is established, leaving companies with the predicament of not knowing who they are talking to. Some companies, such as Red Eye, Cyber Dialogue, and SAS offer complete technology packages for user tracking and data analysis to online businesses. Technologies include software solutions such as e-mine, e-discovery, or WebHound

Whenever user tracking is performed without the explicit agreement of the user, or without laying open which data are collected and what is done with them, considerable privacy concerns have been raised.

http://www.redeye.co.uk/
http://www.cyberdialogue.com/
http://www.sas.com/
http://www.spss.com/emine/
http://www.sas.com/solutions/e-discovery/inde...
http://www.sas.com/products/webhound/index.ht...
http://www.linuxcare.com.au/mbp/meantime/
INDEXCARD, 2/12
 
Sony Corporation

Japanese SONY KK, major Japanese manufacturer of consumer electronics products. Headquarters are in Tokyo. The company was incorporated in 1946 and spearheaded Japan's drive to become the world's dominant consumer electronics manufacturer in the late 20th century. The company was one of the first to recognize the potential of the consumer videotape market. In 1972 it formed an affiliate to market its Betamax colour videocassette system. In 1987-88 Sony purchased the CBS Records Group from CBS Inc., thus acquiring the world's largest record company. It followed that purchase with the purchase in 1989 of Columbia Pictures Entertainment Inc.

INDEXCARD, 3/12
 
Sun Microsystems

Founded in 1982 and headquartered in Palo Alto, USA, Sun Microsystems manufactures computer workstations, servers, and software.

http://www.sun.com

For more detailed information see the Encyclopaedia Britannica: http://www.britannica.com/bcom/eb/article/9/0,5716,108249+1+105909,00.html .

http://www.sun.com/
http://www.britannica.com/bcom/eb/article/9/0...
INDEXCARD, 4/12
 
Cookie

A cookie is an information package assigned to a client program (mostly a Web browser) by a server. The cookie is saved on your hard disk and is sent back each time this server is accessed. The cookie can contain various information: preferences for site access, identifying authorized users, or tracking visits.

In online advertising, cookies serve the purpose of changing advertising banners between visits, or identifying a particular direct marketing strategy based on a user's preferences and responses.

Advertising banners can be permanently eliminated from the screen by filtering software as offered by Naviscope or Webwash

Cookies are usually stored in a separate file of the browser, and can be erased or permanently deactivated, although many web sites require cookies to be active.

http://www.naviscope.com/
http://www.webwash.com/
INDEXCARD, 5/12
 
Gaius Julius Caesar

Gaius Julius Caesar (100-44 BC) was a Roman Statesman who came to power through a military career and by buying of votes. His army won the civil war, run over Spain, Sicily and Egypt, where he made Cleopatra a Queen. For reaching even more power he increased the number of senators. But he also organized social measures to improve the people's food-situation. In February 44 BC he did not accept the kingship offered by Marc Anthony, which made him even more popular. One month later he was murdered during a senate sitting.

INDEXCARD, 6/12
 
Internet Engineering Steering Group

On behalf of the Internet Society, the Internet Engineering Steering Group is responsible for the technical management of the evolution of the architecture, the standards and the protocols of the Net.

http://www.ietf.org/iesg.html

http://www.ietf.org/iesg.html
INDEXCARD, 7/12
 
Chappe's fixed optical network

Claude Chappe built a fixed optical network between Paris and Lille. Covering a distance of about 240kms, it consisted of fifteen towers with semaphores.

Because this communication system was destined to practical military use, the transmitted messages were encoded. The messages were kept such secretly, even those who transmit them from tower to tower did not capture their meaning, they just transmitted codes they did not understand. Depending on weather conditions, messages could be sent at a speed of 2880 kms/hr at best.

Forerunners of Chappe's optical network are the Roman smoke signals network and Aeneas Tacitus' optical communication system.

For more information on early communication networks see Gerard J. Holzmann and Bjoern Pehrson, The Early History of Data Networks.

INDEXCARD, 8/12
 
Digital Subscriber Line (DSL)

DSL connections are high-speed data connections over copper wire telephone lines. As with cable connections, with DSL you can look up information on the Internet and make a phone call at the same time but you do not need to have a new or additional cable or line installed. One of the most prominent DSL services is ISDN (integrated services digital network, for more information click here ( http://www.britannica.com/bcom/eb/article/4/0,5716,129614+15,00.html )).

http://www.britannica.com/bcom/eb/article/4/0...
INDEXCARD, 9/12
 
Immanuel Wallerstein

Immanuel Wallerstein (* 1930) is director of the Fernand Braudel Center for the Study of Economies, Historical Systems, and Civilizations. He is one of the most famous sociologists in the Western World. With his book The Modern World-System: Capitalist Agriculture and the Origins of the European World-Economy in the Sixteenth Century (1976), which led to the expression World-System Theory about centers, peripheries and semi-peripheries in the capitalist world system, he did not only influence a whole generation of scientists, but this theory seems to get popular again, due to globalization.

INDEXCARD, 10/12
 
Proxy Servers

A proxy server is a server that acts as an intermediary between a workstation user and the Internet so that security, administrative control, and caching service can be ensured.

A proxy server receives a request for an Internet service (such as a Web page request) from a user. If it passes filtering requirements, the proxy server, assuming it is also a cache server, looks in its local cache of previously downloaded Web pages. If it finds the page, it returns it to the user without needing to forward the request to the Internet. If the page is not in the cache, the proxy server, acting as a client on behalf of the user, uses one of its own IP addresses to request the page from the server out on the Internet. When the page is returned, the proxy server relates it to the original request and forwards it on to the user.

Source: Whatis.com

INDEXCARD, 11/12
 
Gottfried Wilhelm von Leibniz

b. July 1, 1646, Leipzig
d. November 14, 1716, Hannover, Hanover

German philosopher, mathematician, and political adviser, important both as a metaphysician and as a logician and distinguished also for his independent invention of the differential and integral calculus. 1661, he entered the University of Leipzig as a law student; there he came into contact with the thought of men who had revolutionized science and philosophy--men such as Galileo, Francis Bacon, Thomas Hobbes, and René Descartes. In 1666 he wrote De Arte Combinatoria ("On the Art of Combination"), in which he formulated a model that is the theoretical ancestor of some modern computers.

INDEXCARD, 12/12