The Role of the Media


"Although this is a free society, the U.S. mainstream media often serve as virtual propaganda agents of the state, peddling viewpoints the state wishes to inculcate and marginalizing any alternative perspectives. This is especially true in times of war, when the wave of patriotic frenzy encouraged by the war-makers quickly engulfs the media. Under these conditions the media's capacity for dispassionate reporting and critical analysis is suspended, and they quickly become cheer-leaders and apologists for war." (words as propaganda, by Edward Herman ; source: http://www.foreignpolicy-infocus.org/progresp/vol3/prog3n22.html

The mass-media would have a possibility to get out of this circle of being disinformed and making others disinformed. To admit that oneself is not always informed correctly, and also mention that the pictures shown are not in any case suitable to the text, as some of them are older, or even from another battle.
For the media it would be easy to talk about the own disinformation in public. Doing this would provoke the government or in the case of the NATO an international organization, to unveil secrets. The strategy of the governments to hold back information would then look double as unsuitable.

TEXTBLOCK 1/40 // URL: http://world-information.org/wio/infostructure/100437611661/100438658661
 
The Kosovo-Crisis

During the Kosovo Crisis and during the war that followed, and probably also after it, all sides of the conflict were manipulating their people and others as well, whenever they could. Some of the propaganda shown on TV was as primitive as in World War II, others were subtler. This propaganda started by telling the history of the geographic point of discussion from the own point of view, it went on with the interpretation of the motives of the enemy and finally came to censorship, manipulation of the number of victims ( for more information see: http://www.oneworld.org/index_oc/kosovo/kadare.html , spreading of atrocity stories and so on.
Many journalists and scientists are still working to detect more propaganda and disinformation stories.

An interesting detail about this war was that more people than ever before took their information about the war out of the internet. In part this had to do with the biased TV-reports on all sides. All parties put their ideas and perspectives in the net, so one could get an overview of the different thoughts and types of disinformation.
One of the big lies of NATO was the numbers of destroyed military facilities in Serbia. After the war the numbers had to be corrected down to a ridiculous number of about 13 destroyed tanks. At the same time the numbers of civilian victims turned out to be much higher than NATO had admitted in the first line. The method how European and American people had been persuaded to support the NATO-bombings was the promise to bomb only targets of the military or military-related facilities. Nearly every day NATO had to stretch this interpretation, as many civilian houses got destroyed. A cynical word was created for this kind of excuse: collateral damage.

The Serbs were not better than Western governments and media, which worked together closely. Serb TV showed the bombed targets and compared persons like Bill Clinton to Adolf Hitler and called the NATO fascist. On the other hand pictures from the situation in Kosov@ were left out in their reports.

More:
http://www.voa.gov/editorials/08261.htm (91)
http://www.foreignpolicy-infocus.org/progresp/vol3/prog3n22.html (92)
http://www.serbia-info.com/news (93)
http://www.nyu.edu/globalbeat/syndicate/Belgrade041399.html (94)
http://www.monde-diplomatique.fr/1999/08/SAID/12320.html (95)

TEXTBLOCK 2/40 // URL: http://world-information.org/wio/infostructure/100437611661/100438658714
 
Racism on the Internet

The internet can be regarded as a mirror of the variety of interests, attitudes and needs of human kind. Propaganda and disinformation in that way have to be part of it, whether they struggle for something good or evil. But the classifications do no longer function.
During the last years the internet opened up a new source for racism as it can be difficult to find the person who gave a certain message into the net. The anarchy of the internet provides racists with a lot of possibilities to reach people which they do not possess in other media, for legal and other reasons.

In the 1980s racist groups used mailboxes to communicate on an international level; the first ones to do so were supposedly the Ku Klux Klan and mailboxes like the Aryan Nations Liberty Net. In the meantime those mailboxes can be found in the internet. In 1997 about 600 extreme right websites were in the net, the number is growing, most of them coming from the USA. The shocking element is not the number of racist pages, because still it is a very small number compared to the variety of millions of pages one can find in this media, it is the evidence of intentional disinformation, the language and the hatred that makes it dangerous.
A complete network of anti-racist organizations, including a high number of websites are fighting against racism. For example:

http://motlc.wiesenthal.com/text/x32/xr3257.html

http://www.aranet.org/

http://www.freespeech.org/waronracism/files/allies.htm
http://www.nsdapmuseum.com
http://www.globalissues.org/HumanRights/Racism.asp

TEXTBLOCK 3/40 // URL: http://world-information.org/wio/infostructure/100437611661/100438658620
 
The Cassini Case

In 1997 NASA's Cassini mission to Saturn and its moons led to heated controversies, because it was fueled by plutonium, a substance that could cause serious environmental and health problems if it were released into the atmosphere.

Still no major U.S. news outlet in broadcasting or print reported in depth on the risks of the Cassini mission. Westinghouse-owned media like CBS and NBC (also partly owned by General Electric) for example had only reported that children were invited to sign a plaque inside Cassini. Not surprisingly Westinghouse and General Electric are two of the largest corporations with defense contracts and nuclear interests.

TEXTBLOCK 4/40 // URL: http://world-information.org/wio/infostructure/100437611734/100438658562
 
Disinformation and Science

Disinformation's tools emerged from science and art.
And furthermore: disinformation can happen in politics of course, but also in science:
for example by launching ideas which have not been proven exactly until the moment of publication. e.g. the thought that time runs backwards in parts of the universe:
http://www.newscientist.com/ns/19991127/newsstory3.html

TEXTBLOCK 5/40 // URL: http://world-information.org/wio/infostructure/100437611661/100438658699
 
1940s - Early 1950s: First Generation Computers

Probably the most important contributor concerning the theoretical basis for the digital computers that were developed in the 1940s was Alan Turing, an English mathematician and logician. In 1936 he created the Turing machine, which was originally conceived as a mathematical tool that could infallibly recognize undecidable propositions. Although he instead proved that there cannot exist any universal method of determination, Turing's machine represented an idealized mathematical model that reduced the logical structure of any computing device to its essentials. His basic scheme of an input/output device, memory, and central processing unit became the basis for all subsequent digital computers.

The onset of the Second World War led to an increased funding for computer projects, which hastened technical progress, as governments sought to develop computers to exploit their potential strategic importance.

By 1941 the German engineer Konrad Zuse had developed a computer, the Z3, to design airplanes and missiles. Two years later the British completed a secret code-breaking computer called Colossus to decode German messages and by 1944 the Harvard engineer Howard H. Aiken had produced an all-electronic calculator, whose purpose was to create ballistic charts for the U.S. Navy.

Also spurred by the war the Electronic Numerical Integrator and Computer (ENIAC), a general-purpose computer, was produced by a partnership between the U.S. government and the University of Pennsylvania (1943). Consisting of 18.000 vacuum tubes, 70.000 resistors and 5 million soldered joints, the computer was such a massive piece of machinery (floor space: 1,000 square feet) that it consumed 160 kilowatts of electrical power, enough energy to dim lights in an entire section of a bigger town.

Concepts in computer design that remained central to computer engineering for the next 40 years were developed by the Hungarian-American mathematician John von Neumann in the mid-1940s. By 1945 he created the Electronic Discrete Variable Automatic Computer (EDVAC) with a memory to hold both a stored program as well as data. The key element of the Neumann architecture was the central processing unit (CPU), which allowed all computer functions to be coordinated through a single source. One of the first commercially available computers to take advantage of the development of the CPU was the UNIVAC I (1951). Both the U.S. Census bureau and General Electric owned UNIVACs (Universal Automatic Computer).

Characteristic for first generation computers was the fact, that instructions were made-to-order for the specific task for which the computer was to be used. Each computer had a different binary-coded program called a machine language that told it how to operate. Therefore computers were difficult to program and limited in versatility and speed. Another feature of early computers was that they used vacuum tubes and magnetic drums for storage.

TEXTBLOCK 6/40 // URL: http://world-information.org/wio/infostructure/100437611663/100438659338
 
Doubls Bind Messages

Double bind messages are extremely effective.
For example in Nicaragua the Sandinistas were seen as the personification of the evil. Demonization was the tool to make the U.S.-population to believe that. And the propaganda, called "Operation Truth", succeeded - and is successful until today. The Sandinistas are still considered an enemy in the head of the people. The media played the role of spreading propaganda - nearly without any criticism.
By the end of the 1980s the USA even paid Nicaraguans for voting other parties than the Sandinistas.

El Salvador was a similar case. Again the guerrilla got demonized. The difference was the involvement of the Catholic Church, which was highly fought against by the ruling parties of El Salvador - and those again were financially and organizationally supported by the USA. The elections in the 1980s were more or less paid by the USA.
U.S.-politicians were afraid El Salvador could end up being a second Cuba or Nicaragua. Every means was correct to fight this tendency, no matter what it cost.
On the 21st of September 1996, the Washington Post published several documents proofing an old rumor: not only that Central American soldiers had been educated in a U.S.-army school (the SOA), they also were taught to use torture as a method against revolutionaries. Some of the Salvadorian "students" of that school became very famous for being extremely cruel, one of them being General Roberto d'Aubuisson (35), the person who ordered the killing of Archbishop Oscar Romero in 1980.

TEXTBLOCK 7/40 // URL: http://world-information.org/wio/infostructure/100437611661/100438658752
 
Pressures and Attacks against Independent Content Providers: Serbia

The independent Belgrade based FM radio-station B2-92, which from December 1996 on also broadcasts over the Internet, repeatedly has been the target of suppression and attacks by the Serbian government.

B2-92 offices have been raided on numerous occasions and members of staff have been repeatedly harassed or arrested. In March 1999 the transmitter of radio B2-92 was confiscated yet again by the Serbian authorities and editor-in-chief, Veran Matic, was taken and held in custody at a police station. Ten days after the confiscation of B2-92's transmitter, Serbian police entered and sealed their offices. All members of staff were sent home and a new General Manager was appointed by Serbian officials. Although by closing B2-92, the Serbian regime may have succeeded in softening the voice of the independent content provider, with the distributive nature of the Internet and the international help of media activists, the regime will have little chance of silencing the entire flood of independent content coming out of former Yugoslavia.

TEXTBLOCK 8/40 // URL: http://world-information.org/wio/infostructure/100437611734/100438659225
 
some essential definitions

some essential definitions in the field of cryptography are:
- cryptoanalysis
- cryptology
- ciphers

"Few false ideas have more firmly gripped the minds of so many intelligent men than the one that, if they just tried, they could invent a cipher that no one could break." (David Kahn)

codes
plaintext
ciphertext
to encipher/encode
to decipher/decode

The variants of encryption systems are endless.
For deciphering there exists always the same game of trial and error (first guessing the encryption method, then the code). A help to do so is pruning. Once, after a more or less long or short period a code/cipher breaks. Monoalphabetic ciphers can be broken easily and of course are no longer used today but for games.

for further information on codes and ciphers etc. see:
http://www.optonline.com/comptons/ceo/01004A.html
http://www.ridex.co.uk/cryptology/#_Toc439908851

TEXTBLOCK 9/40 // URL: http://world-information.org/wio/infostructure/100437611776/100438659070
 
Commercial Content

Commercial media aim towards economies of scale and scope and the satisfaction of their shareholders. As most of the private media companies' revenues come from advertising, much of their content is designed to allure audiences, whose size and composition is decisive for advertisers and marketers. Those revenues being of vital importance for commercial media firms, their programming in many cases is tailored to the needs of the advertising industry. In their self-interest commercial media also often accept pressure from marketers and advertisers. "... for example, Procter & Gamble, the world's number one corporate advertiser, explicitly prohibits programming "which could in any way further the concept of business as cold, ruthless, and lacking all sentiment or spiritual motivation." (Edward S. Herman and Robert W. McChesney)

Hence, so as not to interfere with the commercial message, most media conglomerates concentrate on easy-to-consume programming with entertainment, music and sports forming most of their content. Although they also offer news and documentaries, programs focusing on topics of public interest or minority issues hardly play more than a supporting role as they do not comply with the demands of a profit oriented system. One of the most serious effects of this development is that citizens are substantially deprived of an essential element for their participation in the public sphere: objective, exhaustive and diverse information on issues of public concern.

TEXTBLOCK 10/40 // URL: http://world-information.org/wio/infostructure/100437611795/100438659178
 
Infowar

Through the internet a new form of vulnerability of governments is emerging. Hackers drive national and international governmental organizations crazy by changing their websites and offering disinfor-mation. Attacks of this kind happen several times a day and the technicians say there is nothing to stop them. The Pentagon is one of the most popular victims. In the year 1999 the number of hacker-invasions (a series of them is called The Moonlight Maze and was coming from the server of the Russian Academy of Science - a fact that does not proof much) to the Pentagon could rise up till 20.000. Normally it takes several hours to repair the pages - from the moment of realizing that some hackers have entered the zone.
The issue runs as a new form of terrorism. Laws are very strict and punishment high, a fact showing the fear of the authorities, as it is more than the disinformation campaigns that frightens them: Internationals Relations could be influenced.
See more about this on:
http://www.best.com/~hansen/DrPseudocryptonym/infowar.html

TEXTBLOCK 11/40 // URL: http://world-information.org/wio/infostructure/100437611661/100438658605
 
1500 - 1700 A.D.

1588
Agostino Ramelli's reading wheel

Agostino Ramelli designed a "reading wheel", which allowed browsing through a large number of documents without moving from one spot to another.

The device presented a large number of books - a small library - laid open on lecterns on a kind of ferry-wheel. It allowed skipping chapters and browsing through pages by turning the wheel to bring lectern after lectern before the eyes. Ramelli's reading wheel thus linked ideas and texts and reminds of today's browsing software used to navigate the World Wide Web.

1597
The first newspaper is printed in Europe.

TEXTBLOCK 12/40 // URL: http://world-information.org/wio/infostructure/100437611796/100438659704
 
1980s: Artificial Intelligence (AI) - From Lab to Life

Following the commercial success of expert systems, which started in the 1970s, also other AI technologies began to make their way into the marketplace. In 1986, U.S. sales of AI-related hardware and software rose to U.S.$ 425 million. Especially expert systems, because of their efficiency, were still in demand. Yet also other fields of AI turned out to be successful in the corporate world.

Machine vision systems for example were used for the cameras and computers on assembly lines to perform quality control. By 1985 over a hundred companies offered machine vision systems in the U.S., and sales totaled U.S.$ 80 million. Although there was a breakdown in the market for AI-systems in 1986 - 1987, which led to a cut back in funding, the industry slowly recovered.

New technologies were being invented in Japan. Fuzzy logic pioneered in the U.S. and also neural networks were being reconsidered for achieving artificial intelligence. The probably most important development of the 1980s was, that it showed that AI technology had real life uses. AI applications like voice and character recognition systems or steadying camcorders using fuzzy logic were not only made available to business and industry, but also to the average customer.

TEXTBLOCK 13/40 // URL: http://world-information.org/wio/infostructure/100437611663/100438659445
 
Who owns the Internet and who is in charge?

The Internet/Matrix still depends heavily on public infrastructure and there is no dedicated owner of the whole Internet/Matrix, but the networks it consists of are run and owned by corporations and institutions. Access to the Internet is usually provided by Internet Service Providers (ISPs) for a monthly fee. Each network is owned by someone and has a network operation center from where it is centrally controlled, but the Internet/Matrix is not owned by any single authority and has no network operation center of its own. No legal authority determines how and where networks can be connected together, this is something the managers of networks have to agree about. So there is no way to ever gain ultimate control of the Matrix/Internet.
The in some respects decentralized Matrix/Internet architecture and administration do not imply that there are no authorities for oversight and common standards for sustaining basic operations, for administration: There are authorities for IP number and domain name registrations, e.g.
Ever since the organizational structures for Internet administration have changed according to the needs to be addressed. Up to now, administration of the Internet is a collaborative undertaking of several loose cooperative bodies with no strict hierarchy of authority. These bodies make decisions on common guidelines, as communication protocols, e.g., cooperatively, so that compatibility of software is guaranteed. But they have no binding legal authority, nor can they enforce the standards they have agreed upon, nor are they wholly representative for the community of Internet users. The Internet has no official governing body or organization; most parts are still administered by volunteers.
Amazingly, there seems to be an unspoken and uncodified consent of what is allowed and what is forbidden on the Internet that is widely accepted. Codifications, as the so-called Netiquette, are due to individual efforts and mostly just expressively stating the prevailing consent. Violations of accepted standards are fiercely rejected, as reactions to misbehavior in mailing lists and newsgroups prove daily.
Sometimes violations not already subject to law become part of governmental regulations, as it was the case with spamming, the unsolicited sending of advertising mail messages. But engineers proved to be quicker and developed software against spamming. So, in some respects, the Internet is self-regulating, indeed.
For a detailed report on Internet governance, click here.

TEXTBLOCK 14/40 // URL: http://world-information.org/wio/infostructure/100437611791/100438658447
 
Governmental Influence

Agencies like the NSA are currently able to eavesdrop on anyone with few restrictions only - though other messages are spread by the NSA.
Theoretically cryptography can make that difficult. Hence those agencies speak up for actions like introducing trapdoors to make it possible to get access to everybody's data.

See the U.S. discussion about the Clipper Chip some years ago:
http://www.epic.org/crypto/clipper/
http://www.cdt.org/crypto/admin/041693whpress.txt

While encryption offers us privacy for the transmission of data, we do not only wish to have it but also need it if we want to transport data which shall not be seen by anyone else but the recipient of our message. Given this, the governments and governmental institutions/organizations fear to lose control. Strict laws are the consequence. The often repeated rumor that the Internet was a sphere of illegality has been proven wrong. Some parts are controlled by law very clearly. One of them is cryptography. Prohibition of cryptography or at least its restriction are considered an appropriate tool against criminality. Or one should say: had been considered that. In the meantime also governmental institutions have to admit that those restrictions most of all work against the population instead against illegal actors. Therefore laws have been changed in many states during the last five years. Even the USA, the Master of cryptography-restriction, liberated its laws in December 1999 to be more open-minded now.

for an insight into the discussion having gone on for years see:
http://www.cdt.org/crypto/new2crypto/3.shtml

the final text of the new U.S. Encryption Regulations you will find under:
http://www.cdt.org/crypto/admin/000110cryptoregs.shtml
http://www.cdt.org/crypto/admin/000114cryptoregs.txt

an explanation of the regulations can be found under:
http://www.cdt.org/crypto/admin/000112commercefactsheet.shtml

TEXTBLOCK 15/40 // URL: http://world-information.org/wio/infostructure/100437611776/100438659102
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 16/40 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
Challenges for Copyright by ICT: Internet Service Providers

ISPs (Internet Service Providers) (and to a certain extent also telecom operators) are involved in the copyright debate primarily because of their role in the transmission and storage of digital information. Problems arise particularly concerning caching, information residing on systems or networks of ISPs at the directions of users and transitory communication.

Caching

Caching it is argued could cause damage because the copies in the cache are not necessarily the most current ones and the delivery of outdated information to users could deprive website operators of accurate "hit" information (information about the number of requests for a particular material on a website) from which advertising revenue is frequently calculated. Similarly harms such as defamation or infringement that existed on the original page may propagate for years until flushed from each cache where they have been replicated.

Although different concepts, similar issues to caching arise with mirroring (establishing an identical copy of a website on a different server), archiving (providing a historical repository for information, such as with newsgroups and mailing lists), and full-text indexing (the copying of a document for loading into a full-text or nearly full-text database which is searchable for keywords or concepts).

Under a literal reading of some copyright laws caching constitutes an infringement of copyright. Yet recent legislation like the DMCA or the proposed EU Directive on copyright and related rights in the information society (amended version) have provided exceptions for ISPs concerning particular acts of reproduction that are considered technical copies (caching). Nevertheless the exemption of liability for ISPs only applies if they meet a variety of specific conditions. In the course of the debate about caching also suggestions have been made to subject it to an implied license or fair use defense or make it (at least theoretically) actionable.

Information Residing on Systems or Networks at the Direction of Users

ISPs may be confronted with problems if infringing material on websites (of users) is hosted on their systems. Although some copyright laws like the DMCA provide for limitations on the liability of ISPs if certain conditions are met, it is yet unclear if ISPs should generally be accountable for the storage of infringing material (even if they do not have actual knowledge) or exceptions be established under specific circumstances.

Transitory Communication

In the course of transmitting digital information from one point on a network to another ISPs act as a data conduit. If a user requests information ISPs engage in the transmission, providing of a connection, or routing thereof. In the case of a person sending infringing material over a network, and the ISP merely providing facilities for the transmission it is widely held that they should not be liable for infringement. Yet some copyright laws like the DMCA provide for a limitation (which also covers the intermediate and transient copies that are made automatically in the operation of a network) of liability only if the ISPs activities meet certain conditions.

For more information on copyright (intellectual property) related problems of ISPs (BBSs (Bulletin Board Service Operators), systems operators and other service providers) see:

Harrington, Mark E.: On-line Copyright Infringement Liability for Internet Service Providers: Context, Cases & Recently Enacted Legislation. In: Intellectual Property and Technology Forum. June 4, 1999.

Teran, G.: Who is Vulnerable to Suit? ISP Liability for Copyright Infringement. November 2, 1999.

TEXTBLOCK 17/40 // URL: http://world-information.org/wio/infostructure/100437611725/100438659550
 
Problems of Copyright Management and Control Technologies

Profiling and Data Mining

At their most basic copyright management and control technologies might simply be used to provide pricing information, negotiate the purchase transaction, and release a copy of a work for downloading to the customer's computer. Still, from a technological point of view, such systems also have the capacity to be employed for digital monitoring. Copyright owners could for example use the transaction records generated by their copyright management systems to learn more about their customers. Profiles, in their crudest form consisting of basic demographic information, about the purchasers of copyrighted material might be created. Moreover copyright owners could use search agents or complex data mining techniques to gather more information about their customers that could either be used to market other works or being sold to third parties.

Fair Use

Through the widespread use of copyright management and control systems the balance of control could excessively be shifted in favor of the owners of intellectual property. The currently by copyright law supported practice of fair use might potentially be restricted or even eliminated. While information in analogue form can easily be reproduced, the protection of digital works through copyright management systems might complicate or make impossible the copying of material for purposes, which are explicitly exempt under the doctrine of fair use.

Provisions concerning technological protection measures and fair use are stated in the DMCA, which provides that "Since copying of a work may be a fair use under appropriate circumstances, section 1201 does not prohibit the act of circumventing a technological measure that prevents copying. By contrast, since the fair use doctrine is not a defense e to the act of gaining unauthorized access to a work, the act of circumventing a technological measure in order to gain access is prohibited." Also the proposed EU Directive on copyright and related rights in the information society contains similar clauses. It distinguishes between the circumvention of technical protection systems for lawful purposes (fair use) and the circumvention to infringe copyright. Yet besides a still existing lack of legal clarity also very practical problems arise. Even if the circumvention of technological protection measures under fair use is allowed, how will an average user without specialized technological know-how be able to gain access or make a copy of a work? Will the producers of copyright management and control systems provide fair use versions that permit the reproduction of copyrighted material? Or will users only be able to access and copy works if they hold a digital "fair use license" ("fair use licenses" have been proposed by Mark Stefik, whereby holders of such licenses could exercise some limited "permissions" to use a digital work without a fee)?

TEXTBLOCK 18/40 // URL: http://world-information.org/wio/infostructure/100437611725/100438659629
 
Radio

Between the two World Wars the radio started becoming more and more important; as well in education (e.g. Walter Benjamin and Bertolt Brecht) as for propaganda.
By hearing unconsciously, without listening, while concentrating on something else, it is easy to spread ideas and emotions. This fact was taken advantage of.
The German Minister for Propaganda, Josef Goebbels, imagined the radio to be the most effective tool for propaganda. In fact the radio turned out to be a method to reach all generations at the same time, even the illiterates. By sending propaganda music and interrupting programs for the latest news, mostly good ones, the radio became popular.
Radio Moscow, which started working in 1922, tried to intervene in innerstate-affairs in Britain as well as in other countries. The radio was supposed to push ahead the idea of communism.

TEXTBLOCK 19/40 // URL: http://world-information.org/wio/infostructure/100437611661/100438658560
 
Timeline 1600 - 1900 AD

17th century Cardinal Richelieu invents an encryption-tool called grille, a card with holes for writing messages on paper into the holes of those cards. Afterwards he removes the cards and fills in the blanks, so the message looks like an ordinary letter. The recipient needs to own the same card

- Bishop John Wilkins invents a cryptologic system looking like music notes. In a book he describes several forms of steganographic systems like secrets inks, but also the string cipher. He mentions the so-called Pig Latin, a spoken way of encryption that was already used by the ancient Indians

- the English scientist, magician and astrologer John Dee works on the ancient Enochian alphabet; he also possesses an encrypted writing that could not been broken until today

1605/1623 Sir Francis Bacon (= Francis Tudor = William Shakespeare?) writes several works containing ideas about cryptography. One of his most important advises is to use ciphers in such a way that no-one gets suspicious that the text could be enciphered. For this the steganogram was the best method, very often used in poems. The attempt to decipher Shakespeare's sonnets (in the 20th century) lead to the idea that his works had been written by Francis Bacon originally.

1671 Leibniz invents a calculating machine that uses the binary scale which we still use today, more advanced of course, called the ASCII code

18th century this is the time of the Black Chambers of espionage in Europe, Vienna having one of the most effective ones, called the "Geheime Kabinettskanzlei", headed by Baron Ignaz von Koch. Its task is to read through international diplomatic mail, copy letters and return them to the post-office the same morning. Supposedly about 100 letters are dealt with each day.

1790's Thomas Jefferson and Robert Patterson invent a wheel cipher

1799 the Rosetta Stone is found and makes it possible to decipher the Egyptian Hieroglyphs

1832 or 1838 Sam Morse develops the Morse Code, which actually is no code but an enciphered alphabet of short and long sounds. The first Morse code-message is sent by telegraph in 1844.

1834 the Braille Code for blind people is developed in today's form by Louis Braille

1844 the invention of the telegraph changes cryptography very much, as codes are absolutely necessary by then

1854 the Playfair cipher is invented by Sir Charles Wheatstone

1859 for the first time a tomographic cipher gets described

1861 Friedrich W. Kasiski does a cryptoanalysis of the Vigenère ciphers, which had been supposed to be uncrackable for ages

1891 Major Etienne Bazeries creates a new version of the wheel cipher, which is rejected by the French Army

1895 the invention of the radio changes cryptography-tasks again and makes them even more important

TEXTBLOCK 20/40 // URL: http://world-information.org/wio/infostructure/100437611776/100438658974
 
Znet

ZNet provides forum facilities for online discussion and chatting on various topics ranging from culture and ecology to international relations and economics. ZNet also publishes daily commentaries and maintains a Web-zine, which addresses current news and events as well as many other topics, trying to be provocative, informative and inspiring to its readers.

Strategies and Policies

Daily Commentaries: Znet's commentaries address current news and events, cultural happenings, and organizing efforts, providing context, critique, vision, and analysis, but also references to or reviews of broader ideas, new books, activism, the Internet, and other topics that strike the diverse participating authors as worthy of attention.

Forum System: Znet provides a private (and soon also a public) forum system. The fora are among others concerned with topics such as: activism, cultural, community/race/religion/ethnicity, ecology, economics/class, gender/kinship/sexuality, government/polity, international relations, ParEcon, vision/strategy and popular culture. Each forum has a set of threaded discussions, also the fora hosted by commentary writers like Chomsky, Ehrenreich, Cagan, Peters and Wise.

ZNet Daily WebZine: ZNet Daily WebZine offers commentaries in web format.

Z Education Online (planned): The Z Education Online site will provide instructionals and courses of diverse types as well as other university-like, education-aimed features.

TEXTBLOCK 21/40 // URL: http://world-information.org/wio/infostructure/100437611734/100438659288
 
Economic structure; digital euphoria

The dream of a conflict-free capitalism appeals to a diverse audience. No politician can win elections without eulogising the benefits of the information society and promising universal wealth through informatisation. "Europe must not lose track and should be able to make the step into the new knowledge and information society in the 21st century", said Tony Blair.

The US government has declared the construction of a fast information infrastructure network the centerpiece of its economic policies

In Lisbon the EU heads of state agreed to accelerate the informatisation of the European economies

The German Chancellor Schröder has requested the industry to create 20,000 new informatics jobs.

The World Bank understands information as the principal tool for third world development

Electronic classrooms and on-line learning schemes are seen as the ultimate advance in education by politicians and industry leaders alike.

But in the informatised economies, traditional exploitative practices are obscured by the glamour of new technologies. And the nearly universal acceptance of the ICT message has prepared the ground for a revival of 19th century "adapt-or-perish" ideology.

"There is nothing more relentlessly ideological than the apparently anti-ideological rhetoric of information technology"

(Arthur and Marilouise Kroker, media theorists)

TEXTBLOCK 22/40 // URL: http://world-information.org/wio/infostructure/100437611726/100438658999
 
ZaMir.net

ZaMir.net started in 1992 trying to enable anti-war and human rights groups of former Yugoslavia to communicate with each other and co-ordinate their activities. Today there are an estimated 1,700 users in 5 different Bulletin Board Systems (Zagreb, Belgrade, Ljubljana, Sarajevo and Pristiana). Za-mir Transnational Network (ZTN) offers e-mail and conferences/newsgroups. The ZTN has its own conferences, which are exchanged between the 5 BBS, and additionally offers more than 150 international conferences. ZTN aim is to help set up systems in other cities in the post-Yugoslav countries that have difficulty connecting to the rest of the world.

History

With the war in Yugoslavia anti-war and human rights groups of former Yugoslavia found it very difficult to organize and met huge problems to co-ordinate their activities due to immense communication difficulties. So in 1992 foreign peace groups together with Institutions in Ljubljana, Zagreb and Belgrade launched the Communications Aid project. Modems were distributed to peace and anti-war groups in Ljubljana, Zagreb, Belgrade and Sarajevo and a BBS (Bulletin Board System) installed.

As after spring 1992 no directs connections could be made they were done indirectly through Austria, Germany or Britain, which also enabled a connection with the worldwide networks of BBS's. Nationalist dictators therefore lost their power to prevent communication of their people. BBS were installed in Zagreb and Belgrade and connected to the APC Network and associated networks. Za-mir Transnational Network (ZTN) was born.

Strategies and Policies

With the help of ZaMir's e-mail network it have been possible to find and coordinate humanitarian aid for some of the many refugees of the war. It has become an important means of communication for humanitarian organizations working in the war region and sister organizations form other countries. It helps co-ordinate work of activists form different countries of former Yugoslavia, and it also helps to coordinate the search for volunteers to aid in war reconstruction. ZTN also helped facilitate exchange of information undistorted by government propaganda between Croatia, Serbia and Bosnia. Independent magazines like Arkzin (Croatia) and Vreme (Serbia) now publish electronic editions on ZTN.

TEXTBLOCK 23/40 // URL: http://world-information.org/wio/infostructure/100437611734/100438659208
 
Abstract

What we seem to fear most is to get into a status of insecurity - given that the definitions of the word security vary extremely. Thus methods of securing ideas, people, things or data increase their popularity and necessity tremendously. One of them is cryptography - as well as the prohibition/restriction of cryptography.
Questions whether cryptography is absolutely inevitable or on the contrary supports certain criminals more than the ordinary internet-user, are arising. And as the last developments in international and national law showed, Northern governments are changing opinion about that, due to economic tasks.
Business needs cryptography.
Still, the use of cryptography is no recent invention. Already the first steps in writing or even in human communication itself meant developing codes for keeping secrets at the same time as providing information.

This site gives a timeline for the history of cryptography, provides an introduction into the most important terms of tools and devices connected to that topic, and finally tries to interpret necessities for and ideas against cryptography or in other words leads through the current discussions concerning democracy and governmental fears and doubts regarding the security of data-transmission.

TEXTBLOCK 24/40 // URL: http://world-information.org/wio/infostructure/100437611776/100438658887
 
Cultural Opposition

Corporate public relations and advertising are not only regularly criticized by intellectuals, scientists and writers, but also by cultural and artistic institutions and practitioners. Themselves using advertising and public relations approaches their products are artistic pieces and also caricature the advertising and public relations industry.

TEXTBLOCK 25/40 // URL: http://world-information.org/wio/infostructure/100437611652/100438658025
 
Association for Progressive Communication (APC)

The APC is a global federation of 24 non-profit Internet providers serving over 50,000 NGOs in 133 countries. Since 1990, APC has been supporting people and organizations worldwide, working together online for social, environmental and economic justice. The APC's network of members and partners spans the globe, with significant presence in Central and Eastern Europe, Africa, Asia and Latin America.

History

Between 1982 and 1987 several independent, national, non-profit computer networks emerged as viable information and communication resources for activists and NGOs. The networks were founded to make new communication techniques available to movements working for social change.

In 1987, people at GreenNet in England began collaborating with their counterparts at the Institute for Global Communications (IGC) in the United States. These two networks started sharing electronic conference material and demonstrated that transnational electronic communications could serve international as well as domestic communities working for peace, human rights and the environment.

This innovation proved so successful that by late 1989, networks in Sweden, Canada, Brazil, Nicaragua and Australia were exchanging information with each other and with IGC and GreenNet. In the spring of 1990, these seven organizations founded the Association for Progressive communications to co-ordinate the operation and development of this emerging global network of networks.

Strategies and Policies

The APC defends and promotes non-commercial, productive online space for NGOs and collaborates with like-minded organizations to ensure that the information and communication needs of civil society are considered in telecommunications, donor and investment policy. The APC is committed to freedom of expression and exchange of information on the Internet.

The APC helps to build capacity between existing and emerging communication service providers.

The APC Women's Networking Support Program promotes gender-aware Internet design, implementation and use.

Through its African members, the APC is trying to strengthen indigenous information sharing and independent networking capacity on the continent.

Members of APC develop Internet products, resources and tools to meet the advocacy, collaboration and information publishing and management needs of civil society. Recent APC initiatives have included the APC Toolkit Project: Online Publishing and Collaboration for Activists and the Mission-Driven Business Planning Toolkit.

The APC also runs special projects like the Beijing+5, which shall enable non-governmental organizations to actively participate in the review of the Beijing Platform for Action.

TEXTBLOCK 26/40 // URL: http://world-information.org/wio/infostructure/100437611734/100438659269
 
Internet, Intranets, Extranets, and Virtual Private Networks

With the rise of networks and the corresponding decline of mainframe services computers have become communication devices instead of being solely computational or typewriter-like devices. Corporate networks become increasingly important and often use the Internet as a public service network to interconnect. Sometimes they are proprietary networks.

Software companies, consulting agencies, and journalists serving their interests make some further differences by splitting up the easily understandable term "proprietary networks" into terms to be explained and speak of Intranets, Extranets, and Virtual Private Networks.

Cable TV networks and online services as Europe Online, America Online, and Microsoft Network are also proprietary networks. Although their services resemble Internet services, they offer an alternative telecommunication infrastructure with access to Internet services for their subscribers.
America Online is selling its service under the slogan "We organize the Web for you!" Such promises are more frightening than promising because "organizing" is increasingly equated with "filtering" of seemingly objectionable messages and "rating" of content. For more information on these issues, click here If you want to know more about the technical nature of computer networks, here is a link to the corresponding article in the Encyclopaedia Britannica.

Especially for financial transactions, secure proprietary networks become increasingly important. When you transfer funds from your banking account to an account in another country, it is done through the SWIFT network, the network of the Society for Worldwide Interbank Financial Telecommunication (SWIFT). According to SWIFT, in 1998 the average daily value of payments messages was estimated to be above U$ 2 trillion.

Electronic Communications Networks as Instinet force stock exchanges to redefine their positions in trading of equities. They offer faster trading at reduced costs and better prices on trades for brokers and institutional investors as mutual funds and pension funds. Last, but not least clients are not restricted to trading hours and can trade anonymously and directly, thereby bypassing stock exchanges.

TEXTBLOCK 27/40 // URL: http://world-information.org/wio/infostructure/100437611791/100438658384
 
Bandwagon

To bandwagon is a form of persuasion by telling that one should do something as the others are doing the same thing.

for more informations see:
http://persweb.direct.ca/ikhan/elementary/wsoccult6.html

TEXTBLOCK 28/40 // URL: http://world-information.org/wio/infostructure/100437611661/100438658522
 
Timeline 1970-2000 AD

1971 IBM's work on the Lucifer cipher and the work of the NSA lead to the U.S. Data Encryption Standard (= DES)

1976 Whitfield Diffie and Martin Hellman publish their book New Directions in Cryptography, playing with the idea of public key cryptography

1977/78 the RSA algorithm is developed by Ron Rivest, Adi Shamir and Leonard M. Adleman and is published

1984 Congress passes Comprehensive Crime Control Act

- The Hacker Quarterly is founded

1986 Computer Fraud and Abuse Act is passed in the USA

- Electronic Communications Privacy Act

1987 Chicago prosecutors found Computer Fraud and Abuse Task Force

1988 U.S. Secret Service covertly videotapes a hacker convention

1989 NuPrometheus League distributes Apple Computer software

1990 - IDEA, using a 128-bit key, is supposed to replace DES

- Charles H. Bennett and Gilles Brassard publish their work on Quantum Cryptography

- Martin Luther King Day Crash strikes AT&T long-distance network nationwide


1991 PGP (= Pretty Good Privacy) is released as freeware on the Internet, soon becoming worldwide state of the art; its creator is Phil Zimmermann

- one of the first conferences for Computers, Freedom and Privacy takes place in San Francisco

- AT&T phone crash; New York City and various airports get affected

1993 the U.S. government announces to introduce the Clipper Chip, an idea that provokes many political discussions during the following years

1994 Ron Rivest releases another algorithm, the RC5, on the Internet

- the blowfish encryption algorithm, a 64-bit block cipher with a key-length up to 448 bits, is designed by Bruce Schneier

1990s work on quantum computer and quantum cryptography

- work on biometrics for authentication (finger prints, the iris, smells, etc.)

1996 France liberates its cryptography law: one now can use cryptography if registered

- OECD issues Cryptography Policy Guidelines; a paper calling for encryption exports-standards and unrestricted access to encryption products

1997 April European Commission issues Electronic Commerce Initiative, in favor of strong encryption

1997 June PGP 5.0 Freeware widely available for non-commercial use

1997 June 56-bit DES code cracked by a network of 14,000 computers

1997 August U.S. judge assesses encryption export regulations as violation of the First Amendment

1998 February foundation of Americans for Computer Privacy, a broad coalition in opposition to the U.S. cryptography policy

1998 March PGP announces plans to sell encryption products outside the USA

1998 April NSA issues a report about the risks of key recovery systems

1998 July DES code cracked in 56 hours by researchers in Silicon Valley

1998 October Finnish government agrees to unrestricted export of strong encryption

1999 January RSA Data Security, establishes worldwide distribution of encryption product outside the USA

- National Institute of Standards and Technologies announces that 56-bit DES is not safe compared to Triple DES

- 56-bit DES code is cracked in 22 hours and 15 minutes

1999 May 27 United Kingdom speaks out against key recovery

1999 Sept: the USA announce to stop the restriction of cryptography-exports

2000 as the German government wants to elaborate a cryptography-law, different organizations start a campaign against that law

- computer hackers do no longer only visit websites and change little details there but cause breakdowns of entire systems, producing big economic losses

for further information about the history of cryptography see:
http://www.clark.net/pub/cme/html/timeline.html
http://www.math.nmsu.edu/~crypto/Timeline.html
http://fly.hiwaay.net/~paul/cryptology/history.html
http://www.achiever.com/freehmpg/cryptology/hocryp.html
http://all.net/books/ip/Chap2-1.html
http://cryptome.org/ukpk-alt.htm
http://www.iwm.org.uk/online/enigma/eni-intro.htm
http://www.achiever.com/freehmpg/cryptology/cryptofr.html
http://www.cdt.org/crypto/milestones.shtml

for information about hacker's history see:
http://www.farcaster.com/sterling/chronology.htm:

TEXTBLOCK 29/40 // URL: http://world-information.org/wio/infostructure/100437611776/100438658960
 
1900 - 2000 A.D.

1904
First broadcast talk

1918
Invention of the short-wave radio

1929
Invention of television in Germany and Russia

1941
Invention of microwave transmission

1946
Long-distance coaxial cable systems and mobile telephone services are introduced in the USA.

1957
Sputnik, the first satellite, is launched by the USSR
First data transmissions over regular phone circuits.

At the beginning of the story of today's global data networks is the story of the development of satellite communication.

In 1955 President Eisenhower announced the USA's intention to launch a satellite. But it in the end it was the Soviet Union, which launched the first satellite in 1957: Sputnik I. After Sputnik's launch it became evident that the Cold War was also a race for leadership in the application of state-of-the-art technology to defense. As the US Department of Defense encouraged the formation of high-tech companies, it laid the ground to Silicon Valley, the hot spot of the world's computer industry.

The same year as the USA launched their first satellite - Explorer I - data was transmitted over regular phone circuits for the first time, thus laying the ground for today's global data networks.

Today's satellites may record weather data, scan the planet with powerful cameras, offer global positioning and monitoring services, and relay high-speed data transmissions. Yet up to now, most satellites are designed for military purposes such as reconnaissance.

1969
ARPAnet online

ARPAnet was the small network of individual computers connected by leased lines that marked the beginning of today's global data networks. An experimental network it mainly served the purpose of testing the feasibility of wide area networks and the possibility of remote computing. It was created for resource sharing between research institutions and not for messaging services like E-mail. Although US military sponsored its research, ARPAnet was not designed for directly martial use but to support military-related research.

In 1969 ARPANET went online and linked the first two computers, one located at the University of California, Los Angeles, the other at the Stanford Research Institute.

Yet ARPAnet did not become widely accepted before it was demonstrated in action to a public of computer experts at the First International Conference on Computers and Communication in Washington, D. C. in 1972.

Before it was decommissioned in 1990, NSFnet, a network of scientific and academic computers funded by the National Science Foundation, and a separate new military network went online in 1986. In 1988 the first private Internet service providers started offering access to NSFnet to a general public. After having become the backbone of the Internet in the USA, in 1995 NSFnet was turned into a consortium of commercial backbone providers. This and the launch of the World Wide Web added to the success of the global data network we call the Net.

In the USA it was already in 1994 that commercial users outnumbered military and academic users.

Despite the rapid growth of the Net, most computers linked to it are still located in the United States.

1971
Invention of E-Mail

1979
Introduction of fiber-optic cable systems

1992
Launch of the World Wide Web

TEXTBLOCK 30/40 // URL: http://world-information.org/wio/infostructure/100437611796/100438659828
 
ECHELON and COMSAT

COMSAT Communications Satellite Cooperation

http://www.comsat.com/

Until this decade the U.S.-based Comsat, Intelsat and Inmarsat organizations, in fact, shared nearly all international satellite traffic. So it was easy for NSA to eavesdropping on all communications to and from the United states. Less than 60 miles from Sugar Grove COMSAT runs a station in Etam, West Virginia, where more than half of the commercial, international satellite communication entering and leaving the US each day pass by. COMSAT provides international communications solutions via the global, 19-satellite INTELSAT system and 4-satellite Inmarsat satellite systems .

Through the INTELSAT system, COMSAT provides telecommunications, broadcast and digital networking services between the U.S. and the rest of the world. These services are used by Internet service providers, multinational corporations, telecommunications carriers and U.S. and foreign governments to extend their networks globally.

Inmarsat satellites lie in geostationary orbit 22,223 miles (35,786 km) out in space. Each satellite covers up to one third of the Earth's surface and is strategically positioned above one of the four ocean regions.

Calls are beamed up to the satellite and back down to Earth, where special gateway land earth stations re-route them through the appropriate local or international telephone network. COMSAT operates Earth Stations in each part of the world to route calls efficiently within each ocean region. Earth Stations are located in Santa Paula, California; Southbury, Connecticut; Ankara, Turkey; and Kuantan, Malaysia.

Sugar Grove Naval Communications Facility, near Sugar Grove, WV, may intercept Pacific INTELSAT/COMSAT satellite communications traffic routed through the COMSAT ground station at Etam, WV.

NSG station in Winter Harbor, Maine serves as an excellent platform from which to intercept signals to and from COMSATs Andover station, 125 miles to the west.

On the Westcost, COMSATs northern groundstation is situated in Brewster, near to Yakima, so from the Yakima Research Station the Pacific INTELSAT communications traffic can be intercepted.

The other west-coast station is in Jamesburg, California, not so far away from the Army Security Agency intercept station at Two Rock Ranch.

The international communications network with its limited gateways will probably always be easier to monitor than the large domestic networks like in the US. But the use of microwave and domestic satellites is increasing, and the construction of land lines is decreasing.

Sources:

STOA Report by Duncan Campbell: Interception Capabilities 2000 http://www.gn.apc.org/duncan/stoa.htm

James Bamford, The Puzzle Palace, Boston, Houghton Mifflin, 1982,p222-228

TEXTBLOCK 31/40 // URL: http://world-information.org/wio/infostructure/100437611746/100438659471
 
The North against the South?

"Faced with this process of globalization, most governments appear to lack the tools required for facing up to the pressure from important media changes. The new global order is viewed as a daunting challenge, and it most often results in reactions of introversion, withdrawal and narrow assertions of national identity. At the same time, many developing countries seize the opportunity represented by globalization to assert themselves as serious players in the global communications market."
(UNESCO, World Communication Report)

The big hope of the South is that the Internet will close the education gap and economic gap, by making education easier to achieve. But in reality the gap is impossible to close, because the North is not keeping still, but developing itself further and further all the time; inventing new technologies that produce another gap each. The farmer's boy sitting in the dessert and using a cellular telephone and a computer at the same time is a sarcastic picture - nothing else.

Still, the so called developing countries regard modern communication technologies as a tremendous chance - and actually: which other choice is there left?

TEXTBLOCK 32/40 // URL: http://world-information.org/wio/infostructure/100437611730/100438659376
 
Commercial vs. Independent Content: Power and Scope

Regarding the dimension of their financial and human resources commercial media companies are at any rate much more powerful players than their independent counterparts. Still those reply with an extreme multiplicity and diversity. Today thousands of newsgroups, mailing-list and e-zines covering a wide range of issues from the environment to politics, social and human rights, culture, art and democracy are run by alternative groups.

Moreover independent content provider have started to use digital media for communication, information and co-ordination long before they were discovered by corporate interest. They regularly use the Internet and other networks to further public discourse and put up civic resistance. And in many cases are very successful with their work, as initiatives like widerst@ndMUND's (AT) co-ordination of the critics of the participation of the Freedom Party in the Austrian government via mailing-lists, an online-magazine and discussion forums, show.

TEXTBLOCK 33/40 // URL: http://world-information.org/wio/infostructure/100437611734/100438659198
 
The Post-World-War II-period

After World War II the importance of propaganda still increased, on the commercial level as well as on a political level, in the era of the Cold War. The propaganda institutions of the different countries wanted their people to think the right way, which meant, the national way. In the USA the McCarthy-era started, a totalitarian system in struggle against communism. McCarthy even managed to publicly burn critical books that were written about him; and every unbeloved artist was said to be a communist, an out-law.
Cold War brought the era of spies with it, which was the perfect tool of disinformation. But the topic as a movie-genre seems still popular today, as the unchanged success of James Bond-movies show.
A huge net of propaganda was built up for threatening with the nuclear bomb: pretending that the enemy was even more dangerous than the effect of such a bomb.
And later, after the fall of the Iron Curtain, disinformation found other fields of work, like the wars of the 1990s showed us.

TEXTBLOCK 34/40 // URL: http://world-information.org/wio/infostructure/100437611661/100438658581
 
Transparent customers. Direct marketing online



This process works even better on the Internet because of the latter's interactive nature. "The Internet is a dream to direct marketers", said Wil Lansing, CEO of the American retailer Fingerhut Companies. Many services require you to register online, requiring users to provide as much information about them as possible. And in addition, the Internet is fast, cheap and used by people who tend to be young and on the search for something interesting.

Many web sites also are equipped with user tracking technology that registers a users behaviour and preferences during a visit. For example, user tracking technology is capable of identifying the equipment and software employed by a user, as well as movements on the website, visit of links etc. Normally such information is anonymous, but can be personalised when it is coupled with online registration, or when personal identifcation has been obtained from other sources. Registration is often a prerequisite not just for obtaining a free web mail account, but also for other services, such as personalised start pages. Based on the information provided by user, the start page will then include advertisements and commercial offers that correspond to the users profile, or to the user's activity on the website.

One frequent way of obtaining such personal information of a user is by offering free web mail accounts offered by a great many companies, internet providers and web portals (e.g. Microsoft, Yahoo, Netscape and many others). In most cases, users get "free" accounts in return for submitting personal information and agreeing to receive marketing mails. Free web mail accounts are a simple and effective direct marketing and data capturing strategy which is, however, rarely understood as such. However, the alliances formed between direct advertising and marketing agencies on the one hand, and web mail providers on the other hand, such as the one between DoubleClick and Yahoo, show the common logic of data capturing and direct marketing. The alliance between DoubleClick and Yahoo eventually attracted the US largest direct marketing agency, Abacus Direct, who ended up buying DoubleClick.

However, the intention of collecting users personal data and create consumer profiles based on online behaviour can also take on more creative and playful forms. One such example is sixdegrees.com. This is a networking site based on the assumption that everybody on the planet is connected to everybody else by a chain of six people at most. The site offers users to get to know a lot of new people, the friends of their friends of their friends, for example, and if they try hard enough, eventually Warren Beatty or Claudia Schiffer. But of course, in order to make the whole game more useful for marketing purposes, users are encouraged to join groups which share common interests, which are identical with marketing categories ranging from arts and entertainment to travel and holiday. Evidently, the game becomes more interesting the more new people a user brings into the network. What seems to be fun for the 18 to 24 year old college student customer segment targeted by sixdegrees is, of course, real business. While users entertain themselves they are being carefully profiled. After all, data of young people who can be expected to be relatively affluent one day are worth more than money.

The particular way in which sites such as sixdegrees.com and others are structured mean that not only to users provide initial information about them, but also that this information is constantly updated and therefore becomes even more valuable. Consequently, many free online services or web mail providers cancel a user's account if it has not been uses for some time.

There are also other online services which offer free services in return for personal information which is then used for marketing purposes, e.g. Yahoo's Geocities, where users may maintain their own free websites, Bigfoot, where people are offered a free e-mail address for life, that acts as a relais whenever a customer's residence or e-mail address changes. In this way, of course, the marketers can identify friendship and other social networks, and turn this knowledge into a marketing advantage. People finders such as WhoWhere? operate along similar lines.

A further way of collecting consumer data that has recently become popular is by offering free PCs. Users are provided with a PC for free or for very little money, and in return commit themselves to using certain services rather than others (e.g. a particular internet provider), providing information about themselves, and agree to have their online behaviour monitored by the company providing the PC, so that accurate user profiles can be compiled. For example, the Free PC Network offers advertisers user profiles containing "over 60 individual demographics". There are literally thousands of variations of how a user's data are extracted and commercialised when online. Usually this happens quietly in the background.

A good inside view of the world of direct marketing can be gained at the website of the American Direct Marketing Association and the Federation of European Direct Marketing.

TEXTBLOCK 35/40 // URL: http://world-information.org/wio/infostructure/100437611761/100438659667
 
The "Corpse-Conversion Factory"-rumor

Supposedly the most famous British atrocity story concerning the Germans during World War I was the "Corpse-Conversion Factory"-rumor; it was said the Germans produced soap out of corpses. A story, which got so well believed that it was repeated for years - without a clear evidence of reality at that time. (Taylor, Munitions of the Mind, p.180)

TEXTBLOCK 36/40 // URL: http://world-information.org/wio/infostructure/100437611661/100438658427
 
Changes

Still, disinformation and propaganda are nothing magic. They can change things, but supposedly only if those things/meanings/opinions are not fixed completely. Galileo Galilee was not capable of changing people's opinion about the world being flat until some persons got suspicious that this long-believed-in truth was a mistake. The propaganda of his experiments which made him famous was not enough. On the other hand later all the propaganda of his enemies could not turn back the wheel of enlightenment, as people thought it was more logic to believe in a round world than in a flat one.
It is never just a single idea that changes. Society is following the changes.

Thinking about disinformation brings us to the word truth, of course, and to the doubt that there is no definite truth. And truth can easily be manipulated to another truth. Just present some facts that seem to be logic and there you've got a new truth. And if the facts can supposedly be proved by empirical studies then the quality of the truth definitely rises.
That's what ideologies do all the time. And the media like to do the same thing - as a game with power or mere presentation of power?

But of course there also exist bits of disinformation which are more amusing than evil or dangerous:
- the theory of the celestro-centric world/"Hohlwelttheorie"
- the story of the German philosopher who invented an Italian philosopher, wrote books about him, even reprinted "his" texts, which had gone lost pretendedly 100 years ago - and finally lost his job and all his career when other scientists found out that everything had been made up.

TEXTBLOCK 37/40 // URL: http://world-information.org/wio/infostructure/100437611661/100438658633
 
Identity vs. Identification

It has become a commonplace observation that the history of modernity has been accompanied by what one might call a general weakening of identity, both as a theoretical concept and as a social and cultural reality. This blurring of identity has come to full fruition in the 20th century. As a theoretical concept, identity has lost its metaphysical foundation of "full correspondence" following the destruction of metaphysics by thinkers such as Nietzsche, Heidegger, Witgenstein or Davidson. Nietzsche's "dead god", his often-quoted metaphor for the demise of metaphysics, has left western cultures not only with the problem of having to learn how to think without permanent foundations; it has left them with both the liberty of constructing identities, and the structural obligation to do so. The dilemmas arising out of this ambivalent situation have given rise to the comment that "god is dead, and men is not doing so well himself". The new promise of freedom is accompanied by the threat of enslavement. Modern, technologically saturated cultures survive and propagate and emancipate themselves by acting as the gatekeepers of their own technological prisons.

On the social and cultural levels, traditional clear-cut identities have become weakened as traditional cultural belonging has been undermined or supplanted by modern socio-technological structures. The question as to "who one is" has become increasingly difficult to answer: hybrid identities are spreading, identities are multiple, temporary, fleeting rather than reflecting an inherited sense of belonging. The war cry of modern culture industry "be yourself" demands the impossible and offers a myriad of tools all outcompeting each other in their promise to fulfil the impossible.

For many, identity has become a matter of choice rather than of cultural or biological heritage, although being able to chose may not have been the result of a choice. A large superstructure of purchasable identification objects caters for an audience finding itself propelled into an ever accelerating and vertiginous spiral of identification and estrangement. In the supermarket of identities, what is useful and cool today is the waste of tomorrow. What is offered as the latest advance in helping you to "be yourself" is as ephemeral as your identification with it; it is trash in embryonic form.

Identity has become both problematic and trivial, causing modern subjects a sense of thrownness and uprootedness as well as granting them the opportunity of overcoming established authoritarian structures. In modern, technologically saturated societies, the general weakening of identities is a prerequisite for emancipation. The return to "strong" clear-cut "real" identities is the way of new fundamentalism demanding a rehabilitation of "traditional values" and protected zones for metaphysical thought, both of which are to be had only at the price of suppression and violence.

It has become difficult to know "who one is", but this difficulty is not merely a private problem. It is also a problem for the exercise of power, for the state and other power institutions also need to know "who you are". With the spread of weak identities, power is exercised in a different manner. Power cannot be exercised without being clear who it addresses; note the dual significance of "subject". A weakened, hybrid undefined subject (in the philosophical sense) cannot be a "good" subject (in the political sense), it is not easy to sub-ject. Without identification, power cannot be exercised. And while identification is itself not a sufficient precondition for authoritarianism, it is certainly a necessary one.

Identities are therefore reconstructed using technologies of identification in order to keep the weakened and hence evasive subjects "sub-jected". States have traditionally employed bureaucratic identification techniques and sanctioned those who trying to evade the grip of administration. Carrying several passports has been the privilege of spies and of dubious outlaws, and not possessing an "ID" at all is the fate of millions of refugees fleeing violence or economic destitution. Lack of identification is structurally sanctioned by placelessness.

The technisised acceleration of societies and the weakening of identities make identification a complicated matter. On the one hand, bureaucratic identification techniques can be technologically bypassed. Passports and signatures can be forged; data can be manipulated and played with. On the other hand, traditional bureaucratic methods are slow. The requirements resulting from these constraints are met by biometric technology.

TEXTBLOCK 38/40 // URL: http://world-information.org/wio/infostructure/100437611729/100438658075
 
"Attention Brokerage"

"Attention Brokerage" is one of the latest developments in the field of online advertising. The first Web-site applying the concept of selling and buying attention is Cybergold. Users, who want to earn money have to register and then look at ads, which, of course, they have to prove by e.g. downloading software. Attention, according to this idea, represents a good, which is worth being paid for.

TEXTBLOCK 39/40 // URL: http://world-information.org/wio/infostructure/100437611652/100438658064
 
1960s - 1970s: Increased Research in Artificial Intelligence (AI)

During the cold war the U.S. tried to ensure that it would stay ahead of the Soviet Union in technological advancements. Therefore in 1963 the Defense Advanced Research Projects Agency (DARPA) granted the Massachusetts Institute of Technology (MIT) U.S.$ 2.2 million for research in machine-aided cognition (artificial intelligence). The major effect of the project was an increase in the pace of AI research and a continuation of funding.

In the 1960s and 1970s a multitude of AI programs were developed, most notably SHRDLU. Headed by Marvin Minsky the MIT's research team showed, that when confined to a small subject matter, computer programs could solve spatial and logic problems. Other progresses in the field of AI at the time were: the proposal of new theories about machine vision by David Marr, Marvin Minsky's frame theory, the PROLOGUE language (1972) and the development of expert systems.

TEXTBLOCK 40/40 // URL: http://world-information.org/wio/infostructure/100437611663/100438659474
 
NATO

The North Atlantic Treaty was signed in Washington on 4 April 1949, creating NATO (= North Atlantic Treaty Organization). It was an alliance of 12 independent nations, originally committed to each other's defense. Between 1952 and 1982 four more members were welcomed and in 1999, the first ex-members of COMECON became members of NATO (the Czech Republic, Hungary and Poland), which makes 19 members now. Around its 50th anniversary NATO changed its goals and tasks by intervening in the Kosovo Crisis.

INDEXCARD, 1/51
 
Kosov@

The "word" Kosov@ is a compromise between the Serb name KosovO and the Albanian KosovA. It is mostly used by international people who want to demonstrate a certain consciousness about the conflict including some sort of neutrality, believing that neither the one side nor the other (and maybe not even NATO) is totally right. Using the word Kosov@ is seen as a symbol of peace.

For more explanations (in German) see: http://www.zivildienst.at/kosov@.htm

http://www.zivildienst.at/kosov@.htm
INDEXCARD, 2/51
 
Slobodan Milosevic

Slobodan Milosevic (* 1941) is a Serbian political leader.
As a young man S. Milosevic joined the Communist Party, in 1984 the banker became head of the local Communist Party of Belgrade, in 1987 head of the Serb CP. Since 1989 he has been president of Serbia (since 1997 president of the new Federal Republic of Yugoslavia). During his reign the Yugoslav Republic broke up, bringing about the independence of Slovenia and Croatia and the war in Bosnia. In 1998 the Kosovo Crisis started.

INDEXCARD, 3/51
 
Saddam Hussein

Saddam Hussein joined the revolutionary Baath party when he was a university student. In 1958 he had the head of Iraq, Abdul-Karim Qassim, killed. Since 1979 he has been President of Iraq. Under his reign Iraq fought a decade-long war with Iran. Because of his steady enmity with extreme Islamic leaders the West supported him first of all, until his army invaded Kuwait in August 1990, an incident that the USA led to the Gulf War. Since then many rumors about a coup d'état have been launched, but Saddam Hussein is still in unrestricted power.

INDEXCARD, 4/51
 
Mark

A mark (trademark or service mark) is "... a sign, or a combination of signs, capable of distinguishing the goods or services of one undertaking from those of other undertakings. The sign may particularly consist of one or more distinctive words, letters, numbers, drawings or pictures, emblems, colors or combinations of colors, or may be three-dimensional..." (WIPO) To be protected a mark must be registered in a government office whereby generally the duration is limited in time, but can be periodically (usually every 10 years) renewed.

INDEXCARD, 5/51
 
Adolf Hitler

Adolf Hitler (1889-1945) was the head of the NSdAP, the National Socialist Workers' Party. Originally coming from Austria, he started his political career in Germany. As the Reichskanzler of Germany he provoked World War II. His hatred against all non-Aryans and people thinking in a different way killed millions of human beings. Disinformation about his personality and an unbelievable machinery of propaganda made an entire people close its eyes to the most cruel crimes on human kind.

INDEXCARD, 6/51
 
Industrial design

Industrial design refers to the ornamental aspect of a useful article which may constitute of two or three-dimensional elements. To be qualified for intellectual property protection the design must be novel or original. Protection can be obtained through registration in a government office and usually is given for 10 to 15 years.

INDEXCARD, 7/51
 
Invention

According to the WIPO an invention is a "... novel idea which permits in practice the solution of a specific problem in the field of technology." Concerning its protection by law the idea "... must be new in the sense that is has not already been published or publicly used; it must be non-obvious in the sense that it would not have occurred to any specialist in the particular industrial field, had such a specialist been asked to find a solution to the particular problem; and it must be capable of industrial application in the sense that it can be industrially manufactured or used." Protection can be obtained through a patent (granted by a government office) and typically is limited to 20 years.

INDEXCARD, 8/51
 
General Electric

GE is a major American corporation and one of the largest and most diversified corporations in the world. Its products include electrical and electronic equipment, plastics, aircraft engines, medical imaging equipment, and financial services. The company was incorporated in 1892, and in 1986 GE purchased the RCA Corporation including the RCA-owned television network, the National Broadcasting Company, Inc. In 1987, however, GE sold RCA's consumer electronics division to Thomson SA, a state-owned French firm, and purchased Thomson's medical technology division. In 1989 GE agreed to combine its European business interests in appliances, medical systems, electrical distribution, and power systems with the unrelated British corporation General Electric Company. Headquarters are in Fairfield, Conn., U.S.

INDEXCARD, 9/51
 
Sandinistas

The Sandinistas overthrew the right wing Somoza regime of corruption that had support from the U.S.-government, in 1979. The followers of Somoza, who was killed in 1980, formed the Contras and began a guerrilla warfare against the government. Many of them were trained in the School of the Americas (= SOA). The Sandinist government realized social reforms, but these did not convince the USA - and so the war went on for many years, costing between 30,000 and 50,000 lives. When the war finally ended the Sandinistas were beaten in (partly incorrect) elections.

INDEXCARD, 10/51
 
Economic rights

The economic rights (besides moral rights and in some cases also neighboring rights) granted to the owners of copyright usually include 1) copying or reproducing a work, 2) performing a work in public, 3) making a sound recording of a work, 4) making a motion picture of a work, 5) broadcasting a work, 6) translating a work and 7) adapting a work. Under certain national laws some of these rights are not exclusive rights of authorization but in specific cases, merely rights to remuneration.

INDEXCARD, 11/51
 
Archbishop Oscar Arnulfo Romero

Archbishop Oscar Arnulfo Romero († 1980) was elected archbishop because he was very conservative. But when he saw how more and more priests and definitely innocent people were murdered, he changed his attitudes and became one of the sharpest critics of the government. He gave shelter to those in danger, never stopped talking against violence and his Sunday sermons on the radio where moments to tell the truth to the Salvadorians, also mentioning the names of the disappeared or killed persons. As Romero got extremely popular and dangerous for the population he was killed by death squads, while reading a sermon.

INDEXCARD, 12/51
 
Vacuum tube

The first half of the 20th century was the era of the vacuum tube in electronics. This variety of electron tube permitted the development of radio broadcasting, long-distance telephony, television, and the first electronic digital computers. These early electronic computers were, in fact, the largest vacuum-tube systems ever built. Perhaps the best-known representative is the ENIAC (Electronic Numerical Integrator and Computer, completed in 1946).

INDEXCARD, 13/51
 
cryptology

also called "the study of code". It includes both, cryptography and cryptoanalysis

INDEXCARD, 14/51
 
UNIVAC

Built by Remington Rand in 1951 the UNIVAC I (Universal Automatic Computer) was one of the first commercially available computers to take advantage of the development of the central processing unit (CPU). Both the U.S. Census bureau and General Electric owned UNIVACs. Speed: 1,905 operations per second; input/output: magnetic tape, unityper, printer; memory size: 1,000 12-digit words in delay line; technology: serial vacuum tubes, delay lines, magnetic tape; floor space: 943 cubic feet; cost: F.O.B. factory U.S.$ 750,000 plus U.S.$ 185,000 for a high speed printer.

INDEXCARD, 15/51
 
World Wide Web (WWW)

Probably the most significant Internet service, the World Wide Web is not the essence of the Internet, but a subset of it. It is constituted by documents that are linked together in a way you can switch from one document to another by simply clicking on the link connecting these documents. This is made possible by the Hypertext Mark-up Language (HTML), the authoring language used in creating World Wide Web-based documents. These so-called hypertexts can combine text documents, graphics, videos, sounds, and Java applets, so making multimedia content possible.

Especially on the World Wide Web, documents are often retrieved by entering keywords into so-called search engines, sets of programs that fetch documents from as many servers as possible and index the stored information. (For regularly updated lists of the 100 most popular words that people are entering into search engines, click here). No search engine can retrieve all information on the whole World Wide Web; every search engine covers just a small part of it.

Among other things that is the reason why the World Wide Web is not simply a very huge database, as is sometimes said, because it lacks consistency. There is virtually almost infinite storage capacity on the Internet, that is true, a capacity, which might become an almost everlasting too, a prospect, which is sometimes consoling, but threatening too.

According to the Internet domain survey of the Internet Software Consortium the number of Internet host computers is growing rapidly. In October 1969 the first two computers were connected; this number grows to 376.000 in January 1991 and 72,398.092 in January 2000.

World Wide Web History Project, http://www.webhistory.org/home.html

http://www.searchwords.com/
http://www.islandnet.com/deathnet/
http://www.salonmagazine.com/21st/feature/199...
INDEXCARD, 16/51
 
Binary number system

In mathematics, the term binary number system refers to a positional numeral system employing 2 as the base and requiring only two different symbols, 0 and 1. The importance of the binary system to information theory and computer technology derives mainly from the compact and reliable manner in which data can be represented in electromechanical devices with two states--such as "on-off," "open-closed," or "go-no go."

INDEXCARD, 17/51
 
Neural network

A bottom-up artificial intelligence approach, a neural network is a network of many very simple processors ("units" or "neurons"), each possibly having a (small amount of) local memory. The units are connected by unidirectional communication channels ("connections"), which carry numeric data. The units operate only on their local data and on the inputs they receive via the connections. A neural network is a processing device, either an algorithm, or actual hardware, whose design was inspired by the design and functioning of animal brains and components thereof. Most neural networks have some sort of "training" rule whereby the weights of connections are adjusted on the basis of presented patterns. In other words, neural networks "learn" from examples and exhibit some structural capability for generalization.

INDEXCARD, 18/51
 
Artificial Intelligence

Artificial Intelligence is concerned with the simulation of human thinking and emotions in information technology. AI develops "intelligent systems" capable, for example, of learning and logical deduction. AI systems are used for creatively handling large amounts of data (as in data mining), as well as in natural speech processing and image recognition. AI is also used as to support decision taking in highly complex environments.
Yahoo AI sites: http://dir.yahoo.com/Science/Computer_Science/Artificial_Intelligence/
MIT AI lab: http://www.ai.mit.edu/


http://dir.yahoo.com/Science/Computer_Science...
http://www.ai.mit.edu/
INDEXCARD, 19/51
 
Expert system

Expert systems are advanced computer programs that mimic the knowledge and reasoning capabilities of an expert in a particular discipline. Their creators strive to clone the expertise of one or several human specialists to develop a tool that can be used by the layman to solve difficult or ambiguous problems. Expert systems differ from conventional computer programs as they combine facts with rules that state relations between the facts to achieve a crude form of reasoning analogous to artificial intelligence. The three main elements of expert systems are: (1) an interface which allows interaction between the system and the user, (2) a database (also called the knowledge base) which consists of axioms and rules, and (3) the inference engine, a computer program that executes the inference-making process. The disadvantage of rule-based expert systems is that they cannot handle unanticipated events, as every condition that may be encountered must be described by a rule. They also remain limited to narrow problem domains such as troubleshooting malfunctioning equipment or medical image interpretation, but still have the advantage of being much lower in costs compared with paying an expert or a team of specialists.

INDEXCARD, 20/51
 
Machine vision

A branch of artificial intelligence and image processing concerned with the identification of graphic patterns or images that involves both cognition and abstraction. In such a system, a device linked to a computer scans, senses, and transforms images into digital patterns, which in turn are compared with patterns stored in the computer's memory. The computer processes the incoming patterns in rapid succession, isolating relevant features, filtering out unwanted signals, and adding to its memory new patterns that deviate beyond a specified threshold from the old and are thus perceived as new entities.

INDEXCARD, 21/51
 
Java Applets

Java applets are small programs that can be sent along with a Web page to a user. Java applets can perform interactive animations, immediate calculations, or other simple tasks without having to send a user request back to the server. They are written in Java, a platform-independent computer language, which was invented by Sun Microsystems, Inc.

Source: Whatis.com

INDEXCARD, 22/51
 
The World Wide Web History Project

The ongoing World Wide Web History Project was established to record and publish the history of the World Wide Web and its roots in hypermedia and networking. As primary research methods are used archival research and the analysis of interviews and talks with pioneers of the World Wide Web. As result a vast of collection of historic video, audio, documents, and software is expected. The project's digital archive is currently under development.

http://www.webhistory.org/home.html

INDEXCARD, 23/51
 
Assembly line

An assembly line is an industrial arrangement of machines, equipment, and workers for continuous flow of workpieces in mass production operations. An assembly line is designed by determining the sequences of operations for manufacture of each product component as well as the final product. Each movement of material is made as simple and short as possible with no cross flow or backtracking. Work assignments, numbers of machines, and production rates are programmed so that all operations performed along the line are compatible.

INDEXCARD, 24/51
 
Fuzzy logic

A superset of Boolean logic (George Boole) introduced by Lotfi Zadeh in the 1960s as a means to model the uncertainty of natural language. Fuzzy logic is a type of logic that recognizes more than simple true and false values. It represents a departure from classical two-valued sets and logic, that use "soft" linguistic (e.g. large, small, hot, cold, warm) system variables and a continuous range of truth values in the interval [0,1], rather than strict binary (true or false) decisions and assignments.

INDEXCARD, 25/51
 
Bandwidth

The bandwidth of a transmitted communications signal is a measure of the range of frequencies the signal occupies. The term is also used in reference to the frequency-response characteristics of a communications receiving system. All transmitted signals, whether analog or digital, have a certain bandwidth. The same is true of receiving systems.

Generally speaking, bandwidth is directly proportional to the amount of data transmitted or received per unit time. In a qualitative sense, bandwidth is proportional to the complexity of the data for a given level of system performance. For example, it takes more bandwidth to download a photograph in one second than it takes to download a page of text in one second. Large sound files, computer programs, and animated videos require still more bandwidth for acceptable system performance. Virtual reality (VR) and full-length three-dimensional audio/visual presentations require the most bandwidth of all.

In digital systems, bandwidth is data speed in bits per second (bps).

Source: Whatis.com

INDEXCARD, 26/51
 
Intellectual property

Intellectual property, very generally, relates to the output that result from intellectual activity in the industrial, scientific, literary and artistic fields. Traditionally intellectual property is divided into two branches: 1) industrial property (inventions, marks, industrial designs, unfair competition and geographical indications), and 2) copyright. The protection of intellectual property is guaranteed through a variety of laws, which grant the creators of intellectual goods, and services certain time-limited rights to control the use made of their products.

INDEXCARD, 27/51
 
Braille

Universally accepted system of writing used by and for blind persons and consisting of a code of 63 characters, each made up of one to six raised dots arranged in a six-position matrix or cell. These Braille characters are embossed in lines on paper and read by passing the fingers lightly over the manuscript. Louis Braille, who was blinded at the age of three, invented the system in 1824 while a student at the Institution Nationale des Jeunes Aveugles (National Institute for Blind Children), Paris.

INDEXCARD, 28/51
 
blowfish encryption algorithm

Blowfish is a symmetric key block cipher that can vary its length.
The idea behind is a simple design to make the system faster than others.

http://www.counterpane.com/blowfish.html
http://www.counterpane.com/bfsverlag.html

http://www.counterpane.com/blowfish.html
http://www.counterpane.com/blowfish.html
INDEXCARD, 29/51
 
Microsoft Network

Microsoft Network is the online service from Microsoft Corporation. Although offering direct access to the Internet, mainly proprietary content for entertainment purposes is offered. Best viewed with Microsoft's Internet Explorer.

http://www.msn.com

INDEXCARD, 30/51
 
William Frederick Friedman

Friedman is considered the father of U.S.-American cryptoanalysis - he also was the one to start using this term.

INDEXCARD, 31/51
 
IDEA

IDEA is another symmetric-key system. It is a block cipher, operating on 64-bit plaintext blocks, having a key-length of 128 bits.

INDEXCARD, 32/51
 
NSFNet

Developed under the auspices of the National Science Foundation (NSF), NSFnet served as the successor of the ARPAnet as the main network linking universities and research facilities until 1995, when it was replaced it with a commercial backbone network. Being research networks, ARPAnet and NSFnet served as testing grounds for future networks.

INDEXCARD, 33/51
 
National Science Foundation (NSF)

Established in 1950, the National Science Foundation is an independent agency of the U.S. government dedicated to the funding in basic research and education in a wide range of sciences and in mathematics and engineering. Today, the NSF supplies about one quarter of total federal support of basic scientific research at academic institutions.

http://www.nsf.gov

For more detailed information see the Encyclopaedia Britannica: http://www.britannica.com/bcom/eb/article/0/0,5716,2450+1+2440,00.html

http://www.nsf.gov/
INDEXCARD, 34/51
 
Satellites

Communications satellites are relay stations for radio signals and provide reliable and distance-independent high-speed connections even at remote locations without high-bandwidth infrastructure.

On point-to-point transmission, the transmission method originally employed on, satellites face increasing competition from fiber optic cables, so point-to-multipoint transmission increasingly becomes the ruling satellite technology. Point-to-multipoint transmission enables the quick implementation of private networks consisting of very small aperture terminals (VSAT). Such networks are independent and make mobile access possible.

In the future, satellites will become stronger, cheaper and their orbits will be lower; their services might become as common as satellite TV is today.

For more information about satellites, see How Satellites Work (http://octopus.gma.org/surfing/satellites) and the Tech Museum's satellite site (http://www.thetech.org/hyper/satellite).

http://www.whatis.com/vsat.htm
http://octopus.gma.org/surfing/satellites
INDEXCARD, 35/51
 
water-clocks

The water-clocks are an early long-distance-communication-system. Every communicating party had exactly the same jar, with a same-size-hole that was closed and the same amount of water in it. In the jar was a stick with different messages written on. When one party wanted to tell something to the other it made a fire-sign. When the other answered, both of them opened the hole at the same time. And with the help of another fire-sign closed it again at the same time, too. In the end the water covered the stick until the point of the wanted message.

INDEXCARD, 36/51
 
COMECON

The Council for Mutual Economic Aid (COMECON) was set up in 1949 consisting of six East European countries: Bulgaria, Czechoslovakia, Hungary, Poland, Romania, and the USSR, followed later by the German Democratic Republic (1950), Mongolia (1962), Cuba (1972), and Vietnam (1978). Its aim was, to develop the member countries' economies on a complementary basis for the purpose of achieving self-sufficiency. In 1991, Comecon was replaced by the Organization for International Economic Cooperation.

INDEXCARD, 37/51
 
Virtual Private Networks

Virtual Private Networks provide secured connections to a corporate site over a public network as the Internet. Data transmitted through secure connections are encrypted and therefore have to be encrypted before they can be read.
These networks are called virtual because connections are provided only when you connect to a corporate site; they do not rely on dedicated lines and support mobile use.

INDEXCARD, 38/51
 
Enigma

Device used by the German military command to encode strategic messages before and during World War II. The Enigma code was broken by a British intelligence system known as Ultra.

INDEXCARD, 39/51
 
Philip M. Taylor

Munitions of the Mind. A history of propaganda from the ancient world to the present era. Manchester 1995 (2nd ed.)
This book gives a quite detailed insight on the tools and tasks of propaganda in European and /or Western history. Starting with ancient times the author goes up till the Gulf War and the meaning of propaganda today. In all those different eras propaganda was transporting similar messages, even when technical possibilities had not been fairly as widespread as today. Taylor's book is leading the reader through those different periods, trying to show the typical elements of each one.

INDEXCARD, 40/51
 
AT&T

AT&T Corporation provides voice, data and video communications services to large and small businesses, consumers and government entities. AT&T and its subsidiaries furnish domestic and international long distance, regional, local and wireless communications services, cable television and Internet communications services. AT&T also provides billing, directory and calling card services to support its communications business. AT&T's primary lines of business are business services, consumer services, broadband services and wireless services. In addition, AT&T's other lines of business include network management and professional services through AT&T Solutions and international operations and ventures. In June 2000, AT&T completed the acquisition of MediaOne Group. With the addition of MediaOne's 5 million cable subscribers, AT&T becomes the country's largest cable operator, with about 16 million customers on the systems it owns and operates, which pass nearly 28 million American homes. (source: Yahoo)

Slogan: "It's all within your reach"

Business indicators:

Sales 1999: $ 62.391 bn (+ 17,2 % from 1998)

Market capitalization: $ 104 bn

Employees: 107,800

Corporate website: http://www.att.com http://www.att.com/
INDEXCARD, 41/51
 
Royalties

Royalties refer to the payment made to the owners of certain types of rights by those who are permitted by the owners to exercise the rights. The rights concerned are literary, musical, and artistic copyright and patent rights in inventions and designs (as well as rights in mineral deposits, including oil and natural gas). The term originated from the fact that in Great Britain for centuries gold and silver mines were the property of the crown and such "royal" metals could be mined only if a payment ("royalty") were made to the crown.

INDEXCARD, 42/51
 
Richard Barbrook and Andy Cameron, The Californian Ideology

According to Barbrook and Cameron there is an emerging global orthodoxy concerning the relation between society, technology and politics. In this paper they are calling this orthodoxy the Californian Ideology in honor of the state where it originated. By naturalizing and giving a technological proof to a political philosophy, and therefore foreclosing on alternative futures, the Californian ideologues are able to assert that social and political debates about the future have now become meaningless and - horror of horrors - unfashionable. - This paper argues for an interactive future.

http://www.wmin.ac.uk/media/HRC/ci/calif.html

INDEXCARD, 43/51
 
User tracking

User tracking is a generic term that covers all the techniques of monitoring the movements of a user on a web site. User tracking has become an essential component in online commerce, where no personal contact to customers is established, leaving companies with the predicament of not knowing who they are talking to. Some companies, such as Red Eye, Cyber Dialogue, and SAS offer complete technology packages for user tracking and data analysis to online businesses. Technologies include software solutions such as e-mine, e-discovery, or WebHound

Whenever user tracking is performed without the explicit agreement of the user, or without laying open which data are collected and what is done with them, considerable privacy concerns have been raised.

http://www.redeye.co.uk/
http://www.cyberdialogue.com/
http://www.sas.com/
http://www.spss.com/emine/
http://www.sas.com/solutions/e-discovery/inde...
http://www.sas.com/products/webhound/index.ht...
http://www.linuxcare.com.au/mbp/meantime/
INDEXCARD, 44/51
 
1996 WIPO Copyright Treaty (WCT)

The 1996 WIPO Copyright Treaty, which focused on taking steps to protect copyright "in the digital age" among other provisions 1) makes clear that computer programs are protected as literary works, 2) the contracting parties must protect databases that constitute intellectual creations, 3) affords authors with the new right of making their works "available to the public", 4) gives authors the exclusive right to authorize "any communication to the public of their works, by wire or wireless means ... in such a way that members of the public may access these works from a place and at a time individually chosen by them." and 5) requires the contracting states to protect anti-copying technology and copyright management information that is embedded in any work covered by the treaty. The WCT is available on: http://www.wipo.int/documents/en/diplconf/distrib/94dc.htm



http://www.wipo.int/documents/en/diplconf/dis...
INDEXCARD, 45/51
 
WIPO

The World Intellectual Property Organization is one of the specialized agencies of the United Nations (UN), which was designed to promote the worldwide protection of both industrial property (inventions, trademarks, and designs) and copyrighted materials (literary, musical, photographic, and other artistic works). It was established by a convention signed in Stockholm in 1967 and came into force in 1970. The aims of WIPO are threefold. Through international cooperation, WIPO promotes the protection of intellectual property. Secondly, the organization supervises administrative cooperation between the Paris, Berne, and other intellectual unions regarding agreements on trademarks, patents, and the protection of artistic and literary work and thirdly through its registration activities the WIPO provides direct services to applicants for, or owners of, industrial property rights.

INDEXCARD, 46/51
 
Censorship of Online Content in China

During the Tian-an men massacre reports and photos transmitted by fax machines gave notice of what was happening only with a short delay. The Chinese government has learned his lesson well and "regulated" Internet access from the beginning. All Internet traffic to and out of China passes through a few gateways, a few entry-points, thus making censorship a relatively easy task. Screened out are web sites of organizations and media which express dissident viewpoints: Taiwan's Democratic Progress Party and Independence Party, The New York Times, CNN, and sites dealing with Tibetan independence and human rights issues.

Users are expected not to "harm" China's national interests and therefore have to apply for permission of Internet access; Web pages have to be approved before being published on the Net. For the development of measures to monitor and control Chinese content providers, China's state police has joined forces with the MIT.

For further information on Internet censorship, see Human Rights Watch, World Report 1999.

http://www.dpp.org/
http://www.nytimes.com/
http://www.hrw.org/worldreport99/special/inte...
INDEXCARD, 47/51
 
Whitfield Diffie

Whitfield Diffie is an Engineer at Sun Microsystems and co-author of Privacy on the Line (MIT Press) in 1998 with Susan Landau. In 1976 Diffie and Martin Hellman developed public key cryptography, a system to send information without leaving it open to be read by everyone.

INDEXCARD, 48/51
 
Electronic Messaging (E-Mail)

Electronic messages are transmitted and received by computers through a network. By E-Mail texts, images, sounds and videos can be sent to single users or simultaneously to a group of users. Now texts can be sent and read without having them printed.

E-Mail is one of the most popular and important services on the Internet.

INDEXCARD, 49/51
 
General Schwarzkopf

General H. Norman Schwarzkopf (* 1934) followed in his father's footsteps at the United States Military Academy at West Point.
In 1965 he applied to join the troops in Vietnam. For the next 20 years Schwarzkopf worked on his career. As Commander in Chief of the U.S. Central Command, he led U.S. and allied forces in the Gulf War (Operations Desert Shield and Desert Storm). He retired from the Army in 1992 and wrote his autobiography.

For a picture see: http://www.jesterbook.com/sections/5a_mom/gallery/schwarzkopf.htm

http://www.jesterbook.com/sections/5a_mom/gal...
INDEXCARD, 50/51
 
McCarthy

Born in Grand Chute, Wisconsin, Joe McCarthy graduated from Marquette in 1935. In 1939, he won election as a circuit court judge. During World War II, he enlisted in the Marines and served in the Pacific. In 1944, he campaigned for senator but lost in the Republican primary. In 1946, he ran for Wisconsin's other senate seat.

In a 1950 speech, McCarthy entered the public spotlight by claiming that communists had "infested" the State Department, dramatically waving a sheet of paper which purportedly contained the traitors' names. A special Senate committee investigated the charges and found them groundless. Unfazed, McCarthy used his position to wage a relentless anti-communist crusade, denouncing numerous public figures and holding a series of highly confrontational hearings, ruining the careers of many people.

He died at the age of 49 of complications related to alcoholism.

http://us.history.wisc.edu/hist102/bios/31.html

http://us.history.wisc.edu/hist102/bios/31.ht...
INDEXCARD, 51/51