Legal Protection: European Union

Within the EU's goal of establishing a European single market also intellectual property rights are of significance. Therefore the European Commission aims at the harmonization of the respective national laws of the EU member states and for a generally more effective protection of intellectual property on an international level. Over the years it has adopted a variety of Conventions and Directives concerned with different aspects of the protection of industrial property as well as copyright and neighboring rights.

An overview of EU activities relating to intellectual property protection is available on the website of the European Commission (DG Internal Market): http://www.europa.eu.int/comm/internal_market/en/intprop/intprop/index.htm

TEXTBLOCK 1/21 // URL: http://world-information.org/wio/infostructure/100437611725/100438659574
 
History: "Indigenous Tradition"

In preliterate societies the association of rhythmic or repetitively patterned utterances with supernatural knowledge endures well into historic times. Knowledge is passed from one generation to another. Similar as in the Southern tradition intellectual property rights are rooted in a concept of 'collective' or 'communal' intellectual property existing in perpetuity and not limited to the life of an individual creator plus some number of years after his or her death. Often rights are exercised by only one individual in each generation, often through matrilineal descent.


TEXTBLOCK 2/21 // URL: http://world-information.org/wio/infostructure/100437611725/100438659557
 
Databody convergence

In the phrase "the rise of the citizen as a consumer", to be found on the EDS website, the cardinal political problem posed by the databody industry is summarised: the convergence of commercial and political interest in the data body business, the convergence of bureaucratic and commercial data bodies, the erosion of privacy, and the consequent undermining of democratic politics by private business interest.

When the citizen becomes a consumer, the state must become a business. In the data body business, the key word behind this new identity of government is "outsourcing". Functions, that are not considered core functions of government activity are put into the hands of private contractors.

There have long been instances where privately owned data companies, e.g. credit card companies, are allowed access to public records, e.g. public registries or electoral rolls. For example, in a normal credit card transaction, credit card companies have had access to public records in order to verify identity of a customer. For example, in the UK citizen's personal data stored on the Electoral Roll have been used for commercial purposes for a long time. The new British Data Protection Act now allows people to "opt out" of this kind of commercialisation - a legislation that has prompted protests on the part of the data industry: Experian has claimed to lose LST 500 mn as a consequence of this restriction - a figure that, even if exaggerated, may help to understand what the value of personal data actually is.

While this may serve as an example of an increased public awareness of privacy issues, the trend towards outsourcing seems to lead to a complete breakdown of the barriers between commercial and public use of personal data. This trend can be summarised by the term "outsourcing" of government functions.

Governments increasingly outsource work that is not considered core function of government, e.g. cooking meals in hospitals or mowing lawns in public parks. Such peripheral activities marked a first step of outsourcing. In a further step, governmental functions were divided between executive and judgemental functions, and executive functions increasingly entrusted to private agencies. For these agencies to be able to carry out the work assigned to them, the need data. Data that one was stored in public places, and whose handling was therefore subject to democratic accountability. Outsourcing has produced gains in efficiency, and a decrease of accountability. Outsourced data are less secure, what use they are put to is difficult to control.

The world's largest data corporation, EDS, is also among the foremost outsourcing companies. In an article about EDS' involvement in government outsourcing in Britain, Simon Davies shows how the general trend towards outsourcing combined with advances in computer technology allow companies EDS, outside of any public accountability, to create something like blueprints for the societies of the 21st century. But the problem of accountability is not the only one to be considered in this context. As Davies argues, the data business is taking own its own momentum "a ruthless company could easily hold a government to ransom". As the links between government agencies and citizens thin out, however, the links among the various agencies might increase. Linking the various government information systems would amount to further increase in efficiency, and a further undermining of democracy. The latter, after all, relies upon the separation of powers - matching government information systems would therefore pave the way to a kind of electronic totalitarianism that has little to do with the ideological bent of George Orwell's 1984 vision, but operates on purely technocratic principles.

Technically the linking of different systems is already possible. It would also create more efficiency, which means generate more income. The question, then, whether democracy concerns will prevent it from happening is one that is capable of creating

But what the EDS example shows is something that applies everywhere, and that is that the data industry is whether by intention or whether by default, a project with profound political implications. The current that drives the global economy deeper and deeper into becoming a global data body economy may be too strong to be stopped by conventional means.

However, the convergence of political and economic data bodies also has technological roots. The problem is that politically motivated surveillance and economically motivated data collection are located in the same area of information and communication technologies. For example, monitoring internet use requires more or less the same technical equipment whether done for political or economic purposes. Data mining and data warehousing techniques are almost the same. Creating transparency of citizens and customers is therefore a common objective of intelligence services and the data body industry. Given that data are exchanged in electronic networks, a compatibility among the various systems is essential. This is another factor that encourages "leaks" between state-run intelligence networks and the private data body business. And finally, given the secretive nature of state intelligence and commercial data capturing , there is little transparency. Both structures occupy an opaque zone.

TEXTBLOCK 3/21 // URL: http://world-information.org/wio/infostructure/100437611761/100438659769
 
Timeline Cryptography - Introduction

Besides oral conversations and written language many other ways of information-transport are known: like the bush telegraph, drums, smoke signals etc. Those methods are not cryptography, still they need en- and decoding, which means that the history of language, the history of communication and the history of cryptography are closely connected to each other
The timeline gives an insight into the endless fight between enciphering and deciphering. The reasons for them can be found in public and private issues at the same time, though mostly connected to military maneuvers and/or political tasks.

One of the most important researchers on Cryptography through the centuries is David Kahn; many parts of the following timeline are originating from his work.

TEXTBLOCK 4/21 // URL: http://world-information.org/wio/infostructure/100437611776/100438658824
 
Biometrics applications: physical access

This is the largest area of application of biometric technologies, and the most direct lineage to the feudal gate keeping system. Initially mainly used in military and other "high security" territories, physical access control by biometric technology is spreading into a much wider field of application. Biometric access control technologies are already being used in schools, supermarkets, hospitals and commercial centres, where the are used to manage the flow of personnel.

Biometric technologies are also used to control access to political territory, as in immigration (airports, Mexico-USA border crossing). In this case, they can be coupled with camera surveillance systems and artificial intelligence in order to identify potential suspects at unmanned border crossings. Examples of such uses in remote video inspection systems can be found at http://www.eds-ms.com/acsd/RVIS.htm

A gate keeping system for airports relying on digital fingerprint and hand geometry is described at http://www.eds-ms.com/acsd/INSPASS.htm. This is another technology which allows separating "low risk" travellers from "other" travellers.

An electronic reconstruction of feudal gate keeping capable of singling out high-risk travellers from the rest is already applied at various border crossing points in the USA. "All enrolees are compared against national lookout databases on a daily basis to ensure that individuals remain low risk". As a side benefit, the economy of time generated by the inspection system has meant that "drug seizures ... have increased since Inspectors are able to spend more time evaluating higher risk vehicles".

However, biometric access control can not only prevent people from gaining access on to a territory or building, they can also prevent them from getting out of buildings, as in the case of prisons.

TEXTBLOCK 5/21 // URL: http://world-information.org/wio/infostructure/100437611729/100438658838
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 6/21 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
Timeline BC

~ 1900 BC: Egyptian writers use non-standard Hieroglyphs in inscriptions of a royal tomb; supposedly this is not the first but the first documented example of written cryptography

1500 an enciphered formula for the production of pottery is done in Mesopotamia

parts of the Hebrew writing of Jeremiah's words are written down in "atbash", which is nothing else than a reverse alphabet and one of the first famous methods of enciphering

4th century Aeneas Tacticus invents a form of beacons, by introducing a sort of water-clock

487 the Spartans introduce the so called "skytale" for sending short secret messages to and from the battle field

170 Polybius develops a system to convert letters into numerical characters, an invention called the Polybius Chequerboard.

50-60 Julius Caesar develops an enciphering method, later called the Caesar Cipher, shifting each letter of the alphabet an amount which is fixed before. Like atbash this is a monoalphabetic substitution.

TEXTBLOCK 7/21 // URL: http://world-information.org/wio/infostructure/100437611776/100438659084
 
Online data capturing

Hardly a firm today can afford not to engage in electronic commerce if it does not want to be swept out of business by competitors. "Information is everything" has become something like the Lord's prayer of the New Economy. But how do you get information about your customer online? Who are the people who visit a website, where do they come from, what are they looking for? How much money do they have, what might they want to buy? These are key questions for a company doing electronic business. Obviously not all of this information can be obtained by monitoring the online behaviour of web users, but there are always little gimmicks that, when combined with common tracking technologies, can help to get more detailed information about a potential customer. These are usually online registration forms, either required for entry to a site, or competitions, sometimes a combination of the two. Obviously, if you want to win that weekend trip to New York, you want to provide your contact details.

The most common way of obtaining information about a user online is a cookie. However, a cookie by itself is not sufficient to identify a user personally. It merely identifies the computer to the server by providing its IP number. Only combined with other data extraction techniques, such as online registration, can a user be identified personally ("Register now to get the full benefit of xy.com. It's free!")

But cookies record enough information to fine-tune advertising strategies according to a user's preferences and interests, e.g. by displaying certain commercial banners rather than others. For example, if a user is found to respond to a banner of a particular kind, he / she may find two of them at the next visit. Customizing the offers on a website to the particular user is part of one-to-one marketing, a type of direct marketing. But one-to-one marketing can go further than this. It can also offer different prices to different users. This was done by Amazon.com in September 2000, when fist-time visitors were offered cheaper prices than regular customers.

One-to-one marketing can create very different realities that undermine traditional concepts of demand and supply. The ideal is a "frictionless market", where the differential between demand and supply is progressively eliminated. If a market is considered a structure within which demand / supply differentials are negotiated, this amounts to the abolition of the established notion of the nature of a market. Demand and supply converge, desire and it fulfilment coincide. In the end, there is profit without labour. However, such a structure is a hermetic structure of unfreedom.

It can only function when payment is substituted by credit, and the exploitation of work power by the exploitation of data. In fact, in modern economies there is great pressure to increase spending on credit. Using credit cards and taking up loans generates a lot of data around a person's economic behaviour, while at the same restricting the scope of social activity and increasing dependence. On the global level, the consequences of credit spirals can be observed in many of the developing countries that have had to abandon most of their political autonomy. As the data body economy advances, this is also the fate of people in western societies when they are structurally driven into credit spending. It shows that data bodies are not politically neutral.

The interrelation between data, profit and unfreedom is frequently overlooked by citizens and customers. Any company in a modern economy will apply data collecting strategies for profit, with dependence and unfreedom as a "secondary effect". The hunger for data has made IT companies eager to profit from e-business rather resourceful. "Getting to know the customer" - this is a catchphrase that is heard frequently, and which suggests that there are no limits to what a company may want to about a customer. In large online shops, such as amazon.com, where customer's identity is accurately established by the practice of paying with credit cards, an all business happens online, making it easy for the company to accurately profile the customers.

But there are more advanced and effective ways of identification. The German company Sevenval has developed a new way of customer tracking which works with "virtual domains". Every visitor of a website is assigned an 33-digit identification number which the browser understands as part of the www address, which will then read something like http://XCF49BEB7E97C00A328BF562BAAC75FB2.sevenval.com. Therefore, this tracking method, which is advertised by Sevenval as a revolutionary method capable of tracking the exact and complete path of a user on a website, can not be simple switched off. In addition, the method makes it possible for the identity of a user can travel with him when he / she visits one of the other companies linked to the site in question. As in the case of cookies, this tracking method by itself is not sufficient to identify a user personally. Such an identification only occurs once a customer pays with a credit card, or decides to participate in a draw, or voluntarily completes a registration form.

Bu there are much less friendly ways of extracting data from a user and feeding the data body. Less friendly means: these methods monitor users in situations where the latter are likely not to want to be monitored. Monitoring therefore takes place in a concealed manner. One of these monitoring methods are so-called web bugs. These are tiny graphics, not more than 1 x 1 pixel in size, and therefore invisible on a screen, capable of monitoring an unsuspecting user's e-mails or movements on a website. Leading corporations such as Barnes and Noble, eToys, Cooking.com, and Microsoft have all used web bugs in advertising campaigns. Richard Smith has compiled a web bugs FAQ site that contains detailed information and examples of web bugs in use.

Bugs monitoring users have also been packaged in seemingly harmless toys made available on the Internet. For example, Comet Systems offers cursor images which have been shown to collect user data and send them back to the company's server. These little images replace the customary white arrow of a mouse with a little image of a baseball, a cat, an UFO, etc. large enough to carry a bug collecting user information. The technology is offered as a marketing tool to companies looking for a "fun, new way to interact with their audience".

The cursor image technology relies on what is called a GUID (global unique identifier). This is an identification number which is assigned to a customer at the time of registration, or when downloading a product. Many among the online community were alarmed when in 1999 it was discovered that Microsoft assigned GUIDS without their customer's knowledge. Following protests, the company was forced to change the registration procedure, assuring that under no circumstances would these identification numbers be used for tracking or marketing.

However, in the meantime, another possible infringement on user anonymity by Microsoft was discovered, when it as found out that MS Office documents, such as Word, Excel or Powerpoint, contain a bug that is capable of tracking the documents as they are sent through the net. The bug sends information about the user who opens the document back to the originating server. A document that contains the bug can be tracked across the globe, through thousands of stopovers. In detailed description of the bug and how it works can be found at the Privacy Foundation's website. Also, there is an example of such a bug at the Privacy Center of the University of Denver.

Of course there are many other ways of collecting users' data and creating appropriating data bodies which can then be used for economic purposes. Indeed, as Bill Gates commented, "information is the lifeblood of business". The electronic information networks are becoming the new frontier of capitalism.

TEXTBLOCK 8/21 // URL: http://world-information.org/wio/infostructure/100437611761/100438659686
 
Challenges for Copyright by ICT: Digital Content Providers

Providers of digital information might be confronted with copyright related problems when using some of the special features of hypertext media like frames and hyperlinks (which both use third party content available on the Internet to enhance a webpage or CD ROM), or operate a search engine or online directory on their website.

Framing

Frames are often used to help define, and navigate within, a content provider's website. Still, when they are used to present (copyrighted) third party material from other sites issues of passing off and misleading or deceptive conduct, as well as copyright infringement, immediately arise.

Hyperlinking

It is generally held that the mere creation of a hyperlink does not, of itself, infringe copyright as usually the words indicating a link or the displayed URL are unlikely to be considered a "work". Nevertheless if a link is clicked on the users browser will download a full copy of the material at the linked address creating a copy in the RAM of his computer courtesy of the address supplied by the party that published the link. Although it is widely agreed that the permission to download material over the link must be part of an implied license granted by the person who has made the material available on the web in the first place, the scope of this implied license is still the subject of debate. Another option that has been discussed is to consider linking fair use.

Furthermore hyperlinks, and other "information location tools", like online directories or search engines could cause their operators trouble if they refer or link users to a site that contains infringing material. In this case it is yet unclear whether providers can be held liable for infringement.

TEXTBLOCK 9/21 // URL: http://world-information.org/wio/infostructure/100437611725/100438659590
 
Virtual cartels, oligopolistic structures

Global networks require global technical standards ensuring the compatibility of systems. Being able to define such standards makes a corporation extremely powerful. And it requires the suspension of competitive practices. Competition is relegated to the symbolic realm. Diversity and pluralism become the victims of the globalisation of baroque sameness.

The ICT market is dominated by incomplete competition aimed at short-term market domination. In a very short time, new ideas can turn into best-selling technologies. Innovation cycles are extremely short. But today's state-of-the-art products are embryonic trash.

    According to the Computer and Communications Industry Association, Microsoft is trying to aggressively take over the network market. This would mean that AT&T would control 70 % of all long distance phone calls and 60 % of cable connections.



    AOL and Yahoo are lone leaders in the provider market. AOL has 21 million subscribers in 100 countries. In a single month, AOL registers 94 million visits. Two thirds of all US internet users visited Yahoo in December 1999.



    The world's 13 biggest internet providers are all American.



    AOL and Microsoft have concluded a strategic cross-promotion deal. In the US, the AOL icon is installed on every Windows desktop. AOL has also concluded a strategic alliance with Coca Cola.


TEXTBLOCK 10/21 // URL: http://world-information.org/wio/infostructure/100437611709/100438658963
 
Individualized Audience Targeting

New opportunities for online advertisers arise with the possibility of one-to-one Web applications. Software agents for example promise to "register, recognize and manage end-user profiles; create personalized communities on-line; deliver personalized content to end-users and serve highly targeted advertisements". The probably ultimate tool for advertisers. Although not yet widely used, companies like Amazon.Com have already started to exploit individualized audience targeting for their purposes.

TEXTBLOCK 11/21 // URL: http://world-information.org/wio/infostructure/100437611652/100438658450
 
Another Question of Security

Even with the best techniques it is impossible to invent a cryptographic system that is absolutely safe/unbreakable. To decipher a text means to go through many, sometimes nearly - but never really - endless attempts. For the computers of today it might take hundreds of years or even more to go through all possibilities of codes, but still, finally the code stays breakable. The much faster quantum computers will proof that one day.
Therefore the decision to elect a certain method of enciphering finally is a matter of trust.

For the average user of computers it is rather difficult to understand or even realize the dangers and/or the technological background of electronic transmission of data. For the majority thinking about one's own necessities for encryption first of all means to trust others, the specialists, to rely on the information they provide.
The websites explaining the problems behind (and also the articles and books concerning the topic) are written by experts of course as well, very often in their typical scientific language, merely understandable for laymen. The introductions and other superficial elements of those articles can be understood, whereas the real background appears as untouchable spheres of knowledge.

The fact that dangers are hard to see through and the need for security measures appears as something most people know from media reports, leads directly to the problem of an underdeveloped democracy in the field of cryptography. Obviously the connection between cryptography and democracy is rather invisible for many people. Those mentioned media reports often specialize in talking about the work computer hackers do (sometimes being presented as criminals, sometimes as heroes) and the danger to lose control over the money drawn away from one's bank account, if someone steals the credit card number or other important financial data. The term "security", surely connected to those issues, is a completely different one from the one that is connected to privacy.
It is especially the latter that touches the main elements of democracy.

for the question of security see:
http://www-db.stanford.edu/pub/gio/CS99I/security.html

TEXTBLOCK 12/21 // URL: http://world-information.org/wio/infostructure/100437611776/100438658850
 
Product Placement

With television still being very popular, commercial entertainment has transferred the concept of soap operas onto the Web. The first of this new species of "Cybersoaps" was "The Spot", a story about the ups and downs of an American commune. The Spot not only within short time attracted a large audience, but also pioneered in the field of online product placement. Besides Sony banners, the companies logo is also placed on nearly every electronic product appearing in the story. Appearing as a site for light entertainment, The Spots main goal is to make the name Sony and its product range well known within the target audience.

TEXTBLOCK 13/21 // URL: http://world-information.org/wio/infostructure/100437611652/100438658026
 
What is the Internet?

Each definition of the Internet is a simplified statement and runs the risk of being outdated within a short time. What is usually referred to as the Internet is a network of thousands of computer networks (so called autonomous systems) run by governmental authorities, companies, and universities, etc. Generally speaking, every time a user connects to a computer networks, a new Internet is created. Technically speaking, the Internet is a wide area network (WAN) that may be connected to local area networks (LANs).

What constitutes the Internet is constantly changing. Certainly the state of the future Net will be different to the present one. Some years ago the Internet could still be described as a network of computer networks using a common communication protocol, the so-called IP protocol. Today, however, networks using other communication protocols are also connected to other networks via gateways.

Also, the Internet is not solely constituted by computers connected to other computers, because there are also point-of-sale terminals, cameras, robots, telescopes, cellular phones, TV sets and and an assortment of other hardware components that are connected to the Internet.

At the core of the Internet are so-called Internet exchanges, national backbone networks, regional networks, and local networks.

Since these networks are often privately owned, any description of the Internet as a public network is not an accurate. It is easier to say what the Internet is not than to say what it is. On 24 October, 1995 the U.S. Federal Networking Council made the following resolution concerning the definition of the term "Internet": "Internet" refers to the global information system that (i) is logically linked together by a globally unique address space based on the Internet Protocol (IP) or its subsequent extensions/follow-ons; (ii) is able to support communications using the Transmission Control Protocol/Internet Protocol (TCP/IP) suite or its subsequent extensions/follow-ons, and/or other IP-compatible protocols; and (iii) provides, uses or makes accessible, either publicly or privately, high level services layered on the communications and related infrastructure described herein." (http://www.fnc.gov/Internet_res.html)

What is generally and in a simplyfiying manner called the Internet, may be better referred to as the Matrix, a term introduced by science fiction writer William Gibson, as John S. Quarterman and Smoot Carl-Mitchell have proposed. The Matrix consists of all computer systems worldwide capable of exchanging E-Mail: of the USENET, corporate networks and proprietary networks owned by telecommunication and cable TV companies.

Strictly speaking, the Matrix is not a medium; it is a platform for resources: for media and services. The Matrix is mainly a very powerful means for making information easily accessible worldwide, for sending and receiving messages, videos, texts and audio files, for transferring funds and trading securities, for sharing resources, for collecting weather condition data, for trailing the movements of elephants, for playing games online, for video conferencing, for distance learning, for virtual exhibitions, for jamming with other musicians, for long distance ordering, for auctions, for tracking packaged goods, for doing business, for chatting, and for remote access of computers and devices as telescopes and robots remotely, e. g. The Internet is a wonderful tool for exchanging, retrieving, and storing data and sharing equipment over long distances and eventually real-time, if telecommunication infrastructure is reliable and of high quality.

For a comprehensive view of uses of the Matrix, especially the World Wide Web, see ""24 Hours in Cyberspace"

TEXTBLOCK 14/21 // URL: http://world-information.org/wio/infostructure/100437611791/100438659889
 
Introduction: The Substitution of Human Faculties with Technology: Computers and Robots

With the development of modern computing, starting in the 1940s, the substitution of human abilities with technology obtained a new dimension. The focus shifted from the replacement of pure physical power to the substitution of mental faculties. Following the early 1980s personal computers started to attain widespread use in offices and quickly became indispensable tools for office workers. The development of powerful computers combined with progresses in artificial intelligence research also led to the construction of sophisticated robots, which enabled a further rationalization of manufacturing processes.

TEXTBLOCK 15/21 // URL: http://world-information.org/wio/infostructure/100437611663/100438659302
 
Economic structure; digital euphoria

The dream of a conflict-free capitalism appeals to a diverse audience. No politician can win elections without eulogising the benefits of the information society and promising universal wealth through informatisation. "Europe must not lose track and should be able to make the step into the new knowledge and information society in the 21st century", said Tony Blair.

The US government has declared the construction of a fast information infrastructure network the centerpiece of its economic policies

In Lisbon the EU heads of state agreed to accelerate the informatisation of the European economies

The German Chancellor Schröder has requested the industry to create 20,000 new informatics jobs.

The World Bank understands information as the principal tool for third world development

Electronic classrooms and on-line learning schemes are seen as the ultimate advance in education by politicians and industry leaders alike.

But in the informatised economies, traditional exploitative practices are obscured by the glamour of new technologies. And the nearly universal acceptance of the ICT message has prepared the ground for a revival of 19th century "adapt-or-perish" ideology.

"There is nothing more relentlessly ideological than the apparently anti-ideological rhetoric of information technology"

(Arthur and Marilouise Kroker, media theorists)

TEXTBLOCK 16/21 // URL: http://world-information.org/wio/infostructure/100437611726/100438658999
 
Virtual body and data body



The result of this informatisation is the creation of a virtual body which is the exterior of a man or woman's social existence. It plays the same role that the physical body, except located in virtual space (it has no real location). The virtual body holds a certain emancipatory potential. It allows us to go to places and to do things which in the physical world would be impossible. It does not have the weight of the physical body, and is less conditioned by physical laws. It therefore allows one to create an identity of one's own, with much less restrictions than would apply in the physical world.

But this new freedom has a price. In the shadow of virtualisation, the data body has emerged. The data body is a virtual body which is composed of the files connected to an individual. As the Critical Art Ensemble observe in their book Flesh Machine, the data body is the "fascist sibling" of the virtual body; it is " a much more highly developed virtual form, and one that exists in complete service to the corporate and police state."

The virtual character of the data body means that social regulation that applies to the real body is absent. While there are limits to the manipulation and exploitation of the real body (even if these limits are not respected everywhere), there is little regulation concerning the manipulation and exploitation of the data body, although the manipulation of the data body is much easier to perform than that of the real body. The seizure of the data body from outside the concerned individual is often undetected as it has become part of the basic structure of an informatised society. But data bodies serve as raw material for the "New Economy". Both business and governments claim access to data bodies. Power can be exercised, and democratic decision-taking procedures bypassed by seizing data bodies. This totalitarian potential of the data body makes the data body a deeply problematic phenomenon that calls for an understanding of data as social construction rather than as something representative of an objective reality. How data bodies are generated, what happens to them and who has control over them is therefore a highly relevant political question.

TEXTBLOCK 17/21 // URL: http://world-information.org/wio/infostructure/100437611761/100438659695
 
World War II ...

Never before propaganda had been as important as in the 2nd World War. From now on education was one more field of propaganda: its purpose was to teach how to think, while pure propaganda was supposed to show what to think.
Every nation founded at least one ministry of propaganda - of course without calling it that way. For example the British called it the Ministry of Information (= MOI), the U.S. distinguished between the Office of Strategic Services (= OSS) and the Office of War Information (= OWI), the Germans created a Ministry of Propaganda and Public Enlightenment (= RMVP) and the Japanese called their disinformation and propaganda campaign the "Thought War".
British censorship was so strict that the text of an ordinary propaganda leaflet, that had been dropped from planes several million times, was not given to a journalist who asked for it.

Atrocity stories were no longer used the same way as in the 1st World War. Instead, black propaganda was preferred, especially to separate the Germans from their leaders.
German war propaganda had started long before the war. In the middle of the 1930s Leni Riefenstahl filmed Hitler best propaganda movies. For the most famous one, "Triumph of the Will" (1935), she was the only professional filmier who was allowed to make close-up pictures of her admirer.

Some of the pictures of fear, hatred and intolerance still exist in people's heads. Considering this propaganda did a good job, unfortunately it was the anti-national-socialist propaganda that failed at that time.

TEXTBLOCK 18/21 // URL: http://world-information.org/wio/infostructure/100437611661/100438658610
 
Who owns the Internet and who is in charge?

The Internet/Matrix still depends heavily on public infrastructure and there is no dedicated owner of the whole Internet/Matrix, but the networks it consists of are run and owned by corporations and institutions. Access to the Internet is usually provided by Internet Service Providers (ISPs) for a monthly fee. Each network is owned by someone and has a network operation center from where it is centrally controlled, but the Internet/Matrix is not owned by any single authority and has no network operation center of its own. No legal authority determines how and where networks can be connected together, this is something the managers of networks have to agree about. So there is no way to ever gain ultimate control of the Matrix/Internet.
The in some respects decentralized Matrix/Internet architecture and administration do not imply that there are no authorities for oversight and common standards for sustaining basic operations, for administration: There are authorities for IP number and domain name registrations, e.g.
Ever since the organizational structures for Internet administration have changed according to the needs to be addressed. Up to now, administration of the Internet is a collaborative undertaking of several loose cooperative bodies with no strict hierarchy of authority. These bodies make decisions on common guidelines, as communication protocols, e.g., cooperatively, so that compatibility of software is guaranteed. But they have no binding legal authority, nor can they enforce the standards they have agreed upon, nor are they wholly representative for the community of Internet users. The Internet has no official governing body or organization; most parts are still administered by volunteers.
Amazingly, there seems to be an unspoken and uncodified consent of what is allowed and what is forbidden on the Internet that is widely accepted. Codifications, as the so-called Netiquette, are due to individual efforts and mostly just expressively stating the prevailing consent. Violations of accepted standards are fiercely rejected, as reactions to misbehavior in mailing lists and newsgroups prove daily.
Sometimes violations not already subject to law become part of governmental regulations, as it was the case with spamming, the unsolicited sending of advertising mail messages. But engineers proved to be quicker and developed software against spamming. So, in some respects, the Internet is self-regulating, indeed.
For a detailed report on Internet governance, click here.

TEXTBLOCK 19/21 // URL: http://world-information.org/wio/infostructure/100437611791/100438658447
 
The Privatization of Censorship

According to a still widely held conviction, the global data networks constitute the long desired arena for uncensorable expression. This much is true: Because of the Net it has become increasingly difficult to sustain cultural and legal standards. Geographical proximity and territorial boundaries prove to be less relevant, when it does not affect a document's availability if it is stored on your desktop or on a host some thousand kilometers away. There is no international agreement on non-prohibited contents, so human rights organizations and nazi groups alike can bypass restrictions. No single authority or organization can impose its rules and standards on all others. This is why the Net is public space, a political arena where free expression is possible.

This freedom is conditioned by the design of the Net. But the Net's design is not a given, as Lawrence Lessig reminds us. Originally the design of the Net allowed a relatively high degree of privacy and communication was not controlled directly. But now this design is changing and this invisible agora in electronic space is endangered. Governments - even elected ones - and corporations introduce new technologies that allow us to be identified, monitored and tracked, that identify and block content, and that can allow our behaviour to be efficiently controlled.

When the World Wide Web was introduced, soon small independent media and human rights organizations began to use this platform for drawing worldwide attention to their publications and causes. It seemed to be the dawning of a new era with authoritarian regimes and multinational media corporations on the looser side. But now the Net's design is changing according to their needs.

"In every context that it can, the entertaining industry is trying to force the Internet into its own business model: the perfect control of content. From music (fighting MP3) and film (fighting the portability of DVD) to television, the industry is resisting the Net's original design. It was about the free flow of content; Hollywood wants perfect control instead" (Lawrence Lessig, Cyberspace Prosecutor, in: The Industry Standard, February 2000).

In the United States, Hollywood and AT&T, after its merger with MediaOne becoming the biggest US cable service provider, return to their prior positions in the Seventies: the control of content and infrastructure. If most people will access the Net via set up boxes connected to a TV set, it will become a kind of television, at least in the USA.

For small independent media it will become very hard to be heard, especially for those offering streaming video and music. Increasingly faster data transmissions just apply to download capacities; upload capacities are much - on the average about eight times - lower than download capacities. As an AT&T executive said in response to criticism: "We haven't built a 56 billion dollar cable network to have the blood sucked from our veins" (Lawrence Lessig, The Law in the Code: How the Net is Regulated, Lecture at the Institute for Human Sciences, Vienna, May 29th, 2000).

Consumers, not producers are preferred.

For corporations what remains to be done to control the Net is mainly to cope with the fact that because of the Net it has become increasingly difficult to sustain cultural and legal standards. On Nov 11, 1995 the German prosecuting attorney's office searched Compuserve Germany, the branch of an international Internet service provider, because the company was suspected of having offered access to child pornography. Consequently Compuserve blocked access to more than 200 newsgroups, all containing "sex" or "gay" in their names, for all its customers. But a few days later, an instruction for access to these blocked newsgroups via Compuserve came into circulation. On February 26, 1997, Felix Somm, the Chief Executive Officer of Compuserve Germany, was accused of complicity with the distribution of child and animal pornography in newsgroups. In May 1998 he received a prison sentence for two years. This sentence was suspended against a bail of about 51.000 Euro. The sentence was justified by pointing to the fact that Compuserve Germany offered access to its US parent company's servers hosting child pornography. Felix Somm was held responsible for access to forbidden content he could not know of. (For further information (in German) click here.)

Also in 1995, as an attack on US Vice-President Al Gore's intention to supply all public schools with Internet access, Republican Senator Charles Grassley warned of the lurking dangers for children on the Net. By referring to a Time magazine cover story by Philip Elmer-Dewitt from July 3 on pornography on the Net, he pointed out that 83,5% of all images online are pornographic. But Elmer-Dewitt was wrong. Obviously unaware of the difference between Bulletin Board Systems and the Net, he referred misleadingly to Marty Rimm's article Marketing Pornography on the Information Superhighway, published in the prestigious Georgetown Law Journal (vol. 83, June 1995, pp. 1849-1935). Rimm knew of this difference, of course, and stated it clearly. (For further information see Hoffman & Novak, The Cyberporn debate, http://ecommerce.vanderbilt.edu/cyberporn.debate.html and Franz Wegener, Cyberpornographie: Chronologie einer Hexenjagd; http://www.intro-online.de/c6.html)

Almost inevitably anxieties accompany the introduction of new technologies. In the 19th century it was said that traveling by train is bad for health. The debate produced by Time magazine's cover story and Senator Grassley's attack caused the impression that the Net has multiplied possible dangers for children. The global communication networks seem to be a inexhaustible source of mushrooming child pornography. Later would-be bomb recipes found on the Net added to already prevailing anxieties. As even in industrialized countries most people still have little or no first-hand experience with the Net, anxieties about child pornography or terrorist attacks can be stirred up and employed easily.

A similar and related debate is going on about the glorification of violence and erotic depictions in media. Pointing to a "toxic popular culture" shaped by media that "distort children's view of reality and even undermine their character growth", US right-wing social welfare organizations and think tanks call for strong media censorship. (See An Appeal to Hollywood, http://www.media-appeal.org/appeal.htm) Media, especially films and videos, are already censored and rated, so it is more censorship that is wanted.

The intentions for stimulating a debate on child pornography on the Net were manifold: Inter alia, it served the Republican Party to attack Democrat Al Gore's initiative to supply all public schools with Internet access; additionally, the big media corporations realized that because of the Net they might have to face new competitors and rushed to press for content regulation. Taking all these intentions together, we can say that this still ongoing debate constitutes the first and most well known attempt to impose content regulation on the Net. Consequently, at least in Western countries, governments and media corporations refer to child pornography for justifying legal requirement and the implementation of technologies for the surveillance and monitoring of individuals, the filtering, rating and blocking of content, and the prohibition of anonymous publishing on the Net.

In the name of "cleaning" the Net of child pornography, our basic rights are restricted. It is the insistence on unrestricted basic rights that needs to be justified, as it may seem.

Underlying the campaign to control the Net are several assumptions. Inter alia: The Net lacks control and needs to be made safe and secure; we may be exposed inadvertently to pornographic content; this content is harmful to children. Remarkably, racism seems to be not an issue.

The Net, especially the World Wide Web, is not like television (although it is to be feared this is what it might become like within the next years). Say, little Mary types "Barbie" in a search engine. Click here to see what happens. It is true, sometimes you might have the opportunity to see that pornography is just a few mouse clicks away, but it is not likely that you might be exposed to pornographic content unless you make deliberate mouse clicks.

In reaction to these anxieties, but in absence of data how children use the Internet, the US government released the Communications Decency Act (CDA) in 1996. In consequence the Electronic Frontier Foundation (EFF) launched the famous Blue Ribbon Campaign and, among others, America Online and Microsoft Corporation supported a lawsuit of the American Civil Liberties Union (ACLU) against this Act. On June 26, 1997, the US Supreme Court ruled the CDA as unconstitutional under the provisions of the First Amendment to the Constitution: The Communications Decency Act violated the basic right to free expression. After a summit with the US government industry leaders announced the using of existing rating and blocking systems and the development of new ones for "inappropriate" online resources.

So, after the failing of the CDA the US government has shifted its responsibility to the industry by inviting corporations to taking on governmental tasks. Bearing in the mind the CompuServe case and its possible consequences, the industry welcomed this decision and was quick to call this newly assumed responsibility "self-regulation". Strictly speaking, "self-regulation" as meant by the industry does not amount to the regulation of the behaviour of corporations by themselves. On the opposite, "self-regulation" is to be understood as the regulation of users' behaviour by the rating, filtering and blocking of Internet content considered being inappropriate. The Internet industry tries to show that technical solutions are more favourable than legislation und wants to be sure, not being held responsible and liable for illegal, offensive or harmful content. A new CompuServe case and a new Communications Decency Act shall be averted.

In the Memorandum Self-regulation of Internet Content released in late 1999 by the Bertelsmann Foundation it is recommended that the Internet industry joins forces with governmental institutions for enforcing codes of conduct and encouraging the implementation of filters and ratings systems. For further details on the Memorandum see the study by the Center for Democracy and Technology, An Analysis of the Bertelsmann Foundation Memorandum on Self-Regulation of Internet Content: Concerns from a User Empowerment Perspective.

In fact, the "self-regulation" of the Internet industry is privatized censorship performed by corporations and right-wing NGOs. Censorship has become a business. "Crucially, the lifting of restrictions on market competition hasn't advanced the cause of freedom of expression at all. On the contrary, the privatisation of cyberspace seems to be taking place alongside the introduction of heavy censorship." (Richard Barbrook and Andy Cameron, The Californian Ideology)

While trying to convince us that its technical solutions are appropriate alternatives to government regulation, the Internet industry cannot dispense of governmental backing to enforce the proposed measures. This adds to and enforces the censorship measures already undertaken by governments. We are encouraged to use today's information and communication technologies, while the flow of information is restricted.

According to a report by Reporters Sans Frontières, quoted by Leonard R. Sussman in his essay Censor Dot Gov. The Internet and Press Freedom 2000, the following countries totally or largely control Internet access: Azerbaijan, Belarus, Burma, China, Cuba, Iran, Iraq, Kazakhstan, Kirghizstan, Libya, North Korea, Saudi Arabia, Sierra Leone, Sudan, Syria, Tajikistan, Tunisia, Turkmenistan, Uzbekistan, and Vietnam.

TEXTBLOCK 20/21 // URL: http://world-information.org/wio/infostructure/100437611742/100438658968
 
Virtual cartels; mergers

In parallel to the deregulation of markets, there has been a trend towards large-scale mergers which ridicules dreams of increased competition.

Recent mega-mergers and acquisitions include

SBC Communications - Ameritech, $ 72,3 bn

Bell Atlantic - GTE, $ 71,3

AT&T - Media One, $ 63,1

AOL - Time Warner, $ 165 bn

MCI Worldcom - Spring, $ 129 bn

The total value of all major mergers since the beginnings of the 1990s has been 20 trillion Dollars, 2,5 times the size of the USA's GIP.

The AOL- Time Warner reflects a trend which can be observed everywhere: the convergence of the ICT and the content industries. This represents the ultimate advance in complete market domination, and a alarming threat to independent content.

"Is TIME going to write something negative about AOL? Will AOL be able to offer anything other than CNN sources? Is the Net becoming as silly and unbearable as television?"

(Detlev Borchers, journalist)

TEXTBLOCK 21/21 // URL: http://world-information.org/wio/infostructure/100437611709/100438658959
 
Hieroglyphs

Hieroglyphs are pictures, used for writing in ancient Egypt. First of all those pictures were used for the names of kings, later more and more signs were added, until a number of 750 pictures

INDEXCARD, 1/25
 
World Wide Web (WWW)

Probably the most significant Internet service, the World Wide Web is not the essence of the Internet, but a subset of it. It is constituted by documents that are linked together in a way you can switch from one document to another by simply clicking on the link connecting these documents. This is made possible by the Hypertext Mark-up Language (HTML), the authoring language used in creating World Wide Web-based documents. These so-called hypertexts can combine text documents, graphics, videos, sounds, and Java applets, so making multimedia content possible.

Especially on the World Wide Web, documents are often retrieved by entering keywords into so-called search engines, sets of programs that fetch documents from as many servers as possible and index the stored information. (For regularly updated lists of the 100 most popular words that people are entering into search engines, click here). No search engine can retrieve all information on the whole World Wide Web; every search engine covers just a small part of it.

Among other things that is the reason why the World Wide Web is not simply a very huge database, as is sometimes said, because it lacks consistency. There is virtually almost infinite storage capacity on the Internet, that is true, a capacity, which might become an almost everlasting too, a prospect, which is sometimes consoling, but threatening too.

According to the Internet domain survey of the Internet Software Consortium the number of Internet host computers is growing rapidly. In October 1969 the first two computers were connected; this number grows to 376.000 in January 1991 and 72,398.092 in January 2000.

World Wide Web History Project, http://www.webhistory.org/home.html

http://www.searchwords.com/
http://www.islandnet.com/deathnet/
http://www.salonmagazine.com/21st/feature/199...
INDEXCARD, 2/25
 
Polybius

Polybius was one of the greatest historians of the ancient Greek. he lived from 200-118 BC. see: Polybius Checkerboard.

INDEXCARD, 3/25
 
Internet Research Task Force

Being itself under the umbrella of the Internet Society, the Internet Research Task Force is an umbrella organization of small research groups working on topics related to Internet protocols, applications, architecture and technology. It is governed by the Internet Research Steering Group.

http://www.irtf.org

http://www.irtf.org/
INDEXCARD, 4/25
 
Immanuel Wallerstein

Immanuel Wallerstein (* 1930) is director of the Fernand Braudel Center for the Study of Economies, Historical Systems, and Civilizations. He is one of the most famous sociologists in the Western World. With his book The Modern World-System: Capitalist Agriculture and the Origins of the European World-Economy in the Sixteenth Century (1976), which led to the expression World-System Theory about centers, peripheries and semi-peripheries in the capitalist world system, he did not only influence a whole generation of scientists, but this theory seems to get popular again, due to globalization.

INDEXCARD, 5/25
 
Polybius Checkerboard


 

1

2

3

4

5

1

A

B

C

D

E

2

F

G

H

I

K

3

L

M

N

O

P

4

Q

R

S

T

U

5

V

W

X

Y

Z



It is a system, where letters get converted into numeric characters.
The numbers were not written down and sent but signaled with torches.

for example:
A=1-1
B=1-2
C=1-3
W=5-2

for more information see:
http://www.ftech.net/~monark/crypto/crypt/polybius.htm

http://www.ftech.net/~monark/crypto/crypt/pol...
INDEXCARD, 6/25
 
Bruce Schneier

Bruce Schneier is president of Counterpane Systems in Minneapolis. This consulting enterprise specialized in cryptography and computer security. He is the author of the book Applied Cryptography and inventor of the Blowfish and Twofish encryption algorithms.

INDEXCARD, 7/25
 
NATO

The North Atlantic Treaty was signed in Washington on 4 April 1949, creating NATO (= North Atlantic Treaty Organization). It was an alliance of 12 independent nations, originally committed to each other's defense. Between 1952 and 1982 four more members were welcomed and in 1999, the first ex-members of COMECON became members of NATO (the Czech Republic, Hungary and Poland), which makes 19 members now. Around its 50th anniversary NATO changed its goals and tasks by intervening in the Kosovo Crisis.

INDEXCARD, 8/25
 
Harold. D. Lasswell

Harold. D. Lasswell (* 1902) studied at the London School of Economics. He then became a professor of social sciences at different Universities, like the University of Chicago, Columbia University, and Yale University. He also was a consultant for several governments. One of Lasswell's many famous works was Propaganda Technique in World War. In this he defines propaganda. He also discussed major objectives of propaganda, like to mobilize hatred against the enemy, to preserve the friendship of allies, to procure the co-operation of neutrals and to demoralize the enemy.

INDEXCARD, 9/25
 
Theoedore Roosevelt

With the assassination of President McKinley, Theodore Roosevelt (1858-1919), not quite 43, became the youngest President in the Nation's history. Roosevelt's youth differed sharply from that of the log cabin Presidents. He was born in New York City in 1858 into a wealthy family. Roosevelt steered the United States more actively into world politics. He liked to quote a favorite proverb, "Speak softly and carry a big stick. . . . "

He won the Nobel Peace Prize for mediating the Russo-Japanese War.

for more information see the official website:

http://www.whitehouse.gov/WH/glimpse/presidents/html/tr26.html

http://www.whitehouse.gov/WH/glimpse/presiden...
INDEXCARD, 10/25
 
William Gibson

American science fiction author. Most famous novel: Neuromancer.

For resources as writings and interviews available on the Internet see http://www.lib.loyno.edu/bibl/wgibson.htm

INDEXCARD, 11/25
 
DES

The U.S. Data Encryption Standard (= DES) is the most widely used encryption algorithm, especially used for protection of financial transactions. It was developed by IBM in 1971. It is a symmetric-key cryptosystem. The DES algorithm uses a 56-bit encryption key, meaning that there are 72,057,594,037,927,936 possible keys.

for more information see:
http://www.britannica.com/bcom/eb/article/3/0,5716,117763+5,00.html
http://www.cryptography.com/des/

http://www.britannica.com/bcom/eb/article/3/0...
http://www.cryptography.com/des/
INDEXCARD, 12/25
 
CIA

CIA's mission is to support the President, the National Security Council, and all officials who make and execute U.S. national security policy by: Providing accurate, comprehensive, and timely foreign intelligence on national security topics; Conducting counterintelligence activities, special activities, and other functions related to foreign intelligence and national security, as directed by the President. To accomplish its mission, the CIA engages in research, development, and deployment of high-leverage technology for intelligence purposes. As a separate agency, CIA serves as an independent source of analysis on topics of concern and works closely with the other organizations in the Intelligence Community to ensure that the intelligence consumer--whether Washington policymaker or battlefield commander--receives the adaequate intelligence information.

http://www.cia.gov

INDEXCARD, 13/25
 
Central processing unit

A CPU is the principal part of any digital computer system, generally composed of the main memory, control unit, and arithmetic-logic unit. It constitutes the physical heart of the entire computer system; to it is linked various peripheral equipment, including input/output devices and auxiliary storage units...

INDEXCARD, 14/25
 
Kessler Marketing Intelligence (KMI)

KMI is the leading source for information on fiber-optics markets. It offers market research, strategic analysis and product planning services to the opto-electronics and communications industries. KMI tracks the worldwide fiber-optic cable system and sells the findings to the industry. KMI says that every fiber-optics corporation with a need for strategic market planning is a subscriber to their services.

http://www.kmicorp.com/

http://www.kmicorp.com/
INDEXCARD, 15/25
 
Vladimir Putin

Vladimir Putin is Russian President, Boris Yeltsin's. Until his appointment as Prime Minister in August 1999, he was nearly unknown. He had been working for the Soviet Security Service, the KGB. In July 1998 he took charge of the Federal Security Service, FSB. In March 1999 he became secretary of the Security Council. He has no experience in being at all. Where he demonstrated power until now is the Chechnya War. Soon after the beginning of this 2nd war in the region his popularity rose.

INDEXCARD, 16/25
 
Amazon.Com

Amazon.Com was one of the first online bookstores. With thousands of books, CDs and videos ordered via the Internet every year, Amazon.Com probably is the most successful Internet bookstore.

INDEXCARD, 17/25
 
blowfish encryption algorithm

Blowfish is a symmetric key block cipher that can vary its length.
The idea behind is a simple design to make the system faster than others.

http://www.counterpane.com/blowfish.html
http://www.counterpane.com/bfsverlag.html

http://www.counterpane.com/blowfish.html
http://www.counterpane.com/blowfish.html
INDEXCARD, 18/25
 
Electronic Messaging (E-Mail)

Electronic messages are transmitted and received by computers through a network. By E-Mail texts, images, sounds and videos can be sent to single users or simultaneously to a group of users. Now texts can be sent and read without having them printed.

E-Mail is one of the most popular and important services on the Internet.

INDEXCARD, 19/25
 
Federal Networking Council

Being an organization established in the name of the US government, the Federal Networking Council (FNC) acts as a forum for networking collaborations among Federal agencies to meet their research, education, and operational mission goals and to bridge the gap between the advanced networking technologies being developed by research FNC agencies and the ultimate acquisition of mature version of these technologies from the commercial sector.

Its members are representatives of agencies as the National Security Agency, the Department of Energy, the National Science Foundation, e.g.

http://www.fnc.gov

INDEXCARD, 20/25
 
Adolf Hitler

Adolf Hitler (1889-1945) was the head of the NSdAP, the National Socialist Workers' Party. Originally coming from Austria, he started his political career in Germany. As the Reichskanzler of Germany he provoked World War II. His hatred against all non-Aryans and people thinking in a different way killed millions of human beings. Disinformation about his personality and an unbelievable machinery of propaganda made an entire people close its eyes to the most cruel crimes on human kind.

INDEXCARD, 21/25
 
Internet Architecture Board

On behalf of the Internet Society, the Internet Architecture Board oversees the evolution of the architecture, the standards and the protocols of the Net.

Internet Society: http://www.isoc.org/iab

http://www.isoc.org/
INDEXCARD, 22/25
 
atbash

Atbash is regarded as the simplest way of encryption. It is nothing else than a reverse-alphabet. a=z, b= y, c=x and so on. Many different nations used it in the early times of writing.

for further explanations see:
http://www.ftech.net/~monark/crypto/crypt/atbash.htm

http://www.ftech.net/~monark/crypto/crypt/atb...
INDEXCARD, 23/25
 
Reuters Group plc

Founded in 1851 in London, Reuters is the world's largest news and television agency with 1,946 journalists, photographers and camera operators in 183 bureaus serving newspapers, other news agencies, and radio and television broadcasters in 157 countries.
In addition to its traditional news-agency business, over its network Reuters provides financial information and a wide array of electronic trading and brokering services to banks, brokering houses, companies, governments, and individuals worldwide.

http://www.reuters.com

INDEXCARD, 24/25
 
Enigma Machine

The Enigma Encryption Machine was famous for its insecurities as for the security that it gave to German ciphers. It was broken, first by the Poles in the 1930s, then by the British in World War II.

INDEXCARD, 25/25