Virtual cartels; mergers

In parallel to the deregulation of markets, there has been a trend towards large-scale mergers which ridicules dreams of increased competition.

Recent mega-mergers and acquisitions include

SBC Communications - Ameritech, $ 72,3 bn

Bell Atlantic - GTE, $ 71,3

AT&T - Media One, $ 63,1

AOL - Time Warner, $ 165 bn

MCI Worldcom - Spring, $ 129 bn

The total value of all major mergers since the beginnings of the 1990s has been 20 trillion Dollars, 2,5 times the size of the USA's GIP.

The AOL- Time Warner reflects a trend which can be observed everywhere: the convergence of the ICT and the content industries. This represents the ultimate advance in complete market domination, and a alarming threat to independent content.

"Is TIME going to write something negative about AOL? Will AOL be able to offer anything other than CNN sources? Is the Net becoming as silly and unbearable as television?"

(Detlev Borchers, journalist)

TEXTBLOCK 1/15 // URL: http://world-information.org/wio/infostructure/100437611709/100438658959
 
Product Placement

With television still being very popular, commercial entertainment has transferred the concept of soap operas onto the Web. The first of this new species of "Cybersoaps" was "The Spot", a story about the ups and downs of an American commune. The Spot not only within short time attracted a large audience, but also pioneered in the field of online product placement. Besides Sony banners, the companies logo is also placed on nearly every electronic product appearing in the story. Appearing as a site for light entertainment, The Spots main goal is to make the name Sony and its product range well known within the target audience.

TEXTBLOCK 2/15 // URL: http://world-information.org/wio/infostructure/100437611652/100438658026
 
Private data bunkers

On the other hand are the data bunkers of the private sector, whose position is different. Although these are fast-growing engines of data collection with a much greater degree of dynamism, they may not have the same privileged position - although one has to differentiate among the general historical and social conditions into which a data bunker is embedded. For example, it can safely be assumed that the databases of a large credit card company or bank are more protected than the bureaucracies of small developing countries.

Private data bunkers include

    Banks

    Building societies

    Credit bureaus

    Credit card companies

    Direct marketing companies

    Insurance companies

    Telecom service providers

    Mail order stores

    Online stores


TEXTBLOCK 3/15 // URL: http://world-information.org/wio/infostructure/100437611761/100438659735
 
Data bunkers

Personal data are collected, appropriated, processed and used for commercial purposes on a global scale. In order for such a global system to operate smoothly, there a server nodes at which the data streams converge. Among the foremost of these are the data bases of credit card companies, whose operation has long depended on global networking.

On top of credit card companies such as Visa, American Express, Master Card, and others. It would be erroneous to believe that the primary purpose of business of these companies is the provision of credit, and the facilitation of credit information for sale transactions. In fact, Information means much more than just credit information. In an advertisement of 1982, American Express described itself in these terms: ""Our product is information ...Information that charges airline tickets, hotel rooms, dining out, the newest fashions ...information that grows money funds buys and sells equities ...information that pays life insurance annuities ...information that schedules entertainment on cable television and electronically guards houses ...information that changes kroners into guilders and figures tax rates in Bermuda ..."

Information has become something like the gospel of the New Economy, a doctrine of salvation - the life blood of society, as Bill Gates expresses it. But behind information there are always data that need to be generated and collected. Because of the critical importance of data to the economy, their possession amounts to power and their loss can cause tremendous damage. The data industry therefore locates its data warehouses behind fortifications that bar physical or electronic access. Such structures are somewhat like a digital reconstruction of the medieval fortress

Large amounts of data are concentrated in fortress-like structures, in data bunkers. As the Critical Art Ensemble argue in Electronic Civil Disobedience: "The bunker is the foundation of homogeneity, and allows only a singular action within a given situation." All activities within data bunker revolve around the same principle of calculation. Calculation is the predominant mode of thinking in data-driven societies, and it reaches its greatest density inside data bunkers. However, calculation is not a politically neutral activity, as it provides the rational basis - and therefore the formal legitimisation most every decision taken. Data bunkers therefore have an essentially conservative political function, and function to maintain and strengthen the given social structures.

TEXTBLOCK 4/15 // URL: http://world-information.org/wio/infostructure/100437611761/100438659754
 
What is the Internet?

Each definition of the Internet is a simplified statement and runs the risk of being outdated within a short time. What is usually referred to as the Internet is a network of thousands of computer networks (so called autonomous systems) run by governmental authorities, companies, and universities, etc. Generally speaking, every time a user connects to a computer networks, a new Internet is created. Technically speaking, the Internet is a wide area network (WAN) that may be connected to local area networks (LANs).

What constitutes the Internet is constantly changing. Certainly the state of the future Net will be different to the present one. Some years ago the Internet could still be described as a network of computer networks using a common communication protocol, the so-called IP protocol. Today, however, networks using other communication protocols are also connected to other networks via gateways.

Also, the Internet is not solely constituted by computers connected to other computers, because there are also point-of-sale terminals, cameras, robots, telescopes, cellular phones, TV sets and and an assortment of other hardware components that are connected to the Internet.

At the core of the Internet are so-called Internet exchanges, national backbone networks, regional networks, and local networks.

Since these networks are often privately owned, any description of the Internet as a public network is not an accurate. It is easier to say what the Internet is not than to say what it is. On 24 October, 1995 the U.S. Federal Networking Council made the following resolution concerning the definition of the term "Internet": "Internet" refers to the global information system that (i) is logically linked together by a globally unique address space based on the Internet Protocol (IP) or its subsequent extensions/follow-ons; (ii) is able to support communications using the Transmission Control Protocol/Internet Protocol (TCP/IP) suite or its subsequent extensions/follow-ons, and/or other IP-compatible protocols; and (iii) provides, uses or makes accessible, either publicly or privately, high level services layered on the communications and related infrastructure described herein." (http://www.fnc.gov/Internet_res.html)

What is generally and in a simplyfiying manner called the Internet, may be better referred to as the Matrix, a term introduced by science fiction writer William Gibson, as John S. Quarterman and Smoot Carl-Mitchell have proposed. The Matrix consists of all computer systems worldwide capable of exchanging E-Mail: of the USENET, corporate networks and proprietary networks owned by telecommunication and cable TV companies.

Strictly speaking, the Matrix is not a medium; it is a platform for resources: for media and services. The Matrix is mainly a very powerful means for making information easily accessible worldwide, for sending and receiving messages, videos, texts and audio files, for transferring funds and trading securities, for sharing resources, for collecting weather condition data, for trailing the movements of elephants, for playing games online, for video conferencing, for distance learning, for virtual exhibitions, for jamming with other musicians, for long distance ordering, for auctions, for tracking packaged goods, for doing business, for chatting, and for remote access of computers and devices as telescopes and robots remotely, e. g. The Internet is a wonderful tool for exchanging, retrieving, and storing data and sharing equipment over long distances and eventually real-time, if telecommunication infrastructure is reliable and of high quality.

For a comprehensive view of uses of the Matrix, especially the World Wide Web, see ""24 Hours in Cyberspace"

TEXTBLOCK 5/15 // URL: http://world-information.org/wio/infostructure/100437611791/100438659889
 
Another Question of Security

Even with the best techniques it is impossible to invent a cryptographic system that is absolutely safe/unbreakable. To decipher a text means to go through many, sometimes nearly - but never really - endless attempts. For the computers of today it might take hundreds of years or even more to go through all possibilities of codes, but still, finally the code stays breakable. The much faster quantum computers will proof that one day.
Therefore the decision to elect a certain method of enciphering finally is a matter of trust.

For the average user of computers it is rather difficult to understand or even realize the dangers and/or the technological background of electronic transmission of data. For the majority thinking about one's own necessities for encryption first of all means to trust others, the specialists, to rely on the information they provide.
The websites explaining the problems behind (and also the articles and books concerning the topic) are written by experts of course as well, very often in their typical scientific language, merely understandable for laymen. The introductions and other superficial elements of those articles can be understood, whereas the real background appears as untouchable spheres of knowledge.

The fact that dangers are hard to see through and the need for security measures appears as something most people know from media reports, leads directly to the problem of an underdeveloped democracy in the field of cryptography. Obviously the connection between cryptography and democracy is rather invisible for many people. Those mentioned media reports often specialize in talking about the work computer hackers do (sometimes being presented as criminals, sometimes as heroes) and the danger to lose control over the money drawn away from one's bank account, if someone steals the credit card number or other important financial data. The term "security", surely connected to those issues, is a completely different one from the one that is connected to privacy.
It is especially the latter that touches the main elements of democracy.

for the question of security see:
http://www-db.stanford.edu/pub/gio/CS99I/security.html

TEXTBLOCK 6/15 // URL: http://world-information.org/wio/infostructure/100437611776/100438658850
 
Identificaiton in history

In biometric technology, the subject is reduced to its physical and therefore inseparable properties. The subject is a subject in so far as it is objectified; that is, in so far as is identified with its own res extensa, Descartes' "extended thing". The subject exists in so far as it can be objectified, if it resists the objectification that comes with measurement, it is rejected or punished. Biometrics therefore provides the ultimate tool for control; in it, the dream of hermetic identity control seems to become a reality, a modern technological reconstruction of traditional identification techniques such as the handshake or the look into somebody's eyes.

The use of identification by states and other institutions of authority is evidently not simply a modern phenomenon. The ancient Babylonians and Chinese already made use of finger printing on clay to identify authors of documents, while the Romans already systematically compared handwritings.

Body measurement has long been used by the military. One of the first measures after entering the military is the identification and appropriation of the body measurements of a soldier. These measurements are filed and combined with other data and make up what today we would call the soldier's data body. With his data body being in possession of the authority, a soldier is no longer able freely socialise and is instead dependent on the disciplinary structure of the military institution. The soldier's social being in the world is defined by the military institution.

However, the military and civilian spheres of modern societies are no longer distinct entities. The very ambivalence of advanced technology (dual use technologies) has meant that "good" and "bad" uses of technology can no longer be clearly distinguished. The measurement of physical properties and the creation of data bodies in therefore no longer a military prerogative, it has become diffused into all areas of modern societies.

If the emancipatory potential of weak identities is to be of use, it is therefore necessary to know how biometric technologies work and what uses they are put to.

TEXTBLOCK 7/15 // URL: http://world-information.org/wio/infostructure/100437611729/100438658096
 
Linking and Framing: Cases

Mormon Church v. Sandra and Jerald Tanner

In a ruling of December 1999, a federal judge in Utah temporarily barred two critics of the Mormon Church from posting on their website the Internet addresses of other sites featuring pirated copies of a Mormon text. The Judge said that it was likely that Sandra and Jerald Tanner had engaged in contributory copyright infringement when they posted the addresses of three Web sites that they knew, or should have known, contained the copies.

Kaplan, Carl S.: Copyright Decision Threatens Freedom to Link. In: New York Times. December 10, 1999.

Universal Studios v. Movie-List

The website Movie-List, which features links to online, externally hosted movie trailers has been asked to completely refrain from linking to any of Universal Studio's servers containing the trailers as this would infringe copyright.

Cisneros, Oscar S.: Universal: Don't Link to Us. In: Wired. July 27, 1999.

More cases concerned with the issue of linking, framing and the infringement of intellectual property are published in:

Ross, Alexandra: Copyright Law and the Internet: Selected Statutes and Cases.

TEXTBLOCK 8/15 // URL: http://world-information.org/wio/infostructure/100437611725/100438659639
 
Definition

During the last 20 years the old Immanuel Wallerstein-paradigm of center - periphery and semi-periphery found a new costume: ICTs. After Colonialism, Neo-Colonialism and Neoliberalism a new method of marginalization is emerging: the digital divide.

"Digital divide" describes the fact that the world can be divided into people who
do and people who do not have access to (or the education to handle with) modern information technologies, e.g. cellular telephone, television, Internet. This digital divide is concerning people all over the world, but as usually most of all people in the formerly so called third world countries and in rural areas suffer; the poor and less-educated suffer from that divide.
More than 80% of all computers with access to the Internet are situated in larger cities.

"The cost of the information today consists not so much of the creation of content, which should be the real value, but of the storage and efficient delivery of information, that is in essence the cost of paper, printing, transporting, warehousing and other physical distribution means, plus the cost of the personnel manpower needed to run these `extra' services ....Realizing an autonomous distributed networked society, which is the real essence of the Internet, will be the most critical issue for the success of the information and communication revolution of the coming century of millennium."
(Izumi Aizi)

for more information see:
http://www.whatis.com/digital_divide.htm

TEXTBLOCK 9/15 // URL: http://world-information.org/wio/infostructure/100437611730/100438659300
 
Content as Transport Medium for Values and Ideologies

With the dissemination of their content commercial media are among other things also able to transport values and ideologies. Usually their programming reflects society's dominant social, political, ethical, cultural and economical values. A critical view of the prevalent ideologies often is sacrificed so as not to offend the existing political elites and corporate powers, but rather satisfy shareholders and advertisers.

With most of the worlds content produced by a few commercial media conglomerates, with the overwhelming majority of companies (in terms of revenue generation) concentrated in Europe, the U.S., Japan and Australia there is also a strong flow of content from the 'North-West' to the 'South-East'. Popular culture developed in the world's dominant commercial centers and Western values and ideologies are so disseminated into the most distant corners of the earth with far less coming back.

TEXTBLOCK 10/15 // URL: http://world-information.org/wio/infostructure/100437611795/100438659066
 
Legal Protection: National Legislation

Intellectual property - comprising industrial property and copyright - in general is protected by national legislation. Therefore those rights are limited territorially and can be exercised only within the jurisdiction of the country or countries under whose laws they are granted.

TEXTBLOCK 11/15 // URL: http://world-information.org/wio/infostructure/100437611725/100438659540
 
Internet, Intranets, Extranets, and Virtual Private Networks

With the rise of networks and the corresponding decline of mainframe services computers have become communication devices instead of being solely computational or typewriter-like devices. Corporate networks become increasingly important and often use the Internet as a public service network to interconnect. Sometimes they are proprietary networks.

Software companies, consulting agencies, and journalists serving their interests make some further differences by splitting up the easily understandable term "proprietary networks" into terms to be explained and speak of Intranets, Extranets, and Virtual Private Networks.

Cable TV networks and online services as Europe Online, America Online, and Microsoft Network are also proprietary networks. Although their services resemble Internet services, they offer an alternative telecommunication infrastructure with access to Internet services for their subscribers.
America Online is selling its service under the slogan "We organize the Web for you!" Such promises are more frightening than promising because "organizing" is increasingly equated with "filtering" of seemingly objectionable messages and "rating" of content. For more information on these issues, click here If you want to know more about the technical nature of computer networks, here is a link to the corresponding article in the Encyclopaedia Britannica.

Especially for financial transactions, secure proprietary networks become increasingly important. When you transfer funds from your banking account to an account in another country, it is done through the SWIFT network, the network of the Society for Worldwide Interbank Financial Telecommunication (SWIFT). According to SWIFT, in 1998 the average daily value of payments messages was estimated to be above U$ 2 trillion.

Electronic Communications Networks as Instinet force stock exchanges to redefine their positions in trading of equities. They offer faster trading at reduced costs and better prices on trades for brokers and institutional investors as mutual funds and pension funds. Last, but not least clients are not restricted to trading hours and can trade anonymously and directly, thereby bypassing stock exchanges.

TEXTBLOCK 12/15 // URL: http://world-information.org/wio/infostructure/100437611791/100438658384
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 13/15 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
Copyright Management and Control Systems: Post-Infringement

Post-infringement technologies allow the owners of copyrighted works to identify infringements and thus enhance enforcement of intellectual property rights and encompass systems such as:

Steganography

Applied to electronic files, steganography refers to the process of hiding information in files that can not be easily detected by users. Steganography can be used by intellectual property owners in a variety of ways. One is to insert into the file a "digital watermark" which can be used to prove that an infringing file was the creation of the copyright holder and not the pirate. Other possibilities are to encode a unique serial number into each authorized copy or file, enabling the owner to trace infringing copies to a particular source, or to store copyright management information.

Agents

Agents are programs that can implement specified commands automatically. Copyright owners can use agents to search the public spaces of the Internet to find infringing copies. Although the technology is not yet very well developed full-text search engines allow similar uses.

Copyright Litigation

While not every infringement will be the subject of litigation, the threat of litigation helps keep large pirate operations in check. It helps copyright owners obtain relief for specific acts of infringement and publicly warns others of the dangers of infringement.

TEXTBLOCK 14/15 // URL: http://world-information.org/wio/infostructure/100437611725/100438659699
 
Timeline 1600 - 1900 AD

17th century Cardinal Richelieu invents an encryption-tool called grille, a card with holes for writing messages on paper into the holes of those cards. Afterwards he removes the cards and fills in the blanks, so the message looks like an ordinary letter. The recipient needs to own the same card

- Bishop John Wilkins invents a cryptologic system looking like music notes. In a book he describes several forms of steganographic systems like secrets inks, but also the string cipher. He mentions the so-called Pig Latin, a spoken way of encryption that was already used by the ancient Indians

- the English scientist, magician and astrologer John Dee works on the ancient Enochian alphabet; he also possesses an encrypted writing that could not been broken until today

1605/1623 Sir Francis Bacon (= Francis Tudor = William Shakespeare?) writes several works containing ideas about cryptography. One of his most important advises is to use ciphers in such a way that no-one gets suspicious that the text could be enciphered. For this the steganogram was the best method, very often used in poems. The attempt to decipher Shakespeare's sonnets (in the 20th century) lead to the idea that his works had been written by Francis Bacon originally.

1671 Leibniz invents a calculating machine that uses the binary scale which we still use today, more advanced of course, called the ASCII code

18th century this is the time of the Black Chambers of espionage in Europe, Vienna having one of the most effective ones, called the "Geheime Kabinettskanzlei", headed by Baron Ignaz von Koch. Its task is to read through international diplomatic mail, copy letters and return them to the post-office the same morning. Supposedly about 100 letters are dealt with each day.

1790's Thomas Jefferson and Robert Patterson invent a wheel cipher

1799 the Rosetta Stone is found and makes it possible to decipher the Egyptian Hieroglyphs

1832 or 1838 Sam Morse develops the Morse Code, which actually is no code but an enciphered alphabet of short and long sounds. The first Morse code-message is sent by telegraph in 1844.

1834 the Braille Code for blind people is developed in today's form by Louis Braille

1844 the invention of the telegraph changes cryptography very much, as codes are absolutely necessary by then

1854 the Playfair cipher is invented by Sir Charles Wheatstone

1859 for the first time a tomographic cipher gets described

1861 Friedrich W. Kasiski does a cryptoanalysis of the Vigenère ciphers, which had been supposed to be uncrackable for ages

1891 Major Etienne Bazeries creates a new version of the wheel cipher, which is rejected by the French Army

1895 the invention of the radio changes cryptography-tasks again and makes them even more important

TEXTBLOCK 15/15 // URL: http://world-information.org/wio/infostructure/100437611776/100438658974
 
François Duvalier

b. April 14, 1907, Port-au-Prince, Haiti
d. April 21, 1971, Port-au-Prince

By name PAPA DOC, president of Haiti whose 14-year regime was of unprecedented duration in that country. A supporter of President Dumarsais Estimé, Duvalier was appointed director general of the National Public Health Service in 1946. He was appointed underminister of labour in 1948 and the following year became minister of public health and labour, a post that he retained until May 10, 1950, when President Estimé was overthrown by a military junta under Paul E. Magloire, who was subsequently elected president. By 1954 he had become the central opposition figure and went underground. Duvalier was elected president in September 1957. Setting about to consolidate his power, he reduced the size of the army and organized the Tontons Macoutes ("Bogeymen"), a private force responsible for terrorizing and assassinating alleged foes of the regime. Late in 1963 Duvalier moved further toward an absolutist regime, promoting a cult of his person as the semi divine embodiment of the Haitian nation. In April 1964 he was declared president for life. Although diplomatically almost completely isolated, excommunicated by the Vatican until 1966 for harassing the clergy, and threatened by conspiracies against him, Duvalier was able to stay in power longer than any of his predecessors.

INDEXCARD, 1/16
 
Galileo Galilee

Galileo Galilee (1564-1642), the Italian Mathematician and Physicist is called the father of Enlightenment. He proofed the laws of the free fall, improved the technique for the telescope and so on. Galilee is still famous for his fights against the Catholic Church. He published his writings in Italian instead of writing in Latin. Like this, everybody could understand him, which made him popular. As he did not stop talking about the world as a ball (the Heliocentric World System) instead of a disk, the Inquisition put him on trial twice and forbid him to go on working on his experiments.

INDEXCARD, 2/16
 
Gateway

A gateway is a computer supplying point-to-multipoint connections between computer networks.

INDEXCARD, 3/16
 
Oscar Wilde

Oscar Flingal O'Flahertie Wills (1854-1900) is one of the best and most famous poets and novelists of England of his time. His satirical and amusing texts exposed the false moral of the Bourgeoisie publicly. Besides, his life as a dandy made him the leader of aesthetics in England, until he was sent to prison because of homosexuality. Afterwards he lived in Paris where he died lonely and nearly forgotten in a hotel in 1900. His poems, fairy tales, novels and dramas survived.

INDEXCARD, 4/16
 
Internet Engineering Steering Group

On behalf of the Internet Society, the Internet Engineering Steering Group is responsible for the technical management of the evolution of the architecture, the standards and the protocols of the Net.

http://www.ietf.org/iesg.html

http://www.ietf.org/iesg.html
INDEXCARD, 5/16
 
Gerard J. Holzmann and Bjoern Pehrson, The Early History of Data Networks

This book gives a fascinating glimpse of the many documented attempts throughout history to develop effective means for long distance communications. Large-scale communication networks are not a twentieth-century phenomenon. The oldest attempts date back to millennia before Christ and include ingenious uses of homing pigeons, mirrors, flags, torches, and beacons. The first true nationwide data networks, however, were being built almost two hundred years ago. At the turn of the 18th century, well before the electromagnetic telegraph was invented, many countries in Europe already had fully operational data communications systems with altogether close to one thousand network stations. The book shows how the so-called information revolution started in 1794, with the design and construction of the first true telegraph network in France, Chappe's fixed optical network.

http://www.it.kth.se/docs/early_net/

INDEXCARD, 6/16
 
CIM

To perform manufacturing firm's functions related to design and production the CAD/CAM technology, for computer-aided design and computer-aided manufacturing, was developed. Today it is widely recognized that the scope of computer applications must extend beyond design and production to include the business functions of the firm. The name given to this more comprehensive use of computers is computer-integrated manufacturing (CIM).

INDEXCARD, 7/16
 
Censorship of Online Content in China

During the Tian-an men massacre reports and photos transmitted by fax machines gave notice of what was happening only with a short delay. The Chinese government has learned his lesson well and "regulated" Internet access from the beginning. All Internet traffic to and out of China passes through a few gateways, a few entry-points, thus making censorship a relatively easy task. Screened out are web sites of organizations and media which express dissident viewpoints: Taiwan's Democratic Progress Party and Independence Party, The New York Times, CNN, and sites dealing with Tibetan independence and human rights issues.

Users are expected not to "harm" China's national interests and therefore have to apply for permission of Internet access; Web pages have to be approved before being published on the Net. For the development of measures to monitor and control Chinese content providers, China's state police has joined forces with the MIT.

For further information on Internet censorship, see Human Rights Watch, World Report 1999.

http://www.dpp.org/
http://www.nytimes.com/
http://www.hrw.org/worldreport99/special/inte...
INDEXCARD, 8/16
 
Joseph Stalin

Joseph Stalin (1879-1953):
After Lenin's death he took over and became a dictator without any limits of power. Everyone who dared to talk or act against him or was in suspicion of doing so, got killed. Millions were murdered. His empire was one made out of propaganda and fear. As long as he was in power his picture had to be in every flat and bureau. Soon after his death the cult was stopped and in 1956 the De-Stalination was started, though he was partly rehabilitated in 1970.

INDEXCARD, 9/16
 
Reuters Group plc

Founded in 1851 in London, Reuters is the world's largest news and television agency with 1,946 journalists, photographers and camera operators in 183 bureaus serving newspapers, other news agencies, and radio and television broadcasters in 157 countries.
In addition to its traditional news-agency business, over its network Reuters provides financial information and a wide array of electronic trading and brokering services to banks, brokering houses, companies, governments, and individuals worldwide.

http://www.reuters.com

INDEXCARD, 10/16
 
Disney

American corporation that became the best-known purveyor of child and adult entertainment in the 20th century. Its headquarters are in Burbank, Calif. The company was founded in 1929 and produced animated motion-picture cartoons.
In 1955 the company opened the Disneyland amusement park, one of the world's most famous. Under a new management, in the 1980s, Disney's motion-picture and animated-film production units became among the most successful in the United States. In 1996 the Disney corporation acquired Capital Cities/ABC Inc., which owned the ABC television network. The Disney Company also operates the Disney Channel, a pay television programming service.

INDEXCARD, 11/16
 
Augustus

Gaius Julius Caesar Octavian Augustus (63 BC - 14 AD) was adopted by Julius Caesar and became the first Roman Emperor. While he was very successful in military affairs abroad, he tried to bring back law and order to the Roman population. He was most interested in arts and philosophy.

INDEXCARD, 12/16
 
The Spot

http://www.thespot.com/

http://www.thespot.com/
INDEXCARD, 13/16
 
Josef Goebbels

Josef Goebbels (1897-1945) was Hitler's Minister for Propaganda and Public Enlightenment. He had unlimited influence on the press, the radio, movies and all kind of literary work in the whole Reich. In 1944 he received all power over the Total War. At the same time he was one of the most faithful followers of Hitler - and he followed him into death in 1945.

INDEXCARD, 14/16
 
blowfish encryption algorithm

Blowfish is a symmetric key block cipher that can vary its length.
The idea behind is a simple design to make the system faster than others.

http://www.counterpane.com/blowfish.html
http://www.counterpane.com/bfsverlag.html

http://www.counterpane.com/blowfish.html
http://www.counterpane.com/blowfish.html
INDEXCARD, 15/16
 
Viacom

One of the largest and foremost communications and media conglomerates in the
world. Founded in 1971, the present form of the corporation dates from 1994 when Viacom Inc., which owned radio and television stations and cable television programming services and systems, acquired the entertainment and publishing giant Paramount Communications Inc. and then merged with the video and music retailer Blockbuster Entertainment Corp. Headquarters are in New York City.

INDEXCARD, 16/16