Biometrics applications: physical access

This is the largest area of application of biometric technologies, and the most direct lineage to the feudal gate keeping system. Initially mainly used in military and other "high security" territories, physical access control by biometric technology is spreading into a much wider field of application. Biometric access control technologies are already being used in schools, supermarkets, hospitals and commercial centres, where the are used to manage the flow of personnel.

Biometric technologies are also used to control access to political territory, as in immigration (airports, Mexico-USA border crossing). In this case, they can be coupled with camera surveillance systems and artificial intelligence in order to identify potential suspects at unmanned border crossings. Examples of such uses in remote video inspection systems can be found at http://www.eds-ms.com/acsd/RVIS.htm

A gate keeping system for airports relying on digital fingerprint and hand geometry is described at http://www.eds-ms.com/acsd/INSPASS.htm. This is another technology which allows separating "low risk" travellers from "other" travellers.

An electronic reconstruction of feudal gate keeping capable of singling out high-risk travellers from the rest is already applied at various border crossing points in the USA. "All enrolees are compared against national lookout databases on a daily basis to ensure that individuals remain low risk". As a side benefit, the economy of time generated by the inspection system has meant that "drug seizures ... have increased since Inspectors are able to spend more time evaluating higher risk vehicles".

However, biometric access control can not only prevent people from gaining access on to a territory or building, they can also prevent them from getting out of buildings, as in the case of prisons.

TEXTBLOCK 1/26 // URL: http://world-information.org/wio/infostructure/100437611729/100438658838
 
Biometrics applications: gate keeping

Identity has to do with "place". In less mobile societies, the place where a person finds him/herself tells us something about his/her identity. In pre-industrial times, gatekeepers had the function to control access of people to particular places, i.e. the gatekeepers function was to identify people and then decide whether somebody's identity would allow that person to physically occupy another place - a town, a building, a vehicle, etc.

In modern societies, the unambiguous nature of place has been weakened. There is a great amount of physical mobility, and ever since the emergence and spread of electronic communication technologies there has been a "virtualisation" of places in what today we call "virtual space" (unlike place, space has been a virtual reality from the beginning, a mathematical formula) The question as to who one is no longer coupled to the physical abode. Highly mobile and virtualised social contexts require a new generation of gatekeepers which biometric technology aims to provide.

TEXTBLOCK 2/26 // URL: http://world-information.org/wio/infostructure/100437611729/100438658757
 
In Search of Reliable Internet Measurement Data

Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.

Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.

Size and Growth

In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).

So his statement is a slap in the face of all market researchers stating otherwise.
In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand.
So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools.
"There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available.
What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet.
You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.

Hosts

The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data.
Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.

Internet Weather

Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.

Hits, Page Views, Visits, and Users

Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed.
For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate.
In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.

Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal.
More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle.
But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client-server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.

For

If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.

Measuring the Density of IP Addresses

Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom (http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html).
Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.





TEXTBLOCK 3/26 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
 
Global data bodies - intro

- Education files, insurance files, tax files, communication files, consumption files, medical files, travel files, criminal files, investment files, files into infinity ...

Critical Art Ensemble

Global data bodies

1. Introduction

Informatisation has meant that things that once were "real", i.e. whose existence could be experienced sensually, are becoming virtual. Instead of the real existence of a thing, the virtual refers to its possibility of existence. As this process advances, an increasing identification of the possible with the real occurs. Reality migrates into a dim and dematerialised grey area. In the end, the possible counts for the real, virtualisation creates an "as-if" experience.

The experience of the body is also affected by this process. For example, in bio-technology, the human body and its functions are digitised, which prepares and understanding of the body exlusively in terms of its potential manipulation, the body becomes whatever it could be. But digitisation has not only affected the understanding and the social significance of the body, it has also altered the meaning of presence, traditionally identified with the body. The advance of information and communication technologies (ICTs) has meant that for an increasing number of activities we no longer need be physically present, our "virtual" presence, achieved by logging onto a electronic information network, is sufficient.

This development, trumpeted as the pinnacle of convenience by the ICT industries and governments interested in attracting investment, has deeply problematic aspects as well. For example, when it is no longer "necessary" to be physically present, it may soon no longer be possible or allowed. Online-banking, offered to customers as a convenience, is also serves as a justification for charging higher fees from those unwilling or unable to add banking to their household chores. Online public administration may be expected to lead to similar effects. The reason for this is that the digitalisation of the economy relies on the production of surplus data. Data has become the most important raw material of modern economies.

In modern economies, informatisation and virtualisation mean that people are structurally forced to carry out their business and life their lives in such a way as to generate data.

Data are the most important resource for the New Economy. By contrast, activities which do not leave behind a trace of data, as for example growing your own carrots or paying cash rather than by plastic card, are discouraged and structurally suppressed.

TEXTBLOCK 4/26 // URL: http://world-information.org/wio/infostructure/100437611761/100438659649
 
World War II ...

Never before propaganda had been as important as in the 2nd World War. From now on education was one more field of propaganda: its purpose was to teach how to think, while pure propaganda was supposed to show what to think.
Every nation founded at least one ministry of propaganda - of course without calling it that way. For example the British called it the Ministry of Information (= MOI), the U.S. distinguished between the Office of Strategic Services (= OSS) and the Office of War Information (= OWI), the Germans created a Ministry of Propaganda and Public Enlightenment (= RMVP) and the Japanese called their disinformation and propaganda campaign the "Thought War".
British censorship was so strict that the text of an ordinary propaganda leaflet, that had been dropped from planes several million times, was not given to a journalist who asked for it.

Atrocity stories were no longer used the same way as in the 1st World War. Instead, black propaganda was preferred, especially to separate the Germans from their leaders.
German war propaganda had started long before the war. In the middle of the 1930s Leni Riefenstahl filmed Hitler best propaganda movies. For the most famous one, "Triumph of the Will" (1935), she was the only professional filmier who was allowed to make close-up pictures of her admirer.

Some of the pictures of fear, hatred and intolerance still exist in people's heads. Considering this propaganda did a good job, unfortunately it was the anti-national-socialist propaganda that failed at that time.

TEXTBLOCK 5/26 // URL: http://world-information.org/wio/infostructure/100437611661/100438658610
 
Hill & Knowlton

Although it is generally hard to distinguish between public relations and propaganda, Hill & Knowlton, the worlds leading PR agency, represents an extraordinary example for the manipulation of public opinion with public relations activities. Hill & Knowlton did not only lobby for countries, accused of the abuse of human rights, like China, Peru, Israel, Egypt and Indonesia, but also represented the repressive Duvalier regime in Haiti.

It furthermore played a central role in the Gulf War. On behalf of the Kuwaiti government it presented a 15-year-old girl to testify before Congress about human rights violations in a Kuwaiti hospital. The girl, later found out to be the daughter of Kuwait's ambassador to the U.S., and its testimony then became the centerpiece of a finely tuned PR campaign orchestrated by Hill & Knowlton and co-ordinated with the White House on behalf of the government of Kuwait an the Citizens for a Free Kuwait group. Inflaming public opinion against Iraq and bringing the U.S. Congress in favor of war in the Gulf, this probably was one of the largest and most effective public relations campaigns in history.

Running campaigns against abortion for the Catholic Church and representing the Church of Scientology, large PR firms like Hill & Knowlton, scarcely hesitate to manipulate public and congressional opinion and government policy through media campaigns, congressional hearings, and lobbying, when necessary. Also co-operation with intelligence agencies seems to be not unknown to Hill & Knowlton.

Accused of pursuing potentially illegal proxy spying operation for intelligence agencies, Richard Cheney, head of Hill & Knowltons New York office, denied this allegations, but said that "... in such a large organization you never know if there's not some sneak operation going on." On the other hand former CIA official Robert T. Crowley acknowledged, that "Hill & Knowlton's overseas offices were perfect 'cover` for the ever-expanding CIA. Unlike other cover jobs, being a public relations specialist did not require technical training for CIA officers." Furthermore the CIA, Crowley admitted, used its Hill & Knowlton connections to "... put out press releases and make media contacts to further its positions. ... Hill & Knowlton employees at the small Washington office and elsewhere distributed this material through CIA assets working in the United States news media."

(Source: Carlisle, Johan: Public Relationships: Hill & Knowlton, Robert Gray, and the CIA. http://mediafilter.org/caq/)

TEXTBLOCK 6/26 // URL: http://world-information.org/wio/infostructure/100437611652/100438658088
 
Basics: Introduction

Copyright law is a branch of intellectual property law and deals with the rights of intellectual creators in their works. The scope of copyright protection as laid down in Article 2 of the 1996 WIPO Copyright Treaty "... extends to expressions and not to ideas, procedures, methods of operation or mathematical concepts as such." Copyright law protects the creativity concerning the choice and arrangement of words, colors, musical notes etc. It grants the creators of certain specified works exclusive rights relating to the "copying" and use of their original creation.


TEXTBLOCK 7/26 // URL: http://world-information.org/wio/infostructure/100437611725/100438659520
 
Feeding the data body

TEXTBLOCK 8/26 // URL: http://world-information.org/wio/infostructure/100437611761/100438659644
 
Intellectual Property: A Definition

Intellectual property, very generally, relates to the output, which result from intellectual activity in the industrial, scientific, literary and artistic fields. Traditionally intellectual property is divided into two branches:

1) Industrial Property

a) Inventions
b) Marks (trademarks and service marks)
c) Industrial designs
d) Unfair competition (trade secrets)
e) Geographical indications (indications of source and appellations of origin)

2) Copyright

The protection of intellectual property is guaranteed through a variety of laws, which grant the creators of intellectual goods, and services certain time-limited rights to control the use made of their products. Those rights apply to the intellectual creation as such, and not to the physical object in which the work may be embodied.

TEXTBLOCK 9/26 // URL: http://world-information.org/wio/infostructure/100437611725/100438659434
 
Internet, Intranets, Extranets, and Virtual Private Networks

With the rise of networks and the corresponding decline of mainframe services computers have become communication devices instead of being solely computational or typewriter-like devices. Corporate networks become increasingly important and often use the Internet as a public service network to interconnect. Sometimes they are proprietary networks.

Software companies, consulting agencies, and journalists serving their interests make some further differences by splitting up the easily understandable term "proprietary networks" into terms to be explained and speak of Intranets, Extranets, and Virtual Private Networks.

Cable TV networks and online services as Europe Online, America Online, and Microsoft Network are also proprietary networks. Although their services resemble Internet services, they offer an alternative telecommunication infrastructure with access to Internet services for their subscribers.
America Online is selling its service under the slogan "We organize the Web for you!" Such promises are more frightening than promising because "organizing" is increasingly equated with "filtering" of seemingly objectionable messages and "rating" of content. For more information on these issues, click here If you want to know more about the technical nature of computer networks, here is a link to the corresponding article in the Encyclopaedia Britannica.

Especially for financial transactions, secure proprietary networks become increasingly important. When you transfer funds from your banking account to an account in another country, it is done through the SWIFT network, the network of the Society for Worldwide Interbank Financial Telecommunication (SWIFT). According to SWIFT, in 1998 the average daily value of payments messages was estimated to be above U$ 2 trillion.

Electronic Communications Networks as Instinet force stock exchanges to redefine their positions in trading of equities. They offer faster trading at reduced costs and better prices on trades for brokers and institutional investors as mutual funds and pension funds. Last, but not least clients are not restricted to trading hours and can trade anonymously and directly, thereby bypassing stock exchanges.

TEXTBLOCK 10/26 // URL: http://world-information.org/wio/infostructure/100437611791/100438658384
 
Timeline Cryptography - Introduction

Besides oral conversations and written language many other ways of information-transport are known: like the bush telegraph, drums, smoke signals etc. Those methods are not cryptography, still they need en- and decoding, which means that the history of language, the history of communication and the history of cryptography are closely connected to each other
The timeline gives an insight into the endless fight between enciphering and deciphering. The reasons for them can be found in public and private issues at the same time, though mostly connected to military maneuvers and/or political tasks.

One of the most important researchers on Cryptography through the centuries is David Kahn; many parts of the following timeline are originating from his work.

TEXTBLOCK 11/26 // URL: http://world-information.org/wio/infostructure/100437611776/100438658824
 
Challenges for Copyright by ICT: Internet Service Providers

ISPs (Internet Service Providers) (and to a certain extent also telecom operators) are involved in the copyright debate primarily because of their role in the transmission and storage of digital information. Problems arise particularly concerning caching, information residing on systems or networks of ISPs at the directions of users and transitory communication.

Caching

Caching it is argued could cause damage because the copies in the cache are not necessarily the most current ones and the delivery of outdated information to users could deprive website operators of accurate "hit" information (information about the number of requests for a particular material on a website) from which advertising revenue is frequently calculated. Similarly harms such as defamation or infringement that existed on the original page may propagate for years until flushed from each cache where they have been replicated.

Although different concepts, similar issues to caching arise with mirroring (establishing an identical copy of a website on a different server), archiving (providing a historical repository for information, such as with newsgroups and mailing lists), and full-text indexing (the copying of a document for loading into a full-text or nearly full-text database which is searchable for keywords or concepts).

Under a literal reading of some copyright laws caching constitutes an infringement of copyright. Yet recent legislation like the DMCA or the proposed EU Directive on copyright and related rights in the information society (amended version) have provided exceptions for ISPs concerning particular acts of reproduction that are considered technical copies (caching). Nevertheless the exemption of liability for ISPs only applies if they meet a variety of specific conditions. In the course of the debate about caching also suggestions have been made to subject it to an implied license or fair use defense or make it (at least theoretically) actionable.

Information Residing on Systems or Networks at the Direction of Users

ISPs may be confronted with problems if infringing material on websites (of users) is hosted on their systems. Although some copyright laws like the DMCA provide for limitations on the liability of ISPs if certain conditions are met, it is yet unclear if ISPs should generally be accountable for the storage of infringing material (even if they do not have actual knowledge) or exceptions be established under specific circumstances.

Transitory Communication

In the course of transmitting digital information from one point on a network to another ISPs act as a data conduit. If a user requests information ISPs engage in the transmission, providing of a connection, or routing thereof. In the case of a person sending infringing material over a network, and the ISP merely providing facilities for the transmission it is widely held that they should not be liable for infringement. Yet some copyright laws like the DMCA provide for a limitation (which also covers the intermediate and transient copies that are made automatically in the operation of a network) of liability only if the ISPs activities meet certain conditions.

For more information on copyright (intellectual property) related problems of ISPs (BBSs (Bulletin Board Service Operators), systems operators and other service providers) see:

Harrington, Mark E.: On-line Copyright Infringement Liability for Internet Service Providers: Context, Cases & Recently Enacted Legislation. In: Intellectual Property and Technology Forum. June 4, 1999.

Teran, G.: Who is Vulnerable to Suit? ISP Liability for Copyright Infringement. November 2, 1999.

TEXTBLOCK 12/26 // URL: http://world-information.org/wio/infostructure/100437611725/100438659550
 
The 19th Century: First Programmable Computing Devices

Until the 19th century "early computers", probably better described as calculating machines, were basically mechanical devices and operated by hand. Early calculators like the abacus worked with a system of sliding beads arranged on a rack and the centerpiece of Leibniz's multiplier was a stepped-drum gear design.

Therefore Charles Babbage's proposal of the Difference Engine (1822), which would have (it was never completed) a stored program and should perform calculations and print the results automatically, was a major breakthrough, as it for the first time suggested the automation of computers. The construction of the Difference Engine, which should perform differential equations, was inspired by Babbage's idea to apply the ability of machines to the needs of mathematics. Machines, he noted, were best at performing tasks repeatedly without mistakes, while mathematics often required the simple repetition of steps.

After working on the Difference Engine for ten years Babbage was inspired to build another machine, which he called Analytical Engine. Its invention was a major step towards the design of modern computers, as it was conceived the first general-purpose computer. Instrumental to the machine's design was his assistant, Augusta Ada King, Countess of Lovelace, the first female computer programmer.

The second major breakthrough in the design of computing machines in the 19th century may be attributed to the American inventor Herman Hollerith. He was concerned with finding a faster way to compute the U.S. census, which in 1880 had taken nearly seven years. Therefore Hollerith invented a method, which used cards to store data information which he fed into a machine that compiled the results automatically. The punch cards not only served as a storage method and helped reduce computational errors, but furthermore significantly increased speed.

Of extraordinary importance for the evolution of digital computers and artificial intelligence have furthermore been the contributions of the English mathematician and logician George Boole. In his postulates concerning the Laws of Thought (1854) he started to theorize about the true/false nature of binary numbers. His principles make up what today is known as Boolean algebra, the collection of logic concerning AND, OR, NOT operands, on which computer switching theory and procedures are grounded. Boole also assumed that the human mind works according to these laws, it performs logical operations that could be reasoned. Ninety years later Boole's principles were applied to circuits, the blueprint for electronic computers, by Claude Shannon.

TEXTBLOCK 13/26 // URL: http://world-information.org/wio/infostructure/100437611663/100438659426
 
Private data bunkers

On the other hand are the data bunkers of the private sector, whose position is different. Although these are fast-growing engines of data collection with a much greater degree of dynamism, they may not have the same privileged position - although one has to differentiate among the general historical and social conditions into which a data bunker is embedded. For example, it can safely be assumed that the databases of a large credit card company or bank are more protected than the bureaucracies of small developing countries.

Private data bunkers include

    Banks

    Building societies

    Credit bureaus

    Credit card companies

    Direct marketing companies

    Insurance companies

    Telecom service providers

    Mail order stores

    Online stores


TEXTBLOCK 14/26 // URL: http://world-information.org/wio/infostructure/100437611761/100438659735
 
Data bunkers

Personal data are collected, appropriated, processed and used for commercial purposes on a global scale. In order for such a global system to operate smoothly, there a server nodes at which the data streams converge. Among the foremost of these are the data bases of credit card companies, whose operation has long depended on global networking.

On top of credit card companies such as Visa, American Express, Master Card, and others. It would be erroneous to believe that the primary purpose of business of these companies is the provision of credit, and the facilitation of credit information for sale transactions. In fact, Information means much more than just credit information. In an advertisement of 1982, American Express described itself in these terms: ""Our product is information ...Information that charges airline tickets, hotel rooms, dining out, the newest fashions ...information that grows money funds buys and sells equities ...information that pays life insurance annuities ...information that schedules entertainment on cable television and electronically guards houses ...information that changes kroners into guilders and figures tax rates in Bermuda ..."

Information has become something like the gospel of the New Economy, a doctrine of salvation - the life blood of society, as Bill Gates expresses it. But behind information there are always data that need to be generated and collected. Because of the critical importance of data to the economy, their possession amounts to power and their loss can cause tremendous damage. The data industry therefore locates its data warehouses behind fortifications that bar physical or electronic access. Such structures are somewhat like a digital reconstruction of the medieval fortress

Large amounts of data are concentrated in fortress-like structures, in data bunkers. As the Critical Art Ensemble argue in Electronic Civil Disobedience: "The bunker is the foundation of homogeneity, and allows only a singular action within a given situation." All activities within data bunker revolve around the same principle of calculation. Calculation is the predominant mode of thinking in data-driven societies, and it reaches its greatest density inside data bunkers. However, calculation is not a politically neutral activity, as it provides the rational basis - and therefore the formal legitimisation most every decision taken. Data bunkers therefore have an essentially conservative political function, and function to maintain and strengthen the given social structures.

TEXTBLOCK 15/26 // URL: http://world-information.org/wio/infostructure/100437611761/100438659754
 
Another Question of Security

Even with the best techniques it is impossible to invent a cryptographic system that is absolutely safe/unbreakable. To decipher a text means to go through many, sometimes nearly - but never really - endless attempts. For the computers of today it might take hundreds of years or even more to go through all possibilities of codes, but still, finally the code stays breakable. The much faster quantum computers will proof that one day.
Therefore the decision to elect a certain method of enciphering finally is a matter of trust.

For the average user of computers it is rather difficult to understand or even realize the dangers and/or the technological background of electronic transmission of data. For the majority thinking about one's own necessities for encryption first of all means to trust others, the specialists, to rely on the information they provide.
The websites explaining the problems behind (and also the articles and books concerning the topic) are written by experts of course as well, very often in their typical scientific language, merely understandable for laymen. The introductions and other superficial elements of those articles can be understood, whereas the real background appears as untouchable spheres of knowledge.

The fact that dangers are hard to see through and the need for security measures appears as something most people know from media reports, leads directly to the problem of an underdeveloped democracy in the field of cryptography. Obviously the connection between cryptography and democracy is rather invisible for many people. Those mentioned media reports often specialize in talking about the work computer hackers do (sometimes being presented as criminals, sometimes as heroes) and the danger to lose control over the money drawn away from one's bank account, if someone steals the credit card number or other important financial data. The term "security", surely connected to those issues, is a completely different one from the one that is connected to privacy.
It is especially the latter that touches the main elements of democracy.

for the question of security see:
http://www-db.stanford.edu/pub/gio/CS99I/security.html

TEXTBLOCK 16/26 // URL: http://world-information.org/wio/infostructure/100437611776/100438658850
 
Late 1950s - Early 1960s: Second Generation Computers

An important change in the development of computers occurred in 1948 with the invention of the transistor. It replaced the large, unwieldy vacuum tube and as a result led to a shrinking in size of electronic machinery. The transistor was first applied to a computer in 1956. Combined with the advances in magnetic-core memory, the use of transistors resulted in computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.

Stretch by IBM and LARC by Sperry-Rand (1959) were the first large-scale machines to take advantage of the transistor technology (and also used assembly language instead of the difficult machine language). Both developed for atomic energy laboratories could handle enormous amounts of data, but still were costly and too powerful for the business sector's needs. Therefore only two LARC's were ever installed.

Throughout the early 1960s there were a number of commercially successful computers (for example the IBM 1401) used in business, universities, and government and by 1965 most large firms routinely processed financial information by using computers. Decisive for the success of computers in business was the stored program concept and the development of sophisticated high-level programming languages like FORTRAN (Formular Translator), 1956, and COBOL (Common Business-Oriented Language), 1960, that gave them the flexibility to be cost effective and productive. The invention of second generation computers also marked the beginning of an entire branch, the software industry, and the birth of a wide range of new types of careers.

TEXTBLOCK 17/26 // URL: http://world-information.org/wio/infostructure/100437611663/100438659439
 
Online data capturing

Hardly a firm today can afford not to engage in electronic commerce if it does not want to be swept out of business by competitors. "Information is everything" has become something like the Lord's prayer of the New Economy. But how do you get information about your customer online? Who are the people who visit a website, where do they come from, what are they looking for? How much money do they have, what might they want to buy? These are key questions for a company doing electronic business. Obviously not all of this information can be obtained by monitoring the online behaviour of web users, but there are always little gimmicks that, when combined with common tracking technologies, can help to get more detailed information about a potential customer. These are usually online registration forms, either required for entry to a site, or competitions, sometimes a combination of the two. Obviously, if you want to win that weekend trip to New York, you want to provide your contact details.

The most common way of obtaining information about a user online is a cookie. However, a cookie by itself is not sufficient to identify a user personally. It merely identifies the computer to the server by providing its IP number. Only combined with other data extraction techniques, such as online registration, can a user be identified personally ("Register now to get the full benefit of xy.com. It's free!")

But cookies record enough information to fine-tune advertising strategies according to a user's preferences and interests, e.g. by displaying certain commercial banners rather than others. For example, if a user is found to respond to a banner of a particular kind, he / she may find two of them at the next visit. Customizing the offers on a website to the particular user is part of one-to-one marketing, a type of direct marketing. But one-to-one marketing can go further than this. It can also offer different prices to different users. This was done by Amazon.com in September 2000, when fist-time visitors were offered cheaper prices than regular customers.

One-to-one marketing can create very different realities that undermine traditional concepts of demand and supply. The ideal is a "frictionless market", where the differential between demand and supply is progressively eliminated. If a market is considered a structure within which demand / supply differentials are negotiated, this amounts to the abolition of the established notion of the nature of a market. Demand and supply converge, desire and it fulfilment coincide. In the end, there is profit without labour. However, such a structure is a hermetic structure of unfreedom.

It can only function when payment is substituted by credit, and the exploitation of work power by the exploitation of data. In fact, in modern economies there is great pressure to increase spending on credit. Using credit cards and taking up loans generates a lot of data around a person's economic behaviour, while at the same restricting the scope of social activity and increasing dependence. On the global level, the consequences of credit spirals can be observed in many of the developing countries that have had to abandon most of their political autonomy. As the data body economy advances, this is also the fate of people in western societies when they are structurally driven into credit spending. It shows that data bodies are not politically neutral.

The interrelation between data, profit and unfreedom is frequently overlooked by citizens and customers. Any company in a modern economy will apply data collecting strategies for profit, with dependence and unfreedom as a "secondary effect". The hunger for data has made IT companies eager to profit from e-business rather resourceful. "Getting to know the customer" - this is a catchphrase that is heard frequently, and which suggests that there are no limits to what a company may want to about a customer. In large online shops, such as amazon.com, where customer's identity is accurately established by the practice of paying with credit cards, an all business happens online, making it easy for the company to accurately profile the customers.

But there are more advanced and effective ways of identification. The German company Sevenval has developed a new way of customer tracking which works with "virtual domains". Every visitor of a website is assigned an 33-digit identification number which the browser understands as part of the www address, which will then read something like http://XCF49BEB7E97C00A328BF562BAAC75FB2.sevenval.com. Therefore, this tracking method, which is advertised by Sevenval as a revolutionary method capable of tracking the exact and complete path of a user on a website, can not be simple switched off. In addition, the method makes it possible for the identity of a user can travel with him when he / she visits one of the other companies linked to the site in question. As in the case of cookies, this tracking method by itself is not sufficient to identify a user personally. Such an identification only occurs once a customer pays with a credit card, or decides to participate in a draw, or voluntarily completes a registration form.

Bu there are much less friendly ways of extracting data from a user and feeding the data body. Less friendly means: these methods monitor users in situations where the latter are likely not to want to be monitored. Monitoring therefore takes place in a concealed manner. One of these monitoring methods are so-called web bugs. These are tiny graphics, not more than 1 x 1 pixel in size, and therefore invisible on a screen, capable of monitoring an unsuspecting user's e-mails or movements on a website. Leading corporations such as Barnes and Noble, eToys, Cooking.com, and Microsoft have all used web bugs in advertising campaigns. Richard Smith has compiled a web bugs FAQ site that contains detailed information and examples of web bugs in use.

Bugs monitoring users have also been packaged in seemingly harmless toys made available on the Internet. For example, Comet Systems offers cursor images which have been shown to collect user data and send them back to the company's server. These little images replace the customary white arrow of a mouse with a little image of a baseball, a cat, an UFO, etc. large enough to carry a bug collecting user information. The technology is offered as a marketing tool to companies looking for a "fun, new way to interact with their audience".

The cursor image technology relies on what is called a GUID (global unique identifier). This is an identification number which is assigned to a customer at the time of registration, or when downloading a product. Many among the online community were alarmed when in 1999 it was discovered that Microsoft assigned GUIDS without their customer's knowledge. Following protests, the company was forced to change the registration procedure, assuring that under no circumstances would these identification numbers be used for tracking or marketing.

However, in the meantime, another possible infringement on user anonymity by Microsoft was discovered, when it as found out that MS Office documents, such as Word, Excel or Powerpoint, contain a bug that is capable of tracking the documents as they are sent through the net. The bug sends information about the user who opens the document back to the originating server. A document that contains the bug can be tracked across the globe, through thousands of stopovers. In detailed description of the bug and how it works can be found at the Privacy Foundation's website. Also, there is an example of such a bug at the Privacy Center of the University of Denver.

Of course there are many other ways of collecting users' data and creating appropriating data bodies which can then be used for economic purposes. Indeed, as Bill Gates commented, "information is the lifeblood of business". The electronic information networks are becoming the new frontier of capitalism.

TEXTBLOCK 18/26 // URL: http://world-information.org/wio/infostructure/100437611761/100438659686
 
Enforcement: Copyright Management and Control Technologies

With the increased ease of the reproduction and transmission of unauthorized copies of digital works over electronic networks concerns among the copyright holder community have arisen. They fear a further growth of copyright piracy and demand adequate protection of their works. A development, which started in the mid 1990s and considers the copyright owner's apprehensions, is the creation of copyright management systems. Technological protection for their works, the copyright industry argues, is necessary to prevent widespread infringement, thus giving them the incentive to make their works available online. In their view the ideal technology should be "capable of detecting, preventing, and counting a wide range of operations, including open, print, export, copying, modifying, excerpting, and so on." Additionally such systems could be used to maintain "records indicating which permissions have actually been granted and to whom".

TEXTBLOCK 19/26 // URL: http://world-information.org/wio/infostructure/100437611725/100438659674
 
Legal Protection: WIPO (World Intellectual Property Organization)

Presumably the major player in the field of international intellectual property protection and administrator of various multilateral treaties dealing with the legal and administrative aspects of intellectual property is the WIPO.

Information on WIPO administered agreements in the field of industrial property (Paris Convention for the Protection of Industrial Property (1883), Madrid Agreement Concerning the International Registration of Marks (1891) etc.) can be found on: http://www.wipo.org/eng/general/index3.htm

Information on treaties concerning copyright and neighboring rights (Berne Convention for the Protection of Literary and Artistic Works (1886) etc.) is published on: http://www.wipo.org/eng/general/index5.htm

The most recent multilateral agreement on copyright is the 1996 WIPO Copyright Treaty. Among other things it provides that computer programs are protected as literary works and also introduces the protection of databases, which "... by reason of the selection or arrangement of their content constitute intellectual creations." Furthermore the 1996 WIPO Copyright Treaty contains provisions concerning technological measures, rights management information and establishes a new "right of communication to the public". It is available on: http://www.wipo.org/eng/diplconf/distrib/treaty01.htm

TEXTBLOCK 20/26 // URL: http://world-information.org/wio/infostructure/100437611725/100438659588
 
1500 - 1700 A.D.

1588
Agostino Ramelli's reading wheel

Agostino Ramelli designed a "reading wheel", which allowed browsing through a large number of documents without moving from one spot to another.

The device presented a large number of books - a small library - laid open on lecterns on a kind of ferry-wheel. It allowed skipping chapters and browsing through pages by turning the wheel to bring lectern after lectern before the eyes. Ramelli's reading wheel thus linked ideas and texts and reminds of today's browsing software used to navigate the World Wide Web.

1597
The first newspaper is printed in Europe.

TEXTBLOCK 21/26 // URL: http://world-information.org/wio/infostructure/100437611796/100438659704
 
0 - 1400 A.D.

150
A smoke signals network covers the Roman Empire

The Roman smoke signals network consisted of towers within a visible range of each other and had a total length of about 4500 kilometers. It was used for military signaling.
For a similar telegraph network in ancient Greece see Aeneas Tacitus' optical communication system.

About 750
In Japan block printing is used for the first time.

868
In China the world's first dated book, the Diamond Sutra, is printed.

1041-1048
In China moveable types made from clay are invented.

1088
First European medieval university is established in Bologna.

The first of the great medieval universities was established in Bologna. At the beginning universities predominantly offered a kind of do-it-yourself publishing service.

Books still had to be copied by hand and were so rare that a copy of a widely desired book qualified for being invited to a university. Holding a lecture equaled to reading a book aloud, like a priest read from the Bible during services. Attending a lecture equaled to copy a lecture word by word, so that you had your own copy of a book, thus enabling you to hold a lecture, too.

For further details see History of the Idea of a University, http://quarles.unbc.edu/ideas/net/history/history.html

TEXTBLOCK 22/26 // URL: http://world-information.org/wio/infostructure/100437611796/100438659702
 
Legal Protection: TRIPS (Trade-Related Aspects of Intellectual Property Rights)

Another important multilateral treaty concerned with intellectual property rights is the TRIPS agreement, which was devised at the inauguration of the Uruguay Round negotiations of the WTO in January 1995. It sets minimum standards for the national protection of intellectual property rights and procedures as well as remedies for their enforcement (enforcement measures include the potential for trade sanctions against non-complying WTO members). The TRIPS agreement has been widely criticized for its stipulation that biological organisms be subject to intellectual property protection. In 1999, 44 nations considered it appropriate to treat plant varieties as intellectual property.

The complete TRIPS agreement can be found on: http://www.wto.org/english/tratop_e/trips_e/t_agm1_e.htm

TEXTBLOCK 23/26 // URL: http://world-information.org/wio/infostructure/100437611725/100438659758
 
Advertising and the Media System

Media systems (especially broadcasting) can be classified in two different types:

Public Media Systems: Government control over broadcasting through ownership, regulation, and partial funding of public broadcasting services.

Private Media System: Ownership and control lies in the hands of private companies and shareholders.

Both systems can exist in various forms, according to the degree of control by governments and private companies, with mixed systems (public and private) as the third main kind.

Whereas public media systems are usually at least partially funded by governments, private broadcasting solely relies on advertising revenue. Still also public media systems cannot exclude advertising as a source of revenue. Therefore both types are to a certain degree dependent on money coming in by advertisers.

And this implies consequences on the content provided by the media. As the attraction of advertisers becomes critically important, interests of the advertising industry frequently play a dominant role concerning the structure of content and the creation of environments favorable for advertising goods and services within the media becomes more and more common.

TEXTBLOCK 24/26 // URL: http://world-information.org/wio/infostructure/100437611652/100438657942
 
Transparent customers. Direct marketing online



This process works even better on the Internet because of the latter's interactive nature. "The Internet is a dream to direct marketers", said Wil Lansing, CEO of the American retailer Fingerhut Companies. Many services require you to register online, requiring users to provide as much information about them as possible. And in addition, the Internet is fast, cheap and used by people who tend to be young and on the search for something interesting.

Many web sites also are equipped with user tracking technology that registers a users behaviour and preferences during a visit. For example, user tracking technology is capable of identifying the equipment and software employed by a user, as well as movements on the website, visit of links etc. Normally such information is anonymous, but can be personalised when it is coupled with online registration, or when personal identifcation has been obtained from other sources. Registration is often a prerequisite not just for obtaining a free web mail account, but also for other services, such as personalised start pages. Based on the information provided by user, the start page will then include advertisements and commercial offers that correspond to the users profile, or to the user's activity on the website.

One frequent way of obtaining such personal information of a user is by offering free web mail accounts offered by a great many companies, internet providers and web portals (e.g. Microsoft, Yahoo, Netscape and many others). In most cases, users get "free" accounts in return for submitting personal information and agreeing to receive marketing mails. Free web mail accounts are a simple and effective direct marketing and data capturing strategy which is, however, rarely understood as such. However, the alliances formed between direct advertising and marketing agencies on the one hand, and web mail providers on the other hand, such as the one between DoubleClick and Yahoo, show the common logic of data capturing and direct marketing. The alliance between DoubleClick and Yahoo eventually attracted the US largest direct marketing agency, Abacus Direct, who ended up buying DoubleClick.

However, the intention of collecting users personal data and create consumer profiles based on online behaviour can also take on more creative and playful forms. One such example is sixdegrees.com. This is a networking site based on the assumption that everybody on the planet is connected to everybody else by a chain of six people at most. The site offers users to get to know a lot of new people, the friends of their friends of their friends, for example, and if they try hard enough, eventually Warren Beatty or Claudia Schiffer. But of course, in order to make the whole game more useful for marketing purposes, users are encouraged to join groups which share common interests, which are identical with marketing categories ranging from arts and entertainment to travel and holiday. Evidently, the game becomes more interesting the more new people a user brings into the network. What seems to be fun for the 18 to 24 year old college student customer segment targeted by sixdegrees is, of course, real business. While users entertain themselves they are being carefully profiled. After all, data of young people who can be expected to be relatively affluent one day are worth more than money.

The particular way in which sites such as sixdegrees.com and others are structured mean that not only to users provide initial information about them, but also that this information is constantly updated and therefore becomes even more valuable. Consequently, many free online services or web mail providers cancel a user's account if it has not been uses for some time.

There are also other online services which offer free services in return for personal information which is then used for marketing purposes, e.g. Yahoo's Geocities, where users may maintain their own free websites, Bigfoot, where people are offered a free e-mail address for life, that acts as a relais whenever a customer's residence or e-mail address changes. In this way, of course, the marketers can identify friendship and other social networks, and turn this knowledge into a marketing advantage. People finders such as WhoWhere? operate along similar lines.

A further way of collecting consumer data that has recently become popular is by offering free PCs. Users are provided with a PC for free or for very little money, and in return commit themselves to using certain services rather than others (e.g. a particular internet provider), providing information about themselves, and agree to have their online behaviour monitored by the company providing the PC, so that accurate user profiles can be compiled. For example, the Free PC Network offers advertisers user profiles containing "over 60 individual demographics". There are literally thousands of variations of how a user's data are extracted and commercialised when online. Usually this happens quietly in the background.

A good inside view of the world of direct marketing can be gained at the website of the American Direct Marketing Association and the Federation of European Direct Marketing.

TEXTBLOCK 25/26 // URL: http://world-information.org/wio/infostructure/100437611761/100438659667
 
Timeline 1600 - 1900 AD

17th century Cardinal Richelieu invents an encryption-tool called grille, a card with holes for writing messages on paper into the holes of those cards. Afterwards he removes the cards and fills in the blanks, so the message looks like an ordinary letter. The recipient needs to own the same card

- Bishop John Wilkins invents a cryptologic system looking like music notes. In a book he describes several forms of steganographic systems like secrets inks, but also the string cipher. He mentions the so-called Pig Latin, a spoken way of encryption that was already used by the ancient Indians

- the English scientist, magician and astrologer John Dee works on the ancient Enochian alphabet; he also possesses an encrypted writing that could not been broken until today

1605/1623 Sir Francis Bacon (= Francis Tudor = William Shakespeare?) writes several works containing ideas about cryptography. One of his most important advises is to use ciphers in such a way that no-one gets suspicious that the text could be enciphered. For this the steganogram was the best method, very often used in poems. The attempt to decipher Shakespeare's sonnets (in the 20th century) lead to the idea that his works had been written by Francis Bacon originally.

1671 Leibniz invents a calculating machine that uses the binary scale which we still use today, more advanced of course, called the ASCII code

18th century this is the time of the Black Chambers of espionage in Europe, Vienna having one of the most effective ones, called the "Geheime Kabinettskanzlei", headed by Baron Ignaz von Koch. Its task is to read through international diplomatic mail, copy letters and return them to the post-office the same morning. Supposedly about 100 letters are dealt with each day.

1790's Thomas Jefferson and Robert Patterson invent a wheel cipher

1799 the Rosetta Stone is found and makes it possible to decipher the Egyptian Hieroglyphs

1832 or 1838 Sam Morse develops the Morse Code, which actually is no code but an enciphered alphabet of short and long sounds. The first Morse code-message is sent by telegraph in 1844.

1834 the Braille Code for blind people is developed in today's form by Louis Braille

1844 the invention of the telegraph changes cryptography very much, as codes are absolutely necessary by then

1854 the Playfair cipher is invented by Sir Charles Wheatstone

1859 for the first time a tomographic cipher gets described

1861 Friedrich W. Kasiski does a cryptoanalysis of the Vigenère ciphers, which had been supposed to be uncrackable for ages

1891 Major Etienne Bazeries creates a new version of the wheel cipher, which is rejected by the French Army

1895 the invention of the radio changes cryptography-tasks again and makes them even more important

TEXTBLOCK 26/26 // URL: http://world-information.org/wio/infostructure/100437611776/100438658974
 
Caching

Caching generally refers to the process of making an extra copy of a file or a set of files for more convenient retrieval. On the Internet caching of third party files can occur either locally on the user's client computer (in the RAM or on the hard drive) or at the server level ("proxy caching"). A requested file that has been cached will then be delivered from the cache rather than a fresh copy being retrieved over the Internet.

INDEXCARD, 1/30
 
IBM

IBM (International Business Machines Corporation) manufactures and develops cumputer hardware equipment, application and sysem software, and related equipment.

IBM produced the first PC (Personal Computer), and its decision to make Microsoft DOS the standard operating system initiated Microsoft's rise to global dominance in PC software.

Business indicators:

1999 Sales: $ 86,548 (+ 7,2 % from 1998)

Market capitalization: $ 181 bn

Employees: approx. 291,000

Corporate website: www.ibm.com

http://www.ibm.com/
INDEXCARD, 2/30
 
Computer programming language

A computer programming language is any of various languages for expressing a set of detailed instructions for a digital computer. Such a language consists of characters and rules for combining them into symbols and words.

INDEXCARD, 3/30
 
R.J. Reynolds

American manufacturer of tobacco products. The origins of the R.J. Reynolds Tobacco Company date to the post-Civil War era, when Richard Joshua Reynolds (1850-1918) began trading in tobacco, first in his native Virginia and then in Winston, N.C., where in 1875 he established his first plug factory. The company began to diversify in the 1960s, acquiring chiefly food and oil concerns, and the tobacco concern became a subsidiary of R.J. Reynolds Industries, Inc., in 1970.

INDEXCARD, 4/30
 
DES

The U.S. Data Encryption Standard (= DES) is the most widely used encryption algorithm, especially used for protection of financial transactions. It was developed by IBM in 1971. It is a symmetric-key cryptosystem. The DES algorithm uses a 56-bit encryption key, meaning that there are 72,057,594,037,927,936 possible keys.

for more information see:
http://www.britannica.com/bcom/eb/article/3/0,5716,117763+5,00.html
http://www.cryptography.com/des/

http://www.britannica.com/bcom/eb/article/3/0...
http://www.cryptography.com/des/
INDEXCARD, 5/30
 
Scientology

Official name Church Of Scientology, religio-scientific movement developed in the United States in the 1950s by the author L. Ron Hubbard (1911-86). The Church of Scientology was formally established in the United States in 1954 and was later incorporated in Great Britain and other countries. The scientific basis claimed by the church for its diagnostic and therapeutic practice is disputed, and the church has been criticized for the financial demands that it makes on its followers. From the 1960s the church and various of its officials or former officials faced government prosecutions as well as private lawsuits on charges of fraud, tax evasion, financial mismanagement, and conspiring to steal government documents, while the church on the other hand claimed it was being persecuted by government agencies and by established medical organizations. Some former Scientology officials have charged that Hubbard used the tax-exempt status of the church to build a profitable business empire.

INDEXCARD, 6/30
 
Digital Subscriber Line (DSL)

DSL connections are high-speed data connections over copper wire telephone lines. As with cable connections, with DSL you can look up information on the Internet and make a phone call at the same time but you do not need to have a new or additional cable or line installed. One of the most prominent DSL services is ISDN (integrated services digital network, for more information click here ( http://www.britannica.com/bcom/eb/article/4/0,5716,129614+15,00.html )).

http://www.britannica.com/bcom/eb/article/4/0...
INDEXCARD, 7/30
 
David Kahn

David Kahn can be considered one of the most important historians on cryptography. His book The Codebreakers. The comprehensive history of secret Communication from Ancient Times to the Internet, written in 1996 is supposed to be the most important work on the history of cryptography.

INDEXCARD, 8/30
 
Eli Lilly & Company

Eli Lilly & Company discovers, develops, manufactures, and sells pharmaceutical and animal health care products. Research efforts are directed primarily towards discovering and developing products to diagnose and treat diseases in human beings and animals and to increase the efficiency of animal food production. Pharmaceutical products comprise neuroscience products, endocrine products and anti-invectives. Products are manufactured and distributed through owned or leased facilities in the United States, Puerto Rico, and 27 other countries and sold in approximately 160 countries.

INDEXCARD, 9/30
 
Newsgroups

Newsgroups are on-line discussion groups on the Usenet. Over 20,000 newsgroups exist, organized by subject into hierarchies. Each subject hierarchy is further broken down into subcategories. Covering an incredible wide area of interests and used intensively every day, they are an important part of the Internet.

For more information, click here ( http://www.terena.nl/libr/gnrt/group/usenet.html ).

http://www.terena.nl/libr/gnrt/group/usenet.h...
INDEXCARD, 10/30
 
Division of labor

The term refers to the separation of a work process into a number of tasks, with each task performed by a separate person or group of persons. It is most often applied to mass production systems, where it is one of the basic organizing principles of the assembly line. Breaking down work into simple, repetitive tasks eliminates unnecessary motion and limits the handling of tools and parts. The consequent reduction in production time and the ability to replace craftsmen with lower-paid, unskilled workers result in lower production costs and a less expensive final product. The Scottish economist Adam Smith saw in this splitting of tasks a key to economic progress by providing a cheaper and more efficient means of producing economic goods.

INDEXCARD, 11/30
 
Above.net

Headquartered in San Jose, USA, AboveNet Communications is a backbone service provider. Through its extensive peering relationships, the company has built a network with the largest aggregated bandwidth in the world.

http://www.above.net

INDEXCARD, 12/30
 
Assembly line

An assembly line is an industrial arrangement of machines, equipment, and workers for continuous flow of workpieces in mass production operations. An assembly line is designed by determining the sequences of operations for manufacture of each product component as well as the final product. Each movement of material is made as simple and short as possible with no cross flow or backtracking. Work assignments, numbers of machines, and production rates are programmed so that all operations performed along the line are compatible.

INDEXCARD, 13/30
 
Operating system

An operating system is software that controls the many different operations of a computer and directs and coordinates its processing of programs. It is a remarkably complex set of instructions that schedules the series of jobs (user applications) to be performed by the computer and allocates them to the computer's various hardware systems, such as the central processing unit, main memory, and peripheral systems. The operating system directs the central processor in the loading, storage, and execution of programs and in such particular tasks as accessing files, operating software applications, controlling monitors and memory storage devices, and interpreting keyboard commands. When a computer is executing several jobs simultaneously, the operating system acts to allocate the computer's time and resources in the most efficient manner, prioritizing some jobs over others in a process called time-sharing. An operating system also governs a computer's interactions with other computers in a network.

INDEXCARD, 14/30
 
Citicorp/Citibank

American holding company (formerly (1967-74) First National City Corporation),
incorporated in 1967, with the City Bank of New York, National Association (a bank tracing to 1812), as its principal subsidiary. The latter's name changed successively to First National City Bank in 1968 and to Citibank, N.A. (i.e., National Association), in 1976. Citicorp was the holding company's popular and trade name from its inception but became the legal name only in 1974. Headquarters are in New York City.

INDEXCARD, 15/30
 
atbash

Atbash is regarded as the simplest way of encryption. It is nothing else than a reverse-alphabet. a=z, b= y, c=x and so on. Many different nations used it in the early times of writing.

for further explanations see:
http://www.ftech.net/~monark/crypto/crypt/atbash.htm

http://www.ftech.net/~monark/crypto/crypt/atb...
INDEXCARD, 16/30
 
NATO

The North Atlantic Treaty was signed in Washington on 4 April 1949, creating NATO (= North Atlantic Treaty Organization). It was an alliance of 12 independent nations, originally committed to each other's defense. Between 1952 and 1982 four more members were welcomed and in 1999, the first ex-members of COMECON became members of NATO (the Czech Republic, Hungary and Poland), which makes 19 members now. Around its 50th anniversary NATO changed its goals and tasks by intervening in the Kosovo Crisis.

INDEXCARD, 17/30
 
Punch card, 1801

Invented by Joseph Marie Jacquard, an engineer and architect in Lyon, France, the punch cards laid the ground for automatic information processing. For the first time information was stored in binary format on perforated cardboard cards. In 1890 Hermann Hollerith used Joseph-Marie Jacquard's punch card technology for processing statistical data retrieved from the US census in 1890, thus speeding up data analysis from eight to three years. His application of Jacquard's invention was also used for programming computers and data processing until electronic data processing was introduced in the 1960's. - As with writing and calculating, administrative purposes account for the beginning of modern automatic data processing.

Paper tapes are a medium similar to Jacquard's punch cards. In 1857 Sir Charles Wheatstone applied them as a medium for the preparation, storage, and transmission of data for the first time. By their means, telegraph messages could be prepared off-line, sent ten times quicker (up to 400 words per minute), and stored. Later similar paper tapes were used for programming computers.

INDEXCARD, 18/30
 
CIM

To perform manufacturing firm's functions related to design and production the CAD/CAM technology, for computer-aided design and computer-aided manufacturing, was developed. Today it is widely recognized that the scope of computer applications must extend beyond design and production to include the business functions of the firm. The name given to this more comprehensive use of computers is computer-integrated manufacturing (CIM).

INDEXCARD, 19/30
 
Blue Box

The blue box-system works with a special blue colored background. The person in front can act as if he/she was filmed anywhere, also in the middle of a war.

INDEXCARD, 20/30
 
Memex Animation by Ian Adelman and Paul Kahn


INDEXCARD, 21/30
 
Blaise Pascal

b. June 19, 1623, Clermont-Ferrand, France
d. August 19, 1662, Paris, France

French mathematician, physicist, religious philosopher, and master of prose. He laid the foundation for the modern theory of probabilities, formulated what came to be known as Pascal's law of pressure, and propagated a religious doctrine that taught the experience of God through the heart rather than through reason. The establishment of his principle of intuitionism had an impact on such later philosophers as Jean-Jacques Rousseau and Henri Bergson and also on the Existentialists.

INDEXCARD, 22/30
 
Charles Babbage

b. December 26, 1791, London, England
d. October 18, 1871, London, England

English mathematician and inventor who is credited with having conceived the first automatic digital computer. The idea of mechanically calculating mathematical tables first came to Babbage in 1812 or 1813. Later he made a small calculator that could perform certain mathematical computations to eight decimals. During the mid-1830s Babbage developed plans for the so-called analytical engine, the forerunner of the modern digital computer. In this device he envisioned the capability of performing any arithmetical operation on the basis of instructions from punched cards, a memory unit in which to store numbers, sequential control, and most of the other basic elements of the present-day computer.

INDEXCARD, 23/30
 
Saddam Hussein

Saddam Hussein joined the revolutionary Baath party when he was a university student. In 1958 he had the head of Iraq, Abdul-Karim Qassim, killed. Since 1979 he has been President of Iraq. Under his reign Iraq fought a decade-long war with Iran. Because of his steady enmity with extreme Islamic leaders the West supported him first of all, until his army invaded Kuwait in August 1990, an incident that the USA led to the Gulf War. Since then many rumors about a coup d'état have been launched, but Saddam Hussein is still in unrestricted power.

INDEXCARD, 24/30
 
Invention

According to the WIPO an invention is a "... novel idea which permits in practice the solution of a specific problem in the field of technology." Concerning its protection by law the idea "... must be new in the sense that is has not already been published or publicly used; it must be non-obvious in the sense that it would not have occurred to any specialist in the particular industrial field, had such a specialist been asked to find a solution to the particular problem; and it must be capable of industrial application in the sense that it can be industrially manufactured or used." Protection can be obtained through a patent (granted by a government office) and typically is limited to 20 years.

INDEXCARD, 25/30
 
Montage

Certain elements of two or more photographs can be put together, mixed, and the outcome is a new picture. Like this, people can appear in the same picture, even "sit at the same table" though they have never met in reality.

INDEXCARD, 26/30
 
Machine vision

A branch of artificial intelligence and image processing concerned with the identification of graphic patterns or images that involves both cognition and abstraction. In such a system, a device linked to a computer scans, senses, and transforms images into digital patterns, which in turn are compared with patterns stored in the computer's memory. The computer processes the incoming patterns in rapid succession, isolating relevant features, filtering out unwanted signals, and adding to its memory new patterns that deviate beyond a specified threshold from the old and are thus perceived as new entities.

INDEXCARD, 27/30
 
Backbone Networks

Backbone networks are central networks usually of very high bandwidth, that is, of very high transmitting capacity, connecting regional networks. The first backbone network was the NSFNet run by the National Science Federation of the United States.

INDEXCARD, 28/30
 
John Dee

b. July 13, 1527, London, England
d. December 1608, Mortlake, Surrey

English alchemist, astrologer, and mathematician who contributed greatly to the revival of interest in mathematics in England. After lecturing and studying on the European continent between 1547 and 1550, Dee returned to England in 1551 and was granted a pension by the government. He became astrologer to the queen, Mary Tudor, and shortly thereafter was imprisoned for being a magician but was released in 1555. Dee later toured Poland and Bohemia (1583-89), giving exhibitions of magic at the courts of various princes. He became warden of Manchester College in 1595.

INDEXCARD, 29/30
 
Artificial intelligence approaches

Looking for ways to create intelligent machines, the field of artificial intelligence (AI) has split into several different approaches based on the opinions about the most promising methods and theories. The two basic AI approaches are: bottom-up and top-down. The bottom-up theory suggests that the best way to achieve artificial intelligence is to build electronic replicas of the human brain's complex network of neurons (through neural networks and parallel computing) while the top-down approach attempts to mimic the brain's behavior with computer programs (for example expert systems).

INDEXCARD, 30/30