Like that car? The tricks of the data body industry

2. Like that car? The tricks of the data body industry

In the New Economy, data have become a primary resource. Businesses unable to respond to the pressure of informatisation are quickly left behind. "Information is everything" has become the war-cry of the New Economy. More than ever, business companies now collect data related to their customers, their competitors, economic indicators, etc., and compile them in data warehouses. Large amounts of data acquired can be turned into a systematic collection called a data warehouse through data mining techniques. These data can be used for marketing, stock exchange transactions, risk assessment, and many other purposes.

However, there are also many companies that specialise in data body economics as the main line of business. They collect huge amount of data process and enhance them (thereby increasing the value of the data) and offer them on to other companies. Direct marketing companies belong to this category. Direct marketing companies carry out targeted marketing, also called strategic marketing, aimed at individual customers or groups of customers. This process is based on a consumer profile, a collection of data containing personal information such as age, sex, marital status, employment, address, and information about consumer and payment behaviour. Based upon this profile, conclusions regarding possible future consumption are drawn and offers are made.

For example, somebody who has been attracted by a car on display in an airport terminal and completes a card with name and address to participate in a draw reveals a lot of economically valuable information about him / herself. Apart from name and address, and other data that is completed on the card, this person also can be assumed to be a potential car buyer (evidently he / she wants a car) and to be relatively affluent (the poor do not normally travel by plane). The time when you complete the card also provides information: in July and August, you are more likely to be a holiday maker than in November. Possibly in small print somewhere on the ticket you complete you agree to receive more information about this and other products, and you agree also that your data are "electronically processed". The data acquired this way can normally be expected to be much more valuable than the car the is offered in the draw. Most people who completed the cards will not win in the draw, but instead end up on directs marketing data warehouses and one day receive offers of products and services which they never knew they wanted.

TEXTBLOCK 1/2 // URL: http://world-information.org/wio/infostructure/100437611761/100438659665
 
Timeline 1970-2000 AD

1971 IBM's work on the Lucifer cipher and the work of the NSA lead to the U.S. Data Encryption Standard (= DES)

1976 Whitfield Diffie and Martin Hellman publish their book New Directions in Cryptography, playing with the idea of public key cryptography

1977/78 the RSA algorithm is developed by Ron Rivest, Adi Shamir and Leonard M. Adleman and is published

1984 Congress passes Comprehensive Crime Control Act

- The Hacker Quarterly is founded

1986 Computer Fraud and Abuse Act is passed in the USA

- Electronic Communications Privacy Act

1987 Chicago prosecutors found Computer Fraud and Abuse Task Force

1988 U.S. Secret Service covertly videotapes a hacker convention

1989 NuPrometheus League distributes Apple Computer software

1990 - IDEA, using a 128-bit key, is supposed to replace DES

- Charles H. Bennett and Gilles Brassard publish their work on Quantum Cryptography

- Martin Luther King Day Crash strikes AT&T long-distance network nationwide


1991 PGP (= Pretty Good Privacy) is released as freeware on the Internet, soon becoming worldwide state of the art; its creator is Phil Zimmermann

- one of the first conferences for Computers, Freedom and Privacy takes place in San Francisco

- AT&T phone crash; New York City and various airports get affected

1993 the U.S. government announces to introduce the Clipper Chip, an idea that provokes many political discussions during the following years

1994 Ron Rivest releases another algorithm, the RC5, on the Internet

- the blowfish encryption algorithm, a 64-bit block cipher with a key-length up to 448 bits, is designed by Bruce Schneier

1990s work on quantum computer and quantum cryptography

- work on biometrics for authentication (finger prints, the iris, smells, etc.)

1996 France liberates its cryptography law: one now can use cryptography if registered

- OECD issues Cryptography Policy Guidelines; a paper calling for encryption exports-standards and unrestricted access to encryption products

1997 April European Commission issues Electronic Commerce Initiative, in favor of strong encryption

1997 June PGP 5.0 Freeware widely available for non-commercial use

1997 June 56-bit DES code cracked by a network of 14,000 computers

1997 August U.S. judge assesses encryption export regulations as violation of the First Amendment

1998 February foundation of Americans for Computer Privacy, a broad coalition in opposition to the U.S. cryptography policy

1998 March PGP announces plans to sell encryption products outside the USA

1998 April NSA issues a report about the risks of key recovery systems

1998 July DES code cracked in 56 hours by researchers in Silicon Valley

1998 October Finnish government agrees to unrestricted export of strong encryption

1999 January RSA Data Security, establishes worldwide distribution of encryption product outside the USA

- National Institute of Standards and Technologies announces that 56-bit DES is not safe compared to Triple DES

- 56-bit DES code is cracked in 22 hours and 15 minutes

1999 May 27 United Kingdom speaks out against key recovery

1999 Sept: the USA announce to stop the restriction of cryptography-exports

2000 as the German government wants to elaborate a cryptography-law, different organizations start a campaign against that law

- computer hackers do no longer only visit websites and change little details there but cause breakdowns of entire systems, producing big economic losses

for further information about the history of cryptography see:
http://www.clark.net/pub/cme/html/timeline.html
http://www.math.nmsu.edu/~crypto/Timeline.html
http://fly.hiwaay.net/~paul/cryptology/history.html
http://www.achiever.com/freehmpg/cryptology/hocryp.html
http://all.net/books/ip/Chap2-1.html
http://cryptome.org/ukpk-alt.htm
http://www.iwm.org.uk/online/enigma/eni-intro.htm
http://www.achiever.com/freehmpg/cryptology/cryptofr.html
http://www.cdt.org/crypto/milestones.shtml

for information about hacker's history see:
http://www.farcaster.com/sterling/chronology.htm:

TEXTBLOCK 2/2 // URL: http://world-information.org/wio/infostructure/100437611776/100438658960
 
1996 WIPO Copyright Treaty (WCT)

The 1996 WIPO Copyright Treaty, which focused on taking steps to protect copyright "in the digital age" among other provisions 1) makes clear that computer programs are protected as literary works, 2) the contracting parties must protect databases that constitute intellectual creations, 3) affords authors with the new right of making their works "available to the public", 4) gives authors the exclusive right to authorize "any communication to the public of their works, by wire or wireless means ... in such a way that members of the public may access these works from a place and at a time individually chosen by them." and 5) requires the contracting states to protect anti-copying technology and copyright management information that is embedded in any work covered by the treaty. The WCT is available on: http://www.wipo.int/documents/en/diplconf/distrib/94dc.htm



http://www.wipo.int/documents/en/diplconf/dis...
INDEXCARD, 1/6
 
Proxy Servers

A proxy server is a server that acts as an intermediary between a workstation user and the Internet so that security, administrative control, and caching service can be ensured.

A proxy server receives a request for an Internet service (such as a Web page request) from a user. If it passes filtering requirements, the proxy server, assuming it is also a cache server, looks in its local cache of previously downloaded Web pages. If it finds the page, it returns it to the user without needing to forward the request to the Internet. If the page is not in the cache, the proxy server, acting as a client on behalf of the user, uses one of its own IP addresses to request the page from the server out on the Internet. When the page is returned, the proxy server relates it to the original request and forwards it on to the user.

Source: Whatis.com

INDEXCARD, 2/6
 
Bandwidth

The bandwidth of a transmitted communications signal is a measure of the range of frequencies the signal occupies. The term is also used in reference to the frequency-response characteristics of a communications receiving system. All transmitted signals, whether analog or digital, have a certain bandwidth. The same is true of receiving systems.

Generally speaking, bandwidth is directly proportional to the amount of data transmitted or received per unit time. In a qualitative sense, bandwidth is proportional to the complexity of the data for a given level of system performance. For example, it takes more bandwidth to download a photograph in one second than it takes to download a page of text in one second. Large sound files, computer programs, and animated videos require still more bandwidth for acceptable system performance. Virtual reality (VR) and full-length three-dimensional audio/visual presentations require the most bandwidth of all.

In digital systems, bandwidth is data speed in bits per second (bps).

Source: Whatis.com

INDEXCARD, 3/6
 
National Laboratory for Applied Network Research

NLANR, initially a collaboration among supercomputer sites supported by the National Science Foundation, was created in 1995 to provide technical and engineering support and overall coordination of the high-speed connections at these five supercomputer centers.

Today NLANR offers support and services to institutions that are qualified to use high performance network service providers - such as Internet 2 and Next Generation Internet.

http://www.nlanr.net

INDEXCARD, 4/6
 
AT&T Labs-Research

The research and development division of AT&T. Inventions made at AT&T Labs-Research include so important ones as stereo recording, the transistor and the communications satellite.

http://www.research.att.com/

INDEXCARD, 5/6
 
ARPAnet

ARPAnet was the small network of individual computers connected by leased lines that marked the beginning of today's global data networks. Being an experimental network mainly serving the purpose to test the feasibility of wide area networks, the possibility of remote computing, it was created for resource sharing between research institutions, not for messaging services like E-mail. Although research was sponsored by US military, ARPAnet was not designed for directly martial use but to support military-related research.

In 1969 ARPANET went online and links the first two computers, one of them located at the University of California, Los Angeles, the other at the Stanford Research Institute.

But ARPAnet has not become widely accepted before it was demonstrated in action to a public of computer experts at the First International Conference on Computers and Communication in Washington, D. C. in 1972.

Before it was decommissioned in 1990, NSFnet, a network of scientific and academic computers funded by the National Science Foundation, and a separate new military network went online in 1986. In 1988 the first private Internet service providers offered a general public access to NSFnet. Beginning in 1995, after having become the backbone of the Internet in the USA, NSFnet was turned over to a consortium of commercial backbone providers. This and the launch of the World Wide Web added to the success of the global data network we call the Net.

In the USA commercial users already outnumbered military and academic users in 1994.

Despite the rapid growth of the Net, most computers linked to it are still located in the United States.

INDEXCARD, 6/6