Timeline 1970-2000 AD

1971 IBM's work on the Lucifer cipher and the work of the NSA lead to the U.S. Data Encryption Standard (= DES)

1976 Whitfield Diffie and Martin Hellman publish their book New Directions in Cryptography, playing with the idea of public key cryptography

1977/78 the RSA algorithm is developed by Ron Rivest, Adi Shamir and Leonard M. Adleman and is published

1984 Congress passes Comprehensive Crime Control Act

- The Hacker Quarterly is founded

1986 Computer Fraud and Abuse Act is passed in the USA

- Electronic Communications Privacy Act

1987 Chicago prosecutors found Computer Fraud and Abuse Task Force

1988 U.S. Secret Service covertly videotapes a hacker convention

1989 NuPrometheus League distributes Apple Computer software

1990 - IDEA, using a 128-bit key, is supposed to replace DES

- Charles H. Bennett and Gilles Brassard publish their work on Quantum Cryptography

- Martin Luther King Day Crash strikes AT&T long-distance network nationwide


1991 PGP (= Pretty Good Privacy) is released as freeware on the Internet, soon becoming worldwide state of the art; its creator is Phil Zimmermann

- one of the first conferences for Computers, Freedom and Privacy takes place in San Francisco

- AT&T phone crash; New York City and various airports get affected

1993 the U.S. government announces to introduce the Clipper Chip, an idea that provokes many political discussions during the following years

1994 Ron Rivest releases another algorithm, the RC5, on the Internet

- the blowfish encryption algorithm, a 64-bit block cipher with a key-length up to 448 bits, is designed by Bruce Schneier

1990s work on quantum computer and quantum cryptography

- work on biometrics for authentication (finger prints, the iris, smells, etc.)

1996 France liberates its cryptography law: one now can use cryptography if registered

- OECD issues Cryptography Policy Guidelines; a paper calling for encryption exports-standards and unrestricted access to encryption products

1997 April European Commission issues Electronic Commerce Initiative, in favor of strong encryption

1997 June PGP 5.0 Freeware widely available for non-commercial use

1997 June 56-bit DES code cracked by a network of 14,000 computers

1997 August U.S. judge assesses encryption export regulations as violation of the First Amendment

1998 February foundation of Americans for Computer Privacy, a broad coalition in opposition to the U.S. cryptography policy

1998 March PGP announces plans to sell encryption products outside the USA

1998 April NSA issues a report about the risks of key recovery systems

1998 July DES code cracked in 56 hours by researchers in Silicon Valley

1998 October Finnish government agrees to unrestricted export of strong encryption

1999 January RSA Data Security, establishes worldwide distribution of encryption product outside the USA

- National Institute of Standards and Technologies announces that 56-bit DES is not safe compared to Triple DES

- 56-bit DES code is cracked in 22 hours and 15 minutes

1999 May 27 United Kingdom speaks out against key recovery

1999 Sept: the USA announce to stop the restriction of cryptography-exports

2000 as the German government wants to elaborate a cryptography-law, different organizations start a campaign against that law

- computer hackers do no longer only visit websites and change little details there but cause breakdowns of entire systems, producing big economic losses

for further information about the history of cryptography see:
http://www.clark.net/pub/cme/html/timeline.html
http://www.math.nmsu.edu/~crypto/Timeline.html
http://fly.hiwaay.net/~paul/cryptology/history.html
http://www.achiever.com/freehmpg/cryptology/hocryp.html
http://all.net/books/ip/Chap2-1.html
http://cryptome.org/ukpk-alt.htm
http://www.iwm.org.uk/online/enigma/eni-intro.htm
http://www.achiever.com/freehmpg/cryptology/cryptofr.html
http://www.cdt.org/crypto/milestones.shtml

for information about hacker's history see:
http://www.farcaster.com/sterling/chronology.htm:

TEXTBLOCK 1/4 // URL: http://world-information.org/wio/infostructure/100437611776/100438658960
 
Digital Signatures, Timestamps etc

Most computer systems are far from being secure.
A lack of security - it is said - might hinder the developments of new information technologies. Everybody knows electronic transactions involve a more or less calculated risk. Rumors about insecurity let consumers doubt whether the commodity of e-commerce is bigger or its risks. First of all the market depends on the consumer's confidence. To provide that another application for public key cryptography gets essential: the digital signature, which is used to verify the authenticity of the sender of certain data.
It is done with a special private key, and the public key is verifying the signature. This is especially important if the involved parties do not know one another. The DSA (= Digital Signature Algorithm) is a public-key system which is only able to sign digitally, not to encrypt messages. In fact digital signature is the main-tool of cryptography in the private sector.

Digital signatures need to be given for safe electronic payment. It is a way to protect the confidentiality of the sent data, which of course could be provided by other ways of cryptography as well. Other security methods in this respect are still in development, like digital money (similar to credit cards or checks) or digital cash, a system that wants to be anonymous like cash, an idea not favored by governments as it provides many opportunities for money laundry and illegal transactions.

If intellectual property needs to be protected, a digital signature, together with a digital timestamp is regarded as an efficient tool.

In this context, the difference between identification and authentication is essential. In this context smartcards and firewalls are relevant, too.

A lot of digital transactions demand for passwords. More reliable for authentication are biometric identifiers, full of individual and unrepeatable codes, signatures that can hardly be forged.

For more terms of cryptography and more information see:
http://poseidon.csd.auth.gr/signatures
http://www.dlib.org/dlib/december97/ibm/12lotspiech.html
http://www.cryptography.com/technology/technology.html
http://www.cdt.org/crypto/glossary.shtml
http://www.oecd.org//dsti/sti/it/secur/prod/GD97-204.htm

TEXTBLOCK 2/4 // URL: http://world-information.org/wio/infostructure/100437611776/100438659015
 
Governmental Influence

Agencies like the NSA are currently able to eavesdrop on anyone with few restrictions only - though other messages are spread by the NSA.
Theoretically cryptography can make that difficult. Hence those agencies speak up for actions like introducing trapdoors to make it possible to get access to everybody's data.

See the U.S. discussion about the Clipper Chip some years ago:
http://www.epic.org/crypto/clipper/
http://www.cdt.org/crypto/admin/041693whpress.txt

While encryption offers us privacy for the transmission of data, we do not only wish to have it but also need it if we want to transport data which shall not be seen by anyone else but the recipient of our message. Given this, the governments and governmental institutions/organizations fear to lose control. Strict laws are the consequence. The often repeated rumor that the Internet was a sphere of illegality has been proven wrong. Some parts are controlled by law very clearly. One of them is cryptography. Prohibition of cryptography or at least its restriction are considered an appropriate tool against criminality. Or one should say: had been considered that. In the meantime also governmental institutions have to admit that those restrictions most of all work against the population instead against illegal actors. Therefore laws have been changed in many states during the last five years. Even the USA, the Master of cryptography-restriction, liberated its laws in December 1999 to be more open-minded now.

for an insight into the discussion having gone on for years see:
http://www.cdt.org/crypto/new2crypto/3.shtml

the final text of the new U.S. Encryption Regulations you will find under:
http://www.cdt.org/crypto/admin/000110cryptoregs.shtml
http://www.cdt.org/crypto/admin/000114cryptoregs.txt

an explanation of the regulations can be found under:
http://www.cdt.org/crypto/admin/000112commercefactsheet.shtml

TEXTBLOCK 3/4 // URL: http://world-information.org/wio/infostructure/100437611776/100438659102
 
Commercial vs. Independent Content: Human and Financial Resources

- Concerning their human and financial resources commercial media and independent content provider are an extremely unequal pair. While the 1998 revenues of the world's leading media conglomerates (AOL Time Warner, Disney, Bertelsmann, Viacom and the News Corporation) amounted to US$ 91,144,000,000 provider of independent content usually act on a non-profit basis and to a considerable extent depend on donations and contributions.

Also the human resources they have at their disposal quite differ. Viacom for example employs 112,000 people. Alternative media conversely are mostly run by a small group of activists, most of them volunteers. Moreover the majority of the commercial media giants has a multitude of subsidiaries (Bertelsmann for instance has operations in 53 countries), while independent content provider in some cases do not even have proper office spaces. Asked about their offices number of square meters Frank Guerrero from RTMark comments "We have no square meters at all, because we are only on the web. I guess if you add up all of our servers and computers we would take up about one or two square meters."

TEXTBLOCK 4/4 // URL: http://world-information.org/wio/infostructure/100437611795/100438659146
 
Gerard J. Holzmann and Bjoern Pehrson, The Early History of Data Networks

This book gives a fascinating glimpse of the many documented attempts throughout history to develop effective means for long distance communications. Large-scale communication networks are not a twentieth-century phenomenon. The oldest attempts date back to millennia before Christ and include ingenious uses of homing pigeons, mirrors, flags, torches, and beacons. The first true nationwide data networks, however, were being built almost two hundred years ago. At the turn of the 18th century, well before the electromagnetic telegraph was invented, many countries in Europe already had fully operational data communications systems with altogether close to one thousand network stations. The book shows how the so-called information revolution started in 1794, with the design and construction of the first true telegraph network in France, Chappe's fixed optical network.

http://www.it.kth.se/docs/early_net/

INDEXCARD, 1/5
 
blowfish encryption algorithm

Blowfish is a symmetric key block cipher that can vary its length.
The idea behind is a simple design to make the system faster than others.

http://www.counterpane.com/blowfish.html
http://www.counterpane.com/bfsverlag.html

http://www.counterpane.com/blowfish.html
http://www.counterpane.com/blowfish.html
INDEXCARD, 2/5
 
Bruce Schneier

Bruce Schneier is president of Counterpane Systems in Minneapolis. This consulting enterprise specialized in cryptography and computer security. He is the author of the book Applied Cryptography and inventor of the Blowfish and Twofish encryption algorithms.

INDEXCARD, 3/5
 
Sun Microsystems

Founded in 1982 and headquartered in Palo Alto, USA, Sun Microsystems manufactures computer workstations, servers, and software.

http://www.sun.com

For more detailed information see the Encyclopaedia Britannica: http://www.britannica.com/bcom/eb/article/9/0,5716,108249+1+105909,00.html .

http://www.sun.com/
http://www.britannica.com/bcom/eb/article/9/0...
INDEXCARD, 4/5
 
Expert system

Expert systems are advanced computer programs that mimic the knowledge and reasoning capabilities of an expert in a particular discipline. Their creators strive to clone the expertise of one or several human specialists to develop a tool that can be used by the layman to solve difficult or ambiguous problems. Expert systems differ from conventional computer programs as they combine facts with rules that state relations between the facts to achieve a crude form of reasoning analogous to artificial intelligence. The three main elements of expert systems are: (1) an interface which allows interaction between the system and the user, (2) a database (also called the knowledge base) which consists of axioms and rules, and (3) the inference engine, a computer program that executes the inference-making process. The disadvantage of rule-based expert systems is that they cannot handle unanticipated events, as every condition that may be encountered must be described by a rule. They also remain limited to narrow problem domains such as troubleshooting malfunctioning equipment or medical image interpretation, but still have the advantage of being much lower in costs compared with paying an expert or a team of specialists.

INDEXCARD, 5/5