Governmental Influence

Agencies like the NSA are currently able to eavesdrop on anyone with few restrictions only - though other messages are spread by the NSA.
Theoretically cryptography can make that difficult. Hence those agencies speak up for actions like introducing trapdoors to make it possible to get access to everybody's data.

See the U.S. discussion about the Clipper Chip some years ago:
http://www.epic.org/crypto/clipper/
http://www.cdt.org/crypto/admin/041693whpress.txt

While encryption offers us privacy for the transmission of data, we do not only wish to have it but also need it if we want to transport data which shall not be seen by anyone else but the recipient of our message. Given this, the governments and governmental institutions/organizations fear to lose control. Strict laws are the consequence. The often repeated rumor that the Internet was a sphere of illegality has been proven wrong. Some parts are controlled by law very clearly. One of them is cryptography. Prohibition of cryptography or at least its restriction are considered an appropriate tool against criminality. Or one should say: had been considered that. In the meantime also governmental institutions have to admit that those restrictions most of all work against the population instead against illegal actors. Therefore laws have been changed in many states during the last five years. Even the USA, the Master of cryptography-restriction, liberated its laws in December 1999 to be more open-minded now.

for an insight into the discussion having gone on for years see:
http://www.cdt.org/crypto/new2crypto/3.shtml

the final text of the new U.S. Encryption Regulations you will find under:
http://www.cdt.org/crypto/admin/000110cryptoregs.shtml
http://www.cdt.org/crypto/admin/000114cryptoregs.txt

an explanation of the regulations can be found under:
http://www.cdt.org/crypto/admin/000112commercefactsheet.shtml

TEXTBLOCK 1/2 // URL: http://world-information.org/wio/infostructure/100437611776/100438659102
 
1400 - 1500 A.D.

1455
Johannes Gutenberg publishes the Bible as the first book in Europe by means of a movable metal font.

Gutenberg's printing press was an innovative aggregation of inventions known for centuries before Gutenberg: the olive oil press, oil-based ink, block-print technology, and movable types allowed the mass production of the movable type used to reproduce a page of text and enormously increased the production rate. During the Middle Ages it took monks at least a year to make a handwritten copy of a book. Gutenberg could print about 300 sheets per day. Because parchment was too costly for mass production - for the production of one copy of a medieval book often a whole flock of sheep was used - it was substituted by cheap paper made from recycled clothing of the massive number of deads caused by the Great Plague.

Within forty-five years, in 1500, ten million copies were available for a few hundred thousand literate people. Because individuals could examine a range of opinions now, the printed Bible - especially after having been translated into German by Martin Luther - and increasing literacy added to the subversion of clerical authorities. The interest in books grew with the rise of vernacular, non-Latin literary texts, beginning with Dante's Divine Comedy, the first literary text written in Italian.

Among others the improvement of the distribution and production of books as well as increased literacy made the development of print mass media possible.

Michael Giesecke (Sinnenwandel Sprachwandel Kulturwandel. Studien zur Vorgeschichte der Informationsgesellschaft, Frankfurt am Main: Suhrkamp, 1992) has shown that due to a division of labor among authors, printers and typesetters Gutenberg's invention increasingly led to a standardization of - written and unwritten - language in form of orthography, grammar and signs. To communicate one's ideas became linked to the use of a code, and reading became a kind of rite of passage, an important step towards independency in a human's life.

With the growing linkage of knowledge to reading and learning, the history of knowledge becomes the history of reading, of reading dependent on chance and circumstance.

For further details see:
Martin Warnke, Text und Technik, http://www.uni-lueneburg.de/
Bruce Jones, Manuscripts, Books, and Maps: The Printing Press and a Changing World, http://communication.ucsd.edu/bjones/Books/booktext.html

TEXTBLOCK 2/2 // URL: http://world-information.org/wio/infostructure/100437611796/100438659777
 
Internet Software Consortium

The Internet Software Consortium (ISC) is a nonprofit corporation dedicated to the production of high-quality reference implementations of Internet standards that meet production standards. Its goal is to ensure that those reference implementations are properly supported and made freely available to the Internet community.

http://www.isc.org

INDEXCARD, 1/4
 
The World Wide Web History Project

The ongoing World Wide Web History Project was established to record and publish the history of the World Wide Web and its roots in hypermedia and networking. As primary research methods are used archival research and the analysis of interviews and talks with pioneers of the World Wide Web. As result a vast of collection of historic video, audio, documents, and software is expected. The project's digital archive is currently under development.

http://www.webhistory.org/home.html

INDEXCARD, 2/4
 
Enochian alphabet

Also "Angelic" language. Archaic language alphabet composed of 21 letters, discovered by John Dee and his partner Edward Kelley. It has its own grammar and syntax, but only a small sample of it has ever been translated to English.

INDEXCARD, 3/4
 
Expert system

Expert systems are advanced computer programs that mimic the knowledge and reasoning capabilities of an expert in a particular discipline. Their creators strive to clone the expertise of one or several human specialists to develop a tool that can be used by the layman to solve difficult or ambiguous problems. Expert systems differ from conventional computer programs as they combine facts with rules that state relations between the facts to achieve a crude form of reasoning analogous to artificial intelligence. The three main elements of expert systems are: (1) an interface which allows interaction between the system and the user, (2) a database (also called the knowledge base) which consists of axioms and rules, and (3) the inference engine, a computer program that executes the inference-making process. The disadvantage of rule-based expert systems is that they cannot handle unanticipated events, as every condition that may be encountered must be described by a rule. They also remain limited to narrow problem domains such as troubleshooting malfunctioning equipment or medical image interpretation, but still have the advantage of being much lower in costs compared with paying an expert or a team of specialists.

INDEXCARD, 4/4