1400 - 1500 A.D.

1455
Johannes Gutenberg publishes the Bible as the first book in Europe by means of a movable metal font.

Gutenberg's printing press was an innovative aggregation of inventions known for centuries before Gutenberg: the olive oil press, oil-based ink, block-print technology, and movable types allowed the mass production of the movable type used to reproduce a page of text and enormously increased the production rate. During the Middle Ages it took monks at least a year to make a handwritten copy of a book. Gutenberg could print about 300 sheets per day. Because parchment was too costly for mass production - for the production of one copy of a medieval book often a whole flock of sheep was used - it was substituted by cheap paper made from recycled clothing of the massive number of deads caused by the Great Plague.

Within forty-five years, in 1500, ten million copies were available for a few hundred thousand literate people. Because individuals could examine a range of opinions now, the printed Bible - especially after having been translated into German by Martin Luther - and increasing literacy added to the subversion of clerical authorities. The interest in books grew with the rise of vernacular, non-Latin literary texts, beginning with Dante's Divine Comedy, the first literary text written in Italian.

Among others the improvement of the distribution and production of books as well as increased literacy made the development of print mass media possible.

Michael Giesecke (Sinnenwandel Sprachwandel Kulturwandel. Studien zur Vorgeschichte der Informationsgesellschaft, Frankfurt am Main: Suhrkamp, 1992) has shown that due to a division of labor among authors, printers and typesetters Gutenberg's invention increasingly led to a standardization of - written and unwritten - language in form of orthography, grammar and signs. To communicate one's ideas became linked to the use of a code, and reading became a kind of rite of passage, an important step towards independency in a human's life.

With the growing linkage of knowledge to reading and learning, the history of knowledge becomes the history of reading, of reading dependent on chance and circumstance.

For further details see:
Martin Warnke, Text und Technik, http://www.uni-lueneburg.de/
Bruce Jones, Manuscripts, Books, and Maps: The Printing Press and a Changing World, http://communication.ucsd.edu/bjones/Books/booktext.html

TEXTBLOCK 1/2 // URL: http://world-information.org/wio/infostructure/100437611796/100438659777
 
1980s: Artificial Intelligence (AI) - From Lab to Life

Following the commercial success of expert systems, which started in the 1970s, also other AI technologies began to make their way into the marketplace. In 1986, U.S. sales of AI-related hardware and software rose to U.S.$ 425 million. Especially expert systems, because of their efficiency, were still in demand. Yet also other fields of AI turned out to be successful in the corporate world.

Machine vision systems for example were used for the cameras and computers on assembly lines to perform quality control. By 1985 over a hundred companies offered machine vision systems in the U.S., and sales totaled U.S.$ 80 million. Although there was a breakdown in the market for AI-systems in 1986 - 1987, which led to a cut back in funding, the industry slowly recovered.

New technologies were being invented in Japan. Fuzzy logic pioneered in the U.S. and also neural networks were being reconsidered for achieving artificial intelligence. The probably most important development of the 1980s was, that it showed that AI technology had real life uses. AI applications like voice and character recognition systems or steadying camcorders using fuzzy logic were not only made available to business and industry, but also to the average customer.

TEXTBLOCK 2/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659445
 
Artificial intelligence approaches

Looking for ways to create intelligent machines, the field of artificial intelligence (AI) has split into several different approaches based on the opinions about the most promising methods and theories. The two basic AI approaches are: bottom-up and top-down. The bottom-up theory suggests that the best way to achieve artificial intelligence is to build electronic replicas of the human brain's complex network of neurons (through neural networks and parallel computing) while the top-down approach attempts to mimic the brain's behavior with computer programs (for example expert systems).

INDEXCARD, 1/3
 
1996 WIPO Copyright Treaty (WCT)

The 1996 WIPO Copyright Treaty, which focused on taking steps to protect copyright "in the digital age" among other provisions 1) makes clear that computer programs are protected as literary works, 2) the contracting parties must protect databases that constitute intellectual creations, 3) affords authors with the new right of making their works "available to the public", 4) gives authors the exclusive right to authorize "any communication to the public of their works, by wire or wireless means ... in such a way that members of the public may access these works from a place and at a time individually chosen by them." and 5) requires the contracting states to protect anti-copying technology and copyright management information that is embedded in any work covered by the treaty. The WCT is available on: http://www.wipo.int/documents/en/diplconf/distrib/94dc.htm



http://www.wipo.int/documents/en/diplconf/dis...
INDEXCARD, 2/3
 
Binary number system

In mathematics, the term binary number system refers to a positional numeral system employing 2 as the base and requiring only two different symbols, 0 and 1. The importance of the binary system to information theory and computer technology derives mainly from the compact and reliable manner in which data can be represented in electromechanical devices with two states--such as "on-off," "open-closed," or "go-no go."

INDEXCARD, 3/3