Eliminating online censorship: Freenet, Free Haven and Publius

Protecting speech on the global data networks attracts an increasing attention. The efforts and the corresponding abilities of governmental authorities, corporations and copyright enforcement agencies are countered by similar efforts and abilities of researchers and engineers to provide means for anonymous and uncensored communication, as Freenet, Free Haven and Publius. All three of them show a similar design. Content is split up and spread on several servers. When a file is requested, the pieces are reassembled. This design makes it difficult to censor content. All of these systems are not commercial products.

The most advanced system seems to be Publius. Because of being designed by researchers and engineers at the prestigious AT&T Labs, Publius is a strong statement against online censorship. No longer can it be said that taking a firm stand against the use of technologies limiting the freedom of individuals is a position of radical leftists only.

For more information on Publius, see John Schwartz, Online and Unidentifiable? in: The Washington Post, June 30, 2000, http://www.washingtonpost.com/wp-dyn/articles/A21689-2000Jun29.html .

Freenet web site: http://freenet.sourceforge.net

Free Haven web site: http://www.freehaven.net

Publius web site: http://www.cs.nyu.edu/waldman/publius

TEXTBLOCK 1/1 // URL: http://world-information.org/wio/infostructure/100437611742/100438658749
 
Neural network

A bottom-up artificial intelligence approach, a neural network is a network of many very simple processors ("units" or "neurons"), each possibly having a (small amount of) local memory. The units are connected by unidirectional communication channels ("connections"), which carry numeric data. The units operate only on their local data and on the inputs they receive via the connections. A neural network is a processing device, either an algorithm, or actual hardware, whose design was inspired by the design and functioning of animal brains and components thereof. Most neural networks have some sort of "training" rule whereby the weights of connections are adjusted on the basis of presented patterns. In other words, neural networks "learn" from examples and exhibit some structural capability for generalization.

INDEXCARD, 1/1