Atrocity Stories

Atrocity stories are nothing else than lies; the two words "atrocity stories" simply pretend to be more diplomatic.
The purpose is to destroy an image of the enemy, to create a new one, mostly a bad one. The story creating the image is not necessarily made up completely. It can also be a changed into a certain variable direction.
The most important thing about atrocity stories is to follow the line of possibility. Even if the whole story is made up it must be probable or at least possible, following rumors. Most successful might it be if a rumor is spread on purpose, some time before the atrocity story is launched, because as soon as something seems to be familiar, it is easier to believe it.

TEXTBLOCK 1/4 // URL: http://world-information.org/wio/infostructure/100437611661/100438658524
 
Definition

During the last 20 years the old Immanuel Wallerstein-paradigm of center - periphery and semi-periphery found a new costume: ICTs. After Colonialism, Neo-Colonialism and Neoliberalism a new method of marginalization is emerging: the digital divide.

"Digital divide" describes the fact that the world can be divided into people who
do and people who do not have access to (or the education to handle with) modern information technologies, e.g. cellular telephone, television, Internet. This digital divide is concerning people all over the world, but as usually most of all people in the formerly so called third world countries and in rural areas suffer; the poor and less-educated suffer from that divide.
More than 80% of all computers with access to the Internet are situated in larger cities.

"The cost of the information today consists not so much of the creation of content, which should be the real value, but of the storage and efficient delivery of information, that is in essence the cost of paper, printing, transporting, warehousing and other physical distribution means, plus the cost of the personnel manpower needed to run these `extra' services ....Realizing an autonomous distributed networked society, which is the real essence of the Internet, will be the most critical issue for the success of the information and communication revolution of the coming century of millennium."
(Izumi Aizi)

for more information see:
http://www.whatis.com/digital_divide.htm

TEXTBLOCK 2/4 // URL: http://world-information.org/wio/infostructure/100437611730/100438659300
 
Biometrics applications: privacy issues

All biometric technologies capture biometric data from individuals. Once these date have been captured by a system, they can, in principle, be forwarded to other locations and put to many different uses which are capable of compromising on an individuals privacy.

Technically it is easy to match biometric data with other personal data stored in government or corporate files, and to come a step closer to the counter-utopia of the transparent citizen and customer whose data body is under outside control.

While biometric technologies are often portrayed as protectors of personal data and safeguards against identity theft, they can thus contribute to an advance in "Big Brother" technology.

The combination of personalised data files with biometric data would amount to an enormous control potential. While nobody in government and industry would admit to such intentions, leading data systems companies such as EDS (Electronic Data Systems; http://www.eds.com) are also suppliers of biometric systems to the intelligence agencies of government and industry.

Biometric technologies have the function of identification. Historically, identification has been a prerequisite for the exercise of power and serves as a protection only to those who are in no conflict with this power. If the digitalisation of the body by biometric technologies becomes as widespread as its proponents hope, a new electronic feudal system could be emerging, in which people are reduced to subjects dispossessed of their to their bodies, even if these, unlike in the previous one, are data bodies. Unlike the gatekeepers of medieval towns, wear no uniforms by they might be identified; biometric technologies are pure masks.

TEXTBLOCK 3/4 // URL: http://world-information.org/wio/infostructure/100437611729/100438658826
 
The 19th Century: First Programmable Computing Devices

Until the 19th century "early computers", probably better described as calculating machines, were basically mechanical devices and operated by hand. Early calculators like the abacus worked with a system of sliding beads arranged on a rack and the centerpiece of Leibniz's multiplier was a stepped-drum gear design.

Therefore Charles Babbage's proposal of the Difference Engine (1822), which would have (it was never completed) a stored program and should perform calculations and print the results automatically, was a major breakthrough, as it for the first time suggested the automation of computers. The construction of the Difference Engine, which should perform differential equations, was inspired by Babbage's idea to apply the ability of machines to the needs of mathematics. Machines, he noted, were best at performing tasks repeatedly without mistakes, while mathematics often required the simple repetition of steps.

After working on the Difference Engine for ten years Babbage was inspired to build another machine, which he called Analytical Engine. Its invention was a major step towards the design of modern computers, as it was conceived the first general-purpose computer. Instrumental to the machine's design was his assistant, Augusta Ada King, Countess of Lovelace, the first female computer programmer.

The second major breakthrough in the design of computing machines in the 19th century may be attributed to the American inventor Herman Hollerith. He was concerned with finding a faster way to compute the U.S. census, which in 1880 had taken nearly seven years. Therefore Hollerith invented a method, which used cards to store data information which he fed into a machine that compiled the results automatically. The punch cards not only served as a storage method and helped reduce computational errors, but furthermore significantly increased speed.

Of extraordinary importance for the evolution of digital computers and artificial intelligence have furthermore been the contributions of the English mathematician and logician George Boole. In his postulates concerning the Laws of Thought (1854) he started to theorize about the true/false nature of binary numbers. His principles make up what today is known as Boolean algebra, the collection of logic concerning AND, OR, NOT operands, on which computer switching theory and procedures are grounded. Boole also assumed that the human mind works according to these laws, it performs logical operations that could be reasoned. Ninety years later Boole's principles were applied to circuits, the blueprint for electronic computers, by Claude Shannon.

TEXTBLOCK 4/4 // URL: http://world-information.org/wio/infostructure/100437611663/100438659426
 
Royalties

Royalties refer to the payment made to the owners of certain types of rights by those who are permitted by the owners to exercise the rights. The rights concerned are literary, musical, and artistic copyright and patent rights in inventions and designs (as well as rights in mineral deposits, including oil and natural gas). The term originated from the fact that in Great Britain for centuries gold and silver mines were the property of the crown and such "royal" metals could be mined only if a payment ("royalty") were made to the crown.

INDEXCARD, 1/3
 
Above.net

Headquartered in San Jose, USA, AboveNet Communications is a backbone service provider. Through its extensive peering relationships, the company has built a network with the largest aggregated bandwidth in the world.

http://www.above.net

INDEXCARD, 2/3
 
Enochian alphabet

Also "Angelic" language. Archaic language alphabet composed of 21 letters, discovered by John Dee and his partner Edward Kelley. It has its own grammar and syntax, but only a small sample of it has ever been translated to English.

INDEXCARD, 3/3