It is always the others

Disinformation is supposed to be something evil, something ethically not correct. And therefore we prefer to connect it to the past or to other political systems than the ones in the Western hemisphere. It is always the others who work with disinformation. The same is true for propaganda.
Even better, if we can refer it to the past: Adolf Hitler, supposedly one of the world's greatest and most horrible propagandists (together with his Reichsminister für Propaganda Josef Goebbels) did not invent modern propaganda either. It was the British example during World War I, the invention of modern propaganda, where he took his knowledge from. And it was Hitler's Reich, where (racist) propaganda and disinformation were developed to a perfect manipulation-tool in a way that the consequences are still working today.
A war loses support of the people, if it is getting lost. Therefore it is extremely important to launch a feeling of winning the war. Never give up emotions of victory. Governments know this and work hard on keeping the mood up. The Germans did a very hard job on that in the last months of World War II.
But the in the 1990s disinformation- and propaganda-business came back to life (if it ever had gone out of sight) through Iraq's invasion of Kuwait and the reactions by democratic states. After the war, reports made visible that not much had happened the way we had been told it had happened. Regarded like this the Gulf War was the end of the New World Order, a better and geographically broader democratic order, that had just pretended to having begun.

TEXTBLOCK 1/3 // URL: http://world-information.org/wio/infostructure/100437611661/100438658640
 
Late 1970s - Present: Fourth Generation Computers

Following the invention of the first integrated circuits always more and more components could be fitted onto one chip. LSI (Large Scale Integration) was followed by VLSI (Very Large Scale Integration) and ULSI (Ultra-Large Scale Integration), which increased the number of components squeezed onto one chip into the millions and helped diminish the size as well as the price of computers. The new chips took the idea of the integrated circuit one step further as they allowed to manufacture one microprocessor which could then be programmed to meet any number of demands.

Also, ensuing the introduction of the minicomputer in the mid 1970s by the early 1980s a market for personal computers (PC) was established. As computers had become easier to use and cheaper they were no longer mainly utilized in offices and manufacturing, but also by the average consumer. Therefore the number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used.

Further developments included the creation of mobile computers (laptops and palmtops) and especially networking technology. While mainframes shared time with many terminals for many applications, networking allowed individual computers to form electronic co-operations. LANs (Local Area Network) permitted computers to share memory space, information, software and communicate with each other. Although already LANs could reach enormous proportions with the invention of the Internet an information and communication-network on a global basis was established for the first time.

TEXTBLOCK 2/3 // URL: http://world-information.org/wio/infostructure/100437611663/100438659451
 
1960s - 1970s: Increased Research in Artificial Intelligence (AI)

During the cold war the U.S. tried to ensure that it would stay ahead of the Soviet Union in technological advancements. Therefore in 1963 the Defense Advanced Research Projects Agency (DARPA) granted the Massachusetts Institute of Technology (MIT) U.S.$ 2.2 million for research in machine-aided cognition (artificial intelligence). The major effect of the project was an increase in the pace of AI research and a continuation of funding.

In the 1960s and 1970s a multitude of AI programs were developed, most notably SHRDLU. Headed by Marvin Minsky the MIT's research team showed, that when confined to a small subject matter, computer programs could solve spatial and logic problems. Other progresses in the field of AI at the time were: the proposal of new theories about machine vision by David Marr, Marvin Minsky's frame theory, the PROLOGUE language (1972) and the development of expert systems.

TEXTBLOCK 3/3 // URL: http://world-information.org/wio/infostructure/100437611663/100438659474
 
Avatar

Traditionally, an avatar is a mythical figure half man half god. In Hindu mythology, avatars are the form that deities assume when they descend on earth. Greek and Roman mythologies also contain avatars in animal form or half animal, half man. In virtual space, the word avatar refers to a "virtual identity" that a user can construct for him / herself, e.g. in a chat-room. Avatars have also been a preferred object of media art.

INDEXCARD, 1/3
 
Nadia Thalman

Nadia Thalman is director of MIRAlab at the University of Geneva, Switzerland. Thalmann has become known as the creator of "virtual Marylyn", an installation which allowed visitors to literally to slip into Marylyn's shoes. Thalman's work is located at interface between science and art. It is about modelling human bodies for science and creative purposes, e.g. as virtual actors in movies. Thalman insists that artificial beings must be beautiful, in addition to being useful, as we will be living with them at close quarters.

INDEXCARD, 2/3
 
Charles Babbage

b. December 26, 1791, London, England
d. October 18, 1871, London, England

English mathematician and inventor who is credited with having conceived the first automatic digital computer. The idea of mechanically calculating mathematical tables first came to Babbage in 1812 or 1813. Later he made a small calculator that could perform certain mathematical computations to eight decimals. During the mid-1830s Babbage developed plans for the so-called analytical engine, the forerunner of the modern digital computer. In this device he envisioned the capability of performing any arithmetical operation on the basis of instructions from punched cards, a memory unit in which to store numbers, sequential control, and most of the other basic elements of the present-day computer.

INDEXCARD, 3/3