The North against the South?

"Faced with this process of globalization, most governments appear to lack the tools required for facing up to the pressure from important media changes. The new global order is viewed as a daunting challenge, and it most often results in reactions of introversion, withdrawal and narrow assertions of national identity. At the same time, many developing countries seize the opportunity represented by globalization to assert themselves as serious players in the global communications market."
(UNESCO, World Communication Report)

The big hope of the South is that the Internet will close the education gap and economic gap, by making education easier to achieve. But in reality the gap is impossible to close, because the North is not keeping still, but developing itself further and further all the time; inventing new technologies that produce another gap each. The farmer's boy sitting in the dessert and using a cellular telephone and a computer at the same time is a sarcastic picture - nothing else.

Still, the so called developing countries regard modern communication technologies as a tremendous chance - and actually: which other choice is there left?

TEXTBLOCK 1/5 // URL: http://world-information.org/wio/infostructure/100437611730/100438659376
 
Late 1970s - Present: Fourth Generation Computers

Following the invention of the first integrated circuits always more and more components could be fitted onto one chip. LSI (Large Scale Integration) was followed by VLSI (Very Large Scale Integration) and ULSI (Ultra-Large Scale Integration), which increased the number of components squeezed onto one chip into the millions and helped diminish the size as well as the price of computers. The new chips took the idea of the integrated circuit one step further as they allowed to manufacture one microprocessor which could then be programmed to meet any number of demands.

Also, ensuing the introduction of the minicomputer in the mid 1970s by the early 1980s a market for personal computers (PC) was established. As computers had become easier to use and cheaper they were no longer mainly utilized in offices and manufacturing, but also by the average consumer. Therefore the number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used.

Further developments included the creation of mobile computers (laptops and palmtops) and especially networking technology. While mainframes shared time with many terminals for many applications, networking allowed individual computers to form electronic co-operations. LANs (Local Area Network) permitted computers to share memory space, information, software and communicate with each other. Although already LANs could reach enormous proportions with the invention of the Internet an information and communication-network on a global basis was established for the first time.

TEXTBLOCK 2/5 // URL: http://world-information.org/wio/infostructure/100437611663/100438659451
 
1980s: Artificial Intelligence (AI) - From Lab to Life

Following the commercial success of expert systems, which started in the 1970s, also other AI technologies began to make their way into the marketplace. In 1986, U.S. sales of AI-related hardware and software rose to U.S.$ 425 million. Especially expert systems, because of their efficiency, were still in demand. Yet also other fields of AI turned out to be successful in the corporate world.

Machine vision systems for example were used for the cameras and computers on assembly lines to perform quality control. By 1985 over a hundred companies offered machine vision systems in the U.S., and sales totaled U.S.$ 80 million. Although there was a breakdown in the market for AI-systems in 1986 - 1987, which led to a cut back in funding, the industry slowly recovered.

New technologies were being invented in Japan. Fuzzy logic pioneered in the U.S. and also neural networks were being reconsidered for achieving artificial intelligence. The probably most important development of the 1980s was, that it showed that AI technology had real life uses. AI applications like voice and character recognition systems or steadying camcorders using fuzzy logic were not only made available to business and industry, but also to the average customer.

TEXTBLOCK 3/5 // URL: http://world-information.org/wio/infostructure/100437611663/100438659445
 
fingerprint identification

Although fingerprinting smacks of police techniques used long before the dawn of the information age, its digital successor finger scanning is the most widely used biometric technology. It relies on the fact that a fingerprint's uniqueness can be defined by analysing the so-called "minutiae" in somebody's fingerprint. Minutae include sweat pores, distance between ridges, bifurcations, etc. It is estimated that the likelihood of two individuals having the same fingerprint is less than one in a billion.

As an access control device, fingerprint scanning is particularly popular with military institutions, including the Pentagon, and military research facilities. Banks are also among the principal users of this technology, and there are efforts of major credit card companies such as Visa and MasterCard to incorporate this finger print recognition into the bank card environment.

Problems of inaccuracy resulting from oily, soiled or cracked skins, a major impediment in fingerprint technology, have recently been tackled by the development a contactless capturing device (http://www.ddsi-cpc.com) which translates the characteristics of a fingerprint into a digitised image.

As in other biometric technologies, fingerprint recognition is an area where the "criminal justice" market meets the "security market", yet another indication of civilian spheres becomes indistinguishable from the military. The utopia of a prisonless society seems to come within the reach of a technology capable of undermining freedom by an upward spiral driven by identification needs and identification technologies.

TEXTBLOCK 4/5 // URL: http://world-information.org/wio/infostructure/100437611729/100438658358
 
Further Tools: Photography

Art has always contributed a lot to disinformation.
Many modern tools for disinformation are used in art/photography.
Harold D. Lasswell once stated that propaganda was cheaper than violence. Today this is no longer true. Technology has created new tools for propaganda and disinformation - and they are expensive. But by now our possibilities to manipulate pictures and stories have gone so far that it can get difficult to tell the difference between the original and a manipulation.

Trillions of photographs have been taken in the 20th century. Too many to look at, too many to control them and their use. A paradise for manipulation.
We have to keep in mind: There is the world, and there exist pictures of the world, which does not mean that both are the same thing. Photographs are not objective, because the photographer selects the part of the world which is becoming a picture. The rest is left out.

Some tools for manipulation of photography are:



Some of those are digital ways of manipulation, which helps to change pictures in many ways without showing the manipulation.

Pictures taken from the internet could be anything and come from anywhere. To proof the source is nearly impossible. Therefore scientists created on watermarks for pictures, which make it impossible to "steal" or manipulate a picture out of the net.

TEXTBLOCK 5/5 // URL: http://world-information.org/wio/infostructure/100437611661/100438658730
 
Neural network

A bottom-up artificial intelligence approach, a neural network is a network of many very simple processors ("units" or "neurons"), each possibly having a (small amount of) local memory. The units are connected by unidirectional communication channels ("connections"), which carry numeric data. The units operate only on their local data and on the inputs they receive via the connections. A neural network is a processing device, either an algorithm, or actual hardware, whose design was inspired by the design and functioning of animal brains and components thereof. Most neural networks have some sort of "training" rule whereby the weights of connections are adjusted on the basis of presented patterns. In other words, neural networks "learn" from examples and exhibit some structural capability for generalization.

INDEXCARD, 1/8
 
Fuzzy logic

A superset of Boolean logic (George Boole) introduced by Lotfi Zadeh in the 1960s as a means to model the uncertainty of natural language. Fuzzy logic is a type of logic that recognizes more than simple true and false values. It represents a departure from classical two-valued sets and logic, that use "soft" linguistic (e.g. large, small, hot, cold, warm) system variables and a continuous range of truth values in the interval [0,1], rather than strict binary (true or false) decisions and assignments.

INDEXCARD, 2/8
 
First Amendment Handbook

The First Amendment to the US Constitution, though short, lists a number of rights. Only a handful of words refer to freedoms of speech and the press, but those words are of incalculable significance. To understand the current subtleties and controversies surrounding this right, check out this First Amendment site. This detailed handbook of legal information, mostly intended for journalists, should be of interest to anyone who reads or writes. For example, the chapter Invasion of Privacy shows the limits of First Amendment rights, and the balance between the rights of the individual and the rights of the public - or, more crudely, the balance of Tabloid vs. Celebrity. Each section is carefully emended with relevant legal decisions.

http://www.rcfp.org/handbook/viewpage.cgi

INDEXCARD, 3/8
 
Assembly line

An assembly line is an industrial arrangement of machines, equipment, and workers for continuous flow of workpieces in mass production operations. An assembly line is designed by determining the sequences of operations for manufacture of each product component as well as the final product. Each movement of material is made as simple and short as possible with no cross flow or backtracking. Work assignments, numbers of machines, and production rates are programmed so that all operations performed along the line are compatible.

INDEXCARD, 4/8
 
NATO

The North Atlantic Treaty was signed in Washington on 4 April 1949, creating NATO (= North Atlantic Treaty Organization). It was an alliance of 12 independent nations, originally committed to each other's defense. Between 1952 and 1982 four more members were welcomed and in 1999, the first ex-members of COMECON became members of NATO (the Czech Republic, Hungary and Poland), which makes 19 members now. Around its 50th anniversary NATO changed its goals and tasks by intervening in the Kosovo Crisis.

INDEXCARD, 5/8
 
Joseph Stalin

Joseph Stalin (1879-1953):
After Lenin's death he took over and became a dictator without any limits of power. Everyone who dared to talk or act against him or was in suspicion of doing so, got killed. Millions were murdered. His empire was one made out of propaganda and fear. As long as he was in power his picture had to be in every flat and bureau. Soon after his death the cult was stopped and in 1956 the De-Stalination was started, though he was partly rehabilitated in 1970.

INDEXCARD, 6/8
 
News Corporation

The News Corporation Ltd., a global media holding company, which governed News Limited (Australia), News International (U.K.), and News America Holdings Inc. (U.S.) was founded by the Australian-born newspaper publisher and media entrepreneur, Rupert Murdoch. Murdoch's corporate interests center on newspaper, magazine, book, and electronic publishing; television broadcasting; and film and video production, principally in the United States, the United Kingdom, and Australia.

INDEXCARD, 7/8
 
Gateway

A gateway is a computer supplying point-to-multipoint connections between computer networks.

INDEXCARD, 8/8