fingerprint identification

Although fingerprinting smacks of police techniques used long before the dawn of the information age, its digital successor finger scanning is the most widely used biometric technology. It relies on the fact that a fingerprint's uniqueness can be defined by analysing the so-called "minutiae" in somebody's fingerprint. Minutae include sweat pores, distance between ridges, bifurcations, etc. It is estimated that the likelihood of two individuals having the same fingerprint is less than one in a billion.

As an access control device, fingerprint scanning is particularly popular with military institutions, including the Pentagon, and military research facilities. Banks are also among the principal users of this technology, and there are efforts of major credit card companies such as Visa and MasterCard to incorporate this finger print recognition into the bank card environment.

Problems of inaccuracy resulting from oily, soiled or cracked skins, a major impediment in fingerprint technology, have recently been tackled by the development a contactless capturing device (http://www.ddsi-cpc.com) which translates the characteristics of a fingerprint into a digitised image.

As in other biometric technologies, fingerprint recognition is an area where the "criminal justice" market meets the "security market", yet another indication of civilian spheres becomes indistinguishable from the military. The utopia of a prisonless society seems to come within the reach of a technology capable of undermining freedom by an upward spiral driven by identification needs and identification technologies.

TEXTBLOCK 1/2 // URL: http://world-information.org/wio/infostructure/100437611729/100438658358
 
1960s - 1970s: Increased Research in Artificial Intelligence (AI)

During the cold war the U.S. tried to ensure that it would stay ahead of the Soviet Union in technological advancements. Therefore in 1963 the Defense Advanced Research Projects Agency (DARPA) granted the Massachusetts Institute of Technology (MIT) U.S.$ 2.2 million for research in machine-aided cognition (artificial intelligence). The major effect of the project was an increase in the pace of AI research and a continuation of funding.

In the 1960s and 1970s a multitude of AI programs were developed, most notably SHRDLU. Headed by Marvin Minsky the MIT's research team showed, that when confined to a small subject matter, computer programs could solve spatial and logic problems. Other progresses in the field of AI at the time were: the proposal of new theories about machine vision by David Marr, Marvin Minsky's frame theory, the PROLOGUE language (1972) and the development of expert systems.

TEXTBLOCK 2/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659474
 
Vinton Cerf

Addressed as one of the fathers of the Internet, Vinton Cerf together with Robert Kahn developed the TCP/IP protocol suite, up to now the de facto-communication standard for the Internet, and also contributed to the development of other important communication standards. The early work on the protocols broke new ground with the realization of a multi-network open architecture.

In 1992, he co-founded the Internet Society where he served as its first President and later Chairman.

Today, Vinton Cerf is Senior Vice President for Internet Architecture and Technology at WorldCom, one of the world's most important ICT companies

Vinton Cerf's web site: http://www.wcom.com/about_the_company/cerfs_up/

http://www.isoc.org/
http://www.wcom.com/
INDEXCARD, 1/2
 
Apple

Founded by Steve Jobs and Steve Wozniak and headquartered in Cupertino, USA, Apple Computer was the first commercially successful personal computer company.

In 1978 Wozniak invented the first personal computer, the Apple II. IBM countered its successful introduction to the market by introducing a personal computer running MS-DOS, the operating system supplied by Microsoft Corporation. And IBM gained leadership again. Although by introducing the first graphical user interface affordable to consumers having started the desktop publishing revolution, Apple could not regain leadership again.

http://www.apple.com

For more detailed information see the Encyclopaedia Britannica: http://www.britannica.com/bcom/eb/article/6/0,5716,115726+1+108787,00.html

http://www.apple.com/
INDEXCARD, 2/2