Iris recognition

Iris recognition relies upon the fact that every individuals retina has a unique structure. The iris landscape is composed of a corona, crypts, filaments, freckles, pits radial furrows and striatations. Iris scanning is considered a particularly accurate identification technology because the characteristics of the iris do not change during a persons lifetime, and because there are several hundred variables in an iris which can be measured. In addition, iris scanning is fast: it does not take longer than one or two seconds.

These are characteristics which have made iris scanning an attractive technology for high-security applications such as prison surveillance. Iris technology is also used for online identification where it can substitute identification by password. As in other biometric technologies, the use of iris scanning for the protection of privacy is a two-edged sword. The prevention of identity theft applies horizontally but not vertically, i.e. in so far as the data retrieval that accompanies identification and the data body which is created in the process has nothing to do with identity theft.

TEXTBLOCK 1/3 // URL: http://world-information.org/wio/infostructure/100437611729/100438658334
 
1940s - Early 1950s: First Generation Computers

Probably the most important contributor concerning the theoretical basis for the digital computers that were developed in the 1940s was Alan Turing, an English mathematician and logician. In 1936 he created the Turing machine, which was originally conceived as a mathematical tool that could infallibly recognize undecidable propositions. Although he instead proved that there cannot exist any universal method of determination, Turing's machine represented an idealized mathematical model that reduced the logical structure of any computing device to its essentials. His basic scheme of an input/output device, memory, and central processing unit became the basis for all subsequent digital computers.

The onset of the Second World War led to an increased funding for computer projects, which hastened technical progress, as governments sought to develop computers to exploit their potential strategic importance.

By 1941 the German engineer Konrad Zuse had developed a computer, the Z3, to design airplanes and missiles. Two years later the British completed a secret code-breaking computer called Colossus to decode German messages and by 1944 the Harvard engineer Howard H. Aiken had produced an all-electronic calculator, whose purpose was to create ballistic charts for the U.S. Navy.

Also spurred by the war the Electronic Numerical Integrator and Computer (ENIAC), a general-purpose computer, was produced by a partnership between the U.S. government and the University of Pennsylvania (1943). Consisting of 18.000 vacuum tubes, 70.000 resistors and 5 million soldered joints, the computer was such a massive piece of machinery (floor space: 1,000 square feet) that it consumed 160 kilowatts of electrical power, enough energy to dim lights in an entire section of a bigger town.

Concepts in computer design that remained central to computer engineering for the next 40 years were developed by the Hungarian-American mathematician John von Neumann in the mid-1940s. By 1945 he created the Electronic Discrete Variable Automatic Computer (EDVAC) with a memory to hold both a stored program as well as data. The key element of the Neumann architecture was the central processing unit (CPU), which allowed all computer functions to be coordinated through a single source. One of the first commercially available computers to take advantage of the development of the CPU was the UNIVAC I (1951). Both the U.S. Census bureau and General Electric owned UNIVACs (Universal Automatic Computer).

Characteristic for first generation computers was the fact, that instructions were made-to-order for the specific task for which the computer was to be used. Each computer had a different binary-coded program called a machine language that told it how to operate. Therefore computers were difficult to program and limited in versatility and speed. Another feature of early computers was that they used vacuum tubes and magnetic drums for storage.

TEXTBLOCK 2/3 // URL: http://world-information.org/wio/infostructure/100437611663/100438659338
 
Gait recognition

The fact that an individual's identity is expressed not only by the way he/she looks or sounds, but also by the manner of walking is a relatively new discovery of in biometrics.

Unlike the more fully developed biometric technologies whose scrutiny is directed at stationary parts of the body, gait recognition has the added difficulty of having to sample and identify movement. Scientists at the University of Southampton, UK (http://www.isis.ecs.soton.ac.uk/research/gait/) have developed a model which likens the movement of legs to those of a pendulum and uses hip inclination as a variable.

Another model considers the shape and length of legs as well as the velocity of joint movements. The objective is to combine both models into one, which would make gait recognition a fully applicable biometric technology.

Given that gait recognition is applied to "moving preambulatory subjects" it is a particularly interesting technology for surveillance. People can no longer hide their identity by covering themselves or moving. Female shop lifters who pretend pregnancy will be detected because they walk differently than those who are really pregnant. Potential wrongdoers might resort walking techniques as developed in Monty Pythons legendary "Ministry of Silly Walks" (http://www.stone-dead.asn.au/sketches/sillwalk.htm)

TEXTBLOCK 3/3 // URL: http://world-information.org/wio/infostructure/100437611729/100438658388
 
MIRALab

MIRALab is a research laboratory attached to the University of Geneva. Its motto is "where research meets creativity". MIRAlab's objective is to model human functionalities, such as movement or facial expression, in a realistic way.

INDEXCARD, 1/3
 
Nadia Thalman

Nadia Thalman is director of MIRAlab at the University of Geneva, Switzerland. Thalmann has become known as the creator of "virtual Marylyn", an installation which allowed visitors to literally to slip into Marylyn's shoes. Thalman's work is located at interface between science and art. It is about modelling human bodies for science and creative purposes, e.g. as virtual actors in movies. Thalman insists that artificial beings must be beautiful, in addition to being useful, as we will be living with them at close quarters.

INDEXCARD, 2/3
 
Charles Babbage

b. December 26, 1791, London, England
d. October 18, 1871, London, England

English mathematician and inventor who is credited with having conceived the first automatic digital computer. The idea of mechanically calculating mathematical tables first came to Babbage in 1812 or 1813. Later he made a small calculator that could perform certain mathematical computations to eight decimals. During the mid-1830s Babbage developed plans for the so-called analytical engine, the forerunner of the modern digital computer. In this device he envisioned the capability of performing any arithmetical operation on the basis of instructions from punched cards, a memory unit in which to store numbers, sequential control, and most of the other basic elements of the present-day computer.

INDEXCARD, 3/3