Gait recognition

The fact that an individual's identity is expressed not only by the way he/she looks or sounds, but also by the manner of walking is a relatively new discovery of in biometrics.

Unlike the more fully developed biometric technologies whose scrutiny is directed at stationary parts of the body, gait recognition has the added difficulty of having to sample and identify movement. Scientists at the University of Southampton, UK (http://www.isis.ecs.soton.ac.uk/research/gait/) have developed a model which likens the movement of legs to those of a pendulum and uses hip inclination as a variable.

Another model considers the shape and length of legs as well as the velocity of joint movements. The objective is to combine both models into one, which would make gait recognition a fully applicable biometric technology.

Given that gait recognition is applied to "moving preambulatory subjects" it is a particularly interesting technology for surveillance. People can no longer hide their identity by covering themselves or moving. Female shop lifters who pretend pregnancy will be detected because they walk differently than those who are really pregnant. Potential wrongdoers might resort walking techniques as developed in Monty Pythons legendary "Ministry of Silly Walks" (http://www.stone-dead.asn.au/sketches/sillwalk.htm)

TEXTBLOCK 1/2 // URL: http://world-information.org/wio/infostructure/100437611729/100438658388
 
1950: The Turing Test

Alan Turing, an English mathematician and logician, advocated the theory that eventually computers could be created that would be capable of human thought. To cut through the long philosophical debate about exactly how to define thinking he proposed the "imitation game" (1950), now known as Turing test. His test consisted of a person asking questions via keyboard to both a person and an intelligent machine within a fixed time frame. After a series of tests the computers success at "thinking" could be measured by its probability of being misidentified as the human subject. Still today Turing's papers on the subject are widely acknowledged as the foundation of research in artificial intelligence.

TEXTBLOCK 2/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659354
 
Fuzzy logic

A superset of Boolean logic (George Boole) introduced by Lotfi Zadeh in the 1960s as a means to model the uncertainty of natural language. Fuzzy logic is a type of logic that recognizes more than simple true and false values. It represents a departure from classical two-valued sets and logic, that use "soft" linguistic (e.g. large, small, hot, cold, warm) system variables and a continuous range of truth values in the interval [0,1], rather than strict binary (true or false) decisions and assignments.

INDEXCARD, 1/2
 
Writing

Writing and calculating came into being at about the same time. The first pictographs carved into clay tablets are used for administrative purposes. As an instrument for the administrative bodies of early empires, who began to rely on the collection, storage, processing and transmission of data, the skill of writing was restricted to a few. Being more or less separated tasks, writing and calculating converge in today's computers.

Letters are invented so that we might be able to converse even with the absent, says Saint Augustine. The invention of writing made it possible to transmit and store information. No longer the ear predominates; face-to-face communication becomes more and more obsolete for administration and bureaucracy. Standardization and centralization become the constituents of high culture and vast empires as Sumer and China.

INDEXCARD, 2/2