The 19th Century: First Programmable Computing Devices

Until the 19th century "early computers", probably better described as calculating machines, were basically mechanical devices and operated by hand. Early calculators like the abacus worked with a system of sliding beads arranged on a rack and the centerpiece of Leibniz's multiplier was a stepped-drum gear design.

Therefore Charles Babbage's proposal of the Difference Engine (1822), which would have (it was never completed) a stored program and should perform calculations and print the results automatically, was a major breakthrough, as it for the first time suggested the automation of computers. The construction of the Difference Engine, which should perform differential equations, was inspired by Babbage's idea to apply the ability of machines to the needs of mathematics. Machines, he noted, were best at performing tasks repeatedly without mistakes, while mathematics often required the simple repetition of steps.

After working on the Difference Engine for ten years Babbage was inspired to build another machine, which he called Analytical Engine. Its invention was a major step towards the design of modern computers, as it was conceived the first general-purpose computer. Instrumental to the machine's design was his assistant, Augusta Ada King, Countess of Lovelace, the first female computer programmer.

The second major breakthrough in the design of computing machines in the 19th century may be attributed to the American inventor Herman Hollerith. He was concerned with finding a faster way to compute the U.S. census, which in 1880 had taken nearly seven years. Therefore Hollerith invented a method, which used cards to store data information which he fed into a machine that compiled the results automatically. The punch cards not only served as a storage method and helped reduce computational errors, but furthermore significantly increased speed.

Of extraordinary importance for the evolution of digital computers and artificial intelligence have furthermore been the contributions of the English mathematician and logician George Boole. In his postulates concerning the Laws of Thought (1854) he started to theorize about the true/false nature of binary numbers. His principles make up what today is known as Boolean algebra, the collection of logic concerning AND, OR, NOT operands, on which computer switching theory and procedures are grounded. Boole also assumed that the human mind works according to these laws, it performs logical operations that could be reasoned. Ninety years later Boole's principles were applied to circuits, the blueprint for electronic computers, by Claude Shannon.

TEXTBLOCK 1/2 // URL: http://world-information.org/wio/infostructure/100437611663/100438659426
 
Centralization of the Content Industry

Following the 1980s a sweeping restructuring of commercial media power has happened. While some firms have grown through expansion others extended through mergers and acquisitions. Examples are Time & Warner & Turner & AOL; Viacom & Paramount & Blockbusters or News Corp. & Triangle & 20th Century Fox & Metromedia TV.

In recent years those developments have led to the rise of transnational media giants, resulting in the domination of the global media system by about ten huge conglomerates. These have interests in numerous media industries, ranging from film production, magazines, newspapers, book publishing and recorded music to TV and radio channels and networks, but also include retail stores, amusement parks and digital media products.

Behind these firms are about three or four dozen smaller media companies, which primarily engage in local, national or niche markets. In short, the overwhelming majority of the world's content production facilities and distribution channels lies in the hands of approximately fifty enterprises.

TEXTBLOCK 2/2 // URL: http://world-information.org/wio/infostructure/100437611795/100438659096