|
Timeline 1900-1970 AD 1913 the wheel cipher gets re-invented as a strip 1917 - an AT&T-employee, Gilbert S. Vernam, invents a polyalphabetic cipher machine that works with random-keys 1918 the Germans start using the ADFGVX-system, that later gets later by the French Georges Painvin - Arthur Scherbius patents a ciphering machine and tries to sell it to the German Military, but is rejected 1919 Hugo Alexander Koch invents a rotor cipher machine 1921 the Hebern Electric Code, a company producing electro-mechanical cipher machines, is founded 1923 Arthur Scherbius founds an enterprise to construct and finally sell his late 1920's/30's more and more it is criminals who use cryptology for their purposes (e.g. for smuggling). Elizabeth Smith Friedman deciphers the codes of rum-smugglers during prohibition regularly 1929 Lester S. Hill publishes his book Cryptography in an Algebraic Alphabet, which contains enciphered parts 1933-1945 the Germans make the Enigma machine its cryptographic main-tool, which is broken by the Poles Marian Rejewski, Gordon Welchman and Alan Turing's team at Bletchley Park in England in 1939 1937 the Japanese invent their so called Purple machine with the help of Herbert O. Yardley. The machine works with telephone stepping relays. It is broken by a team of 1930's the Sigaba machine is invented in the USA, either by W.F. Friedman or his colleague Frank Rowlett - at the same time the British develop the Typex machine, similar to the German Enigma machine 1943 Colossus, a code breaking computer is put into action at Bletchley Park 1943-1980 the cryptographic Venona Project, done by the NSA, is taking place for a longer period than any other program of that type 1948 Shannon, one of the first modern cryptographers bringing mathematics into cryptography, publishes his book A Communications Theory of Secrecy Systems 1960's the Communications-Electronics Security Group (= CESG) is founded as a section of Government Communications Headquarters (= GCHQ) late 1960's the IBM Watson Research Lab develops the Lucifer cipher 1969 James Ellis develops a system of separate public-keys and private-keys | ||||||||||||||||
|
| ||||||||||||||||
|
Another Question of Security Even with the best techniques it is impossible to invent a cryptographic system that is absolutely safe/unbreakable. To decipher a text means to go through many, sometimes nearly - but never really - endless attempts. For the computers of today it might take hundreds of years or even more to go through all possibilities of codes, but still, finally the code stays breakable. The much faster quantum computers will proof that one day. Therefore the decision to elect a certain method of enciphering finally is a matter of trust. For the average user of computers it is rather difficult to understand or even realize the dangers and/or the technological background of electronic transmission of data. For the majority thinking about one's own necessities for encryption first of all means to trust others, the specialists, to rely on the information they provide. The websites explaining the problems behind (and also the articles and books concerning the topic) are written by experts of course as well, very often in their typical scientific language, merely understandable for laymen. The introductions and other superficial elements of those articles can be understood, whereas the real background appears as untouchable spheres of knowledge. The fact that dangers are hard to see through and the need for security measures appears as something most people know from media reports, leads directly to the problem of an underdeveloped democracy in the field of cryptography. Obviously the connection between cryptography and democracy is rather invisible for many people. Those mentioned media reports often specialize in talking about the work computer hackers do (sometimes being presented as criminals, sometimes as heroes) and the danger to lose control over the money drawn away from one's bank account, if someone steals the credit card number or other important financial data. The term "security", surely connected to those issues, is a completely different one from the one that is connected to privacy. It is especially the latter that touches the main elements of democracy. for the question of security see: | ||||||||||||||||
|
| ||||||||||||||||
|
Legal Protection: European Union Within the EU's goal of establishing a European single market also An overview of EU activities relating to intellectual property protection is available on the website of the European Commission (DG Internal Market): | ||||||||||||||||
|
| ||||||||||||||||
|
History: "Indigenous Tradition" In preliterate societies the association of rhythmic or repetitively patterned utterances with supernatural knowledge endures well into historic times. Knowledge is passed from one generation to another. Similar as in the Southern tradition | ||||||||||||||||
|
| ||||||||||||||||
|
Challenges for Copyright by ICT: Internet Service Providers ISPs (Internet Service Providers) (and to a certain extent also telecom operators) are involved in the copyright debate primarily because of their role in the transmission and storage of digital information. Problems arise particularly concerning Caching Caching it is argued could cause damage because the copies in the cache are not necessarily the most current ones and the delivery of outdated information to users could deprive website operators of accurate "hit" information (information about the number of requests for a particular material on a website) from which advertising revenue is frequently calculated. Similarly harms such as defamation or infringement that existed on the original page may propagate for years until flushed from each cache where they have been replicated. Although different concepts, similar issues to caching arise with mirroring (establishing an identical copy of a website on a different server), archiving (providing a historical repository for information, such as with newsgroups and mailing lists), and full-text indexing (the copying of a document for loading into a full-text or nearly full-text database which is searchable for keywords or concepts). Under a literal reading of some copyright laws caching constitutes an infringement of copyright. Yet recent legislation like the Information Residing on Systems or Networks at the Direction of Users ISPs may be confronted with problems if infringing material on websites (of users) is hosted on their systems. Although some copyright laws like the DMCA provide for limitations on the liability of ISPs if certain conditions are met, it is yet unclear if ISPs should generally be accountable for the storage of infringing material (even if they do not have actual knowledge) or exceptions be established under specific circumstances. Transitory Communication In the course of transmitting digital information from one point on a network to another ISPs act as a data conduit. If a user requests information ISPs engage in the transmission, providing of a connection, or routing thereof. In the case of a person sending infringing material over a network, and the ISP merely providing facilities for the transmission it is widely held that they should not be liable for infringement. Yet some copyright laws like the DMCA provide for a limitation (which also covers the intermediate and transient copies that are made automatically in the operation of a network) of liability only if the ISPs activities meet certain conditions. For more information on copyright ( Harrington, Mark E.: On-line Copyright Infringement Liability for Internet Service Providers: Context, Cases & Recently Enacted Legislation. In: Teran, G.: | ||||||||||||||||
|
| ||||||||||||||||
|
Databody convergence In the phrase "the rise of the citizen as a consumer", to be found on the When the citizen becomes a consumer, the state must become a business. In the data body business, the key word behind this new identity of government is "outsourcing". Functions, that are not considered core functions of government activity are put into the hands of private contractors. There have long been instances where privately owned data companies, e.g. credit card companies, are allowed access to public records, e.g. public registries or electoral rolls. For example, in a normal credit card transaction, credit card companies have had access to public records in order to verify identity of a customer. For example, in the UK citizen's personal data stored on the Electoral Roll have been used for commercial purposes for a long time. The new British Data Protection Act now allows people to "opt out" of this kind of commercialisation - a legislation that has prompted protests on the part of the data industry: While this may serve as an example of an increased public awareness of privacy issues, the trend towards outsourcing seems to lead to a complete breakdown of the barriers between commercial and public use of personal data. This trend can be summarised by the term "outsourcing" of government functions. Governments increasingly outsource work that is not considered core function of government, e.g. cooking meals in hospitals or mowing lawns in public parks. Such peripheral activities marked a first step of outsourcing. In a further step, governmental functions were divided between executive and judgemental functions, and executive functions increasingly entrusted to private agencies. For these agencies to be able to carry out the work assigned to them, the need data. Data that one was stored in public places, and whose handling was therefore subject to democratic accountability. Outsourcing has produced gains in efficiency, and a decrease of accountability. Outsourced data are less secure, what use they are put to is difficult to control. The world's largest data corporation, Technically the linking of different systems is already possible. It would also create more efficiency, which means generate more income. The question, then, whether democracy concerns will prevent it from happening is one that is capable of creating But what the EDS example shows is something that applies everywhere, and that is that the data industry is whether by intention or whether by default, a project with profound political implications. The current that drives the global economy deeper and deeper into becoming a global data body economy may be too strong to be stopped by conventional means. However, the convergence of political and economic data bodies also has technological roots. The problem is that politically motivated surveillance and economically motivated data collection are located in the same area of information and communication technologies. For example, monitoring internet use requires more or less the same technical equipment whether done for political or economic purposes. Data mining and data warehousing techniques are almost the same. Creating transparency of citizens and customers is therefore a common objective of intelligence services and the data body industry. Given that data are exchanged in electronic networks, a compatibility among the various systems is essential. This is another factor that encourages "leaks" between state-run intelligence networks and the private data body business. And finally, given the secretive nature of state intelligence and commercial data capturing , there is little transparency. Both structures occupy an opaque zone. | ||||||||||||||||
|
| ||||||||||||||||
|
Global hubs of the data body industry While most data bunkers are restricted to particular areas or contexts, there are others which act as global data nodes. Companies such as
The size of these data repositories is constantly growing, so it is only a matter of time when everybody living in the technologically saturated part of the world will be registered in one of these data bunkers. Among these companies, For many years, EDS has been surrounded by rumours concerning sinister involvement with intelligence agencies. Beyond the rumours, though, there are also facts. EDS has a special division for | ||||||||||||||||
|
| ||||||||||||||||
|
Timeline Cryptography - Introduction Besides oral conversations and written language many other ways of information-transport are known: like the bush telegraph, drums, smoke signals etc. Those methods are not cryptography, still they need en- and decoding, which means that the history of language, the history of communication and the history of cryptography are closely connected to each other The timeline gives an insight into the endless fight between enciphering and deciphering. The reasons for them can be found in public and private issues at the same time, though mostly connected to military maneuvers and/or political tasks. One of the most important researchers on Cryptography through the centuries is | ||||||||||||||||
|
| ||||||||||||||||
|
In Search of Reliable Internet Measurement Data Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible. Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Size and Growth In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known So his statement is a slap in the face of all market researchers stating otherwise. In a certain sense this is ridiculous, because since the inception of the So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the "There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Hosts The Despite the small sample, this method has at least one flaw: Internet Weather Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet Hits, Page Views, Visits, and Users Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed. For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate. In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle. But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and For If you like to play around with Internet statistics instead, you can use Robert Orenstein's Measuring the Density of Measuring the Density of Dodge and Shiode used data on the ownership of IP addresses from | ||||||||||||||||
|
| ||||||||||||||||
|
Local Area Network (LAN) A Local Area Network is an office network, a network restricted to a building area. | ||||||||||||||||
|
| ||||||||||||||||
|
World Wide Web (WWW) Probably the most significant Internet service, the World Wide Web is not the essence of the Internet, but a subset of it. It is constituted by documents that are linked together in a way you can switch from one document to another by simply clicking on the link connecting these documents. This is made possible by the Hypertext Mark-up Language (HTML), the authoring language used in creating World Wide Web-based documents. These so-called hypertexts can combine text documents, graphics, videos, sounds, and Especially on the World Wide Web, documents are often retrieved by entering keywords into so-called search engines, sets of programs that fetch documents from as many Among other things that is the reason why the World Wide Web is not simply a very huge database, as is sometimes said, because it lacks consistency. There is virtually almost infinite storage capacity on the Internet, that is true, a capacity, which might become an almost everlasting too, a prospect, which is sometimes According to the Internet domain survey of the | ||||||||||||||||
|
| ||||||||||||||||
|
Leni Riefenstahl Leni Riefenstahl (* 1902) began her career as a dancer and actress. Parallel she learnt how to work with a camera, turning out to be one of the most talented directors and cutters of her time - and one of the only female ones. | ||||||||||||||||
|
| ||||||||||||||||
|
MIT The MIT (Massachusetts Institute of Technology) is a privately controlled coeducational institution of higher learning famous for its scientific and technological training and research. It was chartered by the state of Massachusetts in 1861 and became a land-grant college in 1863. During the 1930s and 1940s the institute evolved from a well-regarded technical school into an internationally known center for scientific and technical research. In the days of the Great Depression, its faculty established prominent research centers in a number of fields, most notably analog computing (led by | ||||||||||||||||
|
| ||||||||||||||||
|
Transmission Control Protocol/Internet Protocol (TCP/IP) TCP and IP are the two most important protocols and communication standards. TCP provides reliable message-transmission service; IP is the key protocol for specifying how packets are routed around the Internet. More detailed information can be found | ||||||||||||||||
|
| ||||||||||||||||
|
Framing Framing is the practice of creating a frame or window within a web page where the content of a different web page can be display. Usually when a link is clicked on, the new web page is presented with the reminders of the originating page. | ||||||||||||||||
|
| ||||||||||||||||
|
Expert system Expert systems are advanced computer programs that mimic the knowledge and reasoning capabilities of an expert in a particular discipline. Their creators strive to clone the expertise of one or several human specialists to develop a tool that can be used by the layman to solve difficult or ambiguous problems. Expert systems differ from conventional computer programs as they combine facts with rules that state relations between the facts to achieve a crude form of reasoning analogous to | ||||||||||||||||
|
| ||||||||||||||||
|
Bruce Schneier Bruce Schneier is president of Counterpane Systems in Minneapolis. This consulting enterprise specialized in cryptography and computer security. He is the author of the book Applied Cryptography and inventor of the Blowfish and Twofish encryption algorithms. | ||||||||||||||||
|
| ||||||||||||||||
|
Amazon.com Among privacy campaigners, the company's name has become almost synonymous with aggressive online direct marketing practices as well as user profiling and tracking. Amazon and has been involved in | ||||||||||||||||
|
| ||||||||||||||||
|
WIPO The World Intellectual Property Organization is one of the specialized agencies of the United Nations (UN), which was designed to promote the worldwide protection of both industrial property (inventions, trademarks, and designs) and copyrighted materials (literary, musical, photographic, and other artistic works). It was established by a convention signed in Stockholm in 1967 and came into force in 1970. The aims of WIPO are threefold. Through international cooperation, WIPO promotes the protection of intellectual property. Secondly, the organization supervises administrative cooperation between the Paris, Berne, and other intellectual unions regarding agreements on trademarks, patents, and the protection of artistic and literary work and thirdly through its registration activities the WIPO provides direct services to applicants for, or owners of, industrial property rights. | ||||||||||||||||
|
| ||||||||||||||||