Linking and Framing: Cases
Mormon Church v. Sandra and Jerald Tanner
In a ruling of December 1999, a federal judge in Utah temporarily barred two critics of the Mormon Church from posting on their website the Internet addresses of other sites featuring pirated copies of a Mormon text. The Judge said that it was likely that Sandra and Jerald Tanner had engaged in contributory copyright infringement when they posted the addresses of three Web sites that they knew, or should have known, contained the copies.
Kaplan, Carl S.: Copyright Decision Threatens Freedom to Link. In: New York Times. December 10, 1999.
Universal Studios v. Movie-List
The website Movie-List, which features links to online, externally hosted movie trailers has been asked to completely refrain from linking to any of Universal Studio's servers containing the trailers as this would infringe copyright.
Cisneros, Oscar S.: Universal: Don't Link to Us. In: Wired. July 27, 1999.
More cases concerned with the issue of linking, framing and the infringement of intellectual property are published in:
Ross, Alexandra: Copyright Law and the Internet: Selected Statutes and Cases.
|
TEXTBLOCK 1/5 // URL: http://world-information.org/wio/infostructure/100437611725/100438659639
|
|
The 18th Century: Powered Machines and the Industrial Revolution
The invention of the steam engine by James Watt in 1776 represented a major advance in the development of powered machines. It was first applied to an industrial operation - the spinning of cotton - in 1785. A new kind of work-slave it not only marked the beginning of the Industrial Revolution, but also the coming age of mass production.
In the England of the 18th century five important inventions in the textile industry advanced the automation of work processes. 1) John Kay's flying shuttle in 1733 , which permitted the weaving of larger widths of cloth and significantly increased weaving speed, 2) Edmund Cartwright's power loom in 1785, which increased weaving speed still further, 3) James Hargreaves' spinning jenny in 1764, 4) Richard Arkwright's water frame and 5) Samuel Crompton's spinning mule in 1779, whereby the last three inventions improved the speed and quality of thread-spinning operations. Those developments, combined with the invention of the steam engine, in short time led to the creation of new machine-slaves and the mechanization of the production of most major goods, such as iron, paper, leather, glass and bricks.
Large-scale machine production was soon applied in many manufacturing sectors and resulted in a reduction of production costs. Yet the widespread use of the novel work-slaves also led to new demands concerning the work force's qualifications. The utilization of machines enabled a differentiated kind of division of labor and eventuated in a (further) specialization of skills. While before many goods were produced by skilled craftsmen the use of modern machinery increased the demand for semiskilled and unskilled workers. Also, the nature of the work process altered from one mainly dependent on physical power to one primarily dominated by technology and an increasing proportion of the labor force employed to operate machines.
|
TEXTBLOCK 2/5 // URL: http://world-information.org/wio/infostructure/100437611663/100438659368
|
|
1940s - Early 1950s: First Generation Computers
Probably the most important contributor concerning the theoretical basis for the digital computers that were developed in the 1940s was Alan Turing, an English mathematician and logician. In 1936 he created the Turing machine, which was originally conceived as a mathematical tool that could infallibly recognize undecidable propositions. Although he instead proved that there cannot exist any universal method of determination, Turing's machine represented an idealized mathematical model that reduced the logical structure of any computing device to its essentials. His basic scheme of an input/output device, memory, and central processing unit became the basis for all subsequent digital computers.
The onset of the Second World War led to an increased funding for computer projects, which hastened technical progress, as governments sought to develop computers to exploit their potential strategic importance.
By 1941 the German engineer Konrad Zuse had developed a computer, the Z3, to design airplanes and missiles. Two years later the British completed a secret code-breaking computer called Colossus to decode German messages and by 1944 the Harvard engineer Howard H. Aiken had produced an all-electronic calculator, whose purpose was to create ballistic charts for the U.S. Navy.
Also spurred by the war the Electronic Numerical Integrator and Computer (ENIAC), a general-purpose computer, was produced by a partnership between the U.S. government and the University of Pennsylvania (1943). Consisting of 18.000 vacuum tubes, 70.000 resistors and 5 million soldered joints, the computer was such a massive piece of machinery (floor space: 1,000 square feet) that it consumed 160 kilowatts of electrical power, enough energy to dim lights in an entire section of a bigger town.
Concepts in computer design that remained central to computer engineering for the next 40 years were developed by the Hungarian-American mathematician John von Neumann in the mid-1940s. By 1945 he created the Electronic Discrete Variable Automatic Computer (EDVAC) with a memory to hold both a stored program as well as data. The key element of the Neumann architecture was the central processing unit (CPU), which allowed all computer functions to be coordinated through a single source. One of the first commercially available computers to take advantage of the development of the CPU was the UNIVAC I (1951). Both the U.S. Census bureau and General Electric owned UNIVACs (Universal Automatic Computer).
Characteristic for first generation computers was the fact, that instructions were made-to-order for the specific task for which the computer was to be used. Each computer had a different binary-coded program called a machine language that told it how to operate. Therefore computers were difficult to program and limited in versatility and speed. Another feature of early computers was that they used vacuum tubes and magnetic drums for storage.
|
TEXTBLOCK 3/5 // URL: http://world-information.org/wio/infostructure/100437611663/100438659338
|
|
Introduction: The Substitution of Human Faculties with Technology: Computers and Robots
With the development of modern computing, starting in the 1940s, the substitution of human abilities with technology obtained a new dimension. The focus shifted from the replacement of pure physical power to the substitution of mental faculties. Following the early 1980s personal computers started to attain widespread use in offices and quickly became indispensable tools for office workers. The development of powerful computers combined with progresses in artificial intelligence research also led to the construction of sophisticated robots, which enabled a further rationalization of manufacturing processes.
|
TEXTBLOCK 4/5 // URL: http://world-information.org/wio/infostructure/100437611663/100438659302
|
|
In Search of Reliable Internet Measurement Data
Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible.
Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Internet Performance Measurement and Analysis Project (IPMA) compiled a list of news items about Internet performance and statistics and a few responses to them by engineers.
Size and Growth
In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known Cooperative Association for Internet Data Analysis (CAIDA).
So his statement is a slap in the face of all market researchers stating otherwise. In a certain sense this is ridiculous, because since the inception of the ARPANet, the offspring of the Internet, network measurement was an important task. The very first ARPANet site was established at the University of California, Los Angeles, and intended to be the measurement site. There, Leonard Kleinrock further on worked on the development of measurement techniques used to monitor the performance of the ARPANet (cf. Michael and Ronda Hauben, Netizens: On the History and Impact of the Net). And in October 1991, in the name of the Internet Activities Board Vinton Cerf proposed guidelines for researchers considering measurement experiments on the Internet stated that the measurement of the Internet. This was due to two reasons. First, measurement would be critical for future development, evolution and deployment planning. Second, Internet-wide activities have the potential to interfere with normal operation and must be planned with care and made widely known beforehand. So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the World Wide Web, the transition of the stewardship role of the National Science Foundation over the Internet into a competitive industry (bluntly spoken: its privatization) left no framework for adequate tracking and monitoring of the Internet. The early ISPs were not very interested in gathering and analyzing network performance data, they were struggling to meet demands of their rapidly increasing customers. Secondly, we are just beginning to develop reliable tools for quality measurement and analysis of bandwidth or performance. CAIDA aims at developing such tools. "There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of AT & T Labs-Research, state something similar in their paper The Size and Growth Rate of the Internet, published in First Monday. There are some sources containing seemingly contradictory information on the size and growth rate of the Internet, but "there is no comprehensive source for information". They take a well-informed and refreshing look at efforts undertaken for measuring the Internet and dismantle several misunderstandings leading to incorrect measurements and estimations. Some measurements have such large error margins that you might better call them estimations, to say the least. This is partly due to the fact that data are not disclosed by every carrier and only fragmentarily available. What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count IP addresses. Coffman and Odlyzko focus on the sizes of networks and the traffic they carry to answer questions about the size and the growth of the Internet. You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Intranets, because they are convinced (that means their assertion is put forward, but not accompanied by empirical data) that "the evolution of the Internet in the next few years is likely to be determined by those private networks, especially by the rate at which they are replaced by VPNs (Virtual Private Networks) running over the public Internet. Thus it is important to understand how large they are and how they behave." Coffman and Odlyzko check other estimates by considering the traffic generated by residential users accessing the Internet with a modem, traffic through public peering points (statistics for them are available through CAIDA and the National Laboratory for Applied Network Research), and calculating the bandwidth capacity for each of the major US providers of backbone services. They compare the public Internet to private line networks and offer interesting findings. The public Internet is currently far smaller, in both capacity and traffic, than the switched voice network (with an effective bandwidth of 75 Gbps at December 1997), but the private line networks are considerably larger in aggregate capacity than the Internet: about as large as the voice network in the U. S. (with an effective bandwidth of about 330 Gbps at December 1997), they carry less traffic. On the other hand, the growth rate of traffic on the public Internet, while lower than is often cited, is still about 100% per year, much higher than for traffic on other networks. Hence, if present growth trends continue, data traffic in the U. S. will overtake voice traffic around the year 2002 and will be dominated by the Internet. In the future, growth in Internet traffic will predominantly derive from people staying longer and from multimedia applications, because they consume more bandwidth, both are the reason for unanticipated amounts of data traffic.
Hosts
The Internet Software Consortium's Internet Domain Survey is one of the most known efforts to count the number of hosts on the Internet. Happily the ISC informs us extensively about the methods used for measurements, a policy quite rare on the Web. For the most recent survey the number of IP addresses that have been assigned a name were counted. At first sight it looks simple to get the accurate number of hosts, but practically an assigned IP address does not automatically correspond an existing host. In order to find out, you have to send a kind of message to the host in question and wait for a reply. You do this with the PING utility. (For further explanations look here: Art. PING, in: Connected: An Internet Encyclopaedia) But to do this for every registered IP address is an arduous task, so ISC just pings a 1% sample of all hosts found and make a projection to all pingable hosts. That is ISC's new method; its old method, still used by RIPE, has been to count the number of domain names that had IP addresses assigned to them, a method that proved to be not very useful because a significant number of hosts restricts download access to their domain data. Despite the small sample, this method has at least one flaw: ISC's researchers just take network numbers into account that have been entered into the tables of the IN-ADDR.ARPA domain, and it is possible that not all providers know of these tables. A similar method is used for Telcordia's Netsizer.
Internet Weather
Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet weather report is from The Matrix, Inc. Another one is the Internet Traffic Report displaying traffic in values between 0 and 100 (high values indicate fast and reliable connections). For weather monitoring response ratings from servers all over the world are used. The method used is to "ping" servers (as for host counts, e. g.) and to compare response times to past ones and to response times of servers in the same reach.
Hits, Page Views, Visits, and Users
Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed. For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate. In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding IP addresses and host names with others, she/he might access not the site, but a cached copy from the Web browser or from the ISP's proxy server. So the server might receive just one page request although several users viewed a document.
Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the New York Times. Click-through rates - a quantitative measure - are used as a substitute for something of intrinsically qualitative nature: the importance of a column to its readers, e. g. They may read a journal just for a special column and not mind about the journal's other contents. Deleting this column because of not receiving enough visits may cause these readers to turn their backs on their journal. More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle. But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and IP addresses can be shared and the Internet is a client- server system; in a certain sense, in fact computers communicate with each other. Therefore many content providers are eager to get to know more about users accessing their sites. On-line registration forms or WWW user surveys are obvious methods of collecting additional data, sure. But you cannot be sure that information given by users is reliable, you can just rely on the fact that somebody visited your Web site. Despite these obstacles, companies increasingly use data capturing. As with registration services cookies come here into play.
For
If you like to play around with Internet statistics instead, you can use Robert Orenstein's Web Statistics Generator to make irresponsible predictions or visit the Internet Index, an occasional collection of seemingly statistical facts about the Internet.
Measuring the Density of IP Addresses
Measuring the Density of IP Addresses or domain names makes the geography of the Internet visible. So where on earth is the most density of IP addresses or domain names? There is no global study about the Internet's geographical patterns available yet, but some regional studies can be found. The Urban Research Initiative and Martin Dodge and Narushige Shiode from the Centre for Advanced Spatial Analysis at the University College London have mapped the Internet address space of New York, Los Angeles and the United Kingdom ( http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/telecom.html and http://www.geog.ucl.ac.uk/casa/martin/internetspace/paper/gisruk98.html). Dodge and Shiode used data on the ownership of IP addresses from RIPE, Europe's most important registry for Internet numbers.
|
TEXTBLOCK 5/5 // URL: http://world-information.org/wio/infostructure/100437611791/100438658352
|
|
Electronic Messaging (E-Mail)
Electronic messages are transmitted and received by computers through a network. By E-Mail texts, images, sounds and videos can be sent to single users or simultaneously to a group of users. Now texts can be sent and read without having them printed.
E-Mail is one of the most popular and important services on the Internet.
|
INDEXCARD, 1/12
|
|
Binary number system
In mathematics, the term binary number system refers to a positional numeral system employing 2 as the base and requiring only two different symbols, 0 and 1. The importance of the binary system to information theory and computer technology derives mainly from the compact and reliable manner in which data can be represented in electromechanical devices with two states--such as "on-off," "open-closed," or "go-no go."
|
INDEXCARD, 2/12
|
|
Intelsat
Intelsat, the world's biggest communication satellite services provider, is still mainly owned by governments, but will be privatised during 2001, like Eutelsat. A measure already discussed 1996 at an OECD competition policy roundtable in 1996. Signatory of the Intelsat treaty for the United States of America is Comsat, a private company listed on the New York Stock Exchange. Additionally Comsat is one of the United Kingdom's signatories. Aggregated, Comsat owns about 20,5% of Intelsat already and is Intelsat's biggest shareholder. In September 1998 Comsat agreed to merge with Lockheed Martin. After the merger, Lockheed Martin will hold at least 49% of Comsat share capital.
http://www.intelsat.int/index.htm
http://www.eutelsat.org/
http://www.oecd.org//daf/clp/roundtables/SATS...
http://www.comsat.com/
http://www.nyse.com/
http://www.comsat.com/
http://www.comsat.com/
http://www.comsat.com/
http://www.comsat.com/
|
INDEXCARD, 3/12
|
|
James Watt
b. January 19, 1736, Greenock, Renfrewshire, Scotland d. August 25, 1819, Heathfield Hall, Warwick, England
Scottish instrument maker and inventor whose steam engine contributed substantially to the Industrial Revolution. He was elected fellow of the Royal Society of London in 1785.
|
INDEXCARD, 4/12
|
|
Backbone Networks
Backbone networks are central networks usually of very high bandwidth, that is, of very high transmitting capacity, connecting regional networks. The first backbone network was the NSFNet run by the National Science Federation of the United States.
|
INDEXCARD, 5/12
|
|
Mass production
The term mass production refers to the application of the principles of specialization, division of labor, and standardization of parts to the manufacture of goods. The use of modern methods of mass production has brought such improvements in the cost, quality, quantity, and variety of goods available that the largest global population in history is now sustained at the highest general standard of living. A moving conveyor belt installed in a Dearborn, Michigan, automobile plant in 1913 cut the time required to produce flywheel magnetos from 18 minutes to 5 and was the first instance of the use of modern integrated mass production techniques.
|
INDEXCARD, 6/12
|
|
Machine language
Initially computer programmers had to write instructions in machine language. This coded language, which can be understood and executed directly by the computer without conversion or translation, consists of binary digits representing operation codes and memory addresses. Because it is made up of strings of 1s and 0s, machine language is difficult for humans to use.
|
INDEXCARD, 7/12
|
|
Internet Exchanges
Internet exchanges are intersecting points between major networks.
List of the World's Public Internet exchanges ( http://www.ep.net)
http://www.ep.net/
|
INDEXCARD, 8/12
|
|
George Boole
b. Nov. 2, 1815, Lincoln, Lincolnshire, England d. Dec. 8, 1864, Ballintemple, County Cork, Ireland
English mathematician who helped establish modern symbolic logic and whose algebra of logic, now called Boolean algebra, is basic to the design of digital computer circuits. One of the first Englishmen to write on logic, Boole pointed out the analogy between the algebraic symbols and those that can represent logical forms and syllogisms, showing how the symbols of quantity can be separated from those of operation. With Boole in 1847 and 1854 began the algebra of logic, or what is now called Boolean algebra. It is basically two-valued in that it involves a subdivision of objects into separate classes, each with a given property. Different classes can then be treated as to the presence or absence of the same property.
|
INDEXCARD, 9/12
|
|
General Electric
GE is a major American corporation and one of the largest and most diversified corporations in the world. Its products include electrical and electronic equipment, plastics, aircraft engines, medical imaging equipment, and financial services. The company was incorporated in 1892, and in 1986 GE purchased the RCA Corporation including the RCA-owned television network, the National Broadcasting Company, Inc. In 1987, however, GE sold RCA's consumer electronics division to Thomson SA, a state-owned French firm, and purchased Thomson's medical technology division. In 1989 GE agreed to combine its European business interests in appliances, medical systems, electrical distribution, and power systems with the unrelated British corporation General Electric Company. Headquarters are in Fairfield, Conn., U.S.
|
INDEXCARD, 10/12
|
|
Robot
Robot relates to any automatically operated machine that replaces human effort, though it may not resemble human beings in appearance or perform functions in a humanlike manner. The term is derived from the Czech word robota, meaning "forced labor." Modern use of the term stems from the play R.U.R., written in 1920 by the Czech author Karel Capek, which depicts society as having become dependent on mechanical workers called robots that are capable of doing any kind of mental or physical work. Modern robot devices descend through two distinct lines of development--the early automation, essentially mechanical toys, and the successive innovations and refinements introduced in the development of industrial machinery.
|
INDEXCARD, 11/12
|
|
Charles Babbage
b. December 26, 1791, London, England d. October 18, 1871, London, England
English mathematician and inventor who is credited with having conceived the first automatic digital computer. The idea of mechanically calculating mathematical tables first came to Babbage in 1812 or 1813. Later he made a small calculator that could perform certain mathematical computations to eight decimals. During the mid-1830s Babbage developed plans for the so-called analytical engine, the forerunner of the modern digital computer. In this device he envisioned the capability of performing any arithmetical operation on the basis of instructions from punched cards, a memory unit in which to store numbers, sequential control, and most of the other basic elements of the present-day computer.
|
INDEXCARD, 12/12
|
|