1800 - 1900 A.D. 1801 Invented by Joseph Marie Jacquard, an engineer and architect in Lyon, France, punch cards laid the ground for automatic information processing. For the first time information was stored in binary format on perforated cardboard cards. In 1890 Hermann Hollerith used Joseph-Marie Jacquard's punch card technology to process statistical data collected during the US census in 1890, thus speeding up US census data analysis from eight to three years. Hollerith's application of Jacquard's invention was used for programming computers and data processing until electronic data processing was introduced in the 1960's. - As with Paper tapes are a medium similar to Jacquard's punch cards. In 1857 Sir Charles Wheatstone used them for the preparation, storage, and transmission of data for the first time. Through paper tapes telegraph messages could be stored, prepared off-line and sent ten times quicker (up to 400 words per minute). Later similar paper tapes were used for programming computers. 1809 With Samuel Thomas Soemmering's invention of the electrical telegraph the telegraphic transmission of messages was no longer tied to visibility, as it was the case with smoke and light signals networks. Economical and reliable, the electric telegraph became the state-of-the-art communication system for fast data transmissions, even over long distances. Click 1861 The telephone was not invented by Alexander Graham Bell, as is widely held, but by Philipp Reiss, a German teacher. When he demonstrated his invention to important German professors in 1861, it was not enthusiastically greeted. Because of this dismissal, he was not given any financial support for further development. And here Bell comes in: In 1876 he successfully filed a patent for the telephone. Soon afterwards he established the first telephone company. 1866 First functional underwater telegraph cable is laid across the Atlantic 1895 Invention of the wireless telegraph |
|
In Search of Reliable Internet Measurement Data Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible. Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Size and Growth In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known So his statement is a slap in the face of all market researchers stating otherwise. In a certain sense this is ridiculous, because since the inception of the So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the "There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Hosts The Despite the small sample, this method has at least one flaw: Internet Weather Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet Hits, Page Views, Visits, and Users Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed. For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate. In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle. But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and For If you like to play around with Internet statistics instead, you can use Robert Orenstein's Measuring the Density of Measuring the Density of Dodge and Shiode used data on the ownership of IP addresses from |
|
Integrated circuit Also called microcircuit, the integrated circuit is an assembly of electronic components, fabricated as a single unit, in which active semiconductor devices ( |
|
World Wide Web (WWW) Probably the most significant Internet service, the World Wide Web is not the essence of the Internet, but a subset of it. It is constituted by documents that are linked together in a way you can switch from one document to another by simply clicking on the link connecting these documents. This is made possible by the Hypertext Mark-up Language (HTML), the authoring language used in creating World Wide Web-based documents. These so-called hypertexts can combine text documents, graphics, videos, sounds, and Especially on the World Wide Web, documents are often retrieved by entering keywords into so-called search engines, sets of programs that fetch documents from as many Among other things that is the reason why the World Wide Web is not simply a very huge database, as is sometimes said, because it lacks consistency. There is virtually almost infinite storage capacity on the Internet, that is true, a capacity, which might become an almost everlasting too, a prospect, which is sometimes According to the Internet domain survey of the |
|
First Monday An English language peer reviewed media studies journal based in Denmark. http://firstmonday.dk |
|