In Search of Reliable Internet Measurement Data Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible. Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Size and Growth In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known So his statement is a slap in the face of all market researchers stating otherwise. In a certain sense this is ridiculous, because since the inception of the So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the "There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Hosts The Despite the small sample, this method has at least one flaw: Internet Weather Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet Hits, Page Views, Visits, and Users Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed. For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate. In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle. But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and For If you like to play around with Internet statistics instead, you can use Robert Orenstein's Measuring the Density of Measuring the Density of Dodge and Shiode used data on the ownership of IP addresses from |
|
Credibility The magic word is credibility. Disinformation can mean leaving out important informations. Telling lies is not the only method of disinformation. The not telling also creates thoughts and delegates them into certain directions, whereas other models of thinking are left out. Like this, the deaths on the own side are adjusted downwards whereas the victims of the enemy are counted proudly - as long as they are not civilians. The post-Gulf War period demonstrated how the population reacts if the number of innocent victims is much higher than expected. It was the fact of those numbers that provoked the biggest part of the post-war critique. The media in democratic states tend to criticize this, which does not mean that they always want to be free of governmental influence. They can choose to help the government in a single case by not writing anything against it or by writing pro-government stories. At the same time every democracy has undemocratic parts in it - which is already part of democracy itself. There are situations when a democratic government may find it essential to put pressure on the media to inform the population in a certain way; and also censorship is nothing that can only be connected to dictatorship; just think of the Falkland War, the Gulf-War or the Kosovo-War. |
|
Next Generation Internet Program A research and development program funded by the US government. Goal is the development of advanced networking technologies and applications requiring advanced networking with capabilities that are 100 to 1,000 times faster end-to-end than today's Internet. http://www.ngi.gov |
|
ARPAnet ARPAnet was the small network of individual computers connected by leased lines that marked the beginning of today's global data networks. Being an experimental network mainly serving the purpose to test the feasibility of In 1969 ARPANET went online and links the first two computers, one of them located at the University of California, Los Angeles, the other at the Stanford Research Institute. But ARPAnet has not become widely accepted before it was demonstrated in action to a public of computer experts at the First International Conference on Computers and Communication in Washington, D. C. in 1972. Before it was decommissioned in 1990, In the USA commercial users already outnumbered military and academic users in 1994. Despite the rapid growth of the Net, most computers linked to it are still located in the United States. |
|