Virtual cartels, introduction Among the most striking development of the 1990s has been the emergence of a global commercial media market utilizing new technologies and the global trend toward deregulation. This global commercial media market is a result of aggressive maneuvering by the dominant firms, new technologies that make global systems cost-efficient, and neoliberal economic policies encouraged by the World Bank, IMF, WTO, and the US government to break down regulatory barriers to a global commercial media and telecommunication market. A global oligopolistic market that covers the spectrum of media is now crystallizing the very high barriers to entry." (Robert McChesney, author of "Rich Media, Poor Democracy") The network structure of information and communication technologies means that even deregulated markets are not "free". The functional logic of global networks only tolerates a small number of large players. Mergers, strategic alliances, partnerships and cooperations are therefore the daily routine of the ICT business. They bypass competition and create "virtual cartels". |
|
In Search of Reliable Internet Measurement Data Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible. Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Size and Growth In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known So his statement is a slap in the face of all market researchers stating otherwise. In a certain sense this is ridiculous, because since the inception of the So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the "There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Hosts The Despite the small sample, this method has at least one flaw: Internet Weather Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet Hits, Page Views, Visits, and Users Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed. For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate. In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle. But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and For If you like to play around with Internet statistics instead, you can use Robert Orenstein's Measuring the Density of Measuring the Density of Dodge and Shiode used data on the ownership of IP addresses from |
|
Vinton Cerf Addressed as one of the fathers of the Internet, Vinton Cerf together with Robert Kahn developed the In 1992, he co-founded the Today, Vinton Cerf is Senior Vice President for Internet Architecture and Technology at Vinton Cerf's web site: http://www.wcom.com/about_the_company/cerfs_up/ |
|
Robot Robot relates to any automatically operated machine that replaces human effort, though it may not resemble human beings in appearance or perform functions in a humanlike manner. The term is derived from the Czech word robota, meaning "forced labor." Modern use of the term stems from the play R.U.R., written in 1920 by the Czech author Karel Capek, which depicts society as having become dependent on mechanical workers called robots that are capable of doing any kind of mental or physical work. Modern robot devices descend through two distinct lines of development--the early |
|
Cookie A cookie is an information package assigned to a client program (mostly a Web browser) by a server. The cookie is saved on your hard disk and is sent back each time this server is accessed. The cookie can contain various information: preferences for site access, identifying authorized users, or tracking visits. In online advertising, cookies serve the purpose of changing advertising banners between visits, or identifying a particular Advertising banners can be permanently eliminated from the screen by filtering software as offered by Cookies are usually stored in a separate file of the browser, and can be erased or permanently deactivated, although many web sites require cookies to be active. |
|