"Stealth Sites" "Stealth sites" account for a particular form of hidden advertisement. Stealth sites look like magazines, nicely designed and featuring articles on different topics, but in reality are set up for the sole purpose of featuring a certain companies products and services. |
|
4000 - 1000 B.C. 4th millennium B.C. In Sumer Writing and calculating came into being at about the same time. The first pictographs carved into clay tablets were used for administrative purposes. As an instrument for the administrative bodies of early empires, which began to rely on the collection, storage, processing and transmission of data, the skill of writing was restricted to only very few. Being more or less separated tasks, writing and calculating converge in today's computers. Letters are invented so that we might be able to converse even with the absent, says Saint Augustine. The invention of writing made it possible to transmit and store information. No longer the ear predominates; face-to-face communication becomes more and more obsolete for administration and bureaucracy. Standardization and centralization become the constituents of high culture and vast empires as Sumer and China. 3200 B.C. In Sumer the seal is invented. About 3000 B.C. In Egypt papyrus scrolls and About 1350 B.C. In Assyria the cuneiform script is invented. 1200 B.C. According to Aeschylus, the conquest of the town of Troy was transmitted via torch signals. About 1100 B.C. Egyptians use homing pigeons to deliver military information. |
|
An Economic and therefore Governmental Issue While the digital divide might bring up the idea that enterprises will be able to sell more and more computers during the next years another truth looks as if there was no hope for a certain percentage of the population to get out of their marginalization, their position of being "have nots". Studies show that the issue of different colors of skin play a role in this, but more than "racial" issues it is income, age and education that decides about the have and have nots. There exist ~ 103 million households in the USA. ~6 million do not even have telephone access. Why should they care about computers? The digital divide cuts the world into centers and peripheries, not into nations, as it runs through the boarder between the North and the South as well as through nations. The most different institutions with various interests in their background work in that field; not rarely paid by governments, which are interested in inhabitants, connected to the net and economy. see also: Searching information about the digital divide one will find informations saying that it is growing all the time whereas other studies suggest the contrary, like this one |
|
Individualized Audience Targeting New opportunities for online advertisers arise with the possibility of one-to-one Web applications. Software agents for example promise to "register, recognize and manage end-user profiles; create personalized communities on-line; deliver personalized content to end-users and serve highly targeted advertisements". The probably ultimate tool for advertisers. Although not yet widely used, companies like |
|
Definition During the last 20 years the old "Digital divide" describes the fact that the world can be divided into people who do and people who do not have access to (or the education to handle with) modern information technologies, e.g. cellular telephone, television, Internet. This digital divide is concerning people all over the world, but as usually most of all people in the formerly so called third world countries and in rural areas suffer; the poor and less-educated suffer from that divide. More than 80% of all computers with access to the Internet are situated in larger cities. "The cost of the information today consists not so much of the creation of content, which should be the real value, but of the storage and efficient delivery of information, that is in essence the cost of paper, printing, transporting, warehousing and other physical distribution means, plus the cost of the personnel manpower needed to run these `extra' services ....Realizing an autonomous distributed networked society, which is the real essence of the Internet, will be the most critical issue for the success of the information and communication revolution of the coming century of millennium." (Izumi Aizi) for more information see: |
|
Virtual cartels, introduction Among the most striking development of the 1990s has been the emergence of a global commercial media market utilizing new technologies and the global trend toward deregulation. This global commercial media market is a result of aggressive maneuvering by the dominant firms, new technologies that make global systems cost-efficient, and neoliberal economic policies encouraged by the World Bank, IMF, WTO, and the US government to break down regulatory barriers to a global commercial media and telecommunication market. A global oligopolistic market that covers the spectrum of media is now crystallizing the very high barriers to entry." (Robert McChesney, author of "Rich Media, Poor Democracy") The network structure of information and communication technologies means that even deregulated markets are not "free". The functional logic of global networks only tolerates a small number of large players. Mergers, strategic alliances, partnerships and cooperations are therefore the daily routine of the ICT business. They bypass competition and create "virtual cartels". |
|
Economic structure; introduction "Globalization is to no small extent based upon the rise of rapid global communication networks. Some even go so far as to argue that "information has replaced manufacturing as the foundation of the economy". Indeed, global media and communication are in some respects the advancing armies of global capitalism." (Robert McChesney, author of "Rich Media, Poor Democracy") "Information flow is your lifeblood." (Bill Gates, founder of Microsoft) The usefulness of information and communication technologies increases with the number of people who use them. The more people form part of communication networks, the greater the amount of information that is produced. Microsoft founder Bill Gates dreams of "friction free capitalism", a new stage of capitalism in which perfect information becomes the basis for the perfection of the markets. But exploitative practices have not disappeared. Instead, they have colonised the digital arena where effective protective regulation is still largely absent. Following the dynamics of informatised economies, the consumption habits and lifestyles if customers are of great interest. New technologies make it possible to store and combine collected data of an enormous amount of people. User profiling helps companies understand what potential customers might want. Often enough, such data collecting takes place without the customer's knowledge and amounts to spying. "Much of the information collection that occurs on the Internet is invisible to the consumer, which raises serious questions of fairness and informed consent." (David Sobel, Electronic Privacy Information Center) |
|
In Search of Reliable Internet Measurement Data Newspapers and magazines frequently report growth rates of Internet usage, number of users, hosts, and domains that seem to be beyond all expectations. Growth rates are expected to accelerate exponentially. However, Internet measurement data are anything thant reliable and often quite fantastic constructs, that are nevertheless jumped upon by many media and decision makers because the technical difficulties in measuring Internet growth or usage are make reliable measurement techniques impossible. Equally, predictions that the Internet is about to collapse lack any foundation whatsoever. The researchers at the Size and Growth In fact, "today's Internet industry lacks any ability to evaluate trends, identity performance problems beyond the boundary of a single ISP (Internet service provider, M. S.), or prepare systematically for the growing expectations of its users. Historic or current data about traffic on the Internet infrastructure, maps depicting ... there is plenty of measurement occurring, albeit of questionable quality", says K. C. Claffy in his paper Internet measurement and data analysis: topology, workload, performance and routing statistics (http://www.caida.org/Papers/Nae/, Dec 6, 1999). Claffy is not an average researcher; he founded the well-known So his statement is a slap in the face of all market researchers stating otherwise. In a certain sense this is ridiculous, because since the inception of the So what are the reasons for this inability to evaluate trends, identity performance problems beyond the boundary of a single ISP? First, in early 1995, almost simultaneously with the worldwide introduction of the "There are many estimates of the size and growth rate of the Internet that are either implausible, or inconsistent, or even clearly wrong", K. G. Coffman and Andrew, both members of different departments of What is measured and what methods are used? Many studies are devoted to the number of users; others look at the number of computers connected to the Internet or count You get the clue of their focus when you bear in mind that the Internet is just one of many networks of networks; it is only a part of the universe of computer networks. Additionally, the Internet has public (unrestricted) and private (restricted) areas. Most studies consider only the public Internet, Coffman and Odlyzko consider the long-distance private line networks too: the corporate networks, the Hosts The Despite the small sample, this method has at least one flaw: Internet Weather Like daily weather, traffic on the Internet, the conditions for data flows, are monitored too, hence called Internet weather. One of the most famous Internet Hits, Page Views, Visits, and Users Let us take a look at how these hot lists of most visited Web sites may be compiled. I say, may be, because the methods used for data retrieval are mostly not fully disclosed. For some years it was seemingly common sense to report requested files from a Web site, so called "hits". A method not very useful, because a document can consist of several files: graphics, text, etc. Just compile a document from some text and some twenty flashy graphical files, put it on the Web and you get twenty-one hits per visit; the more graphics you add, the more hits and traffic (not automatically to your Web site) you generate. In the meantime page views, also called page impressions are preferred, which are said to avoid these flaws. But even page views are not reliable. Users might share computers and corresponding Especially the editors of some electronic journals (e-journals) rely on page views as a kind of ratings or circulation measure, Rick Marin reports in the More advanced, but just slightly better at best, is counting visits, the access of several pages of a Web site during one session. The problems already mentioned apply here too. To avoid them, newspapers, e.g., establish registration services, which require password authentication and therefore prove to be a kind of access obstacle. But there is a different reason for these services. For content providers users are virtual users, not unique persons, because, as already mentioned, computers and For If you like to play around with Internet statistics instead, you can use Robert Orenstein's Measuring the Density of Measuring the Density of Dodge and Shiode used data on the ownership of IP addresses from |
|
Newsgroups Newsgroups are on-line discussion groups on the Usenet. Over 20,000 newsgroups exist, organized by subject into hierarchies. Each subject hierarchy is further broken down into subcategories. Covering an incredible wide area of interests and used intensively every day, they are an important part of the Internet. For more information, click here ( |
|
Proxy Servers A proxy server is a server that acts as an intermediary between a workstation user and the Internet so that security, administrative control, and caching service can be ensured. A proxy server receives a request for an Internet service (such as a Web page request) from a user. If it passes filtering requirements, the proxy server, assuming it is also a cache server, looks in its local cache of previously downloaded Web pages. If it finds the page, it returns it to the user without needing to forward the request to the Internet. If the page is not in the cache, the proxy server, acting as a client on behalf of the user, uses one of its own Source: Whatis.com |
|
FEED |
|
Internet Architecture Board On behalf of the Internet Society: |
|
National Laboratory for Applied Network Research NLANR, initially a collaboration among supercomputer sites supported by the Today NLANR offers support and services to institutions that are qualified to use high performance network service providers - such as Internet 2 and http://www.nlanr.net |
|
First Monday An English language peer reviewed media studies journal based in Denmark. http://firstmonday.dk |
|
Intranet As a |
|
Caching Caching is a mechanism that attempts to decrease the time it takes to retrieve data by storing a copy at a closer location. |
|
Defense Advanced Research Project Agency (DARPA) DARPA (Defense Advanced Research Projects Agency) is the independent research branch of the U.S. Department of Defense that, among its other accomplishments, funded a project that in time was to lead to the creation of the Internet. Originally called ARPA (the "D" was added to its name later), DARPA came into being in 1958 as a reaction to the success of Sputnik, Russia's first manned satellite. DARPA's explicit mission was (and still is) to think independently of the rest of the military and to respond quickly and innovatively to national defense challenges. In the late 1960s, DARPA provided funds and oversight for a project aimed at interconnecting computers at four university research sites. By 1972, this initial network, now called the http://www.darpa.mil |
|
Internet Software Consortium The Internet Software Consortium (ISC) is a nonprofit corporation dedicated to the production of high-quality reference implementations of Internet standards that meet production standards. Its goal is to ensure that those reference implementations are properly supported and made freely available to the Internet community. http://www.isc.org |
|