Internet Audience Statistics Are Inexact

Direct marketers that have embraced the Internet in anticipation of its measurement capabilities are among the most frustrated over the growing disharmony between the producers and consumers of Internet audience statistics.

The exasperation of Internet measurement firms and online marketers alike echoes across the pages of publications from The Wall Street Journal to the Sacramento Bee. Yet despite the importance of this issue, we still lack solutions to a problem that steadily grows more important, even as marketers, Internet companies and the new economy itself continue to find their fortunes tethered to the unfulfilled promise of Internet measurement.

The crux of the problem is this: Internet measurement firms and Internet companies have dramatically different views of the size and growth of Web site traffic. A data reconciliation study completed in the fall of 1999 involving several sites, advertising consortiums and every major audience measurement firm found huge discrepancies between third-party and client-side data for all but the largest sites. However, an examination of those large sites illustrates that they, too, are confused by the ambiguities of today’s Internet measurement products.

For example, The Wall Street Journal reported last September on Yahoo’s surprise at Media Metrix’s claim that the portal had experienced a percentage drop in unique users when Yahoo’s data showed the opposite. AltaVista was similarly frustrated earlier this year when its solid January/February traffic growth was contradicted by a reported drop in unique users. Smaller sites are even more consistently dumbfounded by inexplicable traffic fluctuations reported by audience measurement firms.

Given the increasing reliance on Internet data and Web site rankings by advertisers, journalists and analysts, the importance of Internet statistics cannot be underestimated. Yet the validity of today’s widely published metrics is questioned by many, casting a long shadow over an issue that Internet technology was originally poised to illuminate.

Why are the various sources of Internet data so disconnected? Part of the problem stems from the definition of data points. Conventionally, Web sites count unique browsers or Internet protocol addresses while measurement firms count unique individuals – two unequal currencies of Internet traffic.

While Web sites might emphasize the frequency of multiple individuals using one computer (and thus counting as only one unique IP), measurement firms point to duplicate and deleted cookies (causing an “inflation” of unique IP addresses above and beyond unique individuals). Consequently, the exact relationship between users and IPs remains unknown.

Another part of the problem has to do with the limited coverage measurement firms provide for massive vertical markets and usage segments. Multiuser access points like schools, libraries and Internet cafes have been completely (and intentionally) overlooked by measurement services that, in addition, have yet to figure out how to account for the growing wave of non-PC Internet access platforms.

Moreover, measurement firms themselves will admit to having severe problems in getting an accurate view of at-work Internet usage. Built-in obstacles (fire walls and suspicious information systems managers, variations in company size and sector) are partly to blame, but it is no coincidence that the oldest and largest measurement firms have virtually no background in researching the corporate environment to complement their extensive consumer research expertise.

Perhaps the greatest overall disappointment in Internet measurement firms stems from their weak international efforts. Until recently, the U.S.-based measurement firms had made no effort to account for non-U.S. traffic and today have products covering exclusively at-home usage in only a handful of the largest international markets. AltaVista, for one, has been greatly affected by this since the majority of its traffic – nearly 60 percent – comes from outside of the United States, and much of this traffic occurs during local work hours. For all of these reasons, international audience measurement statistics remain questionable at best.

Because the Internet as a commercial medium is less than 5 years old and we continue to have conflicting standards, inadequate coverage and unsatisfied consumers should be no surprise. The problems associated with Internet statistics have been aired by all, but the time has come to move beyond the mere identification of problems and move on to finding the answers.

Addressing the Problem With Internet Statistics

There are some things that are clear. First, there is no acceptable reason for the existence of multiple measurement firms that obtain different results from the same basic methods and data. Ideally, the industry should have a single provider that can apply the best technologies and methods to offer a universally accepted third-party standard.

Second, log file data alone from companies like AltaVista and Yahoo offer no panacea, particularly when there are few rules for how such data should be captured or reported. Therefore, measurement and auditing firms, standards consortiums and online players must cooperate to create, adopt and promote entirely new hybrid reporting standards to coexist with the third-party measurement data.

Third, strategies for building at-work panels must be drastically improved. Today, the most common method is to have at-home panelists place software on their work computers, an extremely questionable sampling method. A “front-door” approach where vendors can gather data in partnership with companies must become standard over the tactics currently used.

Finally, audience measurement firms and data consumers alike simply must be more cautious in their acceptance and interpretation of international traffic measurement. We are all understandably hungry for global Internet data. But as marketers, we have surely learned that questionable data is far worse than no data at all. We must have the discipline to be patient and recognize that the globalization of the Internet will outpace our ability to obtain solid data for some time to come.

Ultimately, the solution to the problems with Internet measurement may lie in a wholesale rethinking of the metrics themselves. Why unique visitors instead of unique IP addresses? Why months or weeks instead of days or day parts? Why page views instead of search referrals or search queries?

In any case, the metrics standards on which we hang our hats should be whatever gives us the most relevant, most accurate information. On the Internet, we didn’t land on a set of measurement standards, they arbitrarily landed on us. Now is the time for Internet industry players to unite and make good on the promise of accurate audience measurement that the Internet offered so long ago.

Ken Neibaur is vice president of marketing at AltaVista Co., Palo Alto, CA.

Related Posts