Hitmetrix - User behavior analytics & recording

“Which search engine is best?” Nobody knows

Dogpile.com, a meta-search engine with perhaps the most poorly chosen brand name in the history of the Internet, recently published a research report that found big differences in the search results produced by the four major search engines (Google, Yahoo, Windows Live and Ask, which account for about 92 percent of US query volume). Using 19,332 random queries gathered over several days, the researchers demonstrated that each engine provides a unique “editorial view” of the Web, with minimal “overlap” (common results) shared among the engines.

The report’s conclusions (which argue that meta search engines provide a better view into the Web’s 40 billion documents) are predictably self-serving. But Dogpile has done us all a favor by demonstrating what many have long suspected: today’s search engines aren’t objective, aren’t comprehensive, rarely agree on what’s relevant, and provide a far too frustrating user experience.

Am I the only person on the planet who’s shocked by the idea that search – the most important information utility in the world – is so subjective? Would any of us tolerate this kind of subjectivity if phone books, card catalogs, or TV guides were organized so that they shared results just 0.6 percent of the time?

Dogpile’s study didn’t break out commercial from non-commercial queries, so it’s impossible to precisely quantify the degree to which searches for, say “digital camera” or “Caribbean cruise,” diverge more than non-commercial queries. Anecdotal evidence suggests, however, that organic results for commercial queries diverge widely and radically among the engines, and that “overlap” (agreement) is minimal.

Given the importance of search today, much more research needs to be done to establish the suitability-to-task of each engine. If I’m looking for a new printer, am I going to find the best results in Google? If I’m going on vacation and want to book a flight, should I go to Ask, or will I be wasting my time there? If it turns out that Live.com is the best search engine to find information about building supplies or musical artists, why doesn’t Microsoft seize this opportunity and broadcast it to the world?

“Which search engine is best?” is a very basic question without any satisfying answers. What’s needed is an objective body to appraise their suitability-to-task, but no one has stepped forward to provide this kind of guidance. One might think that an organization such as Consumer Reports, which maintains a site called WebWatch.org whose purpose is to “investigate, inform, and improve the quality of information published on the World Wide Web,” would have already done so. Alas, WebWatch.org’s reports on search engines have been limited to discussing how well the engines distinguish paid from organic results in their search engine results pages. Worst still, Consumer Reports issues its reports sporadically (the last report was issued more than two years ago). Nor does it appear that the engines are being systematically evaluated at universities: the most recent report I was able to find was from 2000, and search engines have changed a lot since then.

Search engine task suitability is a crucial question for consumers, marketers and the industry itself. There’s a critical need for a neutral third party organization to come forward, rate the engines, and publish objective, unbiased results. If Consumer Reports and academia aren’t up to the job, perhaps we in the industry need to start thinking about forming such an organization ourselves, investigating these issues, and issuing timely and authoritative reports.

Is there anybody else out there who’s unhappy with the performance of search engines? Do you want to see them more systematically evaluated? Send me an e-mail and let’s discuss.

Related Posts