Courting the Fickle Search Engine

The Internet is a vast repository of information without a uniform system for filing, access or storage. Of the 2.7 billion existing Web pages, the largest search engine can locate 60 percent of them. The smallest can find a mere 2 percent, according to NEC Research, Princeton, NJ.

Why do you care? Search engines drive traffic to your site, quality traffic … people actively seeking something related or relevant to what you have published. Being prominently listed in a search engine’s results presents your site to prospects and customers immediately after they have expressed a direct interest in your topic.

Direct marketers live for this moment. You deliver the right message at the right time to the right audience. It is the chemical formula for conversion. There is no greater moment of kismet in cyberspace.

Yet these magical moments do not come early nor often. Achieving consistently favorable search engine presentation requires the instincts of a cryptographer, the focus and intensity of a brain surgeon and the insights of a psychic. For the past few years search engines have played a complex game of cat-and-mouse with site owners seeking “search optimization.” To properly make the continual moves and countermoves necessary to be ranked among the top 20 results is a full-time job.

To assist you through this changing array of algorithms designed to tag, sort, file and retrieve your site and its content, here’s a guide to search optimization.

Search engines are digital reference librarians. There are three garden varieties.

Spiders are computerized robots that automatically pull up pages and tag, sort and file by machine, based on preprogrammed assumptions. They are necessary because new Web pages are being created much faster than anybody can count or catalog them. Indexes depend on humans to do the tagging, sorting and filing. “Pay-for-play” engines allow you to jump the line for money. There’s not much more to it than deciding which rules should trigger your cutting in and how much it is worth to you. The best ones are Ask Jeeves, GoTo, RealNames and Google.

The pure engines are WebCrawler, AltaVista, HotBot, Inktomi and Excite. These engines rank sites by sending a robotic spider to catalog the pages on your site. Different spiders take different things into account. And, just to make it interesting, they change their matrices all the time. That’s why a site can rank in the top 20 one day and in the top 50 the next. Search engines are moving targets. You have to monitor them constantly to be successful.

Pure search engines automatically retrieve documents from the Web and follow all built-in links. The “bot” leaves a catalog trail of both. They can start anywhere and, like natural predators, they move according to their own rhythms. Spiders feed pages to the search engines which, in turn, feed them to searching surfers.

The best way to influence spidered results is to focus on four critical variables:

• Domain. A strong component of the search engine algorithm is whether the keyword is in the domain name. For example, if you search for books, is a sure bet. The closer your domain name mirrors what you are or what you do, the better.

• Title. Does the search term appear in the page title? Every page on the Web has a title. If the name is in the title, the bots pick it up easier. Placement of the title is also important. Large title words repeated often at the top score best. If you understand a bot’s behavior and play to it, you can get better, higher rankings.

• Links. Search engines look for the number of links as well as the relevance of the sites to which your site is linked. Some, like Google, count and rank the number of links to assign your position when the search results are presented to users. More links imply more relevant and used content that could net out a better ranking for your site.

• Meta tags. Meta tags are keywords and characters or phrases listed in the HTML heading text that users do not see. They are critical tools for optimizing search results, even though not all engines use them. Excite and Lycos have programmed bots specifically to ignore meta tags and your effort to influence search rankings.

While not visible to the viewer, the spider searches the meta tags and often ranks sites based on the number of times the keyword appears in the meta tag. Site designers previously loaded up the meta tags with the word “sex” until search engines got wise to the practice. Today you must be careful that the word is not repeated too many times, or the engine’s span detectors might automatically exclude the page from its results.

But nobody knows the magic number. And the engines change it often. Most experts agree that if a keyword is important, it should be repeated a minimum of seven times, preferably in large fonts at the top and in the title of each page.

The two most important types of meta tags and phrases for search engine indexing are:

• Descriptions. Each Web page has a brief, unseen summary description. If you do not create it, the bot will. If you write a description of the page in place of the summary the search engine would ordinarily create, you describe and categorize yourself. This substantially increases the chance of being ranked high in searches for your topic. Descriptions must be less than 25 words. Search engines use a set amount of characters in the descriptions they return. If yours is too long, it will be cut off, as well as your chances to optimize results.

• Keywords. Each page can be coded with keywords to direct how your site is indexed or sorted. If you provide keywords for the search engine to associate with your page, you self-define when and who will find you. Search engines find keywords effectively bearing relevant keywords. They do not make evaluative judgments among those with the same keywords. Only 21 percent of Web pages use keyword and description meta tags. So using both works in your favor.

Directories or indexes use humans to edit, sort, tag and file Web pages. Yahoo is the largest one. It has more than 80 editors reviewing content and determining whether to add sites to its database. Other major indexes are LookSmart, Lycos, Infoseek and Netscape. Engines that make at least partial use of indexes include America Online and Microsoft Network.

When indexes search for keywords, they take four metrics into account:

• URL: Is the keyword in the URL?

• Title: Is the keyword in the title?

• Description: Is the keyword in the actual description of the site that the viewer sees? The description is usually written by an editor, but the editor often uses part or all of a description provided by the site. Because the index is scouring the content of these descriptions for keywords, controlling the description is crucial to search engine success.

With an index, conforming to emerging norms is the best bet. Remember that human editors are looking for the easiest, fastest way to get the job done. Piggybacking on trends simplifies their job. Look carefully at descriptions on other sites and conform as much as possible to the style and length that prevails within the index. Each index has its own editorial norms.

Many of the larger indexes, Lycos, Netscape and AOL, reference the Open Directory Project, which is a voluntary directory where volunteer editors maintain the database and approve or reject submissions. Align your site with them.

• Traffic: Some indexes and some pure engines take traffic into account when determining rank. They rank your site based on the number of times people click through to it on their engines. If your site is getting more clicks than the site above you, you move up in the order of presentation.

If you are the kind of person who imagines yourself in a musty, airless room cracking the Japanese “Purple” code, then search optimization work is for you. The process of ensuring top rankings is a continuous one, requiring constant refinements based on Web site modifications and changes to search engine criteria.

It is almost impossible to keep abreast of the mercurial and unannounced changes that engines and indexes make to determine rankings. You need expertise and time to even have a shot at doing this right. Now that you know the basics, consider outsourcing.

• Danny Flamberg is a senior vice president and managing director at Digitas Inc., New York. His opinions are his alone. Reach him at [email protected] Palmer Jones, a strategic analyst at Digitas, contributed to this article.

Related Posts