Hitmetrix - User behavior analytics & recording

Use Multiple Techniques to Harness Searches

In the past decade, we have spent a tremendous amount of time teaching companies how to harness their marketing resources to gain the most qualified possible respondents.

In traditional direct marketing, a central thrust of the focus has been “list related.” If you are mailing to or phoning prospects who have a better-than-average likelihood of needing your company’s product or services, then the chances of having a successful campaign are greatly increased.

The same philosophy holds true on the Internet. There are many ways to steer traffic to a company’s Web site, but none is more direct, more relevant or more cost-effective than search engines. Developing and maintaining a presence in the major engines, such as Google, AltaVista and NBCi, should be a high priority for all organizations. That’s because most individuals seeking a supplier of a particular product or service, and who initiate their search on the Web, are apt to use a search engine as their primary discovery vehicle. Furthermore, a person who uses a search engine in this manner is likely to be a serious buyer, not just a tire kicker.

When charting a plan to increase visibility within the search engines, two strategies should be employed. The first is to ensure that the home page of the company’s Web site is optimized for review by the “spiders” of the individual search engines. The second tactic is to use a totally external approach as a means of driving qualified traffic to the site as well.

First, is your Web site search engine friendly? Search engines send software programs called spiders to a site to gather data on it. The first thing this spider looks for is a file called “robots.txt.” The file lays out the law for each spider. It controls what part of your site should be visited and what should not. If the spider does not find this file, depending on the error code, it could skip the site.

The next hurdle for the spiders is the actual HTML code. Plain code without Java, Perl, Flash or other Web enhancing tools is the best food for a spider. The data that are collected come from HTML tags and text on the page. The data that are missed can be on pictures, or part of dynamically created pages.

Is the site being submitted correctly and at timely intervals? This question has two parts; both are significant. Web sites that are properly submitted and listed in the top 10 engines can produce results in more than a thousand engines. This happens because the smaller and often more specialized engines are meta search engines, meaning they search other search engines’ results. When submitting to an engine that “deep crawls,” submit only one page, and the rest of the work will be done by the spider. When submitting to an engine that only visits the page sent, other pages also must be submitted for review.

Another challenge of the search engines is the frequency of indexing. Some engines may lie dormant, come alive, spider, index and go dormant again. In this case, unless you were submitting during a brief acceptance period, it is not possible to get into the index. Some engines randomly trim their databases. The best way to stay on top of this problem is to submit in regular intervals. Webmasters have to be conscious of their positions in the search engines, and where positions are lacking — attention is needed.

Are the correct pages being submitted? It is not enough just to submit once and forget about it. It is a good idea to create a submission list for your records — include which engines you submitted to, how often, spider visits, positions and which pages were submitted to that engine. Google requires only one page to be submitted, whereas AltaVista and Inktomi require each page to be submitted separately.

Are your keywords embedded in the pages where they need to be? If the only place a company name is mentioned on a Web page is a graphic, then a search engine will not attach your company name to that page. The best way to feed a spider is with text. You can put text anywhere on the page, especially in the title, alt, meta keywords and meta description tags.

In addition to optimizing the site, it can be extremely beneficial to establish positions in the engines through a second approach, which is different altogether. This involves creating positions through domains other than the primary one that the company uses as the platform for its Web site.

When assisting companies with this initiative, we assign a unique domain for every keyword that a company provides us to market on its behalf. The domain name itself is not significant; what matters is that we use the domains to make submissions to each of the engines. Unlike the home page of a company’s Web site, which has to be designed as a one-size-fits-all entity to appeal to each of the engines, the external domain approach provides the opportunity to send different information to each of the engines that is in sync with what that engine is looking for.

When someone clicks on a link that emanates from one of these other domains, the person is seamlessly transferred to the company’s home page, or to a particular page within the site. That’s important, because if the home page deals with multiple product lines or services that the company offers, then it is vital to make sure that the prospect is promptly directed to the data he is seeking (before he gets frustrated and decides to leave the site altogether).

These techniques are key ingredients to leveraging the power of the search engines as a client acquisition tool. They are designed to take advantage of the openings that the engines present to gain entry into their listings and guide interested prospects to sites that are relevant to search efforts.

• Andrew Wetzler is president and Joe Laratro is chief technology manager at MoreVisibility.com, Boca Raton, FL, which provides search engine optimization consulting services. Reach Wetzler at [email protected], and Laratro at [email protected].

Total
0
Shares
Related Posts