Hitmetrix - User behavior analytics & recording

Stop Relying on Search Engines, Diversify Your Keyword Strategy

Whether prospective customers are Googling, Binging, Yahooing, or using any other search engine, publishers, marketers, and advertising networks are using keyword tracking to provide users with more relevant information and advertisements. However, with a fundamental change in how search engines are tracking and sharing information, access to data is weakening.

Since offering AdWords, Google’s search data has been widely available, allowing publishers and advertisers access to keyword analytics and rankings that enable the creation of customized content and targeted ads – fueling the evolution of real-time intent and real-time bidding. According to a recent comScore report, Google processes 66.7 percent of the web’s searches, sending billions of consumers to websites every day. However, in the past couple of years Google has erected walled gardens as a result of increased privacy policies.

While seemingly benign at first glance, this change, among others, negatively affects thousands of sites and companies that rely on this data. Changes like the one Google has made leaves a gaping hole in the ability to identify the source of traffic or the search terms that led visitors to their site. As a result, some websites and publishers are unable to personalize the user’s experience, thus losing their attention quicker and potentially serving them a non-related advertisement, as they do not have the user’s real-time intent to base their decision on.

For marketers and publishers, these changes mean that they’ll need to look for additional ways to decode users’ real-time intent. One way to accomplish this for a specific website is by implementing tools that track users’ behavior and activity while there. In doing so, publishers are able to analyze how a user navigates through a website, opening up the opportunity to track the user’s intent data. Overall, real-time intent is and will continue to be the most important data for marketers and publishers.

With the ever-changing privacy concerns, publishers and marketers also will need to look towards various approaches to compile user search data. To do this, marketers will want to diversify where they are getting their website traffic data from, which is a good practice to implement in general. By having multiple data sources, marketers are able to view the large picture of top keywords that they should be paying attention to. Indeed, a recent Forrester report, Why Amazon Matters Now More than Ever, found that 30 percent of online consumers are already using other approaches, including Amazon, to research their online purchases, while only 13 percent of online users are researching the product online through search engines.

To choose the best keywords for their websites, publishers and advertisers should follow these four simple steps to confirm that the keywords they choose are the best.

  • Become the user/reader – Think ahead. What additional information might the reader be looking for? What assumptions are being made within any content on a publisher’s site that needs additional information to support the idea?
  • Know the site’s mission – If the publisher’s mission statement is to be the primary source of information about a particular topic for your readers, then the publisher should create links for most/all relevant topics. If the site’s mission is solely to make money, then focus on creating links to destinations that make you more money, such as high RPM pages or affiliate destinations.
  • Be specific – Choose keywords that are explicit to the reader so that they don’t become skeptical of links and buttons that don’t clearly communicate the destination.
  • Know your links – Take a look at high-performance pages. Review the links, number of links versus the number of words in the article, how it fits in with the publisher’s mission, and if the keywords make sense with where the link goes.

Marketers will also want to invest in building their website’s relationship with other sites, social networks, and syndication partners. The data compiled from these sources will help maintain the ability to personalize the user experience and will provide visibility into where readers are coming from based on the clicks to partner sites. Finally, websites will want to watch for changes in website traffic patterns in addition to referring terms. By analyzing the traffic patterns around new content, archived content and around the overall interest in specific items on the site, publishers will still able to learn deep insights about the user’s intent, regardless of the actions taken by Google.

Pete Sheinbaum is the founder and CEO of LinkSmart, Inc. Follow him @sheinbaum and @linksmart on Twitter.

Related Post: Search Engine RedZee Switches From Text to Visual Results

Total
0
Shares
Related Posts