Hitmetrix - User behavior analytics & recording

From search intent to sentiment analysis

In show business, it is said that there is no such thing as bad publicity – as long as they spell your name right. While the merits of such a statement are debatable, the rise of the Social Web requires that online marketers everywhere start learning a thing or two about measuring buzz.

To level the playing field, I define the Social Web as a mass of interlinked, user-generated content born out of pervasive broadband access and facilitated publishing platforms, such as blogs and YouTube. What does this look like in real life? Your television commercial is all over the Web, but it just so happens that the work of a 13 year-old with decent video editing skills has outranked you on the engines. And what he and his audience are saying about your brand does not always fit neatly within the agency’s creative brief.

For the old regime, the Social Web is a PR nightmare. Time-tested techniques go out the door when customers’ voices carry as much – if not more – weight than your own. However, what one gains is much more valuable than iron-clad control. The conversation between searchers and content generators is a cup of insight that runneth over.

Or it would be, if we could just crack the sentiment analysis nut. While there are multiple firms that measure online buzz and the early stages of tone, I have yet to be satisfied that we understand the online conversation, let along how it affects brand perception and sales. Let’s break this down. Assuming your firm has invested in search engine marketing, it is probably someone’s full-time job to understand search intent. When searchers use the word “bad” does it mean that it is, in fact, “good”? Or is it just plain “bad”?

Understanding an individual’s blog is a bit easier. The context allows one to assess the tone of an individual post, or even the collective comments to that blog post. What is hard is to assess the impact of the hundreds of blog posts and the thousands of comments. While I would love to believe that the answer lies in technology, I’m afraid that we aren’t quite there yet.

I’m not alone.

One firm I recently spoke with finally agreed that its automated sentiment accuracy sits just under 80 per cent, primarily due to the fact that sarcasm is widely used by bloggers. Buzzlogic, a firm that plays in the buzz monitoring field, also notes on its FAQ page that “During our beta period, our customers shared that the automated nature à rarely was able to detect nuance or sarcasm in [blog] posts – so the results were often inaccurate. BuzzLogic enables customers to rate the tonality of content themselves and run reports to see how their actions impact the sentiment of influential bloggers over time.”

While I do hold hope that we continue to inch forward, it appears that sentiment analysis, like all technology, is only as good as the humans that develop it.

Total
0
Shares
Related Posts