“If you’re not currently doing it, you’re not actually behind. But you need to be aware of it.”
That’s Seth Grimes, founder of IT strategy consultancy Altaplana, on the topic of applying sentiment analysis to video and image data. And if Grimes says you need to be aware of it, you need to be aware of it. He must be the digital world’s most tireless, resilient, and voluble advocate for digging business value out of online attitudes, opinions, and emotions–it all comes under sentiment analysis for him.
The issue is incapable vendors misdefining sentiment analysis
His Twitter feed is a constant source of trenchant opinion–recent example: “The issue is incapable vendors misdefining sentiment analysis as +/- scoring, excluding emotion & intent analytics”–and around this time of year, it also serves to fire up the crowd for the Sentiment Analysis Symposium he curates annually in New York. With the event coming up next month, it seemed to be a good time to get Seth on the phone and find out what’s currently exciting him.
He cites intent data–something we’ve seen marketers struggling to leverage. Sentiment analysis, by definition, goes deeper than tracking topic mentions, but when it evolves from scoring emotions to evaluating signals of intent–“how people’s language indicates what they’re going to do,” then he sees “a lot of overlap with the customer experience use case.” In other words, sentiment analytics have a predictive role to play when it comes to plotting the customer journey.
Biometrics, interpreted correctly, don’t lie
Grimes is also paying attention to wearables, which have the potential to generate a rich, real-time stream of data relevant to determining mood. “Biometrics, interpreted correctly, don’t lie,” said Grimes, offering the example of eye-tracking, which can determine what people are actually looking at, rather than what they believe or report they’re looking at. Given the track record of simple biofeedback machines, where blood pressure or skin temperature signals furnish actionable data on response to stimuli, we’ve hardly begun to think about what wearables could tell us.
When I profiled Lexalytics recently, I was struck that a major player in the text analytics field was not seeking to apply sentiment analysis to video or images. Acknowledging Lexalytics point that text will remain a major challenge, Grimes nevertheless talked about the growing interest in “omni-channel” analytics, covering not just text, but personal interactions. He mentioned Affectiva‘s project of coding facial expressions, as well as Facegroup’s work to develop Pulsar as a visual insight tool, analyzing images on Instagram, Pinterest, Tumblr and Twitter.
And then there are the hieroglyphics. That’s how Grimes refers to emoticons and emojis–visual emotive signals which also constitute a data stream to be analyzed.
It was at this point that he agreed that not everyone needs to be on top of visual sentiment analytics just yet. As for what brands should be doing, they should be looking at whether their social media management solutions–most of which incorporate at least basic sentiment analysis–go deeper than just binary scoring of positive and negative mentions, and also look at context.
Some verticals should be looking for solutions with domain adaptation capabilities; in other words, the capacity to learn the vocabularies specific to, for example, the hospitality sector, or electronic goods, or healthcare. Most brands, of course, won’t want to add a sentiment analytics product to their own stack–not unless, Grimes said, it’s the kind of brand which already has “an analytically-minded team.” But they should know what their social media partners, or the agencies looking after their social listening, are doing to monitor feelings and intent.
One more highlight from the upcoming symposium? MotiveQuest and “online anthropology”–the application of higher level motivational concepts to provide a context for the analytics. Always at the edge.