Hitmetrix - User behavior analytics & recording

Sentiment: beyond the text

“All the world’s an Internet,” I recently wrote, and at this year’s Alta Plana-organized Sentiment Symposium in New York, the cutting edge presentations focused repeatedly on a vision of individuals as packets of information, distributing actionable data on their vectors through the physical world, just as they leave the trace of their browsing activity online.

Not Just the Text

In the future, sentiment analysis won’t be only textual. That was the clear message. Of course, there’s going to be no short-term shortage of text-based data, generated especially by the global social conversation. But where once there seemed to be insurmountable problems differentiating sincere assertions from sarcasm, irony, or humor, vendors now seem to have text analytics well in hand. 

We heard from Lexalytics about leveraging machine learning to improve the accuracy of parsing natural language syntax. We heard from Kanjoya about using machine learning to develop probabilistic language models to track sentiment as a more accurate and flexible alternatives to keywords. To put it simply, a machine which recognizes phrases associated with positive/negative attitudes, and which continuously expands and corrects its own semantic understanding, is more useful when it comes to large data sets than a machine which can search for “hurray” or “boo” if instructed to do so.

Positive and negative polarities are not sufficient

“Positive and negative polarities are not sufficient and not really useful,” said Jarred McGinnis of Meaning Cloud. It’s important to know who is talking–to turn “strangers into acquaintances”–which implies close links between sentiment analysis and customer profiling. But as McGinnis says, “There’s too much content. We can’t do this any more at a human scale.”

Not Just Analytics

Sentiment analytics may be well in hand, but the data–not so much. Another emphasis repeated by symposium participants over the two days was on the importance of data quality. I kept hearing the same message I’d heard at the Integrated Marketing Week conference last month: big data isn’t enough–data needs to be aggregated and used in a controlled, business-directed fashion. Rob Key of New York-based agency Converseon, emphasized–both on stage and in conversation with me–that “data quality is the issue.” Accuracy in gauging customer sentiment had been an “uphill battle over the last few years,” but Converseon does have a commitment to human as well as automated analysis, and claims to be generating cleaner data sets directed to specific business outcomes–precisely by not trying to handle all the data out there.

Data quality is the issue

There’s now conclusive evidence, according to Converseon, that sentiment is causal when it comes to sales; it’s no longer merely an interesting curiosity. The data is especially valuable when it’s predictive, and Converseon claims an 80 percent correlation between sentiment analysis based on clean data sets and brand tracking results–and brand tracking can be expensive when based on traditional survey methods.

Anjali Lai of Forrester’s Data Insights Innovation team echoed the need for human input. It’s one thing to gather vast quantities of social listening data, but quantitative metrics along won’t explain it to you. Lai proposed a portfolio of methodologies. To quantitative analytics, she adds survey, behaviorial, and MROC (marketing research online communities) data. “It’s not about the accuracy of the analytics tool,” she said: a multiple source approach is necessary.

The “Real” World

But why linger on these enduring topics of natural language processing and quantitative analytics when people are expressing sentiments in their real lives–to the extent people still exist outside Twitter, Facebook, and Instagram? Faces and bodies dominated a number of sessions at the Symposium. Gwen Littlewort showed how Emotient brings deep learning in the cognitive sciences to the practical business of mining sentiment from facial expressions. She demonstrated the identification of individual emotions through a live video stream of her own face; more ambitious was pre-captured video showing how Emotient can detect sentiment and attention (where people are looking) in real time, and on the scale of large crowds. The obvious use case application is to evaluate the effectiveness of advertising at something like a sports event or a concert.

Cultural differences in response?  Affectiva has that covered. Principal Scientist Daniel McDuff explained that the MIT spin-off is creating a global emotional data repository, currently at 11 billion data points and counting, from 75 countries. The more data gathered, the more accurate its emotion-detection algorithm is, across cultures.

This is the future of marketing

Scott Amyx, however, isn’t stopping with your face, founder and CEO of Amyx McKinsey, an agency strategizing for smart wearable and the Internet of Things. How does that connect with sentiment? The Internet of Things, said Amyx, is generated “hard, quantifiable” brand engagement metrics by tracking individuals’ offline journeys. While survey data has a lag effect, but electro-dermal activity, which some researchers have used as a surrogate for emotional states, can be tracked in real-time; for example, by smart fabrics. And then there’s blink patterns to be evaluated by headsets, gait patterns to be derived from pressure-sensing soles, as well as facial analytics and the detection of emotion in speech.

Is this just futurist talk? According to Amyx, unnamed major corporations are reviewing these options right now. “Online metrics can be recreated in an offline context. This is the future of marketing.”

Don’t Say Orwellian

Of course, the prospect of corporations–to say nothing of governments–being able to track our emotions through our waking, and perhaps even our sleeping lives, is not a prospect which afford unalloyed delight to everyone. Arria is a natural language generation platform. It automatically creates text content in response to data. The potential of producing texts tailored for individuals, in real-time and at scale, is a game-changer for commercial messaging–especially when linked to agile, precise customer profiling tools. For example, it’s possible to deliver product descriptions customized by an individual’s personal product research (plus demographics, etc).

But CTO Robert Dale posed some questions about how these tools should be used. Are individuals comfortable with swift messaging, in effect “targeted at you“?

The potential to capture sentiment and intent offline as well as online, and use the data to inform laser-focused marketing strategies, is scientifically fascinating (there were plenty of recent and current academics in this audience) and commercially exciting. It also has the potential, in turn, to generate further sentiments–and sentiments brands won’t want to hear.

But good luck in getting this genie back in the bottle.

Total
0
Shares
Related Posts