Who’s ready for voice search? Is voice search up and ready for marketers?
These questions have picked up over the last year, and one way to answer them is to look at how businesses and consumers are responding to this emerging tech.
Location marketing solution company Uberall, with experience in mobile search, released a report this month on Voice Search Readiness (VSR). It zeroes in on five of the most popular queries about voice search that users search for:
1. What does voice search readiness (VSR) mean?
2. How can I measure voice search optimization?
3. How can a business become voice search ready?
4. What is the future of voice search?
5. Why should I care about voice search?
As the study discovered, most businesses are in the very early stages of their voice search journeys: fact-finding and soul searching. Uberall found that only four percent of business locations are voice search ready. Of the 73,000 business listings analyzed, nearly half of the locations had errors in their “Opening Hours.” The study looked closely at Google, Bing and Yelp to determine the appropriate VSR for a business. Uberall also found that unlike Google, Apple Maps “has very little impact on optimizing for voice search.”
Google has been stingy with releasing data on their voice search. The Uberall study points to the last time we saw numbers from Google, in 2016. Even way back then, one in five searches on mobile were conducted using voice. Uberall found that 21 percent of respondents were using voice on a weekly basis. Ten percent use it daily. Meanwhile, 57.2 percent never use voice search.
At the same time, these searches aren’t just happening now on mobile. From 2017 to 2018, Uberall found, the number of voice-operated smart speakers in U.S. homes climbed from 66.7M to 118.5M.
How many of the non-voice-using majority are waiting for voice search’s next killer app? Will their voice ever be heard?
As the research points out, voice is now a standard feature in new cars. The stakes are high when consumers are already out on the road to visit a location and ask for directions through a navigation app, using voice. The results come back one at a time, so there are stiff penalties for advertisers who come in second.
Will star power increase adoption rates? As reported this week, Google is launching a voice assistant that sounds like recording artist John Legend. This search experience was engineered by Alphabet’s London-based AI unit, WaveNet, which is also responsible for the company’s Duplex phone bots.
AI-powered conversational advertising and commerce, on the other hand, is moving forward aggressively. It’s just not tied directly to voice search.
Conversational AI platform Amplify.ai is supported on Facebook, Facebook Messenger, WhatsApp, and Instagram, as well as other desktop, mobile web and in-app integrations. As of last week, they also linked up with AdLingo, a conversational advertising platform developed by Google’s experimental unit, Area 120.
Amplify,ai’s chief marketing office, John McCrea, told me that, currently, beauty brands have been early adopters for these ads, “with use cases such as having a conversational interaction to help a consumer find the right product for their skin or hair type.”
McCrea explained, “We’re in the early phases of a massive shift toward messaging-based Conversational Commerce, something that started in China and is already at multi-hundred billion dollar scale there, and which is now spreading around the world, driven by the dominance of messaging. Key to unlocking the potential of Conversational Commerce is AI, which enables B2C conversational engagement at superhuman scale — and our industry-first enterprise-class Conversational AI platform, which begins to make possible engaging with prospects and consumers via messaging all along the customer journey.”
McCrea added, “Our integration with AdLingo is focused on the beginning of that journey, when a consumer connects with a brand via a display ad. Through our integration, the experience within the ad becomes truly conversational, engaging the prospect and bringing them down the funnel.”
Voice search would only enter into this engagement if it was the source of search results within a web page. “Thanks to very advanced speech-to-text and text-to-speech, we can think voice and messaging as two modalities for Conversational AI,” McCrea elaborated. “The major differences that arise are ones of user experience (UX) design, with one being visual (and therefore capable of combining rich media with text) and one being auditory. Of course, there is the ability to combine these, such as in the case of voice search generating a visual result.”
Augmented call centers
In terms of further down-funnel CX-related interactions, AI is still primarily used to competently handle repetitive tasks to free up human reps so they can deal with more complicated customer issues. AI-driven chatbots, however, are rapidly improving, according to Jeff Epstein, VP of product at omnichannel CX solutions provider Comm100.
“AI has completely elevated what chatbots are able to handle,” Epstein told me. “It’s the ability to understand not just the words in a question, but the customer’s intent behind them that means the difference between a good chatbot and a great one, and with Natural Language Processing and machine learning, chatbots can reach beyond the simplest inquiries to handle increasingly complicated customer requests.”
AI will also use its ability to interpret intent to directly assist call agents. “In this use case,” Epstein said. “the AI will ‘listen’ in on chats, deduce the visitor’s intent, and surface appropriate responses from knowledge-base articles, live chat canned messages, previous chat transcripts and chatbot programming for the agent to then choose to forward to the visitor. In 2019, it’s this latter application of AI that will usher in the ‘Rise of the Super Agent,’ and make human agents faster and more resourceful than ever.”