(In)direct Response

Marketing results are rarely as straightforward as they might appear at first glance.

During a recent review of fresh marketing-performance data, Regus VP of Marketing Rebecca Tann and CEO Mark Dixon saw that since the company had kicked off its latest campaign, marketing leads had increased by 98%. Regus, which provides office suites, virtual offices, and conference spaces for remote workers, had invested in direct response TV (DRTV), radio, and online marketing efforts for the new campaign. The catch was, most of the prospects had come to Regus through search and online ads, not through the phone number or the specialized URL in its radio and TV spots.

“So, [Mark] looked at the pay-per-click and website returns and said, ‘Those are through the roof—you don’t need media,’” Tann recounts. But digging deeper into the data revealed that analyzing results isn’t that simple.

Although many customers took their final step toward signing up with Regus through the website and searches for terms like “flexible workspace,” the leads seemed to spike during periods when the company had been running DRTV and radio spots, which suggested that these channels were also driving online traffic.

Suspicions were confirmed through data pulled from the dropdown menu the marketing team had placed on Regus’s landing page. The menu asked visitors to specify where they had heard about the company, and it turned out that DRTV and radio had been major drivers of Regus’s online traffic during the campaign.

Many marketing executives are having discussions with their CEOs similar to Tann’s with Dixon. Direct response attribution used to be almost as easy as including a phone number or coded Web address on an ad. A consumer would click or call, and the path to purchase was clear. But consumer behavior has become much more complex in recent years thanks to review sites, social media, and the mobile Web. Most notably, consumers not only hop from channel to channel, but also research products while channel shifting, which blurs what used to be a more linear purchase process.

In this altered purchasing landscape marketers have to rely on faster data gathering and more actionable metrics to get a clearer picture of what makes an ideal marketing mix. As a result, marketers are increasingly viewing direct response ads not only as sales drivers, but also as branding and awareness-building tools.

Untangling the knots

This blur between branding and direct response means that marketers need ways to measure indirect lift. As Regus found, just because a piece of advertising with a call-to-action doesn’t immediately lead to a conversion doesn’t mean its design failed.

The trick is to attribute value to something when the consumer doesn’t follow the prescribed path-to-purchase. It’s an issue that AT&T Mobility is working out. The telecommunications giant’s direct response creative, designed by the Hacker Group, includes interactive ads for mobile devices like the Samsung Galaxy S Skyrocket or the HTC One X, inviting users to click for more information.

But these clicks aren’t always immediate—consumers may be too wrapped up in the piece of content the advertisement is funding to interact with the ad. Dana Cogswell, executive director of marketing for AT&T Mobility, says that the company has restructured its attribution efforts to recognize direct sales, as well as indirect lift, to other channels, including email and postal mail.

“There isn’t one marketing asset that’s having the most impact,” Cogswell says. “It’s optimizing the marketing asset mix that’s critical to ensuring our sales success.”

Currently, AT&T relies on control groups and statistical analysis to determine how best to attribute sales and lift. But Cogswell cautions that brands trying to work through attribution issues need to maintain vigilance. “What we’ve found is that this space requires constant testing and optimization,” he says, “What you thought worked great one quarter ago, or sometimes even last month, might be overtaken by something new.”

And the learning process is ongoing. Cogswell is aware that he doesn’t know as much as he wants to know about how certain creative elements affect customer actions or buying propensities. “In the future we think we’ll be able to map more clearly which customers have interacted with these social tactics and then subsequently purchased,” he says.

In a way the multiplicity of digital channels has created a contradiction for direct response marketers. By dint of their work, they want results quickly: How many individuals responded to a certain direct response ad? But the indirect paths by which consumers move from consideration to purchase means those results are delayed, forcing marketers to adapt the double goals of short-term sales and long-term brand building. And when marketers build attribution models and strategies, they must take into account this duality.

Slow and steady wins the race

This is precisely what Regus did when it entered the market. The company has been active in Europe for 25 years, but is a relatively new brand in the U.S. Its main goals throughout 2011 and 2012 included attracting customers and raising its profile throughout the country.

Additionally, the company is dedicated to promoting its 700 brick-and-mortar locations in recent months. At the start of 2013, with brand awareness higher and 90% of its spaces occupied, Regus’s marketing spend shifted to more local direct response advertising, trying to fill these specific locations.

“Having a national campaign has given us a good foundation for what works, and we can parlay that recipe into local markets,” says Regus’s Tann. When the company ran national TV and radio campaigns, it saw that its branded Google searches went up 30%. Even when directing customers to a specific URL, prospects often use search to find the company. Regus installed a query on its website asking how visitors had heard of the company; it also added a similar question to its contact center interactions.

So how does Regus determine which elements of its marketing are actually driving value? Largely, by being patient. Consider Regus’s efforts in Manhattan: In the second quarter of 2013 it ran a campaign that combined direct response radio and TV spots with more localized creative and calls-to-action; these included targeted online ads, as well as promotions on outdoor assets.

“Since our properties aren’t branded on the outside, we try to advertise on the actual building,” Tann explains. Regus also ran promotional print ads in local publications, and mailed postcards with the offer of a free day pass to the company’s Midtown Lounge, which opened in May, for those who scanned a QR code or visited the special URL regus.com/747 (the office is located at 747 Third Avenue). The ads also included a contact phone number and email address.

Although marketers typically want data quickly, in this instance Regus found value in stepping back and looking at the performance of the quarter’s overall campaign activity. By looking at the total marketing performance of its New York City campaign over a period of about two-and-a-half months, the marketing team determined that the top-three sources of conversions were radio, print, and outdoor signage. This information will be applied to Regus’s Q3 campaign when it launches.

Marc Solomon, director of analytics for marketing firm Catalyst, applauds this long-view approach to data gathering, and cautions against getting caught up in the minute-to-minute feedback that vendors’ dashboards and ROI trackers can provide.

“When you’re trying to assess the effectiveness of a particular marketing program or channel, you need to look over significant period of time,” Solomon says. “It may not need to be quarterly, but the program should be measured in weeks, not hours.”

When direct response isn’t direct

Regus was savvy in its approach of gradually aggregating user feedback to determine the efficacy of its direct response assets. But for many marketers it’s still quite difficult to know what exactly a customer is responding to.

Men’s clothing retailer Bonobos, for example, needed to determine where its customer journey began to properly allocate marketing dollars and resources across direct response and branding-oriented channels. “If we were solely going on a last-click basis…when [customers] typed ‘Bonobos’ into the URL bar, then that would receive credit,” says Craig Elbert, VP of marketing at the retailer. “What we want to know is, how did they first hear about the brand, where do we build that brand awareness?”

Bonobos works with marketing attribution firm Convertro, which encourages the company to think beyond crediting the last click and instead to understand the data as customer narratives. This is because the path-to-purchase for a first-time buyer is going to be very different than that of a repeat customer.

“For the new customer, we want to know how they were first exposed to the brand [and] how did we create the demand?” Elbert says. “[Whereas] with a repeat customer, we know that they heard about us because they purchased before, so we want to know what nudged them across to make their most recent purchase.”

Working with Convertro, Bonobos devises various rule sets based on what types of stories it’s trying to tell. For a new customer, if his first touch is branded search, it doesn’t help marketers to know how he heard about the company. Looking at a second or third touch is necessary to see what other information the company can derive.

Convertro recommends looking at advertising more deeply than on a channel-by-channel basis; in fact, the company breaks each marketing channel down into three levels. For instance, a direct response spot on CNN would be categorized first by marketing channel (TV), then the particular network on which it ran (CNN), and finally the specific spot. Through this, an algorithm can make precise suggestions of actions to take going forward. For example, analysis of one campaign’s results might prompt a recommendation to decrease all budgets except social and DRTV by approximately 37%, while raising TV $82,000 and social $62,000 to grow profit by about 7%.

It’s a different approach from the marketing mix modeling (MMM) that companies like Regus use. MMM is popular because it looks at how marketing activities and external factors affect sales, thereby determining the value of various marketing endeavors. How much lift, for instance, will a certain DRTV campaign provide sales? How has the emergence of a new competitor affected conversions?

Constant vigilance

Whatever approach marketers take to determine the impact of direct response mechanisms on sales, the data deluge means that understanding where customers are coming from and why they’re buying requires constant vigilance.

“We pride ourselves on being a data-driven organization,” says Andrew Mok, senior manager of business intelligence at RelayRides, a car-sharing marketplace in which individuals can rent their cars to others in the network by the hour or day. As the service that RelayRides sells is a local, community-oriented one, much of the brand’s marketing is geo- and demographically targeted. Its direct response channels include email and search engine marketing, and it invests heavily in search engine optimization (SEO). But when looking at the data for RelayRides’ clients, more than 50% of customers’ final clicks before purchase come from owned media.

Working with marketing attribution firm Domo, Mok and his team have been able to get all that data into one dashboard, drawing on five to eight sources at once, which the team tracks “religiously” according to Mok: “Every hour, on the hour.” The software also allows RelayRides to connect its marketing campaign data to revenue, assessing the return from various campaigns and the value of the customers those campaigns attracted.

“We try to get down to the dollar portion of it, looking at conversions [and] ROI,” Mok says. “If we spent $10,000 investing in an SEO campaign for owner acquisitions in New York City, then how much did we really get in dollars?”

“You have to track the source of the lead all the way through to the revenue system,” adds Heather Zynczak, CMO of Domo. “I can click and say, ‘LinkedIn is giving me a strong return, while search is poor. I can holistically look at money I spent and attribute it.”

Related Posts