Hitmetrix - User behavior analytics & recording

“Reasonable Use” Is the Data Litmus Test

Where does the responsibility lie for customers’ privacy in terms of data used for marketing?

Imagine you’re running the biggest search engine in the world. Aha! Someone is asking about Lady Gaga’s next concert. Perfect. Not only do you know the answer, but there’s a broker in Cleveland that has tickets and an advertisement just waiting to be served up. The user clicks and everybody wins. Wait, another search is coming in. This time someone is looking for information about depression. Thankfully, you know about that too and content is on the way; but what ad should you run? It is at this point that we see the jagged edges of predatory marketing. If you had no conscience, you would listen to a statistician and an algorithm. They might tell you the ad that your melancholy customer is most likely to click on is a two-night getaway to Las Vegas with a $100 match play coupon for the blackjack table. Or maybe suggest the vodka ad with the title “Escape” on it. Statistically, these would be better choices than the ad for a book about depression, or an ad for a spa treatment; but at what point do ethics trump mathematics?

The world is in a great transition. Every day we leave a bigger trail of data online. People are becoming more “knowable” at a distance to friends, colleagues, strangers, and, yes, marketers. The challenge for marketers is shifting from discovering information about someone else to marshaling the restraint to not be manipulative. The Web knows more about us than our closest friends do because people “confess” to the Web. They ask search engines questions they would be reluctant to utter to their best friend. They also chronicle their lives 140 characters at a time. In addition to the voluntary information disclosures that we make online to companies, search engines, and social networking websites, there has been a massive digitization of public records that are now easily searchable online. In less than 10 minutes you can find out where someone lives (DMV), how much they paid for their home (Zillow), what their grandfather’s occupation was (Ancestry.com), and where they had breakfast last week (Foursquare). Combining this type of information gives us fine-grained insights into how to market to someone. Should we run the ad that highlights the car’s safety features or the one that shows the elegant couple getting out of the car to walk the red carpet with the sound of camera bulbs flashing in the background? Statistically, there’s a right answer to that question if you mine enough data about your target, but should we act on it?

The politics of privacy

The central issue is a user’s expectation of privacy. The most basic question around privacy and marketing is this: Can you use a specific piece of data that a customer has given you to market to them? Legally, the answer varies from country to country, and based on what the user agreed to when this information was first collected. The question of how you can use information collected from a user for marketing is complex, but things become even more murky when you consider the data you can collect about a person that they never explicitly volunteered, such as information collected from other websites. Some of this data used for personalizing the user’s experience has become tolerated, and even commonplace. Bringing up products that were in your search history or customizing a landing page is par for the course now for many online retailers. There’s a fine line, however, between “customized” and “predatory.”

As we enter the era of Big Data analytics, privacy has become a flashpoint issue. Many believe that we’re on the verge of stricter legislation within the U.S. to rein in how companies and the government use customer and employee data to make decisions. In areas like healthcare and personally identifiable information, the government has stepped in with legislation on both the handling of this data and the notification of customers if the data is lost or stolen. But the responsibility for protecting customers in the face of ubiquitous analytics cannot be the government’s responsibility alone.

The data analytics field is moving quickly, and the legal process moves slowly. The holders of data need to move towards self-regulation or risk a heavy-handed and innovation-stifling piece of legislation passing. The litmus test should be: How would a sensible user reasonably expect you to use this data? Barring direct legislation, the reasonable expectation test is a good guide. It can be difficult to repair a trust relationship with users if they think you’ve gone too far.

********************

Herbert (Hugh) Thompson, Ph.D., RSA Conference US

Dr. Herbert Thompson is a world-renowned expert on IT security. He was named one of the “Top 5 Most Influential Thinkers in IT Security” in 2006 by SC Magazine and has been interviewed by the BBC, CNN, Financial Times, MSNBC, and The Washington Post. Thompson has been an adjunct professor at Columbia University; advisory board member for the Anti-Malware Testing Standards Organization; editorial board member of IEEE Security and Privacy magazine, and SVP and chief security strategist at Blue Coat Systems. He is the author (with Bob Sullivan) of the just published book, The Plateau Effect that looks at why people (and businesses) get stuck and how to break through. “It was a great journey and gave me the chance to talk to some of the most respected psychologists, business leaders, economists, and scientists from around the world,” he says.

Check out the other answers:

Total
0
Shares
Related Posts