Redefining Data Privacy


I think that right now is a dangerous time to be a direct marketer,” says Jay Edelson. He should know. The Chicago-based attorney makes his living filing class-action suits against companies that skirt privacy laws. Well, it’s not always privacy laws, specifically. No comprehensive national privacy laws exist outside ones like COPPA for child privacy rights, HIPAA for health information, and FRCA for financial data. Edelson has called on pre-Internet-era legislation like the Video Privacy Protection Act of 1988 and the Telephone Consumer Protection Act of 1991 to score a reputed billion dollars worth of judgments from federal civil courts. He’s currently applying the latter law’s provision against phoning and recording people without permission in a suit against Caribbean Cruise Lines. Ever pick up your phone during dinner time and hear a foghorn go off in your ear? So have millions of others. The TCPA carries a penalty of $500 per violation, and cell phone owners can sue for up to $1,500 per unwanted contact. Edelson figures that the ultimate judgment in this one case could top $1 billion.

Stu Ingis of the Venable law firm in Washington, DC—ranked by several legal guides as one of the top privacy attorneys in the United States—brands lawyers like Edelson as bottom-feeders able to convince courts to see things in laws that are not there. “Look at the Video Privacy Act,” he says, naming the law that forbids video rental operations from disclosing names of titles rented by customers. “It was passed before an Internet existed. They take laws and statutes and apply them to Web behavior that they never had anything to do with.”

Ingis was heartened by the Consumer Privacy Bill of Rights, which was released in March by the Obama administration. He sees it as a conversation-starter for a national privacy bill that could clarify business liabilities concerning personal data and perhaps apply the brakes to class actions brought by lawyers like Edelson. The President introduced his bill minus any sponsor in the House or Senate, though, in April, the House Energy and Commerce Committee approved a data breach notification bill and Senators Tom Carper (D-DE) and Roy Blunt (R-MO) introduced the Data Security Act of 2015.

Self-governance remains the data-driven marketer’s best hedge against privacy liability, according to Ingis, from data sellers such as Experian, and organizations like the Direct Marketing Association (DMA). The DMA’s “Guidelines for Ethical Business Practice” contains seven pages of prescriptions for the proper collection, use, and transfer of personally identifiable information (PII), and following these golden rules has proven a shield from outlandish fines in the halls of jurisprudence. “In court,” Ingis says, “a stated privacy policy will limit the ability of frivolous class action to succeed.”

Todd Ruback, a longtime private-practice privacy lawyer who recently joined Ghostery as chief privacy officer, concedes that in the U.S. privacy compliance is between industry and the individual. “I have long advocated for a national privacy law,” he says. “We have sectional laws like HIPAA and COPPA and we have state laws, but it causes confusion and makes compliance difficult and expensive. We’re one of the few Western countries without a national law. Uganda has a privacy law; we don’t.”

The question, then, hangs in the digitally charged air: With mobile devices disseminating PII to all points of the globe, with billions of connected devices collecting data in every crevice of every home, and with class-action attorneys sifting through that burgeoning data flow like 49ers during the Gold Rush, will self-governance be enough to protect digital marketers from being blindsided by lawsuits or—perhaps worse—skewered in social media in the years ahead?

Keep your word

The majority of judgments levied against marketers through Section 5 of the Federal Trade Commission Act—the workhorse of privacy enforcement—involved a company’s not holding to its own privacy policies. Section 5 is known as the “unfair or deceptive practices” clause, and failure of a business to keep its word is considered a deceptive practice. PLS Group, a chain of loan stores, for instance, did not adhere to its policy of shredding customer financial records before depositing them in dumpsters and was forced to agree to a $101,000 civil penalty and 20 years of FTC audits. Like Edelson, the FTC takes a slightly circuitous path to policing privacy. Unlike Edelson, the FTC’s judgments are not so punitive.

Ruback says that the FTC has a powerful privacy hammer that can levy penalties of up to $11,000 per violation, and that the agency wields it aggressively. It has brought more than 50 Section 5 cases, but it is usually open to settling with penitent violators on much more agreeable terms. One of those terms is the 20-year oversight period which, in effect, forces the offending company to install all the privacy checks and balances it probably lacked in the first place.

Privacy experts in the business community are generally approving of the FTC’s enforcement philosophy, which includes a meeting of minds with marketers on an opt-out strategy to clear permission for collecting data. One of those is Tony Hadley, SVP of government and regulatory affairs at Experian, who allows that so-called “data brokers” such as his company emerged clean from the FTC’s 2012 investigation into their data collection, handling, and selling practices. “The FTC looked at the data broker and realized that the practice was much broader than it had thought,” Hadley says. “They came back with recommendations that said, ‘If you’re a data broker, make sure you’re telling people who you are and what data you’re collecting, that you’re using good sources, you’re transparent about what you’re doing with it, and you’re keeping it secure. So, that’s what we’re trying to do—implement those very good recommendations the FTC put out there.”

Experian’s extensive privacy policy also makes several promises to customers, among them that it audits its databases for being current, reviews clients’ marketing materials to ensure the data is being used for the purpose stated, and allows consumers to dispute information included in their credit reports. It also vows to terminate relationships with offending clients.

“I look at the Consumer Privacy Bill of Rights and say it’s a misnomer. It should be the Business Data Regulatory Act of 2015. All it would do is regulate marketing data,” says Hadley, who’s of the opinion that it has no chance of becoming law in its current state. “The concept that consumers are fearful is something that is, in a practical way, not true. People live their lives on their iPhones and use ATMs and what consumers know is that the world’s really convenient for them and they don’t see any harm coming from it.”

True or false?

Consumer privacy activists don’t buy either the convenience-makes-it-right or the no-harm-no-foul postulates of data-driven marketers. In fact, one of the more vocal detractors of self-regulation and the Consumer Privacy Bill of Rights (too lenient) applies an automotive metaphor to illustrate where the marketing industry’s wrong on this score. “People drive cars because they make their lives so much more convenient. But they also know that cars get into accidents and injure and kill people. They continue to use them because there are laws and rules built around auto use in the physical world,” says Lee Tien, senior staff attorney for the Electronic Frontier Foundation (EFF). “In the digital world it’s different. If my car breaks down, I take it back to my dealer. If, on the other hand, my dealer’s sold my sensitive financial information, I wouldn’t have the slightest idea that that had happened.”

To marketers, big data means more good information. To Tien and the EFF, big data means more false information. Tien holds that sampling biases are bound to result from “found” data, or third-party data that was collected for some other purpose than that to which the current user is applying it. Storing data indefinitely can also result in such biases, he says, as delivering millions of unwanted and irrelevant messages to people and, in point of fact, making their lives more cluttered and inconvenient. “There are very large problems,” Tien says. “It’s difficult for any person, even a privacy expert, to know what happens with your data.”

The EFF and marketers are in agreement on one thing: They’re both wary of a national privacy law, albeit for different reasons. “We take a jaundiced view of legislation,” Tien says. “There were bills a couple of years ago that had promise, but the federal bills we’ve seen lately are trying to slow the passage of privacy bills in states, many of which have been very effective.” Self-governance, he sniffs, is nothing short of letting the fox rule the henhouse, and other activists are solidly in agreement on that front. Justin Brookman, director of consumer privacy at the Center for Democracy & Technology, applies his own automotive metaphor to the issue: “Right now, private industry is in the driver’s seat.”

Big(ger) data

It’s hard to refute Brookman’s comment—at least here in the U.S.—after hearing from Kitty Kolding, CEO of Infocore. Her data acquisition company collects and sells data in nearly 90 countries worldwide on behalf of such clients as American Express, Cigna, Disney, IBM, Procter & Gamble, and Toyota. To listen to Kolding tell it, data-drivers in the states are enjoying a joy ride on a high-speed locomotive.

“In the U.S. there are about 50,000 sources of data for reaching individuals, a total of about 70 billion records. That’s an enormous amount of data, about 200 records per individual,” Kolding says. “The next highest you’ll find worldwide is only about six or seven per person, and that’s the U.K. and Japan.”

Not only is data on Americans more bountiful, she says, it’s more robust and cleaner, too. The number of variables that can be searched and found for individuals in this country have no parallel any place else in the world. Here in the Land of the Free, even barriers like those erected by emotionally charged sectional laws like HIPAA and COPPA are surmountable. “We had a project not long ago for an adult incontinence product, and you could find dozens of sources for individuals using that product,” Kolding says. “When we went looking for data sources in other countries such as Brazil, we were told, ‘Are you out of your mind?’”

But there’s a privacy practice lesson here, as well. Infocore’s blue chip clientele maintains a high level of restraint and decorum when shopping and consuming in the candy store of American data. “Clients like American Express and P&G are very serious about making sure they are inside the lines on privacy,” says Kolding, and it’s not just because they don’t want to mess up a good thing and see U.S. lawmakers adopt European standards. “The biggest reason that you should be on the ethical side of using data is public opinion of your brand. We’ve been fascinated by the damage social media can do to brands that are blatantly outside the lines.”

The list of respectable companies that are likely to broach that line—intentionally or not—might be bigger than imagined. “I’ve worked with some pretty big online retailers that have inadvertently opened a door and had a breach,” says James Koons, a former information security officer for Army Special Forces who now serves as CPO of Listrak, an omnichannel platform for retailers. “I remember the sentiment at most of these companies. It was like you walked into a quiet office and dropped a flash-bang grenade. There was confusion. They felt violated. When it came down to it, they learned that they had very poor privacy and security practices in place.”

Todd Cullen, former VP of global data for Acxiom, got a more intimate glimpse of the need for strict privacy standards when he joined Ogilvy & Mather as chief data officer in 2012. “For an aggregator, there are always questions on RFPs about physical security and privacy policies. I’m starting to see questions like that here, but just on the interactive side of the business. On the ad side, it never comes up,” says Cullen, who fears that marketers don’t take necessary precautions before launching their data into the cloud.

“It becomes a lot harder to control first-party data if it’s hosted in the cloud. For companies that use cloud-based automation or CRM tools, it’s a constant worry,” Cullen says. “What I’m seeing is clients who are really confused. If you took a snapshot of the data contained within their enterprises, it’s rarely clear where the data originated and under which specific policy it was obtained.”

Take charge

For all the above reasons, the DMA, whose future rides on a vibrant and politically acceptable data-driven marketing industry, urges members to aggressively self-regulate. “If you are without a chief privacy officer at this point, you are far behind,” says Rachel Nyswander Thomas, DMA’s VP of government affairs. “Whether or not legislation moves forward on privacy—and I don’t think it will—businesses have to be on top of their privacy policies every day.”

When speaking to marketing groups, Thomas likes to tell them that their brand reputations are now tied up in data flows, and that if they’ve done their due diligence and can prove it, they should have little to fear in the courts or in social media. She admits, however, that traditional enterprises where data is housed in silos may think they have a handle on data, when in fact they may not. “A wonderful place to look for best practices in privacy is the small startup companies,” she says. “They came up knowing how the technology should and shouldn’t be used. They’ve employed privacy by design and governance from the start.”

But areas of potential data abuse often extend outside of the corporate silo farms and into the organizations that buy data from the silos. The flow of PII will only become more automated and harder to manage with the explosion of the Internet of Things. A report issued by the FTC in January sees the number of connected devices doubling to 50 billion in just the next five years. People on both sides of the privacy issue wonder if legislation could ever keep up with the pace of raging technology. “It’s at a point where tech companies have to be careful trying to innovate because they’ll set up a scenario in which they’ll hold companies hostage to potential liability,” says Ingis, the privacy lawyer. “The class-action bar will jump on any innovation they can apply a statute to.”

But one of the highest-profile members of that bar, Chicago-based lawyer Edelson, argues that his firm is simply protecting the privacy rights of people who are powerless to take on big companies by themselves. A Fair Credit Reporting Act suit that Edelson filed against the search engine Spokeo in 2010 provoked an $800,000 settlement with the FTC. Edelson feels justified in viewing his firm as a de facto consumer watchdog, and he envisions more law firms joining his cause.

“We don’t have the same bureaucracy [as the FTC], so we can make a big impact on privacy violations,” Edelson says. “There are a lot of firms out there beginning to look at privacy issues. The cases can be pretty big.”

Related Posts