CCPA: Industry Concerns, Consumer Fears

In the second part of a two-part feature, we hear from some industry voices about CCPA. The interviews were conducted before the bill was passed unanimously late last week.

Neil Sweeney, founder and CEO of Freckle IoT, told DMN, “The government now has to put forward more aggressive measures and fines to get the attention of the companies who are using this data. Today’s platforms have an open-ended license on identity and they are finding increasingly new ways to invade privacy.” Google Home, a recent rival to Amazon Alexa, has already run into controversy regarding its data security.

Speaking to the likely effects of the initiative on marketing, Sweeney argued, “The CMO needs to understand that the future data management platform (DMP) will not be Oracle, Salesforce, Facebook or Google. Instead, it will be a micro-DMP, owned and controlled by the consumer. People speak about monopolies, but in reality, the monopolies of today will collapse due to the greed around data and the ushering in of new legislation and new models that will upend the companies who could not lay off their current way of operating.”

Victor Wong, CEO and Founder of Thunder Experience Cloud, told DMN that the initiative could negatively affect consumers by trickling over into entertainment and diminishing ad-supported content. “Heavy consumers of ad-supported content are of course most affected and this is the majority of people,” Wong explained. “Few people, if any, exclusively pay and consume completely ad-free content. Of course, just like how losing some digital privacy is a silent cost you don’t directly see the impact of, losing content due to worse advertising economics from poorer targeting and measurement can also be hard to directly feel by any one consumer.”

Wong also pointed out that state-by-state regulations are a potential headache for CMOs. He said, “CMOs operate nationally or globally so having to abide by state-level privacy laws will be a nightmare. Every marketer wants to have ethically sourced and responsibly used data but they also want clear standards across as many markets as possible so they don’t accidentally violate regulations or have to excessively invest in compliance across too many standards.”

Rich Kahn, CEO and co-founder of eZanga, suggested another solution. “If you’re going to use my personal data to make money off me, at least cut me a check for it,” he told DMN. “I think if that process was monetized for the consumer whose data was being bought and sold, magically you’d have a lot more people okay with the level of data being passed around on their behalf.”

The Power of Data

Senator Richard Blumenthal, a Democrat from Connecticut, referenced GDPR when he stated, “Americans deserve no less privacy than Europeans.” However, in order to ensure that new regulations are proportionate to the actual threats posed, the public needs to fully understand the severity of data beaches, the context surrounding them, and the power of data mining.

Data is powerful, but have some of its uses been overhyped?

Cambridge Analytica demonstrated that people’s personalities and political persuasions can be algorithmically deduced from seemingly minor data, such as Facebook likes. However, psychometrics, in its current form, may have been exaggerated by those with vested commercial interests in its many applications. Aleksandr Kogan, a researcher at the heart of the Cambridge Analytica scandal who also received Russian government grants, has rejected the notion that harvested Facebook data was successful in swaying the election, dismissing the idea that they orchestrated “some kind of mind-control project” as science-fiction. Essentially, corporate puffery could have led to disproportionate fears, and those fears have now informed lawmaking.

People are now struggling to get a sense of what is and is not possible through data mining. Senator Mazie Hirono of Hawaii observed that the predictive value of technological vetting programs appears to be questionable and problematic. Cambridge Analytica whistleblower Christopher Wylie confirmed this, casting doubt on the idea that algorithms can easily weed out criminals or terrorists. In his testimony, Wylie told Sen. Hirono that “there is no mathematical way to determine whether someone is a bad person.” He then pointed out that even the most advanced neural network would fail at this task if the underlying training set for that algorithm used systematically biased information. Skewed statistics and incomplete information can pollute the accuracy of predictive technologies. Conclusions appear to be reputable and mathematical, when they’re merely a reflection of social and moral biases and human errors.

Politicians and citizens have to bear this in mind when they evaluate new technology regulations, such as state-wide initiative. Otherwise, they’ll be responding to unsubstantiated fears. By crafting legislation to address types of data abuse that are currently technologically implausible, they might overlook more immediate dangers. However, for the time being, all eyes will be on the new legislation out of California. The recently acquired consumer rights in that state could soon extend to the rest of the nation.

Total
0
Shares
Related Posts