Hitmetrix - User behavior analytics & recording

How GDPR Could Have Prevented the Facebook-Cambridge Analytica Breach

March was a wild month for Facebook Inc. (NYSE: FB) and Mark Zuckerberg. In the wake of revelations that data from 50 million users was acquired by political consulting firm Cambridge Analytica without their permission, the company’s stock plummeted, with ~100 billion shed off the market cap by the time the bell closed late last week. 

The 33-year-old CEO has also been called on and agreed to testify before Congress, putting pressure on Twitter CEO Jack Dorsey and Google CEO Sunday Pichai to join him in front of Senate Judiciary Chairman Chuck Grassley’s data privacy hearing on April 10 – a hearing that has the potential to radically change the way American citizens and lawmakers view big tech.

News also surfaced last week that FB would cut off third-party data access, meaning that businesses advertising on Facebook will lose the ability to use data from Acxiom, Experian and other such services to narrow the group of people who might see their ads. 

As the GDPR enactment date in the EU bloc draws near, raising the profile on data regulation worldwide, Silicon Valley giants are scrambling to notify users of impending changes to data protection policies. And with the knowledge of GDPR hitting more mainstream discussion, people are wondering if GDPR would have stopped the FB data leak from happening.

FTC Agreement

In 2011, Facebook entered settlement agreements with the FTC after charges that it “deceived consumers by telling them they could keep their information on Facebook private, and then repeatedly allowed it to be shared and made public” an FTC press release said at the time. The settlement required Facebook to take several steps to make sure it maintained promises in the future, including giving consumers clear and prominent notice and obtaining consumers’ express consent before their information is shared beyond the privacy settings they have established.

According to the consent order, Facebook was:

·       barred from making misrepresentations about the privacy or security of consumers’ personal information;

·       required to obtain consumers’ affirmative express consent before enacting changes that override their privacy preferences;

·       required to prevent anyone from accessing a user’s material more than 30 days after the user has deleted his or her account;

·       required to establish and maintain a comprehensive privacy program designed to address privacy risks associated with the development and management of new and existing products and services, and to protect the privacy and confidentiality of consumers’ information; and

·       required, within 180 days, and every two years after that for the next 20 years, to obtain independent, third-party audits certifying that it has a privacy program in place that meets or exceeds the requirements of the FTC order, and to ensure that the privacy of consumers’ information is protected.

To reiterate, this was in 2011 – or, seven years ago. Chalk it up to bureaucratic inefficiency or big corporate entitlement, but the “promise” to the FTC was a broken promise, which begs the question: If GDPR had been in place, would it have stopped the Cambridge Analytica breach from happening? The short answer is yes, but why?

GDPR and Facebook

As we’ve covered numerous times before at DMN, at a base level, GDPR gives consumers ownership control over their personal data, including the right to decide whether to share it or not, and to delete it. While both the EU and the US have strands of data protection laws and programs in place, GDPR has a wider scope, more prescriptive standards, and substantial fines. For example, it requires a higher standard of consent for using some types of data, and broadens individuals’ rights with respect to accessing and porting their data. It also establishes significant enforcement powers, allowing a company’s supervisory authority to seek fines of up to 4% of global annual revenue for certain violations.

According to a corporate statement, “[Facebook] will comply with current EU data protection law, and will comply with the GDPR. Our GDPR preparations are well underway, supported by the largest cross-functional team in Facebook’s history. We’re also expanding our Dublin-led data protection team which is leading on these efforts.”

Key points of GDPR that would have prevented the Cambridge Analytica scandal include:

Contractual Necessity

·       Data processed must be necessary for the Service and defined in the contract with the individual

Consent

·       Requires a freely given, specific, informed and unambiguous consent by clear affirmative action

·       People have a right to withdraw consent, which must be brought to their attention

·       Must be from a person over the age of consent specified in that Member State, otherwise given by or authorized by a parent / guardian

·       Explicit consent is required for some processing (e.g., special categories of personal data)

Legitimate Interests

·       If a business or a third party has legitimate interests which are not overridden by individuals’ rights or interests.

·       Processing must be paused if objection is raised by an individual

It will be interesting to see how Facebook continues to evolve under GDPR and under whatever data protections the US puts in place following the April 10 hearing. Compliance (and strict compliance) will be the key ingredient to rebuilding lost consumer trust with the social media platform — even if it is a bit too late.  

Total
0
Shares
Related Posts