By now you’ve probably seen some of the hundreds of headlines about Cambridge Analytica, the shady data analytics firm which managed to get its paws on information about some 50 million Facebook users collected via a personality testing app.
Some have suggested that the information could have helped influence Facebook users into voting for Donald Trump in the US presidential election.
In its initial exposé, The Guardian referred to the incident as a “major data breach”, and described it as “one of the largest-ever breaches of Facebook data.”
The claim of a “data breach” understandably stung Facebook badly, as the implication for the average person in the street would be that hackers somehow managed to infiltrate Facebook’s servers and make off with a haul of personal information.
Facebook’s Chief Security Officer Alex Stamos said it was unfair to describe what happened as a breach, in a now-deleted tweet:
“The recent Cambridge Analytica stories by the NY Times and The Guardian are important and powerful, but it is incorrect to call this a “breach” under any reasonable definition of the term. We can condemn this behaviour while being accurate in our description of it.”
And the social network updated its official statement on its suspension of Cambridge Analytica to reinforce that it had not suffered a breach.
Now, you might reasonably respond “Well they would say that, wouldn’t they?”
But let’s try to think this through. In my opinion, the question of whether it’s a data breach or not depends on where you stand.
From Facebook’s point of view, it’s not a traditional data breach. That’s because this isn’t a case of malicious hackers breaking into a server, exploiting a vulnerability, or grabbing passwords.
This is how Facebook was designed to work, and many apps over the years have scooped up users’ information and (privacy settings permitting) those of their friends as well.
Hundreds of millions of times every day Facebook hones the content it displays to you based on what it has determined you are interested in, who you are, and what it thinks will be most effective. So, it’s not that different from what Cambridge Analytica did with the same access to the data.
What was against Facebook’s guidelines was for Aleksandr Kogan (the developer of the “thisisyourdigitallife” app) to share the data he captured with Cambridge Analytica. However, I don’t see how Facebook could have technically prevented that, other than doing what it already does – requesting third parties play by the rules.
In short, Facebook might try to argue this is not a data security breach, but rather a misappropriation of data, or (if you like) a data policy breach.
However, millions of Facebook users around the world might take a dimmer view of things.
Many of them may not realise that just because one of their Facebook friends allowed a personality test to scoop up their friends’ details, their own information may also have been exposed to and shared with third parties against their wishes.
None of this is news. Facebook has been working this way for years.
And the fact that this is how Facebook is supposed to work actually makes it worse than any data breach.
The only way to reduce your exposure is to refuse to play Facebook’s game and not be a member of the site. If you can’t bring yourself to leave, at the very least lock down your privacy settings and reduce the level of information that you share.
Finally, it’s worth saying, Facebook isn’t the only website that collects vast amounts of information about its users and exploits it in this fashion.
“If you are not paying for it, you’re not the customer; you’re the product being sold.”
For more discussion on this topic, listen to this episode of the “Smashing Security” podcast.
Smashing Security #070: 'Facebook and Cambridge Diabolica'
Listen on Apple Podcasts | Spotify | Pocket Casts | Other... | RSS
More episodes...
Update: Facebook now admits that as many as 87 million people have had their details improperly shared with Cambridge Analytica.
I've a feeling that under GDPR it is definitely a breach.
For a few reasons: the GDPR redefines breach to include not merely exfiltration of data but processing / not processing in a way likely to cause an impact on the data subject.
It also requires controllers, facebook, to push down data protection clauses on processors and requires processor to implement the same on sub-processors.
At the same time GSR as both a controller and processor would have needed to fully disclose the intended current and future processing uses at time of collection.
Whilst facebook enabled this GSR definitively breached the terms and conditions not only under GDPR but under the old DPA where change of purpose requires new consent.
Now back to reading my GDPR law book (it's as much fun as it sounds)
Yes, I absolutely agree. Under GPDR, I am sure this would be a breach because all data subjects have to be informed what data is held, why and who it is shared with. They also have to have given explicit consent. I don't see any way that Facebook could argue that people could have given explicit consent just by virtue of being friended with the main data subject.
Under the new law, Cambridge Analytica would be a Data Processor and Facebook would be the Data Controller and would have the ultimate legal culpability for any data breach. If the ICO investigates this and finds that Cambridge Analytica did commit a breach, Facebook can count themselves lucky that this did not happen after 25th May this year because after that, they could be fined up to 15% of global turnover.
Oops, sorry. I got the %age wrong. It's 4%, not 15%. Still a potentially a huge amount if you are someone like Facebook though and it is on turnover now, not profits as in DPA.
Absolutely this is how Facebook works. The data they collect is for them to target adverts at you, and that's how its meant to be used by third parties too.
Where the problem lies, is in the way a single person taking the 'personality test' was able to release the data of all their friends. That should not happen. Each person should have the say-so on whether their data is released. This is the crux of the matter.
In the end, everyone DOES have a say on whether their data is released. Each of these people signed up for facebook, answered all their questions, let them suck up their address book, collect their pictures and comments, their likes and dislikes and everything else, all in exchange for a few shiny Internet baubles. When anyone balked at it, these deep thinkers took the "what do you have to hide?" attitude. Well now, maybe, finally, they know.
Oh my goodness! Millions of folks’ privacy (and therefore their security) is now at risk because of the way Facebook operates. I’m shocked! </sarcasm> Not.
Exactly as the article points out, the possibility…no, wait — the near certainty that something like this could happen has been part and parcel of the way Facebook operates from the get-go. To anyone who has been paying attention, there’s no actual news here.
So why is this hogging the headlines now? Didn’t something like this happen back when Mr. Obama was seeking his second stint on the throne? Oh, wait…I forgot; the news media weren’t desperately, disingenuously, relentlessly, and tiresomely looking for some way to invalidate his election, as they are with Mr. Trump.
Until Facebook makes a radical and uncharacteristic change in their policies and suddenly stops exploiting FB users’ personal information in ways FB knows they never suspect (and would never agree to if they thought it through to the potentially disastrous consequences),
this will happen again.
And just as it has over the entirety of Facebook’s history, FB users won’t care, just as they have never cared that Mr. Zuckerberg once characterized them as “dumb f_cks”. He’s the poster boy for successful, legal (as distinct from moral) social engineering. He owes his billion$ to the accurate perception that vast numbers of people can’t resist the lure of telling the world about themselves in the vain hope that anyone cares.
It’s a tragicomic irony. The world is dominated by politicians who couldn’t care less about the sanctity of personal property and privacy, and who will say anything to buy the votes of their victims.
The only people who really care about what Facebook users post are the companies who buy that data. Zuckerberg makes his billions, and the people who provide the product (their information) don’t make a penny…and they still don’t care.
Graham’s advice is sound; bail out before you become the next victim. Alas, those who most need to follow that advice probably aren’t inclined to read a security newsletter in the first place.