« Back to Blog

Cambridge Analytica and Facebook's Big Fail

By Chandler Givens
Jun 8, 2018

Guest Post By: Chandler Givens

Chandler Givens is the CEO and co-founder of TrackOFF, a Baltimore-based startup that builds tools to help consumers protect their online privacy. To learn more about TrackOFF, visit their website at TrackOFF.com.

Privacy issues abound on the Internet, and we at TrackOFF make it our point to unpack each one for you. When it comes to security snafus, we’re usually uniquely well-equipped to help you with them, but some of what happens is just a brand of madness few could predict. In that latter category is the scandal that happened with Cambridge Analytica and Facebook, though many would argue there was nothing about it that was exactly unpredictable.

While Mark Zuckerberg traverses the globe, desperately trying to repair Facebook’s damaged reputation with lawmakers and media after their scandalous tryst with Cambridge Analytica, we try to make sense of how to avoid being victim to such shenanigans again. Just what happened, and how did it go so horribly wrong?

It all came to pass when some dubious practices from “back in the day” were retroactively exploited for power and profit. Back in 2010, Facebook allowed for some questionable app indiscretions. Open Graph, for instance, allowed external developers to contact Facebook users and request permission to access a large chunk of their personal data — and, crucially, to also access the personal information about their friends and followers.

Three years later, Cambridge University researcher Aleksandr Kogan and his company Global Science Research created an app called “thisisyourdigitallife.” The app asked users to answer questions for a psychological profile. Almost 300,000 people were thought to have been paid to take the psychological test — with the app then harvesting their private info. It also harvested data from their Facebook friends, which resulted in Kogan having access to the personal information on millions of Facebook profiles.

By 2014, Facebook changed its rules to limit developers’ access to user data. This change was made so that a third party was not able to access a Facebook user’s friends’ personal info without getting permission first. However, the rule changes were not retroactively enforced — and Kogan didn’t delete the data that had been improperly acquired.

In late 2015, The Guardian reported that Cambridge Analytica was working with Ted Cruz’s presidential campaign. The report suggested the Republican presidential candidate was using psychological data based on research from tens of millions of Facebook users in an attempt to gain an advantage over his rivals — including Donald Trump.

Facebook responded to the story by seeking to ban Kogan’s app and legally pressured him and Cambridge Analytica to remove all the data they had improperly acquired. But by then, Cambridge Analytica was mixed up not just with Ted Cruz’s campaign, but with Donald Trump’s as well.

An explosive exposé in March of this year initially reported that 50 million Facebook profiles were harvested for Cambridge Analytica in a major scandal, but that number was later revised to as many as 87 million victims. The FTC launched an investigation, Mark Zuckerberg went into overtime spinning the response to the media through strategic statements and advertisements, and was ultimately required to stand and testify before U.S. and European lawmakers.

As this enormous scandal keeps unfolding, consumers are becoming more and more aware of how fragile their privacy can be. How do we keep our tech companies, social media platforms, apps and online resources from taking advantage of our trust? Thankfully, what’s happened this year with Facebook is leading to some changes.

The European Union is prepared to deploy extensive privacy rules by the summer of 2018, and these new public information safeguards may very well set a new standard that will be executed around the globe — including in the U.S.

Beginning late May, under the new General Data Protection Regulation, companies that collect or mine personal data must first request consent from users. No longer will firms be able to bury disclosures about pervasive tracking in hard-to-read legal disclaimers.

Corporate entities misusing your data without your knowledge is not going away in the foreseeable future. Some efforts are being made by lawmakers and other regulatory agencies to prevent this from happening more, but the best thing you can do is be educated about best practices.

When’s the last time you looked through the privacy section of your Facebook settings? Corporations can’t be counted on to have your best interests in mind. Turns out, we all have to take charge of our own privacy, and that probably shouldn’t have been a surprise.