Over the past couple of days, you’ve probably seen a few references in the news to a company called Cambridge Analytica, and something about Facebook being hacked. Or manipulated. Or something like that. Maybe something about your personal data being harvested and weaponised by vast, shadowy political consulting firms? Whatever, dude. I have memes to listlessly click on.


The story is still developing, but it’s actually a pretty juicy one – and has a lot of relevance to literally anyone who regularly sees Facebook ads, which is basically everyone at this point. Here’s a simple rundown of what’s gone down so far.

What is Cambridge Analytica?

Cambridge Analytica is a company that offers a number of data services to companies and political parties. They claim they can offer deep and complex insights for entities who want to “change consumer behaviour” – like, for example, influencing someone to vote for a particular candidate.

How do they get all this data they’re analysing? Well, a lot of places. They have a methodology which involves classic data gathering techniques – like good old-fashioned polling – alongside more advanced online and social media operations.

Cambridge Analytica was set up by a bloke named Alexander Nix, who specifically said that he wanted to fill the gap in the Republican Party‘s data operations. You may recall that after the 2012 election – when Barack Obama sailed to victory on the back of the most advanced political data operation ever assembled – it was pretty clear that the Republicans were in a very sorry shape. Cambridge Analytica wanted to fix that.

Here’s Nix:

The Democrats had ostensibly been leading the tech revolution, and data analytics and digital engagement were areas where Republicans had failed to catch up. We saw this as an opportunity.

Why are they controversial?

Well, even before this current rigamarole, people were quite upset with Cambridge Analytica. Why? Well, they worked on Donald Trump‘s campaign, and have been accused by some MPs of being secretly involved in the Leave side of the Brexit debate too.

Image“We are thrilled that our revolutionary approach to data-driven communication has played such an integral part in President-elect Trump’s extraordinary win,” Nix said in a press release just after it became clear that Trump had won the election.

The firm is associated with Trump surrogates and donors like Steve Bannon and Robert Mercer, so there’s the general sense that they’re the data brains behind some very big right-wing players.

Okay, but what’s all this about a leak?

Well, we knew that Cambridge Analytica had been instrumental in electing Trump, and that they’d used some big data wizardry to facilitate that. The new controversy, courtesy of reports in The Guardian and The Observer, is that the data used by the company to ultra-specifically target political advertisements was obtained illicitly.

Basically, the Observer claims that in 2014 over 50 million Facebook profiles were harvested by an academic named Aleksandre Kogan, and his company Global Science Research. The information was scraped from Facebook profiles without people’s knowledge when they or their friends did a personality test through a Facebook app.

Kogan and GSR had a deal with Cambridge Analytica to share that data. So when you join the dots, it sure seems like the Trump campaign’s finely-targeted political communications might have been based on data which was obtained without authorisation. Which is obviously bad.

What do Facebook say?

When the story broke, Facebook made some quick moves – despite the fact that by all accounts they knew something was up for a very long time. They banned Kogan and Cambridge Analytica, as well as a few other people, from their platform.

They issued a detailed press release and explanation about this, claiming that the data wasn’t ‘breached’ and that Kogan had obtained it all legitimately:

Like all app developers, Kogan requested and gained access to information from people after they chose to download his app. His app, “thisisyourdigitallife,” offered a personality prediction, and billed itself on Facebook as “a research app used by psychologists.” Approximately 270,000 people downloaded the app. In so doing, they gave their consent for Kogan to access information such as the city they set on their profile, or content they had liked, as well as more limited information about friends who had their privacy settings set to allow it.

Although Kogan gained access to this information in a legitimate way and through the proper channels that governed all developers on Facebook at that time, he did not subsequently abide by our rules.

So the data collection was all above-board, according to Facebook, but it became problematic when it was shared with Cambridge Analytica for commercial purposes.

Oh, and they also suspended Christopher Wylie, the former Cambridge Analytica employee who blew the whistle on all of this.

Alright, why should I give a shit?

Well, this could absolutely be a bit of a turning point when it comes to regulation of social media, and exposing just how careless entities like Facebook are when it comes to your private data.

You may or may not give a toss if firms are targeting political messages at you based on very personal data you never willingly parted with, but it’s definitely a huge concern. It feeds into the big debate we’ve all been having for the past few years about ‘fake news’ and social media bubbles – if a firm can harvest insanely specific and personal insights about you, and tool their political messages to that exactly, then is that legitimate? Or is it propaganda?

Either way, it should absolutely remind that if you’re one of those people who still uses Facebook apps in the year of our lord 2018, make sure you read the T&Cs.

Image: The Matrix