"The Great Hack," a new Netflix documentary, painfully relives the 2016 presidential election by following the personal journeys of key players in the Cambridge Analytica/Facebook data scandal. David Carroll, a media professor who spends the film doggedly trying to retrieve his personal data from Cambridge Analytica. British journalist, Carole Cadwalladr, who exposes how Cambridge Analytica harvested data from more than 50 million Facebook users to create targeted ads for the Trump 2016 campaign. And Brittany Kaiser, the former director of business development at Cambridge Analytica, turned whistleblower. While Kaiser and former Cambridge Analytica employee Chris Wylie, surfaced as whistleblowers when the events unfolded in real time after the election, “The Great Hack” introduces us to a new character: Julian Wheatland, Cambridge Analytica’s acting CEO. Unlike the other former Cambridge Analytica employees we know, Wheatland has a different perspective on the demise of the company. If you still have questions after watching “The Great Hack,” you might find answers here. Salon interviewed Wheatland at the San Francisco screening of the “The Great Hack” on July 22.
This interview has been edited and condensed for clarity
What made you decide to become part of this documentary when the company that you worked for is portrayed as the villain?
Good question, I spent about nine months not answering co-director Karim Amer's emails.
I mean if you could imagine last year. I was in the middle of trying to clean it up, deal with employee issues, shareholder issues, regulatory issues, and 35,000 news stories per day on Cambridge Analytica.
The idea of being in a film was not of any interest at all, but towards the back end of last year a mutual friend intervened and said, "I think you should really talk to these guys." And, I talked to Karim, and I became convinced that he wasn't out to do a hatchet job number one, and that actually, as much as anything, all the things that Cambridge Analytica had been accused of was the reason for being in it.
Because, I felt, particularly on behalf of all the employees that were in the company, that weren't Alexander Nix, that they had been tarnished with having been associated with this company, Cambridge Analytica, which had been accused and found guilty without trial of pretty much anything anybody dreamed up. I felt for them more than anything, that someone should stand up for them, at least give a more balanced point of view.
You’re saying you believe that the company was betrayed by its leader?
So, the things that killed Cambridge Analytica, the first big thing was licensing the Facebook data, which they did do, and they did it before I was there. So, I wasn't involved, I've never met Chris Wiley by the way, I was Chief Operating Officer and Chief Finance Officer since the beginning of 2015, and I have never met Chris Wiley, but that's besides the point.
He and Alexander organized the licensing of this data, and I've looked back at the contracts and as far as I can tell, they did everything right.
The supplier took all warranties and assurances for the data being legally licensed, and being compliant with Facebook's regulations. So, they licensed this data, when it came out they had the data there was a lot of . . . there was a media story, Carole Cadwalladr I think broke it, and it was a big media story. In truth, the data hadn't been very useful, and so they had stopped using it even before Facebook asked for it to be deleted.
But the data wasn't deleted?
Actually, listen to the words. Mark Zuckerberg says, "We heard news stories which suggested it might not have been deleted."
It's a little bit like Brittany [Kaiser] says, "I had an email from a senior data scientist that said, 'we're still using Facebook-like data.'"
So, as far as I'm aware and concerned it was deleted. Alexander Taylor, Alex Taylor, was the chief data scientist, chief data officer, and he told me it was deleted, and I don't mean in a passing way, we had a lot of conversation, and he said, "it's deleted, it's gone."
There was some work that took some time because you can end up with shadows of data around a big database, so it's a lot of work to go and check all the corners and make sure it's not there, but I believe it was deleted to this day. But, I recognize Chris Wiley put a question mark on that, particularly when he showed someone at The Guardian the data, and he said, "look it's not all deleted."
It was a long time after that, that I realized through the parliamentary inquiries, that he himself had licensed even more data than he had arranged for Cambridge Analytica to license, and so I assume, and I don't know this, but what he was showing someone was a copy of data that was on his computer rather than the company's, but that's an aside.
So, the first problem was the Facebook Data, the second one was the Brexit Campaign.
Now the problem with the Brexit Campaign is we told everybody we worked on it.
That started because Aaron Banks gave us what we call a verbal contract. He verbally agreed to our proposal at which point we sent the contract and the first invoice, and he never responded, he never signed the contract, never paid any money, never got delivered any work.
By my definition we never worked for them.
Brittany has a different interpretation.
Then the third thing was the Trump Campaign, which we did do. We actually did so much PR work to try and build our position that actually it wasn't our fault, because we thought Trump was going to lose, so we were busy making our excuses with all the major newspapers, and telling them what great work we did, and how it would have been worse if he hadn't had us, and then he won, and we ended up with the credit for him winning. More than we deserved.
I think we made a difference on the Trump Campaign. We made a difference not using any of that psychographic stuff, we used none of that on the Trump Campaign. We did on the [Ted] Cruz campaign, not on the Trump Campaign, because we really only worked for the Trump Campaign for about three months, from when he was nominated, so there wasn't time to do the Psychographic work.
Where we did make a difference was, we were doing a lot of polling, and a lot of data analytics and polling, and the combination of polling and data analytics, in simplistic terms, if you want accurate polls then you want to poll people who are going to vote, and that's slightly different from people who say they are going to vote.
So, the data analytic tells you what type of person, what demographic, how their voting, how to weight it, and then the polling tells you what they're thinking. And, we could see in certain States that elderly white vote was up, black vote was down, for instance. And Trump wasn't ahead, but the direction of travel was such that, these were the states which are worth visiting.
And this is from Facebook Data?
No, nothing to do with Facebook. Nothing to do with Facebook at all, this was from polling data in the states, and data analytics of offline data.
And so, they directed the Trump Campaign to the states they could see it shifting, famously Hillary didn't visit those states, and the rest is history, but having said all of that, the difference we made was tiny compared to the one Comey made. Because when Comey reopened the email investigation it was the first time in the whole campaign where there had been a negative story about Hillary that hadn't been, pardon the pun, trumped by a much bigger negative story about Trump.
There was no negative story about him, they were all done out, and so she had the news agenda, and that's what really tilted the balance.
Anyway, we got the Trump thing, and that upset a lot of the media, and then finally with Alexander's video.
Did you ever feel ethically compromised for being part of this company and doing this work to help Trump win, and to use data that was unauthorized by its users to create these personality profiles, and target voters, and create these campaigns?
Not very much, and I think that, that's where looking back, I can see that we made mistakes. We were so busy focused on the clever stuff we were doing, and you know, in the middle of a Presidential election, it's exciting. we definitely did not stop to think about the ethics of what we were doing enough. The ethics of the Facebook Data, to be honest, by the time I knew the Facebook Data was there we were being asked by Facebook, that was how I discovered it, Facebook were asking us to delete it, and I believe we deleted it.
So, I didn't really spend much time worrying about, well I did think about the ethics of it, but it was being deleted and we weren't using it going forward.
I think there are some ethical questions around the way we used offline data, not Facebook data, but more offline data, to profile people. The surveys that we conducted which did accurately predict their personality profile, and how we used that as a learning set to then go and predict the personality profile of other people in the general population in order to send them adverts, marketing communications, that we thought would give more impact.
I think there are definitely ethical questions there that need answering that we never stopped to pause.
Yeah, so what's your relationship like today with Alexander Nix?