Nguồn: Dân chủ ngay!
New details are emerging about how the shadowy data firm Cambridge Analytica worked to manipulate voters across the globe, from the 2016 election in the United States to the Brexit campaign in Britain and elections in over 60 other countries, including Malaysia, Kenya and Brazil. A new trove of internal Cambridge Analytica documents and emails are being posted on Twitter detailing the company’s operations, including its work with President Trump’s former national security adviser John Bolton. The documents come from Cambridge Analytica whistleblower Brittany Kaiser, who worked at the firm for three-and-a-half years before leaving in 2018. We speak with Jehane Noujaim and Karim Amer, co-directors of the Oscar shortlisted documentary “The Great Hack”; Brittany Kaiser, the Cambridge Analytica whistleblower featured in “The Great Hack” and author of “Targeted: The Cambridge Analytica Whistleblower’s Inside Story of How Big Data, Trump, and Facebook Broke Democracy and How It Can Happen Again”; and Emma Briant, a visiting research associate in human rights at Bard College whose upcoming book is titled “Propaganda Machine: Inside Cambridge Analytica and the Digital Influence Industry.”
AMY NGƯỜI ĐÀN ÔNG TỐT: New details are emerging about how the shadowy data firm Cambridge Analytica worked to manipulate voters across the globe, from the 2016 election in the United States to the Brexit campaign in Britain to elections in over 60 countries, including Ukraine, Malaysia, Kenya and Brazil.
Cambridge Analytica was founded by the right-wing billionaire Robert Mercer. Trump’s former adviser Steve Bannon of Breitbart News was one of the company’s key strategists and claims to have named the company. The company collapsed in May 2018 after Người quan sát newspaper revealed the company had harvested some 87 million Facebook profiles without the users’ knowledge or consent. Cambridge Analytica used the data to sway voters during the 2016 campaign.
A new trove of internal Cambridge Analytica documents and emails are being posted on Twitter detailing the company’s operations across the globe, including its work with President Trump’s former national security adviser John Bolton. The documents come from Cambridge Analytica whistleblower Brittany Kaiser, who worked at the firm for three-and-a-half years before leaving in 2018. Kaiser is featured prominently in the Netflix documentary Đại Hack, which has been shortlisted for an Oscar. This is the trailer for the film.
DAVID XE: Who has seen an advertisement that has convinced you that your microphone is listening to your conversations? All of your interactions, your credit card swipes, web searches, locations, likes, they’re all collected, in real time, into a trillion-dollar-a-year industry.
CAROLE CADWALLADR: The real game changer was Cambridge Analytica. They worked for the Trump campaign and for the Brexit campaign. They started using information warfare.
DAVID XE: Cambridge Analytica claimed to have 5,000 data points on every American voter.
CAROLE CADWALLADR: I started tracking down all these Cambridge Analytica ex-employees.
CHRISTOPHER WYLIE: Someone else that you should be calling to the committee is Brittany Kaiser.
PHIÊN BẢN TIN TỨC: Brittany Kaiser, once a key player inside Cambridge Analytica, casting herself as a whistleblower.
nước Anh HOÀNG ĐẾ: The reason why Google and Facebook are the most powerful companies in the world is because last year data surpassed oil in value. Data is the most valuable asset on Earth. We targeted those whose minds we thought we could change, until they saw the world the way we wanted them to. I do know that their targeting tool was considered a weapon.
PAUL HILDER: There is a possibility that the American public had been experimented on.
DAVID XE: This is becoming a criminal matter.
CHRISTOPHER WYLIE: When people see the extent of the surveillance, I think they’re going to be shocked.
nước Anh KAISER’S MẸ: And I still fear for your life.
nước Anh HOÀNG ĐẾ: Yeah.
nước Anh KAISER’S MẸ: With the powerful people that are involved.
nước Anh HOÀNG ĐẾ: But I can’t keep quiet just because it will make powerful people mad.
nước Anh KAISER’S MẸ: Tôi biết.
RAVI NAIK: Data rights should be considered just fundamental rights.
CAROLE CADWALLADR: This is about the integrity of our democracy. These platforms which were created to connect us have now been weaponized. It’s impossible to know what is what, because nothing is what it seems.
AMY NGƯỜI ĐÀN ÔNG TỐT: That’s the trailer to the Netflix documentary Đại Hack. Well, we’re joined right now by four guests, by the film’s directors, Jehane Noujaim and Karim Amer. They’re the co-directors of Đại Hack, which was just nominated for a BAFTA today. That’s t the British equivalent of the Oscars. And it has been shortlisted for the Oscars. Jehane’s past films with Karim Amer include The Square. She also did Phòng điều khiển. Brittany Kaiser is the Cambridge Analytica whistleblower featured in the film. She’s the author of Targeted: The Cambridge Analytica Whistleblower’s Inside Story of How Big Data, Trump, and Facebook Broke Democracy and How It Can Happen Again. And we’re joined in Washington, D.C., by Emma Briant, visiting research associate in human rights at Bard College who specializes in researching propaganda. Her forthcoming book is called Propaganda Machine: Inside Cambridge Analytica and the Digital Influence Industry.
Chúng tôi chào đón tất cả các bạn để Dân chủ ngay! Brittany, you have just begun to release a trove of documents from Cambridge Analytica, involves scores of countries, including the United States, including John Bolton, including Iran. Talk about how — why you decided to begin this release and what are in these documents.
nước Anh HOÀNG ĐẾ: Absolutely. I decided to release the Hindsight Files because it’s now 2020. I’ve been waiting and working with investigators and journalists around the world for the past two years. And what I’ve seen is that we don’t have enough change in order for voters to be protected, ahead of not just November, but in 27 days the first votes that are cast for the 2020 election.
I really think that digital literacy is the most important point that I’m trying to make here. If you understand the tactics and the strategies that are being used to manipulate you, then you can protect yourself from that. And I want to be able to empower voters ahead of casting their first vote this year.
AMY NGƯỜI ĐÀN ÔNG TỐT: So, talk about these documents, where they came from and what’s in them.
nước Anh HOÀNG ĐẾ: These are all documents from my time at Cambridge Analytica. I worked at the company for over three years. So, it’s internal communications and negotiations for data-driven communications projects all around the world. It’s proposals, contracts and case studies of what has been done to intervene in democracy.
And I think it’s so important for people to understand that while sometimes these tactics are benign, sometimes they are incredibly malignant. And there’s evidence of voter suppression, fake news and disinformation, using racism, sexism.
And I just want to make sure that there is real action that is going to be taken, not just ahead of this next election, but for countries all around the world. We need privacy legislation so badly. We need to regulate Big Tech and have an ability to enforce our voting laws online, because right now we can’t. And unfortunately, companies like Facebook are not doing enough to protect us.
AMY NGƯỜI ĐÀN ÔNG TỐT: So, for people who are new to what Cambridge Analytica is, why don’t you describe why it is and why you have these documents, what Cambridge Analytica’s role was in all of these countries, including the United States?
nước Anh HOÀNG ĐẾ: Absolutely. So, Cambridge Analytica was one of the companies under the SCL Group, Strategic Communication Laboratories. This is a company that has been around for over 25 years, and they started by using data-driven strategies in order to understand people’s psyche, how they make decisions and how they can be persuaded to take certain actions or to prevent people from taking certain actions.
AMY NGƯỜI ĐÀN ÔNG TỐT: It was a defense contractor.
nước Anh HOÀNG ĐẾ: Originally they started in defense, yes. And once they found out how successful that was — that was actually in the Nelson Mandela election in ’93, ’94 in South Africa, they were preventing election violence for a defense contract — they realized that that was very useful in elections. And those strategies developed over two-and-a-half decades in order to no longer just do good things and good impact work, but, unfortunately, to undermine our democracies.
AMY NGƯỜI ĐÀN ÔNG TỐT: I want to turn to a clip from the documentary Đại Hack. In this clip, the British journalist Carole Cadwalladr talks about Cambridge Analytica’s parent company SCL. We also hear the voice of former Cambridge Analytica CEO Alexander Nix, who was previously a director of SCL.
CAROLE CADWALLADR: SCL started out as a military contractor, SCL Defence.
ALEXANDER NIX: We have a fairly substantial defense business. We actually train the British Army, the British Navy, the U.S. Army, U.S. Special Forces. We train NATO, Các CIA, State Department, Pentagon. It’s using research to influence behavior of hostile audiences. How do you persuade 14-to-30-year-old Muslim boys not to join al-Qaeda? Essentially communication warfare.
CAROLE CADWALLADR: They had worked in Afghanistan. They had worked in Iraq. They had worked in various places in Eastern Europe. But the real game changer was they started using information warfare in elections.
ALEXANDER NIX: There’s a lot of overlap, because it’s all the same methodology.
CAROLE CADWALLADR: All of the campaigns which Cambridge Analytica/SCL did for the developing world, it was all about practicing some new technology or trick, how to persuade people, how to suppress turnout or how to increase turnout. And then, it’s like, “OK, now I’ve got the hang of it. Let’s use it in Britain and America.”
AMY NGƯỜI ĐÀN ÔNG TỐT: Đây là Dân chủ ngay!, democracynow.org, Báo cáo chiến tranh và hòa bình. I’m Amy Goodman. We’re spending the hour looking at Cambridge Analytica, the British company that was co-founded by the well-known President Trump special adviser Steve Bannon. He was vice president of Cambridge Analytica, which came out of a military contracting group and now is accused of having been involved, essentially, with PSYOPs, using tens of millions of people’s information that it harvested from Facebook.
Our guests are the award-winning filmmakers Karim Amer and Jehane Noujaim, who are directors of Đại Hack, which has just been nominated for a BAFTA, the Oscar equivalent in Britain, and has been shortlisted for an Oscar, and Brittany Kaiser, a whistleblower who worked at Cambridge Analytica for more than three years. In Washington, we’re joined by Emma Briant, who has been looking at Cambridge Analytica for years.
Let me ask you, Jehane, why you decided to spend years of your life looking at Cambridge Analytica and making this film, Đại Hack, and why you called it that.
JEHANE NOUJAIM: Well, this has, for me, been a film, I feel, 20 years in the making. Twenty years ago, I made a film with Chris Hegedus and D.A. Pennebaker called Startup.com, which was about the beginning of the dotcom world starting and where people wanted to be God online and start internet companies and make millions and millions of dollars. And fast-forward 20 years, and they are God online.
But I’ve always been obsessed with how we get our information. And soon after Startup.com, I made Phòng điều khiển, which was about looking at the Iraq War, and depending on whether you were looking at Al Jazeera or Fox News or CNN, you had a completely different understanding of reality on the ground. And I had this question of how — if you don’t have some kind of shared understanding of truth, how can there be nuanced conversation and discussion, which is what’s necessary for a democracy to function?
Flash-forward to us making The Square, and at that time social media was a tool for change, positive change. I mean —
AMY NGƯỜI ĐÀN ÔNG TỐT: You’re talking Tahrir Square in Egypt.
JEHANE NOUJAIM: Tahrir Square. This is where we met, and we made the film, The Square. And at that time, even when I was arrested, Twitter was used to find me. So, it was a very positive tool for change.
Then we see the pendulums kind of swing in the other direction, and we see that social media can be used in a very different way. And it was used by the Army, and then we started to see it being used in the Brexit campaign and the Trump campaign. And we started hear this word “hack” and the hacking of these elections. But what we realized was that the real hack that we needed to look at was the hack of the mind and what is happening inside our newsfeeds and what happens when people are creating their own truths because they’re being microtargeted.
And we started to look at this company Cambridge Analytica. And what we found there was fascinating, because we realized that there was this invisible story that’s happening inside our computer screens, inside our heads, which is leading to everybody having a completely different understanding of reality based on their newsfeeds.
And that’s when we met Brittany Kaiser, who at the time was just about to become a whistleblower and come out about what she knew. She was basically saying, “I’m going to Thailand, and if you want to talk to me, then come meet me here. I can’t tell you exactly where it’s going to be, where I’m going, but land at this airport.” We met her there, and that’s where the film began.
AMY NGƯỜI ĐÀN ÔNG TỐT: Well, I want to go to a clip of Đại Hack, your film. In this, Brittany Kaiser explains the concept of the persuadables.
nước Anh HOÀNG ĐẾ: Remember those Facebook quizzes that we used to form personality models for all voters in the U.S.? The truth is, we didn’t target every American voter equally. The bulk of our resources went into targeting those whose minds we thought we could change. We called them the persuadables.
They’re everywhere in the country, but the persuadables that mattered were the ones in swing states like Michigan, Wisconsin, Pennsylvania and Florida. Now, each of these states were broken down by precinct. So you can say there are 22,000 persuadable voters in this precinct, and if we targeted enough persuadable people in the right precincts, then those states would turn red instead of blue.
AMY NGƯỜI ĐÀN ÔNG TỐT: Vậy đó là một đoạn clip từ Đại Hack. Brittany Kaiser, explain further this idea of persuadables.
nước Anh HOÀNG ĐẾ: So, you might have heard them referred to as “swing voters.” In brand advertising, they’re called “switchers,” because it’s easy to persuade someone to try something new or to change their mind. So, identifying persuadables is what everybody does in data science for political modeling. Every political consultant in the books is trying to do this, identify the people whose minds can be changed, because quite a lot of people have not made up their mind yet. And when you’re trying to introduce a character as controversial as Donald Trump, the idea was, find the people who could be convinced, even though they had probably never voted for anyone like him before.
AMY NGƯỜI ĐÀN ÔNG TỐT: So, talk about your trajectory. I mean, Karim and Jehane, you do this very well in the film, but it is a very unlikely path to a firm that may well have been illegal in what it did, in working with Facebook, harvesting all this information, that ultimately helped to get Trump elected. But that’s not really where you came from. In the film, I’m looking at pictures of you and Michelle Obama. You were a key figure in President Obama’s social media team in his election campaign.
nước Anh HOÀNG ĐẾ: I have always been a political and human rights activist. That’s where I came from, so it was really easy to snap back into that kind of work. I actually was in the third year of my Ph.D., writing about prevention of genocide, war crimes and crimes against humanity, when I first met the former CEO of Cambridge Analytica, Alexander Nix. My Ph.D. ended up being about how you could get real-time information, so how you could use big data systems, in order to build early-warning systems to give people who make decisions, like the decision that was just made about Iran — give them real-time information so that they can prevent war before it happens. Unfortunately, no one at my law school could teach me anything about predictive algorithms, so I joined this company part-time in order to start to learn how these early-warning systems could possibly be built.
AMY NGƯỜI ĐÀN ÔNG TỐT: Well, explain. Explain your meeting with Alexander Nix, who is the head — came from the defense contractor — right? — SCL, and then was the head of Cambridge Analytica, who said, “Let me get you drunk and steal your secrets.”
nước Anh HOÀNG ĐẾ: Yes, he did. Not that becoming, but he has always been an incredibly good salesman. In one of my first meetings with him, he showed me a contract that the company had with NATO in order to identify young people in the United Kingdom who were vulnerable to being recruited into ISIS, and running counterpropaganda communications to keep them at home safe with their families instead of sneaking themselves into Syria. So, obviously, that type of work was incredibly attractive to me. And I thought, “Hey, data can really be used for good and for human rights impact. This is something I really want to learn how to do.”
AMY NGƯỜI ĐÀN ÔNG TỐT: But soon you were on your way to the United States with Alexander Nix, meeting with Corey Lewandowski, who at the time was the campaign manager for Donald Trump. When did those red flags go up for you?
nước Anh HOÀNG ĐẾ: There were red flags here and there, especially when I would call our lawyers, who were actually Giuliani’s firm at the time, in order to ask for advice on what I could and could not do with certain data projects. And I always got told, “Hey, you’re creating too many invoices.”
But what really landed the plane for me was, a month after Donald Trump’s election, everybody at Cambridge Analytica who had worked both on the Trump campaign and on the Trump super PAC, which ran the “Defeat Crooked Hillary” campaign — they gave us a two-day-long debrief, which I write about in detail in my book MỤC TIÊU, about what they did. They showed us how much data they collected, how they modeled it, how they identified people as individuals that could be convinced not to vote, and the types of disinformation that they sent these people in order to change their minds. It was the most horrific two days of my life.
AMY NGƯỜI ĐÀN ÔNG TỐT: So what did you do after that?
nước Anh HOÀNG ĐẾ: I spent a while trying to figure out if there was still anything I could salvage from what I learned there. Was it still possible to use these tools for good? And when I realized that the company had gone way too far in the wrong direction, I started working with journalists in order to go through and figure out what I had in my documents that could possibly assist in saving democracy in the future.
AMY NGƯỜI ĐÀN ÔNG TỐT: You testified before the British Parliament. You were subpoenaed by Robert Mueller. You’ve been involved in a lot of information giving during these investigations. In an odd way, would you describe yourself as a persuadable?
nước Anh HOÀNG ĐẾ: Definitely. And that’s actually a story that is very prevalent in my book. Most people don’t like to think that they are persuadable. We all like to think that we can’t be manipulated. But, trust me, we’re not as digitally literate as we like to think that we are. That’s why I released the Hindsight Files, because I want everyone to realize how easy it is for us to be manipulated and that we need to be aware in order to protect ourselves.
AMY NGƯỜI ĐÀN ÔNG TỐT: Karim Amer, for people who are still sitting here and going, “Cambridge Analytica, Facebook, what does this have to do with each other, Zuckerberg testifying before Congress?” explain then what was the magic sauce. What happened here between these two companies? What did Cambridge Analytica do? And I’ll also ask Brittany this question. And what did Facebook understand was done? And what did they do about it?
KARIM MỸ: Well, I think, you know, the situation that we find ourselves in is one in which all of our behavior, which is essentially what data is, recordable human behavior, is constantly being tracked and gathered. And that’s part of the deal or the devil’s bargain we’ve made in this new economy, where we give our data up, and in return we get services. Now, most of us go ahead and we sign these terms and conditions that we don’t really read or don’t really understand what they’re about. What we’ve realized is that we’re giving up a certain level of autonomy that we may not have understood the implications of.
Now, what is that autonomy? What that is, it is insight into everything you do all the time. It’s something that’s tracking you, from the most public of spaces, when you’re posting, to the most intimate of spaces, when you’re watching porn or when you’re messaging somebody or when you’re staring at photos of a loved one or someone else. All of that behavior is constantly being tracked.
And it is used to create essentially a voodoo doll of you that can predict your behavior with quite a lot of accuracy. The proof of that is that that is the business model of Facebook and Google. It is about predicting your behavior and selling access to that prediction. Think of it as being in a casino that is constantly running, trying to make bets on what you’re going to do next. Now, that casino access is being sold in real time to all kinds of brands around the world.
What Cambridge identified was that they could take voter data, and they could take personality data, and they could map them together and create the most accurate profiles. That’s why they bragged about having 5,000 data points about every U.S. voter, which was one of their unique offerings. And with that insight, they realized that if you knew which districts you had to target and how to target the key people in those districts with the perfect messaging, you had the greatest chance of success.
And where that leaves us to is, we used to live in a world where a political leader had to write one big, one great story to inspire a great people to go on to a great cause. Now we live in a world where a politician can customize a story to every single individual voter and do it in a way which is operating in darkness without transparency.
What do we mean by darkness? We mean that ’til this day we still do not know what ads were placed on Facebook in 2016, who was targeted, who paid for those ads, how it was conducted, were these ads paid for by a foreign country or not, what happened, what didn’t happen. And I think we deserve to know. Why? Because we’ve seen that this has become a place of weaponized information, that can be used to not only promote amazing ideas, but to convince people not to vote, which is active voter suppression.
And what’s troubling is that this is an information crime. Whether it’s legal or illegal doesn’t matter, in my opinion, because many things in our country’s history were legal once before, including slavery, and yet we’ve realized that it was not OK for them to be legal. So, at the current moment, Facebook is a crime scene. Facebook has the answers. Facebook knows what happened to our democracy. And yet it is still unwilling to participate in giving us the evidence we need.
AMY NGƯỜI ĐÀN ÔNG TỐT: I want to turn to another one of your clips, Jehane and Karim, one of the clips of Đại Hack. This is one of the main subjects of the documentary, professor David Carroll.
DAVID XE: I was teaching digital media and developing apps, so I knew that the data from our online activity wasn’t just evaporating. And as I dug deeper, I realized these digital traces of ourselves are being mined into a trillion-dollar-a-year industry. We are now the commodity. But we were so in love with the gift of this free connectivity that no one bothered to read the terms and conditions.
AMY NGƯỜI ĐÀN ÔNG TỐT: David Carroll is featured in this documentary, Đại Hack. In 2018, we nói to David Carroll at Dân chủ ngay!, associate professor of media design at Parsons School of Design. He’s filed a full disclosure claim against Cambridge Analytica in the U.K. I asked him what he was demanding. And a clip of this also appears in Đại Hack.
DAVID XE: A full disclosure. So, where did they get our data? How did they process it? Who did they share it with? And do we have a right to opt out? So, the basic rights that I think a lot of people would like to have, and the basic questions that a lot of people are asking.
AMY NGƯỜI ĐÀN ÔNG TỐT: Đây là Dân chủ ngay!, democracynow.org, Báo cáo chiến tranh và hòa bình. I’m Amy Goodman. Our guests for this hour are Jehane Noujaim and Karim Amer. They are the award-winning filmmakers who made Đại Hack. It has just been shortlisted for an Oscar and nominated for the British Oscar, the BAFTA. Brittany Kaiser is with us. It is her first major interview since she’s begun a major document leak, troves of documents about Cambridge Analytica’s involvement in various elections around the world. In a moment, we’re going to ask her about John Bolton and Iran. John Bolton, by the way, has just said he will testify at an impeachment trial in the Senate, which has turned the Senate on its head for the moment. And we’re joined by Emma Briant, visiting research associate in human rights at Bard College, who specializes in researching propaganda. Her forthcoming book, Propaganda Machine: Inside Cambridge Analytica and the Digital Influence Industry.
Emma, as you listen to this conversation, your work in Cambridge Analytica has been going on for a long time. Can you relate what they were doing, also Facebook, in manipulating populations, as we move into this current election — we don’t even have to say “2020 election” anymore because we’re in 2020 — and how this work is basically tâm lý, psychological operations?
EMMA BRIANT: Thank you, Amy. Yes, I actually first came across the company SCL, the parent company of Cambridge Analytica, in 2007 as part of my master’s and then doctoral research. And I’ve been following their techniques and how they evolved over the years. And what I’ve realized when I started to discover the political work that they were doing, beyond the counterterrorism campaigns that I was studying, was just horrifying to me. The potential for what, you know, escalated during the sort of 2010s is just phenomenal.
So, we started to see them doing these big data projects for the military and so on. And this was, you know, quite a change of direction for them from their earlier work, which was a lot more about qualitative data, doing interviews with people, and so forth, for their research. And it started to change the kind of target audience analysis, this kind of analytics of what — if you like, the persuadables, the people that you want to target your communications at, how those groups were being profiled by the military, then was being taken out and deployed in elections.
And this is deeply disturbing to me because of the fact that I think that these companies have been established with a particular motive in mind, with a particular way of doing things and methodology. And to be repurposing that, when you have been doing work with DARPA, the defense research agency in the U.S., as well as the British equivalent, the ĐSTL, to develop these kinds of techniques, and then you’re going off and taking them to clients that are working in shady political campaigns around the world and working for human rights abusers, it’s really very disturbing — and then, of course, moving back into our own elections. So, I —
AMY NGƯỜI ĐÀN ÔNG TỐT: Well, talk about that.
EMMA BRIANT: Yeah. So, in 2017, I started to realize that they were working in — they had worked on Brexit and that they worked on the Trump campaign. And I started to do a lot more interviews with the company, including meeting Brittany for the first time. And also I met the filmmakers from Đại Hack. And I was weighed down by the responsibility for what I was discovering from my interviews. I was interviewing people like Nigel Oakes, who, as you can see in the — from the evidence that I submitted to the British Parliament, was telling me about the unethical activities that they were doing around the world for — for instance, for Kenyatta —
AMY NGƯỜI ĐÀN ÔNG TỐT: In Kenya.
EMMA BRIANT: — and the role that they played — in Kenya — and the role that they played also in the election of Donald Trump. And he was telling me about how they had basically deployed the same kinds of techniques of the Nazis in the U.S. election. Now, this horrified me. And I had to go further and further and haven’t stopped researching this.
And I think the most important thing is also to put this in the context of their military work, because, actually, these firms are working in multiple domains. You have commercial data use, you have military data use, and you have political data use in the same company. And we have no regulation over what is happening in the United States with companies like this. And there is little transparency over these companies in the United Kingdom, too, which is how we’ve ended up with this real catastrophe for democracy.
So, the issue is that we don’t know how data was being abused by this company. We know some examples of it. We certainly know that they had a slack regard for consent, from the — you know, what’s been revealed in Đại Hack and by Carole Cadwalladr. And the data was being repurposed from research done by academics for their political campaigns. Where else might they have been repurposing data from? And this is the thing that really scares me the most, is that this is rampant, I think, across the industry. I see many, many more companies out there that are working in these multiple domains with little accountability.
AMY NGƯỜI ĐÀN ÔNG TỐT: I’d like to go back to October, when New York Congressmember Alexandria Ocasio-Cortez questioned Facebook CEO Mark Zuckerberg about Cambridge Analytica.
REP. ALEXANDRIA Ocasio–CORTEZ: Did anyone on your leadership team know about Cambridge Analytica prior to the initial report by The Guardian vào ngày 11 tháng 2015 năm XNUMX?
DẤU ZUCERBERG: Congresswoman, I believe so, in that some folks were tracking it internally.
REP. ALEXANDRIA Ocasio–CORTEZ: Và -
DẤU ZUCERBERG: I’m actually — as you’re asking this, I do think I was aware of Cambridge Analytica as an entity earlier. I just — I don’t know if I was tracking how they were using Facebook specifically.
REP. ALEXANDRIA Ocasio–CORTEZ: When was the issue discussed with your board member Peter Thiel?
DẤU ZUCERBERG: Congresswoman, I don’t — I don’t know that offhand.
REP. ALEXANDRIA Ocasio–CORTEZ: You don’t know. This was the largest data scandal with respect to your company, that had catastrophic impacts on the 2016 election. You don’t — you don’t know?
AMY NGƯỜI ĐÀN ÔNG TỐT: Vì vậy, đó là AOC, Congressmember Alexandria Ocasio-Cortez, questioning Jeffrey [sic] Zuckerberg. Brittany Kaiser, was he telling the truth?
nước Anh HOÀNG ĐẾ: I have found that during multiple rounds of questioning, that he —
AMY NGƯỜI ĐÀN ÔNG TỐT: Mark Zuckerberg, that is.
nước Anh HOÀNG ĐẾ: — that Mark Zuckerberg continues to deny the amount of strategies that he is aware of, the amount of data abuses that he is aware of. And just saying that “My team will get back to you,” without being honest with the public, is a massive disaster, not only for his own PR, but for our democracies and for moving forward in a productive manner.
AMY NGƯỜI ĐÀN ÔNG TỐT: And explain what Ocasio-Cortez was getting at when she was talking about Peter Thiel.
nước Anh HOÀNG ĐẾ: So, Peter Thiel, as far as I am aware, was the head of Trump’s technology advisory council. There were multiple meetings where Alexander Nix, the former CEO of Cambridge Analytica, was either being invited or attempting to be invited to those meetings, through the Mercers, through Kellyanne Conway, through Steve Bannon.
AMY NGƯỜI ĐÀN ÔNG TỐT: And you were with Rebekah Mercer, right, on Trump inauguration night?
nước Anh HOÀNG ĐẾ: Absolutely. The Mercers and a lot of other people who had played a very large role in the funding and in the campaigning for Donald Trump were reaching as far as they possibly could to technology tools in order to achieve their goals.
AMY NGƯỜI ĐÀN ÔNG TỐT: And are they doing it now?
nước Anh HOÀNG ĐẾ: Absolutely. I think if anybody thinks that this is different than 2016, they are sorely mistaken. In 2016, everybody saw how successful Cambridge Analytica’s tactics are, so now there’s hundreds of Cambridge Analyticas around the world, especially acting in the U.S. elections right now.
AMY NGƯỜI ĐÀN ÔNG TỐT: So, why don’t we talk about John Bolton and Iran, the files that you’re releasing, that you had during your Cambridge Analytica days? Can you set up this video that we want to play?
nước Anh HOÀNG ĐẾ: Absolutely. So, the files on Ambassador John Bolton show the work that Cambridge Analytica was paid to undertake for the John Bolton super PAC. That was work that started in 2013. It was actually one of the biggest first projects that Cambridge Analytica undertook in the United States. And that was to find five different psychographic groupings of voters and target them with psychographic messaging meant to resonate with your psyche and engage you, depending on whether you’re open, conscientious, extroverted, agreeable or neurotic.
And those videos were targeted over television and YouTube pre-roll in order to convince people, one, that national security was the most important political issue; two, that Ambassador John Bolton was the biggest authority on these topics; and, three, that whoever John Bolton was endorsing — for example, Thom Tillis — would be a better candidate — than Kay Hagan, for example, in that specific race. So, these ads were paid in order to manipulate people into being more interested in his hawkish foreign policies than in their own best interest.
AMY NGƯỜI ĐÀN ÔNG TỐT: So, can you talk about the ad, that is not radio-friendly because it’s mainly music with script over it? But this is an ad for Thom Tillis, right?
nước Anh HOÀNG ĐẾ: Yes. And it was an ad targeted at a group of people identified as being highly neurotic, and, therefore, it is black and white. It’s eerie. You get very emotional music, that shows surrender flags on all of America’s most prominent landmarks.
AMY NGƯỜI ĐÀN ÔNG TỐT: Let me play it, and I’ll read what it says on the screen. So we’ll listen to that music and play the ad, and you can then comment further.
There’s white flags going up, white flag on the Brooklyn Bridge, over the White House — over Congress. And it says, “America’s never surrendered before. What happens if we start now? On November 4, vote Thom Tillis.”
An ad paid for, 2014, by the John Bolton super PAC. The ad was titled “White Flags.” Tillis went on to win his Senate race.
nước Anh HOÀNG ĐẾ: Yes, he did, indeed. And they used these tactics in a very successful manner. In fact, the John Bolton super PAC paid for a third party to rate how successful Cambridge Analytica had been. And they saw, on this ad specifically, that there was over a 36% uplift in engagement on these ads versus the communications that they had already been running.
And so, what scares me so much is that I know that these tactics are being used right now. We are being manipulated in order to support going to war with Iran. We are being manipulated in order to believe that this type of violence is acceptable and that we should support candidates that support this violence.
AMY NGƯỜI ĐÀN ÔNG TỐT: There are some who have suggested that President Trump did this, possibly in part because he’s so deeply concerned about his impeachment trial, and this specifically could be targeted for John Bolton, the former national security adviser, who almost immediately tweeted, “We have been working on this for a long time,” talking about the assassination, and that Trump is afraid of what he might say, and so this would appease him.
nước Anh HOÀNG ĐẾ: Another important tranche of documents that I released are in the Iran folder, which actually shows that Cambridge Analytica and other right-wing organizations, like America Rising, that does all of the opposition research for the GOP, were polling to see how many people in the United States were interested in the Iran deal, to drop sanctions, or if they were against the Iran nuclear deal and more interested in war. So you can see the types of questions that were asked, and that they were using that to model and identify people who could be persuaded to go against Iran, and whether or not that would be favorable for electoral fodder, I should say. And right now that same polling is happening. So, if you are identified as being a persuadable, you are going to see more of this propaganda in order to convince you that war is in our best interest, when it obviously is not.
AMY NGƯỜI ĐÀN ÔNG TỐT: Emma Briant, what makes this different from regular polling? Why do you see this as evil?
EMMA BRIANT: It’s not polling I think is evil; it’s what it’s being used for. I’d like to put this in a little bit of a wider context, if I may, and talk a little bit about what I know about SCL và -
AMY NGƯỜI ĐÀN ÔNG TỐT: You have 30 seconds, and then we will do Part 2.
EMMA BRIANT: — Iran. Oh, OK, sure. So, basically, they were working also in the Gulf from 2013. And it’s really important to note that the Saudis and the Các Tiểu vương quốc Ả Rập were also very keen to oppose the Iran deal, and Trump’s election was followed by a huge spike in arms sales to the Saudis. And I think that this being a military contractor is extremely important to remember in the light of these recent developments on Iran. Thank you.
AMY NGƯỜI ĐÀN ÔNG TỐT: Đây là Dân chủ ngay!, democracynow.org, Báo cáo chiến tranh và hòa bình. Tôi là Amy Goodman.
DAVID XE: All of your interactions, your credit card swipes, web searches, locations, likes, they’re all collected, in real time, into a trillion-dollar-a-year industry.
CAROLE CADWALLADR: The real game changer was Cambridge Analytica. They worked for the Trump campaign and for the Brexit campaign. They started using information warfare.
AMY NGƯỜI ĐÀN ÔNG TỐT: New details are emerging about how the shadowy data firm Cambridge Analytica worked to manipulate voters across the globe, from the 2016 election in the United States to the Brexit campaign in Britain.
We are continuing our look at the Oscar-shortlisted documentary Đại Hack, which chronicles the rise and fall of Cambridge Analytica. And we’re continuing with our four guests.
Jehane Noujaim and Karim Amer are the co-directors of Đại Hack, which was just nominated for a BAFTA — that’s the British equivalent of the Oscars — as well as made it to the Academy Award shortlist for documentaries. Jehane and Karim’s past film include The Square. Jehane was the director of Phòng điều khiển.
Brittany Kaiser is also with us. She’s the Cambridge Analytica whistleblower who’s featured in the film. She has written the book Targeted: The Cambridge Analytica Whistleblower’s Inside Story of How Big Data, Trump, and Facebook Broke Democracy and How It Can Happen Again. She’s joining us in her first interview that she’s done after releasing a trove of documents on Cambridge Analytica’s involvement in elections around the world and other issues.
And we’re joined by Emma Briant, a visiting research associate in human rights at Bard College who specializes in researching propaganda. Her forthcoming book, Propaganda Machine: Inside Cambridge Analytica and the Digital Influence Industry.
We ended Phần 1 of our discussion talking about the difference between polling and PSYOPs, or psychographics. So, Emma Briant, let’s begin with you in this segment of our broadcast. For people who aren’t aware of Cambridge Analytica, why you essentially became obsessed with it? And talk about the new level Cambridge Analytica and Facebook have brought manipulation to.
EMMA BRIANT: Absolutely. The scale of what they were doing around the world, just, you know, the incredible potential of these technologies, hit me like a ton of bricks in 2016, when we realized that they were messing around in our own elections. Now, of course, this had been a human rights issue around the world for years before that. And, you know, I think democracies had been complacent about the ways in which things that are being ignored in other places would also ripple into our own societies. We’ve been engaged in counterterrorism wars around the world and developing these technologies for deployment against terrorists in Middle Eastern contexts, for stabilization projects, for counterterrorism at home, as well. And these technologies were developed in ways that were not enclosed from further development and commercialization and adaptation into politics.
And one of the problems that I realized as I was digging deeper and deeper into the more subterranean depths of the influence industry is how underregulated it is, how even our defense departments are not aware of what these companies are doing, beyond the purposes that they are hiring them for a lot of the time. There are conflicts of interest that are just not being declared. And this is deeply dangerous, because we can’t have bad actors weaponizing these technologies against us.
And we’re also seeing the extension of inequality and modern imperialism in action, of course, where these big firms are being created out of, you know, our national security infrastructure, are then able to go in between those campaigns and work for other states, which, whether they’re allies or enemies, may not necessarily have our democracies’ interests at heart.
So, the power and the importance of this is deeply, deeply important. These technologies that they were developing are aggressive technologies. This isn’t just advertising as normal. This is a surveillant infrastructure of technology that is, you know, weaving its way through our everyday lives. Our entire social world is now data-driven.
And what we’re now seeing is these — we’ve seen, in parallel with that, the rise of these influence companies out of the “war on terror,” both, for instance, in Israel, as well as in the in the West, these companies being developed to try to combat terrorism, but also then those are able to be — you know, reused these technologies for commercial purposes. And a multibillion-dollar industry grew up with the availability of technology and data. And we have been sleepwalking through it, and suddenly woke up in the last few years.
The sad thing is that we haven’t had enough attention to what we do about this. We can’t just scare people. We can’t just advertise the tools of Cambridge Analytica so that the executives can form new companies and go off and do the same thing again.
What we need is action. And we need to be calling on everybody to be getting in touch with their legislators, as Brittany Kaiser has been arguing, as well. And we need to be shouting loudly to make sure that military contractors are being properly governed. Oversight, reporting of conflicts of interest must go beyond the individuals who are engaged in an operation. It has to go to networks of companies. They need to be declaring who else they’re working for and with.
And, you know, we also need to actually regulate the whole influence industry, so these companies, like PR companies, like — lobbying is already somewhat regulated. But we still have an awful lot of ability to use shell companies to cover up what’s actually happening. If we can’t see what’s happening, we’re doomed. We have to make sure that these firms are able to be transparent, that we know what’s going on and that we can respond and show that we’re protecting our democratic elections.
AMY NGƯỜI ĐÀN ÔNG TỐT: Chúng ta hãy đi đến một clip từ Đại Hack about how the story of Cambridge Analytica and what it did with Facebook first came to light. Carole Cadwalladr is a reporter who broke the Cambridge Analytica story.
CAROLE CADWALLADR: I started tracking down all these Cambridge Analytica ex-employees. And eventually, I got one guy who was prepared to talk to me: Chris Wylie. We had this first telephone call, which was insane. It was about eight hours long. And pooff!
CHRISTOPHER WYLIE: My name is Christopher Wylie. I’m a data scientist, and I help set up Cambridge Analytica. It’s incorrect to call Cambridge Analytica a purely sort of data science company or an algorithm company. You know, it is a full-service propaganda machine.
AMY NGƯỜI ĐÀN ÔNG TỐT: “A full-service propaganda machine,” Chris Wylie says. Before that, Carole Cadwalladr, which brings us to Brittany Kaiser, because when Chris Wylie testified in the British Parliament, and they were saying, “How do we understand exactly what Cambridge Analytica has done?” he said, “Talk to Brittany Kaiser,” who ultimately did also testify before the British Parliament. But talk about what Chris Wylie’s role was, Cambridge Analytica, how you got involved. I want to step back, and you take us step by step, especially for young people to understand how they can get enmeshed in something like this, and then how you end up at Trump’s inauguration party with the Mercers.
nước Anh HOÀNG ĐẾ: I think it’s very important to note this, because there are people all around the world that are working for tech companies, that I’m sure joined that company in order to do something good. They want the world to be more connected. They want to use technology in order to communicate with everybody, to get people more engaged in important issues. And they don’t realize that while you’re moving fast and breaking things, some things get so broken that you cannot actually contemplate or predict what those repercussions are going to look like.
Chris Wylie and I both really idealistically joined Cambridge Analytica because we were excited about the potential of using data for exciting and good impact projects. Chris joined in 2013 on the data side in order to start developing different types of psychographic models. So he worked with Dr. Aleksandr Kogan and the Cambridge Psychometrics Centre at Cambridge University in order to start doing experiments with Facebook data, to be able to gather that data, which we now know was taken under the wrong auspices of academic research, and was then used in order to identify people’s psychographic groupings.
AMY NGƯỜI ĐÀN ÔNG TỐT: Now, explain that, psychographic groupings, and especially for people who are not on Facebook, who don’t understand its enormous power and the intimate knowledge it has of people. Think of someone you’re talking to who’s never experienced Facebook. Explain what is there.
nước Anh HOÀNG ĐẾ: Absolutely. So, the amount of data that is collected about you on Facebook and on any of your devices is much more than you’re really made aware of. You probably haven’t ever read the terms and conditions of any of these apps on your phone. But if you actually took the time to do it and you could understand it, because most of them are written for you not to understand — it’s written in legalese — you would realize that you are giving away a lot more than you would have ever agreed to if there was transparency. This is your every move, everywhere you’re going, who you’re talking to, who your contacts are, what information you’re actually giving in other apps on your phone, your location data, all of your lifestyle, where you’re going, what you’re doing, what you’re reading, how long you spend looking at different images and websites.
This amount of behavioral data gives such a good picture of you that your behavior can be predicted, as Karim was talking about earlier, to a very high degree of accuracy. And this allows companies like Cambridge Analytica to understand how you see the world and what will motivate you to go and take an action — or, unfortunately, what will demotivate you. So, that amount of data, available on Facebook ever since you joined, allows a very easy platform for you to be targeted and manipulated.
And when I say “psychographic targeting,” I’m sure you probably are a little bit more familiar with the Myers-Briggs test, so the Myers-Briggs that asks you a set of questions in order to understand your personality and how you see the world. The system that Cambridge Analytica used is actually a lot more scientific. It’s called the ĐẠI DƯƠNG five-factor model. And ĐẠI DƯƠNG stands for O for openness, C for conscientiousness, whether you prefer plans and order or you’re a little bit more fly by the seat of your pants. Extraversion, whether you gather your energy from being out surrounded by people, or you’re introverted and you prefer to gather your energy from being alone. If you are agreeable, you care about your family, your community, society, your country, more than you care about yourself. And if you are disagreeable, then you are a little bit more egotistical. You need messages that are about benefits to you. And then the worst is neurotic. You know, it’s not bad to be neurotic. It means that you are a little bit more emotional. It means, unfortunately, as well, that you are motivated by fear-based messaging, so people can use tactics in order to scare you to doing what they want to do.
And this is what was targeted when they were gathering that data out of Facebook to figure out which group you belonged into. They found about 32 different groups of people, different personality types. And there were groups of psychologists that were looking into how they could understand that data and convert that into messaging that was just for you.
I need to remind everybody that the Trump campaign put together over a million different advertisements that were put out, a million different advertisements with tens of thousands of different campaigns. Some of these messages were for just you, were for 50 people, a hundred people. Obviously, certain groups are thousands, tens of thousands or millions. But some of them were targeted very much directly at the individual, to know exactly what you’re going to click on and exactly what you care about.
AMY NGƯỜI ĐÀN ÔNG TỐT: So they were doing this before Cambridge Analytica. But describe — I want to actually go to a Bannon clip, Steve Bannon, who takes credit for naming Cambridge Analytica, right? Because you had SCL before, Defence.
nước Anh HOÀNG ĐẾ: Vâng.
AMY NGƯỜI ĐÀN ÔNG TỐT: And then it becomes Cambridge Analytica, for Cambridge University, right? Where Kogan got this information that he culled from Facebook.
nước Anh HOÀNG ĐẾ: Vâng.
AMY NGƯỜI ĐÀN ÔNG TỐT: This is the White House chief strategist Steve Bannon in an interview at a Thời báo Tài chính conference in March 2018. Bannon said that reports that Cambridge Analytica improperly accessed data to build profiles on American voters and influence the 2016 presidential election were politically motivated. Months later, evidence emerged linking Bannon to Cambridge Analytica, the scandal, which resulted in a $5 billion fine for Facebook. Bannon is the founder and former board member of the political consulting firm — he was vice president of Cambridge Analytica.
STEPHEN BANNON: All Cambridge Analytica is the data scientists and the applied applications here in the United States. It has nothing to do with the international stuff. The Guardian actually tells you that, and Người quan sát tell you that, when you get down to the 10th paragraph, OK? When you get down to the 10th paragraph. And what Nix does overseas is what Nix does overseas. Right? It was a data — it was a data company.
And by the way, Cruz’s campaign and the Trump campaign say, “Hey, they were a pretty good data company.” But this whole thing on psychographics was optionality in the deal. If it ever worked, it worked. But it hasn’t worked, and it doesn’t look like it’s going to work. So, it was never even applied.
AMY NGƯỜI ĐÀN ÔNG TỐT: So, that’s Steve Bannon in 2018, key to President Trump’s victory and to his years so far in office, before he was forced to — before he was forced out. What was your relationship with Steve Bannon? You worked at Cambridge Analytica for over three years. You had the keys to the castle, is that right, in Washington?
nước Anh HOÀNG ĐẾ: Yes, for a while I actually split the keys to what is Steve’s house, with Alexander Nix, because we used his house as our office. His house is also used as a Breitbart office in the basement. It’s called the “Breitbart Embassy” on Capitol Hill. And that’s where I would go for meetings.
AMY NGƯỜI ĐÀN ÔNG TỐT: Who funded that?
nước Anh HOÀNG ĐẾ: I believe it was owned by the Mercer family, that building. And we would come into the basement and use that boardroom for our meetings. And we would use that for planning who we were going to go pitch to, what campaigns we were going to work for, what advocacy groups, what conservative 501(c)(3)s and (c)(4)s he wanted us to go see.
And I didn’t spend a lot of time with Steve, but the time I did was incredibly insightful. Almost every time I saw him, he’d be showing me some new Hillary Clinton hit video that he had come out with, or announcing that he was about to throw a book launch party for Ann Coulter for ¡Adios, America!, which was something that he invited both me and Alexander to, and we promptly decided to leave the house before she arrived.
But Steve was very influential in the development of Cambridge Analytica and who we were going to go see, who we were going to support with our technology. And he made a lot of the introductions, which in the beginning seemed a little less nefarious than they did later on, when he got very confident and started introducing us to white right-wing political parties across Europe and in other countries and tried to get meetings with the main political parties, or leftist or green parties instead, to make sure that those far-right-wing parties that do not have the world’s best interests at heart could not get access to these technologies.
AMY NGƯỜI ĐÀN ÔNG TỐT: You said in Đại Hack, in the film, that you have evidence of illegality of the Trump and Brexit campaigns, that they were conducted illegally. I was wondering if you can go into that. I mean, it was controversial even, and Carole Calwalladr, the great reporter at Người quan sát và The Guardian, was blasted and was personally targeted, very well demonstrated in Đại Hack, for saying that Cambridge Analytica was involved in Brexit. They kept saying they had nothing to do with it, until she shows a video of you, who worked for Cambridge Analytica, at one of the founding events of leave it, or Brexit.
nước Anh HOÀNG ĐẾ: Yeah, Leave.EU, that panel that I was on, which has now become quite an infamous video, was their launch event to launch the campaign. And Cambridge Analytica was in deep negotiations, through introduction of Steve Bannon, with both of the Brexit campaigns. I was told, actually, originally we pitched remain, and the remain side said that they did not need to spend money on expensive political consultants, because they were going to win anyway. And that’s actually what I also truly believed, and so did they.
So, Steve made the introductions to make sure that we would still get a commercial contract out of this political campaign, and both to Vote Leave and Leave.EU. Cambridge Analytica took Leave.EU, and AIQ, which was Cambridge Analytica’s essentially digital partner, before Cambridge Analytica could run our own digital campaigns, they were running the Vote Leave side, both funded by the Mercers, both with the same access to this giant database on American voters.
AMY NGƯỜI ĐÀN ÔNG TỐT: The Mercers funded Brexit?
nước Anh HOÀNG ĐẾ: There was Cambridge Analytica work, as well as AIQ work, in both of the leave campaigns. So, a lot of that money, in order to collect that data and in order to build the infrastructure of both of those companies, came from Mercer-funded campaigns, yes.
AMY NGƯỜI ĐÀN ÔNG TỐT: And again, explain what AIQ là.
nước Anh HOÀNG ĐẾ: AIQ was a company that actually ran all of Cambridge Analytica’s digital campaigns, until January 2016, when Molly Schweickert, our head of digital, was hired in order to build ad tech internally within the company. AIQ was based in Canada and was a partner that had access to Cambridge Analytica data the entire time that they were running the Vote Leave campaign, which was the designated and main campaign in Brexit.
AMY NGƯỜI ĐÀN ÔNG TỐT: So, when did you see the connection between Brexit and the Trump campaign?
nước Anh HOÀNG ĐẾ: Actually, a lot of it started to come when I saw some of Carole’s reporting, because there were a lot of conspiracy theories over what was going on, and I didn’t know what to believe. All I knew was that we definitely did work in the Brexit campaign, “we” as in when I was at Cambridge Analytica, because I was one of the people working on the campaign. And we obviously played a large role in not just the Trump campaign itself, but Trump super PACs and a lot of other conservative advocacy groups, 501(c)(3)s, (4)s, that were the infrastructure that allowed for the building of the movement that pushed Donald Trump into the White House.
AMY NGƯỜI ĐÀN ÔNG TỐT: I mean, it looks like Cambridge Analytica was heading to a billion-dollar corporation.
nước Anh HOÀNG ĐẾ: That’s what Alexander used to tell us all the time. That was the carrot that he waved in front of our eyes in order to have us keep going. “We’re building a billion-dollar company. Aren’t you excited?” And I think that that’s what so many people get caught up in, people that are currently working at Facebook, people that are working at Google, people that are working at companies where they are motivated to build exciting technology, that obviously can also be very dangerous, but they think they’re going to financially benefit and be able to take care of themselves and their families because of it.
AMY NGƯỜI ĐÀN ÔNG TỐT: So what was illegal?
nước Anh HOÀNG ĐẾ: The massive problems that came from the data collection, specifically, are where my original accusations come from, because data was collected under the auspices of being for academic research and was used for political and commercial purposes. There are also different data sets that are not supposed to be matched and used without explicit transparency and consent in the United Kingdom, because they actually have good national data protection laws and international data protection laws through the European Union to protect voters. Unfortunately, in the United States, we only have seen the state of California coming out and doing it.
Now, on the other side, we have voter suppression laws that prevent our vote from being suppressed. We have laws against discrimination in advertising, racism, sexism, incitement of violence. All of those things are illegal, yet somehow a platform like Facebook has decided that if politicians want to use any of those tactics, that they will not be held to the same community standards as you or me, or the basic laws and social contracts that we have in this country.
AMY NGƯỜI ĐÀN ÔNG TỐT: Karim Amer and Jehane Noujaim, I was wondering if you can talk about — Brittany sparked this when she talked about voter suppression — Trinidad and Tobago, which you go into in your film, because ultimately the elections there were about voter suppression and trying to get whole populations not to vote.
KARIM MỸ: Yeah, I think it was important for us to show in the film the expansiveness of Cambridge’s work. This went beyond the borders of the United States and even beyond the borders of the EU and the U.K. Because what we find is that Cambridge used the — in pursuing this global influence industry that they were very much a part of, they used different countries as Petri dishes to learn and get the know-how about different tactics. And from improving those tactics, they could then sell them for a higher cost — higher margin in Western democracies, where the election budgets are, you know — we have to remember, I think it’s important to predicate that the election business has become a multibillion-dollar global business, right? So, we have to remember that while we are upset with companies like Cambridge, we allowed for the commoditization of our democratic process, right? So, people are exploiting this now because it’s become a business. And we, as purveyors of this, can’t really be as upset as we want to be, when we’ve justified that. So I want to preface it with that.
Now, that being said, what’s happened as a result is a company like Cambridge can practice tactics in a place like Trinidad, that’s very unregulated in terms of what they can and can’t do, learn from that know-how and then, you know, use it — parlay it into activities in the United States. What they did in Trinidad, and why it was important for us to show it in the film, is they led something called the “Do So” campaign, where they admit to making it cool and popular among youth to get out and not vote. And they knew —
AMY NGƯỜI ĐÀN ÔNG TỐT: So, you had the Indian population and the black population.
KARIM MỸ: And the black population. And there is a lot of historic tension between those two, and a lot of generational differences, as well, between those two. And the “Do So” campaign targeted — was was done in a way to, you know, by looking at the data and looking at the predictive analysis of which group would vote or not vote, get enough people to dissuade them from voting, so that they could flip the election.
AMY NGƯỜI ĐÀN ÔNG TỐT: Targeted at?
KARIM MỸ: Targeted at the youth. And so, this is really — when you watch —
AMY NGƯỜI ĐÀN ÔNG TỐT: “Do So” actually meant “don’t vote.”
KARIM MỸ: “Do So,” don’t vote.
JEHANE NOUJAIM: Don’t vote.
KARIM MỸ: Yes, exactly. And when —
AMY NGƯỜI ĐÀN ÔNG TỐT: With their fists crossed.
KARIM MỸ: With their fists.
AMY NGƯỜI ĐÀN ÔNG TỐT: And that it became cool not to vote.
KARIM MỸ: Exactly. And you look at the level of calculation behind this, and it’s quite frightening. Now, as Emma was saying, a lot of these tactics were born out of our own fears in the United States and the U.K. post-9/11, when we allowed for this massive weaponization of influence campaigns to begin. You know, if you remember President Bush talking about, you know, the battle for the hearts and minds of the Iraqi people, all of these kinds of industries were born out of this.
And now I believe what we’re seeing is the hens have come home to roost, right? All of these tactics that we developed in the name of, quote-unquote, “fighting the war on terror,” in the name of doing these things, have now been commercialized and used to come back to the biggest election market in the world, the United States. And how do we blame people for doing that, when we’ve allowed for our democracy to be for sale?
And that’s what Brittany’s files today, that she’s releasing and has released over the last couple days, really give us insight to. The Hindsight Files that Brittany has released show us how there is an auction happening for influence campaigns in every democracy around the world. There is no vote that is unprotected in the current way that we — in the current space that we’re living.
And the thing that’s allowing this to happen is these information platforms like Facebook. And that is what’s so upsetting, because we can actually do something about that. We are the only country in the world that can hold Facebook accountable, yet we still have not done so. And we still keep going to their leadership hoping they do the right thing, but they have not. And why is that? Because no industry has ever shown in American history that it can regulate itself. There is a reason why antitrust laws exist in this country. There’s a tradition of holding companies accountable, and we need to re-embrace that tradition, especially as we enter into 2020, where the stakes could not be higher.
AMY NGƯỜI ĐÀN ÔNG TỐT: Jehane, you want to talk about the issue of truth?
JEHANE NOUJAIM: Yes. I think that what’s under attack is the open society and truth within it. And that underlies every single problem that exists in the world, because if we don’t have an understanding of basic facts and can have nuanced debate, our democracies are destroyed.
AMY NGƯỜI ĐÀN ÔNG TỐT: So, let’s go to this issue of truth. In October, as Facebook said it will not fact-check political ads or hold politicians to its usual content standards, the social media giant’s CEO Mark Zuckerberg was grilled for more than five hours by lawmakers on Capitol Hill on Facebook policy of allowing politicians to lie in political advertisements, as well as its role in facilitating election interference and housing discrimination. Again, this is New York Congressmember Alexandria Ocasio-Cortez grilling Mark Zuckerberg at that hearing.
REP. ALEXANDRIA Ocasio–CORTEZ: Could I run ads targeting Republicans in primaries, saying that they voted for the Green New Deal?
DẤU ZUCERBERG: Sorry, I — can you repeat that?
REP. ALEXANDRIA Ocasio–CORTEZ: Would I be able to run advertisements on Facebook targeting Republicans in primaries, saying that they voted for the Green New Deal? I mean, if you’re not fact-checking political advertisements, I’m just trying to understand the bounds here, what’s fair game.
DẤU ZUCERBERG: Congresswoman, I don’t know the answer to that off the top of my head. I think probably?
REP. ALEXANDRIA Ocasio–CORTEZ: So you don’t know if I’ll be able to do that.
DẤU ZUCERBERG: I think probably.
REP. ALEXANDRIA Ocasio–CORTEZ: Do you see a potential problem here with a complete lack of fact-checking on political advertisements?
DẤU ZUCERBERG: Well, Congresswoman, I think lying is bad. And I think if you were to run an ad that had a lie, that would be bad. That’s different from it being — from — in our position, the right thing to do to prevent your constituents or people in an election from seeing that you had lied.
REP. ALEXANDRIA Ocasio–CORTEZ: So, we can — so, you won’t take down lies, or you will take down lies? I think this is just a pretty simple yes or no.
AMY NGƯỜI ĐÀN ÔNG TỐT: That’s Congressmember Alexandria Ocasio-Cortez questioning Mark Zuckerberg at a House hearing. Brittany Kaiser, you also testified in Congress. You testified in the British Parliament. And as you watch Mark Zuckerberg, you also wrote a piece that says, “How much of Facebook’s revenue comes from the monetization of users’ personal data?” Talk about what AOC was just asking Zuckerberg, that he wouldn’t answer.
nước Anh HOÀNG ĐẾ: Absolutely. So, the idea that politicians, everything that they say is newsworthy is something that Mark Zuckerberg is trying to defend as a defense of free speech. But you know what? My right to free speech ends where your human rights begin. I cannot use my right to free speech in order to discriminate against you, use racism, sexism, suppress your vote, incite violence upon you. There are limits to that. And there are limits to that that are very well and obviously enshrined in our laws.
Unfortunately, Mark Zuckerberg doesn’t seem to understand that. He is going on TV and talking about how the most important thing that he’s investing in is stopping foreign intervention in our elections. Fantastic. I definitely applaud that. But guess what. Russia only spent a couple hundred thousand dollars on intervening in the 2016 U.S. elections, whereas the Trump campaign spent billions amongst the PACs and the campaign and different conservative groups. So, you know what? The biggest threat to democracy is not Russia; the biggest threat is domestic.
AMY NGƯỜI ĐÀN ÔNG TỐT: So, keep on that front and how it’s operating and how we’re seeing it even continue today.
nước Anh HOÀNG ĐẾ: Yes. So, I think it’s been very obvious that some of the campaign material that has come out is disinformation. It is fake news. And there are well-principled networks on television that refuse to air these ads. Yet somehow Facebook is saying yes.
And that is not only a disaster, but it’s actually completely shocking that we can’t agree to a social standard of what we are going to accept. Facebook has signed the contract for the internet. Facebook has said that they believe in protecting us in elections. But time and time again, we have seen that that’s not actually true in their actions.
They need to be investing in human capacity and in technology, AI, in order to identify disinformation, fake news, hatred, incitement of violence. These are things that can be prevented, but they’re not making the decision to protect us. They’re making the decision to line their pockets with as much of our value as possible.
AMY NGƯỜI ĐÀN ÔNG TỐT: Can you talk about the “Crooked Hillary” campaign and how it developed?
nước Anh HOÀNG ĐẾ: Absolutely. So, this started as a super PAC that was built for Ted Cruz, Keep the Promise I, which was run by Kellyanne Conway and funded by the Mercers. That was then converted to becoming a super PAC for Donald Trump. They tried to register with the Federal Election Commission the name, Defeat Crooked Hillary, and the FEC, luckily, did not allow them to do that. So it was called Make America Number 1.
This super PAC was headed by David Bossie, someone that you might remember from Citizens United, who basically brought dark money into our politics and allowed endless amounts of money to be funneled into these types of vehicles so that we don’t know where all of the money is coming from for these types of manipulative communications. And he was in charge of this campaign.
Now, on that two-day-long debrief that I talked about — and if you want to know more, you can read about it in my book — they told us —
AMY NGƯỜI ĐÀN ÔNG TỐT: Wait, and explain where you were and who was in the room.
nước Anh HOÀNG ĐẾ: So, I was in New York in our boardroom for Cambridge Analytica’s office on Fifth Avenue. And all of our offices from around the world had called in to videocast. And everybody from the super PAC and the Trump campaign took us through all of their tactics and strategies and implementation and what they had done.
Now, when we got to this Defeat Crooked Hillary super PAC, they explained to us what they had done, which was to run experiments on psychographic groups to figure out what was working and what wasn’t. Unfortunately, what they found out was the only very successful tactic was sending fear-based, scaremongering messaging to people that were identified as being neurotic. And it was so successful in their first experiments that they spent the rest of the money from the super PAC over the rest of the campaign only on negative messaging and fearmongering.
AMY NGƯỜI ĐÀN ÔNG TỐT: And crooked, the O-O in “crooked” was handcuffs.
nước Anh HOÀNG ĐẾ: Yes. That was designed by Cambridge Analytica’s team.
AMY NGƯỜI ĐÀN ÔNG TỐT: Karim?
KARIM MỸ: And one thing that I think it’s important to remember here, because there’s been a lot of debate among some people about: Did this actually work? To what degree did it work? How do we know whether it worked or not? What Brittany is describing is a debrief meeting where Cambridge, as a company, is saying, “This is what we learned from our political experience. This is what actually worked.” OK? And they’re sharing it because they’re saying, “Now this is how we want to codify this and commoditize this to go into commercial business.” Right?
So this is the company admitting to their own know-how. There is no debate about whether it works or not. This is not them advertising it to the world. This is them saying, “This is what we’ve learned. Based off that, this is how we’re going to run our business. This is how we’re going to invest in the expansion of this to sell this outside of politics.” The game was, take the political experience, parlay it into the commercial sector. That was the strategy. So, there is no debate whether it worked or not. It was highly effective.
And the thing that’s terrifying is that while Cambridge has been disbanded, the same actors are out there. And there’s nothing has been — nothing has changed to allow us to start putting in place legislation to say there is something called information crimes. In this era of information warfare, in this era of information economies, what is an information crime? What does it look like? Who determines it? And yet, without that, we are still living in this unfiltered, unregulated space, where places like Facebook are continuing to choose profit over the protection of the republic. And I think that’s what’s so outrageous.
JEHANE NOUJAIM: And I think it’s pretty telling that only two people —
AMY NGƯỜI ĐÀN ÔNG TỐT: Jehane.
JEHANE NOUJAIM: Only two people have come forward from Cambridge Analytica. Why is that? Both of the people that have come forward, Brittany and Chris, and also with Carole’s writing, have been targeted personally. And it’s been a very, very difficult story to tell. Even with us, when we released the film in January, every single time we have entered into the country, we have been stopped for four to six hours of questioning at the border. That —
AMY NGƯỜI ĐÀN ÔNG TỐT: Stopped by?
JEHANE NOUJAIM: Stopped by — on the border of the U.S., in JFK Airport, where you’re taken into the back, asked for all of your social media handles, questioned for four to six hours, every single time we enter the country. So —
AMY NGƯỜI ĐÀN ÔNG TỐT: Kể từ khi?
JEHANE NOUJAIM: Since we released the film, so since Sundance, since January, every time we’ve come back into the U.S.
AMY NGƯỜI ĐÀN ÔNG TỐT: And on what grounds are they saying they’re stopping you?
JEHANE NOUJAIM: No explanation. No —
AMY NGƯỜI ĐÀN ÔNG TỐT: And what is your theory?
JEHANE NOUJAIM: My theory is that it’s got something to do with this film. Maybe we’re doing something right. We were at first — we’ve been stopped in Egypt, but we’ve never been stopped in the U.S. in this way. We’re American citizens. Right?
AMY NGƯỜI ĐÀN ÔNG TỐT: You talk about people coming forward and not coming forward. I wanted to turn to former Cambridge Analytica COO, the chief operating officer, Julian Wheatland, speaking on the podcast giải mã lại.
THÁNG XNUMX Lúa mỳ: The company made some significant mistakes when it came to its use of data. They were ethical mistakes. And I think that part of the reason that that happened was that we spent a lot of time concentrating on not making regulatory mistakes. And so, for the most part, we didn’t, as far as I can tell, make any regulatory mistakes, but we got almost distracted by ticking those boxes of fulfilling the regulatory requirements. And it felt like, well, once that was done, then we’d done what we needed to do. And we forgot to pause and think about ethically what was — what was going on.
AMY NGƯỜI ĐÀN ÔNG TỐT: So, if you could decode that, Brittany? Cambridge Analytica COO Julian Wheatland, who, interestingly, in Đại Hack, while he was — really condemned Chris Wylie, did not appreciate Chris Wylie stepping forward and putting Cambridge Analytica in the crosshairs in the British Parliament, he was more equivocal about you. He — talk about Wheatland and his role and what he’s saying about actually abiding by the regulations, which they actually clearly didn’t.
nước Anh HOÀNG ĐẾ: Once upon a time, I used to have a lot of respect for Julian Wheatland. I even thought we were friends. I thought we were building a billion-dollar company together that was going to allow me to do great things in the world. But, unfortunately, that’s a story that I told myself and a story he wanted me to believe that isn’t true at all.
While he likes to say that they spent a lot of time abiding by regulations, I would beg to differ. Cambridge Analytica did not even have a data protection officer until 2018, right before they shut down. I begged for one for many years. I begged for more time with our lawyers and was told I was creating too many invoices. And for a long time, because I had multiple law degrees, I was asked to write contracts. And so were other —
AMY NGƯỜI ĐÀN ÔNG TỐT: Didn’t you write the Trump campaign contract?
nước Anh HOÀNG ĐẾ: The original one, yes, I did. And there were many other people that were trained in human rights law in the company that were asked to draft contracts, even though contract law was not anybody’s specialty within the company. But they were trying to cut corners and save money, just like a lot of technology companies decide to do. They do not invest in making the ethical or legal decisions that will protect the people that are affected by these technologies.
AMY NGƯỜI ĐÀN ÔNG TỐT: I wanted to bring Emma Briant back into this conversation. When you hear Julian Wheatland saying, “We abided by all regulations,” talk about what they violated. Ultimately, Cambridge Analytica was forced to disband. Was Alexander Nix criminally indicted?
EMMA BRIANT: I think that the investigations into what Alexander Nix has done will continue and that the revelations that will come from the Hindsight Files will result, I think, in more criminal investigations into this network of people. I think that it’s deeply worrying, the impact that they’ve had on — well, in terms of security, but also in terms of, you know, the law breaking that has happened in respect to data privacy. What was done with Facebook, in particular, is deeply worrying. But you look at what has been revealed in these files, and you start to realize that, actually, there’s a hell of a lot more going on behind the scenes that we don’t know.
And what Julian Wheatland is covering for is his central role in that company, which was in charge of all of the finances. Now, they were covering up wrongdoing repeatedly throughout their entire history. And that goes back way before Chris Wylie or Brittany Kaiser had joined the companies. I’ve seen documents which are evidencing things like cash payments and the use of shell companies to cover things up. It’s deeply, deeply disturbing. And I think the fact that Julian Wheatland is trying to shift blame onto the academics or just onto Facebook — because, you know, obviously Facebook has a massive culpability and has received multiple fines for its own role. However, it’s not just about Facebook, and it’s not just about those academics who were doing things wrong. It’s actually about Wheatland himself and about these people in charge of the company who were making the massive decisions that have affected all of us.
AMY NGƯỜI ĐÀN ÔNG TỐT: What happened with Professor Kogan?
EMMA BRIANT: Well, I think this is a really important aspect of it. The academics —
AMY NGƯỜI ĐÀN ÔNG TỐT: He was the Cambridge University professor that Chris Wylie went to —
EMMA BRIANT: Yes, he was —
AMY NGƯỜI ĐÀN ÔNG TỐT: — to say, “Help us scrape the data of, what, 80 million Facebook users,” under the guise of academic research that Brittany was just describing.
EMMA BRIANT: Exactly. And there’s still a little bit of a lack of clarity over the role of Kogan with respect to his — the Ph.D. student at the time, Michal Kosinski, who was also working on this kind of — the analysis of the Facebook data and its application to the personality quiz, the ĐẠI DƯƠNG personality quiz, which Brittany Kaiser talked about.
But, basically, with Joseph Chancellor, Kogan set up a new company, which would commercialize the data that have been obtained for their research. But, of course, this was known to Facebook. We are starting to realize that they knew way earlier what was going on. And it was being sold to Cambridge Analytica for the purposes of the U.S. elections. And, of course, Cambridge Analytica were also using it for their military technologies, the development of their information warfare techniques, which, by the way, they were also hoping to get many more defense contracts out of the winning of the U.S. election, and then, of course, apply it into commercial practices. And, you know, the ways in which they —
AMY NGƯỜI ĐÀN ÔNG TỐT: I wanted to stop you there. I just wanted to go to this point —
EMMA BRIANT: Ồ, vâng.
AMY NGƯỜI ĐÀN ÔNG TỐT: — because you mentioned it in Phần 1 of our discussion —
EMMA BRIANT: Tất nhiên.
AMY NGƯỜI ĐÀN ÔNG TỐT: — this issue —
EMMA BRIANT: Chắc chắn rồi.
AMY NGƯỜI ĐÀN ÔNG TỐT: — of military contractors —
EMMA BRIANT: Yeah.
AMY NGƯỜI ĐÀN ÔNG TỐT: — and the nexus of military and government power, the fact that with Trump’s election —
EMMA BRIANT: Yeah.
AMY NGƯỜI ĐÀN ÔNG TỐT: — military contractors were one of the greatest financial beneficiaries of Trump’s election.
KARIM MỸ: But I think it’s important to remember that —
EMMA BRIANT: Chắc chắn rồi.
AMY NGƯỜI ĐÀN ÔNG TỐT: Karim Amer.
KARIM MỸ: — the issue is that also these — when we think of military contractors, we think of people selling tanks and guns and bullets and these types of things. The problem that we don’t realize is that we’re in an era of information warfare. So the new military contractors aren’t selling — aren’t selling the traditional tanks. They’re selling the —
AMY NGƯỜI ĐÀN ÔNG TỐT: Although they’re doing that.
KARIM MỸ: They’re doing that, as well, but they’re selling the equivalent of that in the information space. And that’s a new kind of weapon. That’s a new kind of battle that we’re not familiar with.
And the reason why it’s more challenging for us is because there’s a deficit of language and a deficit of visuals. We don’t know where the battlefield is. We don’t know where the borders are. We can’t pinpoint, be like, “This is where the trenches are.” Yet we’re starting to uncover that. And that was so much of the challenge in making this film, is trying to see where can we actually show you where these wreckage sites are, where the casualties of this new information warfare are, and who the actors are and where the fronts are.
And I think, in entering 2020, we have to keep a keen eye on where the new war fronts are and when they’re happening in our domestic frontiers and how they’re happening in these devices that we use every day. So this is where we have to have a new kind of reframing of what we’re looking at, because while we are at war, it is a very different kind of borderless war where asymmetric information activity can affect us in ways that we never imagined.
AMY NGƯỜI ĐÀN ÔNG TỐT: And, Emma Briant, you talked about when Facebook knew the level of documentation that Cambridge Analytica was taking from them.
EMMA BRIANT: Yeah.
AMY NGƯỜI ĐÀN ÔNG TỐT: I mean, Cambridge Analytica paid them, right?
EMMA BRIANT: Yes. I mean, they were providing the data to GSR, who then, you know, were paid by —
AMY NGƯỜI ĐÀN ÔNG TỐT: Giải thích gì GSR là.
EMMA BRIANT: Sorry, the company by Kogan and Joseph Chancellor, their company that they were setting up to do both academic research but also to exploit the data for Cambridge Analytica’s purposes. So they were working with — on mapping that data onto the personality tests and giving that access to Cambridge Analytica, so that they could scale it up to profile people across the target states in America especially, but also all across America. They obtained way more than they ever expected, as Chris Wylie and Brittany have shown.
But I want to also ask: When did our governments know about what Cambridge Analytica and SCL were doing around the world and when they were starting to work in our elections? One of the issues is that these technologies have been partly developed by, you know, grants from our governments and that these were, you know, defense contractors, as we say. We have a responsibility for those companies and for ensuring that there’s reporting back on what they’re doing, and some kind of transparency.
As Karim was saying, that if you — you know, we’re in a state of global information warfare now. If you have a bomb that has been discovered that came from an American source and it’s in Yemen, then we can look at that bomb, and often there’s a label which declares that it’s an American bomb that has been bought, that has, you know, been used against civilians. But what about data? How do we know if our militaries develop technologies and the data that it has gathered on people, for instance, across the Middle East, the kind of data that Snowden revealed — how do we know when that is turning up in Yemen or when that is being utilized by an authoritarian regime against the human rights of its people or against us? How do we know that it’s not being manipulated by Russia, by Iran, by anybody who’s an enemy, by Saudi Arabia, for example, who SCL were also working with? We have no way of knowing, unless we open up this industry and hold these people properly accountable for what they’re doing.
AMY NGƯỜI ĐÀN ÔNG TỐT: Let me ask you about Cambridge University, Emma Briant.
EMMA BRIANT: Chắc chắn rồi.
AMY NGƯỜI ĐÀN ÔNG TỐT: We’re talking about Cambridge professors, but how did Cambridge University — or, I should say, did it profit here?
EMMA BRIANT: I think the issue is that a lot of academics believe that — you know, I mean, their careers are founded on the ability to do their research. They are incentivized to try and make real-world impacts. And in the United Kingdom, we have something called the Research Evaluation Framework. That’s building in more and more the requirement to show that you have had some kind of impact in the real world. All academics are being — feeling very much like they want to be engaged in industry. They want to be engaged in — not just in lofty towers and considered an elite, but working in partnerships with people having an impact in real campaigns and so on. And I think there’s a lot of incentivization also to make a little money on the side with that.
And the issue is that it’s not just actually Kogan who was involved working with Cambridge Analytica. There were many academics. And, yes, universities make — you know, are able to profit off these kinds of relationships, because that is the global impact of their internationally renowned academics. Unfortunately, I think that there isn’t built into that enough accountability and consideration of ethics.
And there is too great an ability to say, “Oh, but that’s not really part of my academic work,” when it’s something that’s a negative outcome. So, for instance, I recently revealed in Oxford University a professor who had been working with SCL, too, and there are other academics who were working with SCL all around the world who benefit in their careers from doing this. But then, when it’s revealed that there’s some wrongdoing, you know, it’s taken off the CV.
Now, I think universities have a great responsibility to be not profiting from this, to be transparent about it. They’re presenting themselves as ethical institutions that are, you know, teaching students. And what if those staff are actually, you know, engaged in nefarious activities? The problem with Cambridge University is they’ve also covered up this. And, you know, Oxford University, when I tried to challenge them, as well, have also tried to cover this up. And we don’t know how many other universities are also doing this.
And the universities also want to expand the number of students they’ve got, and they will work with firms like Cambridge Analytica. Sheffield University were named in the documents that Brittany Kaiser has released. I’ve been chasing them with Freedom of Information requests to find out whether or not they actually were a client of Cambridge Analytica. And it turns out that they had deleted the presentation and the emails that they had an exchange with the company.
Now, the lack of openness about decision-making about student recruitment is disturbing, because it looks like Sheffield didn’t partner with Cambridge Analytica, but other universities did. And that’s enabling that company to then have access to students’ data, which, potentially, if they are in mind to abuse it, could be used for political targeting. So, if you imagine American universities are coming forward and working with a lot of, you know, data-driven companies to try to improve their outreach to gather more students to look for — like, what they were doing in the past is looking for look-alike audiences and so on — in order to, you know, increase their student numbers —
AMY NGƯỜI ĐÀN ÔNG TỐT: Emma Briant —
EMMA BRIANT: And all of this data is being given across to these companies that have no transparency at all.
AMY NGƯỜI ĐÀN ÔNG TỐT: Emma Briant, before I go back to Brittany on this issue of the new Hindsight Files, I wanted to ask you what you found most interesting about them, though she is just releasing them now, since the beginning of the year, so there is an enormous amount to go through.
EMMA BRIANT: Vâng.
AMY NGƯỜI ĐÀN ÔNG TỐT: It involves scores of countries all over the world.
EMMA BRIANT: To me, I think the biggest reveal is going to be in the American campaigns around this, but I think you haven’t seen the half of it yet. This is the tip of the iceberg, as I’ve been saying.
My thing that I think that is most interesting of what’s been revealed so far is actually the Iran campaign, because, you know, this is a very complex issue, and it really is an exemplar of the kinds of conflicts of interest that I’m talking about, at a company that is, you know, set out to profit from the arms trade and from the expansion of war in that region and from the favoring of one side in a regional conflict, essentially, backed by American power, by the escalation of the conflict with Iran and, you know, by getting more contracts, of course, with the Gulf states, the Các Tiểu vương quốc Ả Rập and the Saudis. You know, and, of course, they were trying to put Trump in power, as well, to do that, and advancing John Bolton and the other hawks who have been trying to demand that sanctions — to keep sanctions and to get out of the Iran deal, which they have been arguing is a flawed deal.
Và tất nhiên, SCL were involved in doing work in that region since 2013, including they were working before that on Iran for President Obama’s administration, which I’m going to be talking more about in the future. The issue is that there is a conflict of interest here. So you gain experience for one government, and then you’re going and working for others that maybe are not entirely aligned in their interests. Thank you.
AMY NGƯỜI ĐÀN ÔNG TỐT: Well, I mean, let’s be clear that all of this happened — although it was for the election of Donald Trump, of course, it happened during the Obama years.
EMMA BRIANT: Vâng.
AMY NGƯỜI ĐÀN ÔNG TỐT: That’s when Cambridge Analytica really gained its strength in working with Facebook.
EMMA BRIANT: Yeah. And SCL’s major shareholder, Vincent Tchenguiz, of course, was involved in the early establishment of the company Black Cube and in some of its early funding, I believe. I don’t know how long they stayed in any kind of relationship with that firm. However, the firm Black Cube were also targeting Obama administration officials with a massive smear campaign, as has been revealed in the media. And, you know, this opposition to the Iran deal and the promotion of these kinds of, you know, really fearmongering advertising that Brittany is talking about is very disturbing, when this same company is also driving, you know, advertising for gun sales and things like that.
AMY NGƯỜI ĐÀN ÔNG TỐT: Wait. Explain what Black Cube is, which goes right to today’s headlines —
EMMA BRIANT: Chính xác.
AMY NGƯỜI ĐÀN ÔNG TỐT: — because Harvey Weinstein, accused of raping I don’t know how many women at last count, also employed Black Cube, former Israeli intelligence folks, to go after —
EMMA BRIANT: Vâng.
AMY NGƯỜI ĐÀN ÔNG TỐT: — the women who were accusing him, and even to try to deceive the reporters, like at The New York Times, to try to get them to write false stories.
EMMA BRIANT: Absolutely. I mean, this is an intelligence firm that was born, again, out of the “war on terror.” So, Israel’s war on terror, this time, produced an awful lot of people who had gone through conscription and developed really, you know, strong expertise in cyberoperations or on developing information warfare technologies, in general, intelligence gathering techniques. And Black Cube was formed by people who came out of the Israeli intelligence industries. And they all formed these companies, and this has become a huge industry, which is not really being properly regulated, as well, and properly governed, and seems to be rather out of control. And they have been also linked to Cambridge Analytica in the evidence to Parliament. So, I think the involvement of all of these companies is really disturbing, as well, in relation to the Iran deal.
We don’t know that Cambridge Analytica in any way were working with Black Cube in this, at this point in time. However, the fact is that all of this infrastructure has been created, which is not being properly tackled. And how they’re able to operate without anybody really understanding what’s going on is a major, major problem.
We have to enact policies that open up what Facebook and other platforms are doing with our data, and make data work for our public good, make it impossible for these companies to be able to operate in such an opaque manner, through shell companies and so forth and dark money being funneled through, you know, systems, political communication systems, and understand that it’s not just about us and our data and as individuals. We have a public responsibility to others. If my data is being used in order to create a model for voter suppression of someone else, I have a responsibility to that. And it’s not just about me and whether or not I consent to that. If I’m consenting to somebody else’s voter suppression, my data to be used for that purpose, that’s also unacceptable. So what we really need is an infrastructure that protects us and ensures that that can’t be possible.
AMY NGƯỜI ĐÀN ÔNG TỐT: Brittany, I wanted to go back to this data dump that you’re doing in this release of documents that you’ve been engaged in since the beginning of the year. Brittany Kaiser, you worked for Cambridge Analytica for more than three years, so these documents are extremely meaningful. Emma Briant just mentioned Black Cube. Who else did Black Cube work with?
nước Anh HOÀNG ĐẾ: I don’t know about Black Cube, but I do know that there were Israeli intelligence firms that were cooperating with Cambridge Analytica and with some of Cambridge Analytica’s clients around the world, and not just former Israeli intelligence officers, but intelligence officers from many other countries. That’s concerning, but also Black Cube isn’t the only company you should be concerned about.
The founder of Blackwater, or their CEO, Erik Prince, was also an investor in Cambridge Analytica. So he profits from arm sales around the world, and military contracts, and has been accused of causing the unnecessary death of civilians in very many different wartime situations. He was one of the investors in Cambridge Analytica and their new company, Emerdata. And so, I should be very concerned, and everyone should be very concerned, about the weaponization of our data by people that are actually experts in selling weapons. So, that’s one thing that I think needs to be in the public discussion, the difference between what is military, what is civilian, and how those things can be used for different purposes or not.
AMY NGƯỜI ĐÀN ÔNG TỐT: Were you there with Erik Prince on inauguration night? He was there on election night with Donald Trump.
nước Anh HOÀNG ĐẾ: I suppose he was there. That has been reported. But I’ve never met the man. I have never spoken to him, nor do I ever wish to, to be honest.
So, now that — I really want to address back your question about the files that I’m releasing. I started this on New Year’s because I have no higher purpose than to get the truth out there. People need to know what happened. And for the pieces of the puzzle that I do not have, I do hope that investigative journalists and citizen journalists and individuals around the world, as well as investigators that are still working with these documents, can actually help bring people to justice.
I’m not the only person who’s releasing files. I’d like to draw attention to the Hofeller Files, which have also been released recently by the daughter of Thomas Hofeller, who was a GOP strategist. And she just released a massive tranche of documents the other day that show the voter suppression tactics of minorities used by Republicans around the United States. So, that is extra evidence, on top of what I’ve already released, on voter suppression.
I know that this concerns quite a lot of people because of the amount of cyberattacks that have been happening on our files that are currently hosted on Twitter. People are trying desperately to take those down. They’re not going to, because I have some of the best cybersecurity experts in the world keeping them up there. But I want to say to anyone that’s been trying to take them down: You cannot take them down, and you won’t be able to stop the release of the rest of the documents, because the information is decentralized around the world in hundreds of locations, and these files are coming out whether you like it or not.
AMY NGƯỜI ĐÀN ÔNG TỐT: In talking about Thomas Hofeller, these documents revealing that this now-dead senior Republican strategist, who specialized in gerrymandering, was secretly behind the Trump administration’s efforts to add a citizenship question in the 2020 census. He had been called the “Michelangelo of gerrymandering.” When he died in 2018, he left behind this computer hard drive full of his notes and records, and his estranged daughter found among the documents a 2015 study that said adding the citizenship question to the census, quote, “would be advantageous to Republicans and non-Hispanic whites” and “would clearly be a disadvantage to the Democrats.” That’s what she is releasing in the Hofeller Files.
nước Anh HOÀNG ĐẾ: Absolutely. And some of the evidence that I have in the Hindsight Files corroborates those types of tactics. Voter suppression, unfortunately, is a lot cheaper than getting people to register and turn out to the polls. And that is so incredibly sad. Roger McNamee actually talks about this quite often now, about the gamification of Facebook and how much hatred and negativity is cheaper and easier and more viral, and what that actually means for informing the strategy of campaigns. And we need to be very aware of that and know that what we’re seeing is made to manipulate and influence us, not for our own interests.
AMY NGƯỜI ĐÀN ÔNG TỐT: Talk more about these files and what you understand is in them. For example, we’ve been talking about Facebook, and something, of course, brought out in Đại Hack, in the Oscar-shortlisted film, Facebook owns WhatsApp. WhatsApp was key in elections like in Brazil for the far-right President Jair Bolsonaro.
nước Anh HOÀNG ĐẾ: Absolutely. And so, we have to realize that now that Facebook owns Instagram and WhatsApp, that the amount of data that they own about individuals around the world, behavioral data and real-time data, is absolutely unmatched. They are the world’s largest communications platform. They are the world’s largest advertising platform. And if they do not have the best interests of their users at heart, our democracies will never be able to succeed.
So, regulation of Facebook, perhaps even the breaking up of it, is the only thing that we can do at this moment, when Mark Zuckerberg and Sheryl Sandberg have decided to not take the ethical decision themselves. At this time, we cannot allow companies to be responsible for their own ethics and moral compass, because they’ve proven they cannot do it. We need to force them.
And anybody that has listened to this, please call your legislator. Tell them you care about your privacy. Tell them you care about your data rights. Tell them you care about your electoral laws being enforced online. You know what? Your representatives work for you. If you have any employees, do you not tell them what you want them to do every single day? Call your legislator. It takes a few minutes. Or even write them an email if you don’t want to pick up the phone. I beg of you. This is so incredibly important that we get through national legislation on these issues — national regulation. Just having California and New York lead the way is not enough.
JEHANE NOUJAIM: These are files that show over 68 countries, election manipulation in over 68 countries around the world. And it’s eerie when you — some of these are audio files — sitting, listening to audio, where inside the Trump campaign they are talking about how fear and hate engage people longer and is the best way to engage voters. And so, when you listen to that and listen to the research that has gone into that, all of a sudden it becomes clear why there is such division, why you see such hate, why you see such anger on our platforms.
KARIM MỸ: And I think what’s important to see —
AMY NGƯỜI ĐÀN ÔNG TỐT: Karim.
KARIM MỸ: — is that, you know, in the clip you showed, Chris Wylie is talking about how Cambridge is a full-service propaganda machine. What does that really mean? You know, I would say that what’s happening is we’re getting insight to the the network of the influence industry, the buying and selling of information and of people’s behavioral change. And it is a completely unregulated space.
And what’s very worrisome is that, as we’re seeing more and more, with what Emma’s talking about and what Brittany has shown, is this conveyor belt of military-grade information and research and expertise coming out of our defense work, that’s being paid for by our tax money, then going into the private sector and selling it to the highest bidder, with different special interests from around the world. So what you see in the files is, you know, a oil company buying influence campaign in a country that it’s not from and having no — you know, no responsibility or anything to what it’s doing there. And what happens in that, the results of that research, where it gets handed over to, no one knows. Any contracts that then result in the change that happens on the political ground, no one tracks and sees.
So, this is what we’re very concerned about, is because you’re seeing that everything has become for sale. And if everything is for sale —
JEHANE NOUJAIM: Our elections are for sale.
KARIM MỸ: Exactly. And so, how do we have any kind of integrity to the vote, when we’re living in such a condition?
JEHANE NOUJAIM: Democracy has been broken. And our first vote is happening in 28 days, and nothing has changed. No election laws have changed. Facebook’s a crime scene. No research, nothing has come out. We don’t understand it yet. This was why we felt so passionate about making this film, because it’s invisible. How do you make the invisible visible? And this is why Brittany is releasing these files, because unless we understand the tactics, which are currently being used again, right now, as we speak, same people involved, then we can’t change this.
AMY NGƯỜI ĐÀN ÔNG TỐT: Facebook’s a crime scene, Jehane. Elaborate on that.
JEHANE NOUJAIM: Absolutely. Facebook is where this has happened. Initially, we thought this was Cambridge Analytica — right? — and that Cambridge Analytica was the only bad player. But Facebook has allowed this to happen. And they have not revealed. They have the data. They understand what has happened, but they have not revealed that.
AMY NGƯỜI ĐÀN ÔNG TỐT: And they have profited off of it.
JEHANE NOUJAIM: And they have profited off of it.
KARIM MỸ: Well, it’s not just that they’ve profited off it. I think what’s even more worrisome is that a lot of our technology companies, I would say, are incentivized now by the polarization of the American people. The more polarized, the more you spend time on the platform checking the endless feed, the more you’re hooked, the more you’re glued, the more their KPI at the end of the year, which says number of hours spent per user on platform, goes up. And as long as that’s the model, then everything is designed, from the way you interact with these devices to the way your news is sorted and fed to you, to keep you on, as hooked as possible, in this completely unregulated, unfiltered way — under the guise of freedom of speech when it’s selectively there for them to protect their interests further. And I think that’s very worrisome.
And we have to ask these technology companies: Would there be a Silicon Valley if the ideals of the open society were not in place? Would Silicon Valley be this refuge for the world’s engineers of the future to come reimagine what the future could look like, had there not been the foundations of an open society? There would not be. Yet the same people who are profiting off of these ideals protecting them feel no responsibility in their preservation. And that is what is so upsetting. That is what is so criminal. And that is why we cannot look to them for leadership on how to get out of this.
We have to look at the regulation. You know, if Facebook was fined $50 billion instead of five, I guarantee you we wouldn’t be having this conversation right now. It would have led to not only an incredible change within the company, but it would have been the signal to the entire industry. And there would have been innovation that would have been sponsored to come out of this problem. Like, we can use technology to fix this, as well. We just have to create the right incentivization plan. I have belief that the engineers of the future that are around can help us get out of this. But currently they are not the — they are not the decision makers, because these companies are not democratic whatsoever.
AMY NGƯỜI ĐÀN ÔNG TỐT: Emma Briant?
EMMA BRIANT: Could I? Yes, please. I just wanted to make a point about how important this is for ordinary Americans to understand the significance for their own lives, as well, because I think some people hear this, and they think, “Oh, tech, this is maybe quite abstract,” or, you know, they may feel that other issues are more important when it comes to election time. But I want to make the point that, actually, you know what? This subject is about all of those other issues.
This is about inequality and it being enabled. If you care about, you know, having a proper debate about all of the issues that are relevant to America right now, so, you know, do you care about the — you know, the horrifying state of American prison system, what’s being done to migrants right now, if you care about a minimum wage, if you care about the healthcare system, you care about the poverty, the homelessness on the streets, you care about American prosperity, you care about the environment and making sure that your country doesn’t turn into the environmental disaster that Australia is experiencing right now, then you have to care about this topic, because we can’t have an adequate debate, we cannot, like, know that we have a fair election system, until we understand that we are actually having a discussion, from American to American, from, you know, country to country, that isn’t being dominated by rich oil industries or defense industries and brutal leaders and so on.
So, I think that the issue is that Americans need to understand that this is an underlying issue that is stopping them being able to have the kinds of policies that would create for them a better society. It’s stopping their own ability to make change happen in the ways that they want it to happen.
AMY NGƯỜI ĐÀN ÔNG TỐT: Emma Briant —
EMMA BRIANT: It’s not an abstract issue. Thank you.
AMY NGƯỜI ĐÀN ÔNG TỐT: What would be the most effective form of regulation? I mean, we saw old Standard Oil broken up, these monopolies broken up.
EMMA BRIANT: Yeah.
AMY NGƯỜI ĐÀN ÔNG TỐT: Do you think that’s the starting point for companies like Facebook, like Google and others?
EMMA BRIANT: I think that’s a big part of it. I do think that Elizabeth Warren’s recommendations when it comes to that and antitrust and so on are really important. And we have legal precedents to follow on that kind of thing.
But I also think that we need an independent regulator for the tech industry and also a separate one for the influence industry. So, America has some regulation when it comes to lobbying. In the U.K., we have none. And quite often, you know, American companies will partner with a British company in order to be able to get around doing things, for instance. We have to make sure that different countries’ jurisdictions cannot be, you know, abused in order to make something happen that would be forbidden in another country.
We need to make sure that we’re also tackling how money is being channeled into these campaigns, because, actually, there’s an awful lot we could do that isn’t just about censoring or taking down content, but that actually is, you know, about making sure that the money isn’t being funneled in to the — to fund these actual campaigns. If we knew who was behind them, if we were able to show which companies were working on them and what other interests they might have, then I think this would really open up the system to better journalism, to better — you know, more accountability.
And the issue isn’t just about what’s happening on the platforms, although that is a big part of it. We have to think about, you know, the whole infrastructure. We also need more accountability when it comes to our governments. So, this is the third part of what I would consider the required regulatory framework. You have to address the tech companies and platforms. You also have to address the influence industry companies.
And you also need to talk about defense contracting, which has insufficient accountability at the moment. There is not enough reporting. There is not enough — when I say “reporting,” I mean the companies are not required to give enough information about what their conflicts of interest might be, about where they’re getting their money and what else they’re doing. And we shouldn’t be able to have companies that are working in elections at the same time as doing this kind of national security work. It’s so risky. I personally would outlaw it. But at the very least, it needs to be controlled more effectively, you know, because I think a lot of the time that the people who are looking at those contracts, when they’re deciding whether to give, you know, Erik Prince or somebody else a defense contract, are not always knowledgeable about the individuals concerned or their wider networks. And that is really disturbing. If we’re going to be spending the amount of money that we do on defense contracting, we can at least make sure it’s accountable.
KARIM MỸ: When you look at the film, there’s a clip that I think ties exactly to what Emma’s talking about, where in the secret camera footage of — Mark Turnbull is being filmed, and he’s bragging to a client and saying, you know, about the “Crooked Hillary” campaign, “We had the ability to release this information without any trackability, without anyone knowing where it came from, and we put it into the bloodstream of the internet.” Right? That is, I hope, something that we would say only happened in 2016 and will not be allowed to happen in
2020.
AMY NGƯỜI ĐÀN ÔNG TỐT: What would stop it? And let’s go —
KARIM MỸ: Well, political action, because what — the problem with that is that what you’re saying there is, essentially, is you can go out there and say what you want, not as an individual — free speech is an individual who says no — as an organized, concerted, profiting entity with special interests, without anyone knowing what you’re saying. And I would say that the first step to accountability is to say, “If you want to say something in the political arena, you have to – we have to know who’s behind what’s being said.” And I guarantee you, if you just put at least that level of accountability, a lot of people will be — a lot less hate would be spewed that would be funded for, because it would connect back to who’s behind it.
AMY NGƯỜI ĐÀN ÔNG TỐT: Let’s go to that video clip. This was a turning point for Cambridge Analytica.
KARIM MỸ: Vâng.
AMY NGƯỜI ĐÀN ÔNG TỐT: It was not only Turnbull, it was Alexander Nix —
KARIM MỸ: Vâng.
AMY NGƯỜI ĐÀN ÔNG TỐT: — who was selling the company and talking about what they could do.
ALEXANDER NIX: Deep digging is interesting, but, you know, equally effective can be just to go and speak to the incumbent and to offer them a deal that’s too good to be true and make sure that that’s video recorded. You know, these sorts of tactics are very effective, instantly having video evidence of corruption —
SỬA CHỮA: Đúng.
ALEXANDER NIX: — putting it on the internet, these sorts of things.
SỬA CHỮA: And the operative you will use for this is who?
ALEXANDER NIX: Well, someone known to us.
SỬA CHỮA: OK, so it is somebody. You won’t use a Sri Lankan person, no, because then this issue will —
ALEXANDER NIX: No, no. We’ll have a wealthy developer come in, somebody posing as a wealthy developer.
DẤU TURNBULL: I’m a master of disguise.
ALEXANDER NIX: Yes. They will offer a large amount of money to the candidate, to finance his campaign in exchange for land, for instance. We’ll have the whole thing recorded on cameras. We’ll blank out the face of our guy and then post it on the internet.
SỬA CHỮA: So, on Facebook or YouTube or something like this.
ALEXANDER NIX: We’ll send some girls around to the candidate’s house. We have lots of history of things.
AMY NGƯỜI ĐÀN ÔNG TỐT: So, that was Alexander Nix, the former head of the former company Cambridge Analytica. Brittany Kaiser, if you can talk about what happened when that video was released? I mean, he was immediately suspended.
nước Anh HOÀNG ĐẾ: He was immediately suspended. And Julian Wheatland, the former COO/Giám đốc tài chính, because he actually straddled both jobs, was then appointed temporary CEO before the company closed. What —
AMY NGƯỜI ĐÀN ÔNG TỐT: Is it totally dead now?
nước Anh HOÀNG ĐẾ: I wouldn’t say so. Just because it’s not the SCL Group or Cambridge Analytica or Emerdata, which was the holding company they set up at the end so that they could bring in large-scale investors from all around the world to scale up — just because it’s not under those names doesn’t mean that the same work isn’t being done by the same people. Former Cambridge Analytica employees are currently supporting Trump 2020. They are working in countries all around the world with individual political consultancies, marketing consultancies, strategic communications firms.
And it’s not just the people that worked for Cambridge Analytica now. As I mentioned earlier, because 2016 was so successful, there are now hundreds of these companies all around the world. There was an Oxford University report that came out a couple months ago that showed the proliferation of propaganda service companies, that are even worse than what Cambridge Analytica did, because they employ bot farms. And they use trolls on the internet in order to increase hatred and division.
And so, these technologies have become even more sophisticated. And if we don’t start investing in the legislation and the regulation to stop this, it’s going to get worse before it gets better. And I’m absolutely terrified to see what is going to happen on our newsfeeds between now and November 3rd.
AMY NGƯỜI ĐÀN ÔNG TỐT: What should people look for? And how do people right now, at this point — how do they get involved?
nước Anh HOÀNG ĐẾ: There are a few different ways. And first, I would say, educate yourself. Become digitally literate. Look on the trang mạng of the Center for Humane Technology. Look on the trang mạng of the Electronic Frontier Foundation. Read the contract for the internet. Look at the Beacon Trust trang mạng. And start to understand what your data rights are, how you can protect yourself online, how you can identify misinformation. The Center for Media Literacy is fantastic, as well. There are a lot of people who have put a lot of time into educating everybody.
There is a new concept called DQ. It means digital intelligence, like IQ or EQ. And it’s a new global standard that my new foundation, the Own Your Data Foundation, is helping roll out in schools in America, that actually teach kids these things, how to prevent cyberbullying and be ethical online and identify fake news and disinformation.
We have to empower ourselves to protect ourselves, because, unfortunately, legislation and regulation is not a fast process. But it could be a lot faster than it is right now, if everybody that listens to this actually calls your legislators. Just type in “contact your legislator.” You can find government websites. You can find advocacy groups that will help you do that. They’ll even give you example emails and phone calls that you can say, scripts. It takes five minutes of your time. Please do it.
AMY NGƯỜI ĐÀN ÔNG TỐT: As you point out in Đại Hack, data companies have surpassed oil companies as the most valuable companies in the world right now. You also — that piece that you wrote, “How much of Facebook’s revenue comes from the monetization of users’ personal data?” — what exactly do you mean? And what should people do in response?
nước Anh HOÀNG ĐẾ: What I mean is that data is now the world’s most valuable asset. And somehow us, as the producers of that asset, do not have access to that value. We look back in history, and we see when populations have been exploited for their natural resources, and we now see that as wrong. Anybody that is providing value has access to at least a dividend from that.
So what I’m asking is that not only should our data rights be our human rights, but we start to employ a property rights framework to think about the transparency that we should be given into how our assets are used, the opt-in infrastructure so that we actually can decide whether we want our data to be used for certain purposes or not. And if we decide to opt in, we should be given positive incentives to do so.
I would love to give my healthcare data if I knew that I could help solve cancer with research organizations and pharma companies. Very happy to do that. But I’m not going to do that until I have the transparency, the data security and the opt-in infrastructure to know that my data will not be abused if I decide to share it.
AMY NGƯỜI ĐÀN ÔNG TỐT: So, what do you do when it says “terms and conditions,” and in order to get to that website, which you’re racing to get to, you just click it because there’s no other way to get to it?
nước Anh HOÀNG ĐẾ: That is starting to be considered manipulative by the Federal Trade Commission, and the FTC is starting to outlaw that type of behavior. Also, the implementation of GDPR và CCPA in California does not allow for that —
AMY NGƯỜI ĐÀN ÔNG TỐT: The new law in California.
nước Anh HOÀNG ĐẾ: The new law in California that just went into enactment on January 1st, which already has very exciting updates that are coming in, that were proposed in November.
So, there’s going to be a lot of help for companies to comply. There is going to be a lot of help for political organizations and nonprofits to comply. And we’re going to make it easy to be ethical and moral about data usage. But right now this is forcing a new type of transparency, where terms and conditions are made completely transparent, so it’s easy for anybody to understand, even at a low-grade reading level, and you know what you are agreeing to. And if you decide to opt out, you can still access that website or that service. Because right now we have given up our privacy and our freedom in exchange for convenience. And we can’t allow people to be manipulated into that any longer.
JEHANE NOUJAIM: We’re very much in the middle of a war on truth, where it’s impossible to have a nuanced debate with somebody that you disagree with. That’s what’s happening in this country and around the world.
And there are leaders that have — are recognizing this, I mean, in terms of the showings of Đại Hack. That’s why we made this film, was so that people could become aware of this. Hillary Clinton hosted a screening. President Macron talked about it. Schiff has hosted a screening and a debate after it. So there are leaders that are trying to educate publics about this.
But we have to continue to educate ourselves, share the film, go to the Hindsight Files, @HindsightIs2020 on Twitter, and really start holding our governments accountable and our tech platforms accountable for our current situation.
AMY NGƯỜI ĐÀN ÔNG TỐT: Let’s end with another clip from the documentary Đại Hack. This is the early Facebook investor Roger McNamee.
ROGER McNAMEE: Facebook is designed to monopolize attention, just taking all of the basic tricks of propaganda, marrying them to the tricks of casino gambling — you know, slot machines and the like — and basically playing on instincts. And fear and anger are the two most dependable ways of doing that. And so they created a set of tools to allow advertisers to exploit that emotional audience with individual-level targeting, right? There’s 2.1 billion people each with their own reality. And once everybody has their own reality, it’s relatively easy to manipulate them.
AMY NGƯỜI ĐÀN ÔNG TỐT: So that’s a clip of Roger McNamee, the early Facebook investor, in the film Đại Hack. And as we wrap up, you see him speaking with our guest, Brittany Kaiser, who was just involved with releasing more documents from the company that was Cambridge Analytica, where she worked for three years, and Paul Hilder, who was featured in Đại Hack. Talk about what Paul Hilder is doing right now with these files, and tell us where we can get Hindsight Files.
nước Anh HOÀNG ĐẾ: Paul Hilder was one of my first inspirations to decide to become a whistleblower. He introduced me to Paul Lewis, who at the time was the bureau chief of The Guardian in San Francisco, and we went together to Silicon Valley to start looking through my files. He helped me, almost as a research assistant, to discover what were all of the most important points that I had in my evidence. And he continues to support me in that. He’s also started an amazing company called Datapraxis, which is an ethical data company that only does positive messaging, with explicit and transparent opt-ins, in order to support progressive parties and progressive movements around the world.
AMY NGƯỜI ĐÀN ÔNG TỐT: This in the midst of what’s happening around Brexit in Britain.
nước Anh HOÀNG ĐẾ: Chắc chắn rồi.
KARIM MỸ: And Paul — you know, Paul is the person who introduced us to Brittany, actually, as well. And that’s why we chose to, you know, follow their journey in the film.
And I think one of the things that does keep me optimistic is — despite all of this, is a line from the film where Paul says that, you know, he believes, ultimately, in societal and personal redemption. I think that’s what we’re looking for in this story. We’re looking for: How can we come back together here? How can we realize that this is actually not a partisan issue? Because this right now, as it’s being used in the election, we’re seeing it from the lens of just pure a partisan way, but actually we’re talking about the basic information infrastructure of an entire country and of the world.
And if we want to live in a world where we have any shared values, if we want to live in a world where the truth is not so fragile, if we want to live in a world where human autonomy is still being protected, then we have to band together right now. And we have to realize that this is bigger than us. This is bigger than just one election. And we have to find the courage to come together. And that begins with transparency. That begins with actually having a conversation about what happened and trying to prevent it from happening again.
AMY NGƯỜI ĐÀN ÔNG TỐT: And finally, Brittany, where people can go for the Hindsight Files on scores of countries around the world and what Cambridge Analytica’s involvement with them was, and may, in other companies, continue to be?
nước Anh HOÀNG ĐẾ: You can follow @HindsightFiles on Twitter. You can also follow one of my other accounts, Own Your Data, @OwnYourDataNow. You can also go to TheGreatHack.com/Hindsight-Files, because on that website they have chosen to support my document release and explain to you further in detail the context of each of the folders that I am dropping. So, thank you all for your support, and I do hope you decide to follow.
AMY NGƯỜI ĐÀN ÔNG TỐT: And how many more documents do you plan to drop? And how many have dropped so far since the beginning of the year?
nước Anh HOÀNG ĐẾ: We’ve dropped thousands of documents already, and we have tens of thousands more to come.
AMY NGƯỜI ĐÀN ÔNG TỐT: Final words, Emma Briant?
EMMA BRIANT: Oh, thank you so much. As I said before, this is, of course, the tip of the iceberg. I have got an awful lot more that’s going to be coming out in due course.
The things that we need to watch for are the — you know, making sure we vote with our feet when it comes to our political representatives, the kinds of commercial companies that we’re going for. You know, we need to be demanding companies work with more ethical firms.
And it’s really important to also also encourage people to come forward about wrongdoing. You know, Brittany Kaiser has been very brave in doing what she did, as has Chris Wylie. There are other brave whistleblowers who have not been so much in the public eye. But people can come forward anonymously. They can come forward visibly. It’s a very scary thing to do. But we need more of this kind of data release going on, in order to open things up and ensure we are able to expose the wrongdoing, at the very least until we get proper regulation.
So I want to encourage anybody to be brave and to work with whistleblower organizations who help to support people coming forward, and to also say thank you to everybody on Đại Hack for doing the wonderful job that you’re doing in bringing this to the international world stage, into everybody’s homes around the world. It’s so important that we get some amplification to these really just vital issues and create real change. Thank you, everybody. And thank you, Amy, for having us on.
AMY NGƯỜI ĐÀN ÔNG TỐT: Well, thank you all so much. I want to thank Emma Briant, a visiting research associate in human rights at Bard College; Jehane Noujaim and Karim Amer, the award-winning co-directors of Đại Hack, which has just been shortlisted for an Academy Award, and just today, it was announced, is a nominee for a BAFTA, the British equivalent of the Oscars. And, Brittany Kaiser, I want to thank you for spending this time with us in this first big interview you’ve done since this latest document release that you are in the midst of. Brittany Kaiser, Cambridge Analytica whistleblower, featured in the documentary Đại Hack, the director of business development when she left Cambridge Analytica in 2018, after working at the company for three-and-a-half years. Her book is titled Targeted: The Cambridge Analytica Whistleblower’s Inside Story of How Big Data, Trump, and Facebook Broke Democracy and How It Can Happen Again.
AMY NGƯỜI ĐÀN ÔNG TỐT: A longtime Facebook executive has admitted the company’s platform helped Donald Trump win the 2016 election, and predicted it may happen again this year. In an internal memo, Facebook Vice President Andrew Bosworth wrote, “So was Facebook responsible for Donald Trump getting elected? I think the answer is yes,” he said. Bosworth, who was a backer of Hillary Clinton in 2016, went on to write that the company should not change its policies in an effort to hurt Trump’s re-election chances. Bosworth credited Trump for running, quote, “the single best digital ad campaign I’ve ever seen from any advertiser.”
In his memo, Bosworth referenced the role of the shadowy data firm Cambridge Analytica but downplayed its significance. However, a new Netflix documentary called Đại Hack argues Cambridge Analytica has played a critical role in the U.S. election, as well as elections across the globe.
Cambridge Analytica was founded by the right-wing billionaire Robert Mercer. Trump’s former adviser Steve Bannon was the company’s vice president and claims to have named the company.
Cambridge Analytica harvested some 87 million Facebook profiles without the users’ knowledge or consent and used the data to sway voters during the 2016 campaign. The story of Cambridge Analytica is featured in the new documentary Đại Hack, which has been shortlisted for an Oscar.
DAVID XE: All of your interactions, your credit card swipes, web searches, locations, likes, they’re all collected, in real time, into a trillion-dollar-a-year industry.
CAROLE CADWALLADR: The real game changer was Cambridge Analytica. They worked for the Trump campaign and for the Brexit campaign. They started using information warfare.
DAVID XE: Cambridge Analytica claimed to have 5,000 data points on every American voter.
AMY NGƯỜI ĐÀN ÔNG TỐT: Well, earlier this week, I spoke to the directors of Đại Hack, Jehane Noujaim and Karim Amer, as well as propaganda researcher Emma Briant and a former employee at Cambridge Analytica, Brittany Kaiser, who has begun posting online a trove of documents detailing the company’s operations, including its work with President Trump’s former national security adviser John Bolton. Kaiser has also written about her experience at the company in the book Targeted: The Cambridge Analytica Whistleblower’s Inside Story of How Big Data, Trump, and Facebook Broke Democracy and How It Can Happen Again. Kaiser is one of two former Cambridge Analytica employees featured in Đại Hack. The other is Christopher Wylie.
CHRISTOPHER WYLIE: It’s incorrect to call Cambridge Analytica a purely sort of data science company or an algorithm company. You know, it is a full-service propaganda machine.
AMY NGƯỜI ĐÀN ÔNG TỐT: I asked Cambridge Analytica whistleblower Brittany Kaiser to talk about how she became involved with Cambridge Analytica.
nước Anh HOÀNG ĐẾ: I think it’s very important to note this, because there are people all around the world that are working for tech companies, that I’m sure joined that company in order to do something good. They want the world to be more connected. They want to use technology in order to communicate with everybody, to get people more engaged in important issues. And they don’t realize that while you’re moving fast and breaking things, some things get so broken that you cannot actually contemplate or predict what those repercussions are going to look like.
Chris Wylie and I both really idealistically joined Cambridge Analytica because we were excited about the potential of using data for exciting and good impact projects. Chris joined in 2013 on the data side in order to start developing different types of psychographic models. So he worked with Dr. Aleksandr Kogan and the Cambridge Psychometrics Centre at Cambridge University in order to start doing experiments with Facebook data, to be able to gather that data, which we now know was taken under the wrong auspices of academic research, and was then used in order to identify people’s psychographic groupings.
AMY NGƯỜI ĐÀN ÔNG TỐT: Now, explain that, psychographic groupings, and especially for people who are not on Facebook, who don’t understand its enormous power and the intimate knowledge it has of people. Think of someone you’re talking to who’s never experienced Facebook. Explain what is there.
nước Anh HOÀNG ĐẾ: Absolutely. So, the amount of data that is collected about you on Facebook and on any of your devices is much more than you’re really made aware of. You probably haven’t ever read the terms and conditions of any of these apps on your phone. But if you actually took the time to do it and you could understand it, because most of them are written for you not to understand — it’s written in legalese — you would realize that you are giving away a lot more than you would have ever agreed to if there was transparency. This is your every move, everywhere you’re going, who you’re talking to, who your contacts are, what information you’re actually giving in other apps on your phone, your location data, all of your lifestyle, where you’re going, what you’re doing, what you’re reading, how long you spend looking at different images and websites.
This amount of behavioral data gives such a good picture of you that your behavior can be predicted, as Karim was talking about earlier, to a very high degree of accuracy. And this allows companies like Cambridge Analytica to understand how you see the world and what will motivate you to go and take an action — or, unfortunately, what will demotivate you. So, that amount of data, available on Facebook ever since you joined, allows a very easy platform for you to be targeted and manipulated.
And when I say “psychographic targeting,” I’m sure you probably are a little bit more familiar with the Myers-Briggs test, so the Myers-Briggs that asks you a set of questions in order to understand your personality and how you see the world. The system that Cambridge Analytica used is actually a lot more scientific. It’s called the ĐẠI DƯƠNG five-factor model. And ĐẠI DƯƠNG stands for O for openness, C for conscientiousness, whether you prefer plans and order or you’re a little bit more fly by the seat of your pants. Extraversion, whether you gather your energy from being out surrounded by people, or you’re introverted and you prefer to gather your energy from being alone. If you are agreeable, you care about your family, your community, society, your country, more than you care about yourself. And if you are disagreeable, then you are a little bit more egotistical. You need messages that are about benefits to you. And then the worst is neurotic. You know, it’s not bad to be neurotic. It means that you are a little bit more emotional. It means, unfortunately, as well, that you are motivated by fear-based messaging, so people can use tactics in order to scare you to doing what they want to do.
And this is what was targeted when they were gathering that data out of Facebook to figure out which group you belonged into. They found about 32 different groups of people, different personality types. And there were groups of psychologists that were looking into how they could understand that data and convert that into messaging that was just for you.
I need to remind everybody that the Trump campaign put together over a million different advertisements that were put out, a million different advertisements with tens of thousands of different campaigns. Some of these messages were for just you, were for 50 people, a hundred people. Obviously, certain groups are thousands, tens of thousands or millions. But some of them were targeted very much directly at the individual, to know exactly what you’re going to click on and exactly what you care about.
AMY NGƯỜI ĐÀN ÔNG TỐT: So they were doing this before Cambridge Analytica. But describe — I want to actually go to a Bannon clip, Steve Bannon, who takes credit for naming Cambridge Analytica, right? Because you had SCL before, Defence.
nước Anh HOÀNG ĐẾ: Vâng.
AMY NGƯỜI ĐÀN ÔNG TỐT: And then it becomes Cambridge Analytica, for Cambridge University, right? Where Kogan got this information that he culled from Facebook.
nước Anh HOÀNG ĐẾ: Vâng.
AMY NGƯỜI ĐÀN ÔNG TỐT: This is the White House chief strategist Steve Bannon in an interview at a Thời báo Tài chính conference in March 2018. Bannon said that reports that Cambridge Analytica improperly accessed data to build profiles on American voters and influence the 2016 presidential election were politically motivated. Months later, evidence emerged linking Bannon to Cambridge Analytica, the scandal, which resulted in a $5 billion fine for Facebook. Bannon is the founder and former board member of the political consulting firm — he was vice president of Cambridge Analytica.
STEPHEN BANNON: All Cambridge Analytica is the data scientists and the applied applications here in the United States. It has nothing to do with the international stuff. The Guardian actually tells you that, and Người quan sát tell you that, when you get down to the 10th paragraph, OK? When you get down to the 10th paragraph. And what Nix does overseas is what Nix does overseas. Right? It was a data — it was a data company.
And by the way, Cruz’s campaign and the Trump campaign say, “Hey, they were a pretty good data company.” But this whole thing on psychographics was optionality in the deal. If it ever worked, it worked. But it hasn’t worked, and it doesn’t look like it’s going to work. So, it was never even applied.
AMY NGƯỜI ĐÀN ÔNG TỐT: So, that’s Steve Bannon in 2018, key to President Trump’s victory and to his years so far in office, before he was forced to — before he was forced out. What was your relationship with Steve Bannon? You worked at Cambridge Analytica for over three years. You had the keys to the castle, is that right, in Washington?
nước Anh HOÀNG ĐẾ: Yes, for a while I actually split the keys to what is Steve’s house, with Alexander Nix, because we used his house as our office. His house is also used as a Breitbart office in the basement. It’s called the “Breitbart Embassy” on Capitol Hill. And that’s where I would go for meetings.
AMY NGƯỜI ĐÀN ÔNG TỐT: Who funded that?
nước Anh HOÀNG ĐẾ: I believe it was owned by the Mercer family, that building. And we would come into the basement and use that boardroom for our meetings. And we would use that for planning who we were going to go pitch to, what campaigns we were going to work for, what advocacy groups, what conservative 501(c)(3)s and (c)(4)s he wanted us to go see.
And I didn’t spend a lot of time with Steve, but the time I did was incredibly insightful. Almost every time I saw him, he’d be showing me some new Hillary Clinton hit video that he had come out with, or announcing that he was about to throw a book launch party for Ann Coulter for ¡Adios, America!, which was something that he invited both me and Alexander to, and we promptly decided to leave the house before she arrived.
But Steve was very influential in the development of Cambridge Analytica and who we were going to go see, who we were going to support with our technology. And he made a lot of the introductions, which in the beginning seemed a little less nefarious than they did later on, when he got very confident and started introducing us to white right-wing political parties across Europe and in other countries and tried to get meetings with the main political parties, or leftist or green parties instead, to make sure that those far-right-wing parties that do not have the world’s best interests at heart could not get access to these technologies. heart could not get access to these technologies.
AMY NGƯỜI ĐÀN ÔNG TỐT: You said in Đại Hack, in the film, that you have evidence of illegality of the Trump and Brexit campaigns, that they were conducted illegally. I was wondering if you can go into that. I mean, it was controversial even, and Carole Calwalladr, the great reporter at Người quan sát và The Guardian, was blasted and was personally targeted, very well demonstrated in Đại Hack, for saying that Cambridge Analytica was involved in Brexit. They kept saying they had nothing to do with it, until she shows a video of you, who worked for Cambridge Analytica, at one of the founding events of leave it, or Brexit.
nước Anh HOÀNG ĐẾ: Yeah, Leave.EU, that panel that I was on, which has now become quite an infamous video, was their launch event to launch the campaign. And Cambridge Analytica was in deep negotiations, through introduction of Steve Bannon, with both of the Brexit campaigns. I was told, actually, originally we pitched remain, and the remain side said that they did not need to spend money on expensive political consultants, because they were going to win anyway. And that’s actually what I also truly believed, and so did they.
So, Steve made the introductions to make sure that we would still get a commercial contract out of this political campaign, and both to Vote Leave and Leave.EU. Cambridge Analytica took Leave.EU, and AIQ, which was Cambridge Analytica’s essentially digital partner, before Cambridge Analytica could run our own digital campaigns, they were running the Vote Leave side, both funded by the Mercers, both with the same access to this giant database on American voters.
AMY NGƯỜI ĐÀN ÔNG TỐT: The Mercers funded Brexit?
nước Anh HOÀNG ĐẾ: There was Cambridge Analytica work, as well as AIQ work, in both of the leave campaigns. So, a lot of that money, in order to collect that data and in order to build the infrastructure of both of those companies, came from Mercer-funded campaigns, yes.
AMY NGƯỜI ĐÀN ÔNG TỐT: And again, explain what AIQ là.
nước Anh HOÀNG ĐẾ: AIQ was a company that actually ran all of Cambridge Analytica’s digital campaigns, until January 2016, when Molly Schweickert, our head of digital, was hired in order to build ad tech internally within the company. AIQ was based in Canada and was a partner that had access to Cambridge Analytica data the entire time that they were running the Vote Leave campaign, which was the designated and main campaign in Brexit.
AMY NGƯỜI ĐÀN ÔNG TỐT: So, when did you see the connection between Brexit and the Trump campaign?
nước Anh HOÀNG ĐẾ: Actually, a lot of it started to come when I saw some of Carole’s reporting, because there were a lot of conspiracy theories over what was going on, and I didn’t know what to believe. All I knew was that we definitely did work in the Brexit campaign, “we” as in when I was at Cambridge Analytica, because I was one of the people working on the campaign. And we obviously played a large role in not just the Trump campaign itself, but Trump super PACs and a lot of other conservative advocacy groups, 501(c)(3)s, (4)s, that were the infrastructure that allowed for the building of the movement that pushed Donald Trump into the White House.
AMY NGƯỜI ĐÀN ÔNG TỐT: I mean, it looks like Cambridge Analytica was heading to a billion-dollar corporation.
nước Anh HOÀNG ĐẾ: That’s what Alexander used to tell us all the time. That was the carrot that he waved in front of our eyes in order to have us keep going. “We’re building a billion-dollar company. Aren’t you excited?” And I think that that’s what so many people get caught up in, people that are currently working at Facebook, people that are working at Google, people that are working at companies where they are motivated to build exciting technology, that obviously can also be very dangerous, but they think they’re going to financially benefit and be able to take care of themselves and their families because of it.
AMY NGƯỜI ĐÀN ÔNG TỐT: So what was illegal?
nước Anh HOÀNG ĐẾ: The massive problems that came from the data collection, specifically, are where my original accusations come from, because data was collected under the auspices of being for academic research and was used for political and commercial purposes. There are also different data sets that are not supposed to be matched and used without explicit transparency and consent in the United Kingdom, because they actually have good national data protection laws and international data protection laws through the European Union to protect voters. Unfortunately, in the United States, we only have seen the state of California coming out and doing it.
Now, on the other side, we have voter suppression laws that prevent our vote from being suppressed. We have laws against discrimination in advertising, racism, sexism, incitement of violence. All of those things are illegal, yet somehow a platform like Facebook has decided that if politicians want to use any of those tactics, that they will not be held to the same community standards as you or me, or the basic laws and social contracts that we have in this country.
AMY NGƯỜI ĐÀN ÔNG TỐT: Cambridge Analytica whistleblower Brittany Kaiser. When we come back, we speak to the directors of Đại Hack, the documentary that’s just been shortlisted for an Academy Award.
[nghỉ]
AMY NGƯỜI ĐÀN ÔNG TỐT: “Let Me Steal Your Secrets,” soundtrack from the documentary Đại Hack. Đây là Dân chủ ngay! I’m Amy Goodman, as we continue our look at Cambridge Analytica, Facebook, and their roles in the 2016 U.S. election and other elections. Earlier this week, I spoke to the directors of Đại Hack, Jehane Noujaim and Karim Amer, as well as propaganda researcher Emma Briant and a former Cambridge Analytica whistleblower, Brittany Kaiser. I asked Karim Amer to talk about what Cambridge Analytica effort to suppress the vote in Trinidad and Tobago did.
KARIM MỸ: It was important for us to show in the film the expansiveness of Cambridge’s work. This went beyond the borders of the United States and even beyond the borders of the EU and the U.K. Because what we find is that Cambridge used the — in pursuing this global influence industry that they were very much a part of, they used different countries as Petri dishes to learn and get the know-how about different tactics. And from improving those tactics, they could then sell them for a higher cost — higher margin in Western democracies, where the election budgets are, you know — we have to remember, I think it’s important to predicate that the election business has become a multibillion-dollar global business, right? So, we have to remember that while we are upset with companies like Cambridge, we allowed for the commoditization of our democratic process, right? So, people are exploiting this now because it’s become a business. And we, as purveyors of this, can’t really be as upset as we want to be, when we’ve justified that. So I want to preface it with that.
Now, that being said, what’s happened as a result is a company like Cambridge can practice tactics in a place like Trinidad, that’s very unregulated in terms of what they can and can’t do, learn from that know-how and then, you know, use it — parlay it into activities in the United States. What they did in Trinidad, and why it was important for us to show it in the film, is they led something called the “Do So” campaign, where they admit to making it cool and popular among youth to get out and not vote. And they knew —
AMY NGƯỜI ĐÀN ÔNG TỐT: So, you had the Indian population and the black population.
KARIM MỸ: And the black population. And there is a lot of historic tension between those two, and a lot of generational differences, as well, between those two. And the “Do So” campaign targeted — was was done in a way to, you know, by looking at the data and looking at the predictive analysis of which group would vote or not vote, get enough people to dissuade them from voting, so that they could flip the election.
AMY NGƯỜI ĐÀN ÔNG TỐT: Targeted at?
KARIM MỸ: Targeted at the youth. And so, this is really — when you watch —
AMY NGƯỜI ĐÀN ÔNG TỐT: “Do So” actually meant “don’t vote.”
KARIM MỸ: “Do So,” don’t vote.
JEHANE NOUJAIM: Don’t vote.
KARIM MỸ: Yes, exactly. And when —
AMY NGƯỜI ĐÀN ÔNG TỐT: With their fists crossed.
KARIM MỸ: With their fists.
AMY NGƯỜI ĐÀN ÔNG TỐT: And that it became cool not to vote.
KARIM MỸ: Exactly. And you look at the level of calculation behind this, and it’s quite frightening. Now, as Emma was saying, a lot of these tactics were born out of our own fears in the United States and the U.K. post-9/11, when we allowed for this massive weaponization of influence campaigns to begin. You know, if you remember President Bush talking about, you know, the battle for the hearts and minds of the Iraqi people, all of these kinds of industries were born out of this.
And now I believe what we’re seeing is the hens have come home to roost, right? All of these tactics that we developed in the name of, quote-unquote, “fighting the war on terror,” in the name of doing these things, have now been commercialized and used to come back to the biggest election market in the world, the United States. And how do we blame people for doing that, when we’ve allowed for our democracy to be for sale?
And that’s what Brittany’s files today, that she’s releasing and has released over the last couple days, really give us insight to. The Hindsight Files that Brittany has released show us how there is an auction happening for influence campaigns in every democracy around the world. There is no vote that is unprotected in the current way that we — in the current space that we’re living.
And the thing that’s allowing this to happen is these information platforms like Facebook. And that is what’s so upsetting, because we can actually do something about that. We are the only country in the world that can hold Facebook accountable, yet we still have not done so. And we still keep going to their leadership hoping they do the right thing, but they have not. And why is that? Because no industry has ever shown in American history that it can regulate itself. There is a reason why antitrust laws exist in this country. There’s a tradition of holding companies accountable, and we need to re-embrace that tradition, especially as we enter into 2020, where the stakes could not be higher.
AMY NGƯỜI ĐÀN ÔNG TỐT: Brittany Kaiser, can you talk about the “Crooked Hillary” campaign and how it developed?
nước Anh HOÀNG ĐẾ: Absolutely. So, this started as a super PAC that was built for Ted Cruz, Keep the Promise I, which was run by Kellyanne Conway and funded by the Mercers. That was then converted to becoming a super PAC for Donald Trump. They tried to register with the Federal Election Commission the name, Defeat Crooked Hillary, and the FEC, luckily, did not allow them to do that. So it was called Make America Number 1.
This super PAC was headed by David Bossie, someone that you might remember from Citizens United, who basically brought dark money into our politics and allowed endless amounts of money to be funneled into these types of vehicles so that we don’t know where all of the money is coming from for these types of manipulative communications. And he was in charge of this campaign.
Now, on that two-day-long debrief that I talked about — and if you want to know more, you can read about it in my book — they told us —
AMY NGƯỜI ĐÀN ÔNG TỐT: Wait, and explain where you were and who was in the room.
nước Anh HOÀNG ĐẾ: So, I was in New York in our boardroom for Cambridge Analytica’s office on Fifth Avenue. And all of our offices from around the world had called in to videocast. And everybody from the super PAC and the Trump campaign took us through all of their tactics and strategies and implementation and what they had done.
Now, when we got to this Defeat Crooked Hillary super PAC, they explained to us what they had done, which was to run experiments on psychographic groups to figure out what was working and what wasn’t. Unfortunately, what they found out was the only very successful tactic was sending fear-based, scaremongering messaging to people that were identified as being neurotic. And it was so successful in their first experiments that they spent the rest of the money from the super PAC over the rest of the campaign only on negative messaging and fearmongering.
AMY NGƯỜI ĐÀN ÔNG TỐT: And crooked, the O-O in “crooked” was handcuffs.
nước Anh HOÀNG ĐẾ: Yes. That was designed by Cambridge Analytica’s team.
AMY NGƯỜI ĐÀN ÔNG TỐT: Karim?
KARIM MỸ: And one thing that I think it’s important to remember here, because there’s been a lot of debate among some people about: Did this actually work? To what degree did it work? How do we know whether it worked or not? What Brittany is describing is a debrief meeting where Cambridge, as a company, is saying, “This is what we learned from our political experience. This is what actually worked.” OK? And they’re sharing it because they’re saying, “Now this is how we want to codify this and commoditize this to go into commercial business.” Right?
So this is the company admitting to their own know-how. There is no debate about whether it works or not. This is not them advertising it to the world. This is them saying, “This is what we’ve learned. Based off that, this is how we’re going to run our business. This is how we’re going to invest in the expansion of this to sell this outside of politics.” The game was, take the political experience, parlay it into the commercial sector. That was the strategy. So, there is no debate whether it worked or not. It was highly effective.
And the thing that’s terrifying is that while Cambridge has been disbanded, the same actors are out there. And there’s nothing has been — nothing has changed to allow us to start putting in place legislation to say there is something called information crimes. In this era of information warfare, in this era of information economies, what is an information crime? What does it look like? Who determines it? And yet, without that, we are still living in this unfiltered, unregulated space, where places like Facebook are continuing to choose profit over the protection of the republic. And I think that’s what’s so outrageous.
JEHANE NOUJAIM: And I think it’s pretty telling that only two people —
AMY NGƯỜI ĐÀN ÔNG TỐT: Jehane.
JEHANE NOUJAIM: Only two people have come forward from Cambridge Analytica. Why is that? Both of the people that have come forward, Brittany and Chris, and also with Carole’s writing, have been targeted personally. And it’s been a very, very difficult story to tell. Even with us, when we released the film in January, every single time we have entered into the country, we have been stopped for four to six hours of questioning at the border. That —
AMY NGƯỜI ĐÀN ÔNG TỐT: Stopped by?
JEHANE NOUJAIM: Stopped by — on the border of the U.S., in JFK Airport, where you’re taken into the back, asked for all of your social media handles, questioned for four to six hours, every single time we enter the country. So —
AMY NGƯỜI ĐÀN ÔNG TỐT: Kể từ khi?
JEHANE NOUJAIM: Since we released the film, so since Sundance, since January, every time we’ve come back into the U.S.
AMY NGƯỜI ĐÀN ÔNG TỐT: And on what grounds are they saying they’re stopping you?
JEHANE NOUJAIM: No explanation. No —
AMY NGƯỜI ĐÀN ÔNG TỐT: And what is your theory?
JEHANE NOUJAIM: My theory is that it’s got something to do with this film. Maybe we’re doing something right. We were at first — we’ve been stopped in Egypt, but we’ve never been stopped in the U.S. in this way. We’re American citizens. Right?
AMY NGƯỜI ĐÀN ÔNG TỐT: You talk about people coming forward and not coming forward. I wanted to turn to former Cambridge Analytica COO, the chief operating officer, Julian Wheatland, speaking on the podcast giải mã lại.
THÁNG XNUMX Lúa mỳ: The company made some significant mistakes when it came to its use of data. They were ethical mistakes. And I think that part of the reason that that happened was that we spent a lot of time concentrating on not making regulatory mistakes. And so, for the most part, we didn’t, as far as I can tell, make any regulatory mistakes, but we got almost distracted by ticking those boxes of fulfilling the regulatory requirements. And it felt like, well, once that was done, then we’d done what we needed to do. And we forgot to pause and think about ethically what was — what was going on.
AMY NGƯỜI ĐÀN ÔNG TỐT: So, if you could decode that, Brittany? Cambridge Analytica COO Julian Wheatland, who, interestingly, in Đại Hack, while he was — really condemned Chris Wylie, did not appreciate Chris Wylie stepping forward and putting Cambridge Analytica in the crosshairs in the British Parliament, he was more equivocal about you. He — talk about Wheatland and his role and what he’s saying about actually abiding by the regulations, which they actually clearly didn’t.
nước Anh HOÀNG ĐẾ: Once upon a time, I used to have a lot of respect for Julian Wheatland. I even thought we were friends. I thought we were building a billion-dollar company together that was going to allow me to do great things in the world. But, unfortunately, that’s a story that I told myself and a story he wanted me to believe that isn’t true at all.
While he likes to say that they spent a lot of time abiding by regulations, I would beg to differ. Cambridge Analytica did not even have a data protection officer until 2018, right before they shut down. I begged for one for many years. I begged for more time with our lawyers and was told I was creating too many invoices. And for a long time, because I had multiple law degrees, I was asked to write contracts. And so were other —
AMY NGƯỜI ĐÀN ÔNG TỐT: Didn’t you write the Trump campaign contract?
nước Anh HOÀNG ĐẾ: The original one, yes, I did. And there were many other people that were trained in human rights law in the company that were asked to draft contracts, even though contract law was not anybody’s specialty within the company. But they were trying to cut corners and save money, just like a lot of technology companies decide to do. They do not invest in making the ethical or legal decisions that will protect the people that are affected by these technologies.
AMY NGƯỜI ĐÀN ÔNG TỐT: I wanted to bring Emma Briant back into this conversation. You mentioned it in Phần 1 of our discussion —
EMMA BRIANT: Tất nhiên.
AMY NGƯỜI ĐÀN ÔNG TỐT: — this issue of military contractors —
EMMA BRIANT: Yeah.
AMY NGƯỜI ĐÀN ÔNG TỐT: — and the nexus of military and government power, the fact that with Trump’s election —
EMMA BRIANT: Yeah.
AMY NGƯỜI ĐÀN ÔNG TỐT: — military contractors were one of the greatest financial beneficiaries of Trump’s election.
KARIM MỸ: But I think it’s important to remember that —
EMMA BRIANT: Chắc chắn rồi.
AMY NGƯỜI ĐÀN ÔNG TỐT: Karim Amer.
KARIM MỸ: — the issue is that also these — when we think of military contractors, we think of people selling tanks and guns and bullets and these types of things. The problem that we don’t realize is that we’re in an era of information warfare. So the new military contractors aren’t selling — aren’t selling the traditional tanks. They’re selling the —
AMY NGƯỜI ĐÀN ÔNG TỐT: Although they’re doing that.
KARIM MỸ: They’re doing that, as well, but they’re selling the equivalent of that in the information space. And that’s a new kind of weapon. That’s a new kind of battle that we’re not familiar with.
And the reason why it’s more challenging for us is because there’s a deficit of language and a deficit of visuals. We don’t know where the battlefield is. We don’t know where the borders are. We can’t pinpoint, be like, “This is where the trenches are.” Yet we’re starting to uncover that. And that was so much of the challenge in making this film, is trying to see where can we actually show you where these wreckage sites are, where the casualties of this new information warfare are, and who the actors are and where the fronts are.
And I think, in entering 2020, we have to keep a keen eye on where the new war fronts are and when they’re happening in our domestic frontiers and how they’re happening in these devices that we use every day. So this is where we have to have a new kind of reframing of what we’re looking at, because while we are at war, it is a very different kind of borderless war where asymmetric information activity can affect us in ways that we never imagined.
AMY NGƯỜI ĐÀN ÔNG TỐT: And, Emma Briant, you talked about when Facebook knew the level of documentation that Cambridge Analytica was taking from them.
EMMA BRIANT: Yeah.
AMY NGƯỜI ĐÀN ÔNG TỐT: I mean, Cambridge Analytica paid them, right?
EMMA BRIANT: Yes. I mean, they were providing the data to GSR, who then, you know, were paid by —
AMY NGƯỜI ĐÀN ÔNG TỐT: Giải thích gì GSR là.
EMMA BRIANT: Sorry, the company by Kogan and Joseph Chancellor, their company that they were setting up to do both academic research but also to exploit the data for Cambridge Analytica’s purposes. So they were working with — on mapping that data onto the personality tests and giving that access to Cambridge Analytica, so that they could scale it up to profile people across the target states in America especially, but also all across America. They obtained way more than they ever expected, as Chris Wylie and Brittany have shown.
But I want to also ask: When did our governments know about what Cambridge Analytica and SCL were doing around the world and when they were starting to work in our elections? One of the issues is that these technologies have been partly developed by, you know, grants from our governments and that these were, you know, defense contractors, as we say. We have a responsibility for those companies and for ensuring that there’s reporting back on what they’re doing, and some kind of transparency.
As Karim was saying, that if you — you know, we’re in a state of global information warfare now. If you have a bomb that has been discovered that came from an American source and it’s in Yemen, then we can look at that bomb, and often there’s a label which declares that it’s an American bomb that has been bought, that has, you know, been used against civilians. But what about data? How do we know if our militaries develop technologies and the data that it has gathered on people, for instance, across the Middle East, the kind of data that Snowden revealed — how do we know when that is turning up in Yemen or when that is being utilized by an authoritarian regime against the human rights of its people or against us? How do we know that it’s not being manipulated by Russia, by Iran, by anybody who’s an enemy, by Saudi Arabia, for example, who SCL were also working with? We have no way of knowing, unless we open up this industry and hold these people properly accountable for what they’re doing.
AMY NGƯỜI ĐÀN ÔNG TỐT: SCL Defence was the parent company of Cambridge Analytica. Emma Briant, human rights researcher at Bard College, her upcoming book, Propaganda Machine: Inside Cambridge Analytica and the Digital Influence Industry. We’ll be back in less than 30 seconds.
[nghỉ]
AMY NGƯỜI ĐÀN ÔNG TỐT: Đây là Dân chủ ngay! I’m Amy Goodman, as we continue our look at Cambridge Analytica, Facebook, and their roles in U.S. and other elections. I speak to the directors of Đại Hack, Jehane Noujaim and Karim Amer, as well as former Cambridge Analytica whistleblower Brittany Kaiser, who’s begun posting online a trove of documents detailing the operations of the now-defunct company. I asked propaganda researcher Emma Briant to talk about the significance of the documents.
EMMA BRIANT: I think the biggest reveal is going to be in the American campaigns around this, but I think you haven’t seen the half of it yet. This is the tip of the iceberg, as I’ve been saying.
My thing that I think that is most interesting of what’s been revealed so far is actually the Iran campaign, because, you know, this is a very complex issue, and it really is an exemplar of the kinds of conflicts of interest that I’m talking about, at a company that is, you know, set out to profit from the arms trade and from the expansion of war in that region and from the favoring of one side in a regional conflict, essentially, backed by American power, by the escalation of the conflict with Iran and, you know, by getting more contracts, of course, with the Gulf states, the Các Tiểu vương quốc Ả Rập and the Saudis. You know, and, of course, they were trying to put Trump in power, as well, to do that, and advancing John Bolton and the other hawks who have been trying to demand that sanctions — to keep sanctions and to get out of the Iran deal, which they have been arguing is a flawed deal.
Và tất nhiên, SCL were involved in doing work in that region since 2013, including they were working before that on Iran for President Obama’s administration, which I’m going to be talking more about in the future. The issue is that there is a conflict of interest here. So you gain experience for one government, and then you’re going and working for others that maybe are not entirely aligned in their interests.
AMY NGƯỜI ĐÀN ÔNG TỐT: Although it was for the election of Donald Trump, of course, it happened during the Obama years.
EMMA BRIANT: Vâng.
AMY NGƯỜI ĐÀN ÔNG TỐT: That’s when Cambridge Analytica really gained its strength in working with Facebook.
EMMA BRIANT: Yeah. And SCL’s major shareholder, Vincent Tchenguiz, of course, was involved in the early establishment of the company Black Cube and in some of its early funding, I believe. I don’t know how long they stayed in any kind of relationship with that firm. However, the firm Black Cube were also targeting Obama administration officials with a massive smear campaign, as has been revealed in the media. And, you know, this opposition to the Iran deal and the promotion of these kinds of, you know, really fearmongering advertising that Brittany is talking about is very disturbing, when this same company is also driving, you know, advertising for gun sales and things like that.
AMY NGƯỜI ĐÀN ÔNG TỐT: Wait. Explain what Black Cube is, which goes right to today’s headlines —
EMMA BRIANT: Chính xác.
AMY NGƯỜI ĐÀN ÔNG TỐT: — because Harvey Weinstein, accused of raping I don’t know how many women at last count, also employed Black Cube, former Israeli intelligence folks, to go after —
EMMA BRIANT: Vâng.
AMY NGƯỜI ĐÀN ÔNG TỐT: — the women who were accusing him, and even to try to deceive the reporters, like at The New York Times, to try to get them to write false stories.
EMMA BRIANT: Absolutely. I mean, this is an intelligence firm that was born, again, out of the “war on terror.” So, Israel’s war on terror, this time, produced an awful lot of people who had gone through conscription and developed really, you know, strong expertise in cyberoperations or on developing information warfare technologies, in general, intelligence gathering techniques. And Black Cube was formed by people who came out of the Israeli intelligence industries. And they all formed these companies, and this has become a huge industry, which is not really being properly regulated, as well, and properly governed, and seems to be rather out of control. And they have been also linked to Cambridge Analytica in the evidence to Parliament. So, I think the involvement of all of these companies is really disturbing, as well, in relation to the Iran deal.
We don’t know that Cambridge Analytica in any way were working with Black Cube in this, at this point in time. However, the fact is that all of this infrastructure has been created, which is not being properly tackled. And how they’re able to operate without anybody really understanding what’s going on is a major, major problem.
AMY NGƯỜI ĐÀN ÔNG TỐT: Brittany?
nước Anh HOÀNG ĐẾ: Black Cube isn’t the only company you should be concerned about. The founder of Blackwater, or their CEO, Erik Prince, was also an investor in Cambridge Analytica. So he profits from arm sales around the world, and military contracts, and has been accused of causing the unnecessary death of civilians in very many different wartime situations. He was one of the investors in Cambridge Analytica and their new company, Emerdata. And so, I should be very concerned, and everyone should be very concerned, about the weaponization of our data by people that are actually experts in selling weapons. So, that’s one thing that I think needs to be in the public discussion, the difference between what is military, what is civilian, and how those things can be used for different purposes or not.
KARIM MỸ: And I think what’s important to see —
AMY NGƯỜI ĐÀN ÔNG TỐT: Karim.
KARIM MỸ: — is that, you know, in the clip you showed, Chris Wylie is talking about how Cambridge is a full-service propaganda machine. What does that really mean? You know, I would say that what’s happening is we’re getting insight to the the network of the influence industry, the buying and selling of information and of people’s behavioral change. And it is a completely unregulated space.
And what’s very worrisome is that, as we’re seeing more and more, with what Emma’s talking about and what Brittany has shown, is this conveyor belt of military-grade information and research and expertise coming out of our defense work, that’s being paid for by our tax money, then going into the private sector and selling it to the highest bidder, with different special interests from around the world. So what you see in the files is, you know, a oil company buying influence campaign in a country that it’s not from and having no — you know, no responsibility or anything to what it’s doing there. And what happens in that, the results of that research, where it gets handed over to, no one knows. Any contracts that then result in the change that happens on the political ground, no one tracks and sees.
So, this is what we’re very concerned about, is because you’re seeing that everything has become for sale. And if everything is for sale —
JEHANE NOUJAIM: Our elections are for sale.
KARIM MỸ: Exactly. And so, how do we have any kind of integrity to the vote, when we’re living in such a condition?
JEHANE NOUJAIM: Democracy has been broken. And our first vote is happening in 28 days, and nothing has changed. No election laws have changed. Facebook’s a crime scene. No research, nothing has come out. We don’t understand it yet. This was why we felt so passionate about making this film, because it’s invisible. How do you make the invisible visible? And this is why Brittany is releasing these files, because unless we understand the tactics, which are currently being used again, right now, as we speak, same people involved, then we can’t change this.
AMY NGƯỜI ĐÀN ÔNG TỐT: Facebook’s a crime scene, Jehane. Elaborate on that.
JEHANE NOUJAIM: Absolutely. Facebook is where this has happened. Initially, we thought this was Cambridge Analytica — right? — and that Cambridge Analytica was the only bad player. But Facebook has allowed this to happen. And they have not revealed. They have the data. They understand what has happened, but they have not revealed that.
AMY NGƯỜI ĐÀN ÔNG TỐT: And they have profited off of it.
JEHANE NOUJAIM: And they have profited off of it.
KARIM MỸ: Well, it’s not just that they’ve profited off it. I think what’s even more worrisome is that a lot of our technology companies, I would say, are incentivized now by the polarization of the American people. The more polarized, the more you spend time on the platform checking the endless feed, the more you’re hooked, the more you’re glued, the more their KPI at the end of the year, which says number of hours spent per user on platform, goes up. And as long as that’s the model, then everything is designed, from the way you interact with these devices to the way your news is sorted and fed to you, to keep you on, as hooked as possible, in this completely unregulated, unfiltered way — under the guise of freedom of speech when it’s selectively there for them to protect their interests further. And I think that’s very worrisome.
And we have to ask these technology companies: Would there be a Silicon Valley if the ideals of the open society were not in place? Would Silicon Valley be this refuge for the world’s engineers of the future to come reimagine what the future could look like, had there not been the foundations of an open society? There would not be. Yet the same people who are profiting off of these ideals protecting them feel no responsibility in their preservation. And that is what is so upsetting. That is what is so criminal. And that is why we cannot look to them for leadership on how to get out of this.
We have to look at the regulation. You know, if Facebook was fined $50 billion instead of five, I guarantee you we wouldn’t be having this conversation right now. It would have led to not only an incredible change within the company, but it would have been the signal to the entire industry. And there would have been innovation that would have been sponsored to come out of this problem. Like, we can use technology to fix this, as well. We just have to create the right incentivization plan. I have belief that the engineers of the future that are around can help us get out of this. But currently they are not the — they are not the decision makers, because these companies are not democratic whatsoever.
AMY NGƯỜI ĐÀN ÔNG TỐT: Emma Briant?
EMMA BRIANT: Could I? Yes, please. I just wanted to make a point about how important this is for ordinary Americans to understand the significance for their own lives, as well, because I think some people hear this, and they think, “Oh, tech, this is maybe quite abstract,” or, you know, they may feel that other issues are more important when it comes to election time. But I want to make the point that, actually, you know what? This subject is about all of those other issues.
This is about inequality and it being enabled. If you care about, you know, having a proper debate about all of the issues that are relevant to America right now, so, you know, do you care about the — you know, the horrifying state of American prison system, what’s being done to migrants right now, if you care about a minimum wage, if you care about the healthcare system, you care about the poverty, the homelessness on the streets, you care about American prosperity, you care about the environment and making sure that your country doesn’t turn into the environmental disaster that Australia is experiencing right now, then you have to care about this topic, because we can’t have an adequate debate, we cannot, like, know that we have a fair election system, until we understand that we are actually having a discussion, from American to American, from, you know, country to country, that isn’t being dominated by rich oil industries or defense industries and brutal leaders and so on.
So, I think that the issue is that Americans need to understand that this is an underlying issue that is stopping them being able to have the kinds of policies that would create for them a better society. It’s stopping their own ability to make change happen in the ways that they want it to happen.
AMY NGƯỜI ĐÀN ÔNG TỐT: Emma Briant —
EMMA BRIANT: It’s not an abstract issue. Thank you.
AMY NGƯỜI ĐÀN ÔNG TỐT: What would be the most effective form of regulation? I mean, we saw old Standard Oil broken up, these monopolies broken up.
EMMA BRIANT: Yeah.
AMY NGƯỜI ĐÀN ÔNG TỐT: Do you think that’s the starting point for companies like Facebook, like Google and others?
EMMA BRIANT: I think that’s a big part of it. I do think that Elizabeth Warren’s recommendations when it comes to that and antitrust and so on are really important. And we have legal precedents to follow on that kind of thing.
But I also think that we need an independent regulator for the tech industry and also a separate one for the influence industry. So, America has some regulation when it comes to lobbying. In the U.K., we have none. And quite often, you know, American companies will partner with a British company in order to be able to get around doing things, for instance. We have to make sure that different countries’ jurisdictions cannot be, you know, abused in order to make something happen that would be forbidden in another country.
We need to make sure that we’re also tackling how money is being channeled into these campaigns, because, actually, there’s an awful lot we could do that isn’t just about censoring or taking down content, but that actually is, you know, about making sure that the money isn’t being funneled in to the — to fund these actual campaigns. If we knew who was behind them, if we were able to show which companies were working on them and what other interests they might have, then I think this would really open up the system to better journalism, to better — you know, more accountability.
And the issue isn’t just about what’s happening on the platforms, although that is a big part of it. We have to think about, you know, the whole infrastructure.
AMY NGƯỜI ĐÀN ÔNG TỐT: Propaganda researcher Emma Briant; Jehane Noujaim and Karim Amer, the directors of Đại Hack, the Netflix documentary just shortlisted for an Academy Award; and Cambridge Analytica whistleblower Brittany Kaiser, author of Targeted: The Cambridge Analytica Whistleblower’s Inside Story of How Big Data, Trump, and Facebook Broke Democracy and How It Can Happen Again.
ZNetwork được tài trợ hoàn toàn thông qua sự hào phóng của độc giả.
Đóng góp