Transcribe your podcast
[00:00:03]

Hello from the Lincoln Project and welcome back. I'm Ron Suslow. In 2016, Russia orchestrated a widespread campaign to undermine American democracy in this episode. I'm going to talk with renowned documentarian and director Alex Gibney about his new film, Agents of Chaos, uncovering Russia's sophisticated plans to undermine and discredit American elections. Alex is an Oscar, Emmy and Peabody Award winning documentary filmmaker. His films include Taxi to the Dark Side, Enron, The Smartest Guys in the Room and Going Clear Scientology and the Prison of Belief.

[00:00:43]

Also joining us for this conversation is technologist and cyber conflict researcher Kamei First Wife, who features prominently in the film to discuss how Russia used cyber troll farms to create a massive misinformation. Campaign committee is the chief innovation officer at Traffick, where she leads the company's work to detect and mitigate disinformation, media manipulation and harassment. Previously, she was the principal researcher at Jigsaw, which is a unit at Google that builds technology to address global security challenges and protect vulnerable users.

[00:01:16]

So, Alex, I want to start with you to just set the table for our listeners. What made you decide to make this documentary and why did you choose to release it?

[00:01:25]

Now, the decision to make it is is contained in the film itself. I got a mysterious call early in 2017 from somebody I didn't even know who.

[00:01:34]

I mean, I didn't even know who it was. And I went out to California to do an interview with this mystery person who happened to be Glenn Simpson, ran a company called Fusion GPS, and that was the oppo research firm for Hillary Clinton that hired Christopher Steele, the author of the dossier.

[00:01:53]

And at the time, he weaved a very intriguing story about Russian interference in the 2016 election. That kind of made my head spin, but that was just the beginning of the process. A week later, Lowell Bergman called me and said he had designs on doing something like this and had the ability to connect with a number of independent Russian journalists who could also help us dig into this story. So we went to HBO and HBO said we'll we'll we'll get behind you and follow where you lead us.

[00:02:31]

And that's what started it off. And in terms of why now, I mean, part of the reason is it took us a long time to try to figure out what was going on and also to get a certain number of people to talk. But I also think that. We're running now into the 2020 election and the 2016 election is a case of a past is prologue, and I think there are a lot of important parallels. There are some certainly significant differences, a lot of important parallels between what happened in 2016 and what is happening now again in 2020.

[00:03:05]

So for that reason, we think it's vital to release this election based documentary right now on the eve of another presidential election, be your background is in human rights.

[00:03:17]

Can you talk a little bit about why Russia's interference in the 2016 election is a human rights issue? That's a great question.

[00:03:26]

You know, it's fun for me because I don't think it's often framed as something that has to do with human rights. But I looked into the issues of trolling and coordinated disinformation a few years ago because I was working with human rights activists and at that time, human rights activists who often get hacked and get Fisht and get targeted by other types of cyber attack. We're starting to say we understand the fishing, we understand the normal types of cyber attack, but we're increasingly concerned with the coordinated Trilling's and the fake profiles and the disinformation and harassment campaigns that they do.

[00:04:07]

And I think in that regard, they were very seriously ahead of the curve. And so when you think about who were the first researchers and people who documented these troll forums, it's not the national security establishment is researchers who are focused on human rights and investigative journalist who are covering that space, including in Ukraine or the first ones to say, hey, look at this large scale disinformation campaign that uses all these fake blogs. They're all connected to one another.

[00:04:37]

And so I think here the human rights lens sort of got me to look into this issue perhaps a little bit ahead of the curve.

[00:04:47]

I think probably one of the most useful things to begin with is a title of the documentary Tiss this Agents of Chaos.

[00:04:52]

But what was Russia's goal when they interfered with the election? Because I think there's this popular perception that they were actively trying to help or hurt Hillary. Did they start by trying to help Trump or Republicans win the election? Either if you can can take that one. I think it's fair to say that they did not start by saying we're going to get Trump elected. They started out in 2014 with a project focused on the US and the goal of the project was chaos at scale.

[00:05:26]

Push the buttons that divide the American society, find that the most tense topics that you can sort of highlight and press on find the polarizing, difficult issues and pawson fire on that. If you think about it, 2014 is really a while ago and they had sort of two years to prep for what they did in 2016. And then they sort of continued this approach of how are you going to further divide? How are we going to further inflame? We are the new divisive topics and how can we continue pouring gasoline on that?

[00:06:04]

It's also important to understand that part of the audience for Russia's interference in the United States and not just the United States, there are other countries around the world, but particularly in Ukraine, was also a domestic audience. I mean, in a couple of ways. First of all, a lot of this disinformation campaign was practiced on Russian dissidents themselves to to expand on this point earlier. But also tweaking the United States or undermining the essential elements of American democracy was something that enhanced or that the Russian government believed would it would enhance and ennoble them in the eyes of a domestic audience.

[00:06:51]

In other words, Russia, well, its economy is dwarfed by the United States. Look how much trouble we can cause for the world's greatest superpower. So so there was a domestic audience for this, even in advance of the attempts in 2016 to sow discord, to create chaos. I do agree with that, that that was really the intent. And over time, I think they settled on Trump as the best chaos agent, which is why I think that looking to understand any commonality of policy between Trump and Russia kind of misses the point.

[00:07:34]

The Russians were looking to sow chaos and to exacerbate divisions in American society in order to make their own system. You know, I think one of the most helpful things for me that you do in the film is in the very beginning is to is to show how these tactics are used against the Russian people to begin with. That was illuminating because this stuff wasn't new. They were very you know, they had been doing this in Russia for a very long time.

[00:08:07]

And I think to see the US election interference from Russia not as a brand new experiment, but as an extension of sort of tactics that were tried and true against the Russian people, was really helpful for me to understand what was going on. But before we dive into how the misinformation campaign really worked, we had Anne Applebaum on the show a few weeks ago. You're both probably familiar with her. And we talked about how American exceptionalism can make us feel immune to authoritarianism, but we're clearly not.

[00:08:40]

And so I want to hear from you. I'd love to hear from both of you on this. But what do you think it is about our current culture in the United States that made us so susceptible to the Russian attack? And. There's one point you make in the film, I think, about how the term, you know, the Russian playbook is bandied about quite a bit. And that makes it sound like there is a very specific and singular strategy that the Russians employed.

[00:09:08]

And so I know this is a multipart question, but I'd love to hear your thoughts on both of those things. Why don't we start with you and then Alex?

[00:09:14]

I'll do my best to complement what I was saying. I think the point about American exceptionalism is also one that applies to Silicon Valley. I used to work at Google when all of this started unfolding. Something that was really interesting in Silicon Valley is they had totally failed to take the lessons of how much this has happened everywhere else in the world. By 2016, there had been many well documented cases of troll farm targeting dissidents and trying to interfere in elections.

[00:09:49]

And yet everybody was caught absolutely flat footed to the point that most major platforms didn't even have rules against this type of activity. So that this notion that we forgot to look everywhere else in the world and therefore we were oblivious to threats that were honestly quite straightforward to predict rain. That didn't come as a surprise for anybody who is looking a little bit more broadly. One of the things that I think is interesting is sort of goes beyond social media into more mainstream media.

[00:10:25]

And I think that one of the things that that happened and you see it particularly in 2016, is that with the increasing polarization of American society, you see an economic model begin to emerge in which particularly news outlets begin to focus not on trying to talk to a broad middle, but to focus on totally capturing certain core groups, liberal, conservative. And so you see these groups, whether it be Mzee, MSNBC on one side and Fox News on the other side, not caring that they are not getting or reaching everybody.

[00:11:06]

They just want the rabid following of certain core groups. And that became an economic model that was really pushing an agenda which over time would create a kind of social mechanism where people were consuming news not as rational a fact pattern seekers, but as as members of tribes who were responding emotionally to messages that were being piped to them. And I think it really became a kind of firestorm in the 2016 election.

[00:11:43]

Yeah, news almost became an information commodity as opposed to a source of objective information. That's correct. Yeah. And actually, this dovetails with a question I wanted to ask me a little bit later. But let's do it now, because I think these are part of the same piece, which is filter bubbles and how how individual filter bubbles almost were a predicate for the success of the misinformation campaign. So can you talk about the filter bubble for those who aren't familiar with that term, which Eli Pariser might have coined back in the day, and how that is essentially a digital mirror of exactly what Alex is talking about with the cable news networks.

[00:12:22]

Generally, what people mean when they talk about the filter bubble online is this tendency that people online have to increasingly consume information that is aligned with their own worldview. Some of this is out of their own volition. They select the groups that they most agree with and in some of that is algorithmic. Your reaction now that you only read content that has a certain end to it, you will get recommended more of this content and so it can sort of enhance groupthink.

[00:12:58]

That's the that's the idea behind this term, the filter bubble. Now, why does this matter when you think about this information? Of course it does, because it explains how ideological communities gather online. And this is something that in the case we're talking about, the Russians really understood. And so when they created this fake persona is those fake counts, they really managed to create influencers for these micro communities. And so you had, for instance, General Abrams, who was the troll who was speaking to a specific community of Republican woman.

[00:13:40]

She was Jenna was saying she's an American woman from Main Street, USA, in her mid 30s. And she was very carefully studying the type of content that her audience was sharing and blending a few jokes on Kim Kardashian with a very inflammatory set of posts on American politics. And so the more we sort of create our own little bubbles online, which is also something that's normal, and the more algorithmic recommendations continue to sort of get people trapped in their bubble, the easier it is for actors whose main intent is to polarize and divide, to come and target audiences.

[00:14:25]

And I think we've seen that other folks have talked about, if you just create a blank profile on YouTube, for example, a brand new one, and then start watching cat videos, there are very few degrees of separation between the very first cat video and and and some extremist propaganda that Google will the YouTube will recommend to you just based on an increasing stickiness of the content. I think that's an important part of the conversation in understanding how these misinformation campaign worked in the first place and can still work now.

[00:14:57]

Alex, I have another question for you about another misunderstood term that has been used, which is the deep state.

[00:15:05]

And I and I want to know, Alex, this one that is definitely for you.

[00:15:12]

I think I just appreciate you talking for a few minutes about the merits of or or lack thereof, this argument that there is a group of people working inside the government to take down Donald Trump versus the you know, what we know are the professional patriots who are trying to guard against foreign interference, regardless of who it might help or hurt in any in any given context.

[00:15:38]

Right. So, I mean, I wouldn't want listeners to think that I'm utterly uncritical of our people who work in the intelligence agency says people who watch my films in the past.

[00:15:51]

No, I particularly when it comes to the CIA, I've been sometimes witheringly critical.

[00:15:56]

But I think that when people refer to the deep state, they're they're trying to create a sense that there are are people who are somehow set apart from the rest of us who are mysteriously like Puppeteer's, you know, guiding the rest of us in a certain direction. And and particularly in the case of Donald Trump, you know, it became a kind of a trope of particular on right wing media that the deep state was engaged in a conspiracy from the very beginning to undermine his candidacy.

[00:16:32]

It's a weird concept because actually what you learn or what I learned when I was doing the story was, number one, the Obama administration particularly and particularly a lot of intelligence professionals ended up being so diffident about letting the American public know about Russian interference that it ended up giving a lot of cover for for for Donald Trump's campaign, that there was a lot of because so everyone assumed that Hillary was going to win. And so aspects of the deep state were being extraordinarily quiet about what it was that they knew about what Russia was doing when Americans deserve to know more about it.

[00:17:12]

But there's something else that's funny that comes up in the documentary.

[00:17:16]

We interviewed Andrew McCabe, who number two FBI during this period, and he said nobody did more to enhance the campaign of Donald Trump than the FBI, which should put to rest any notions about the state.

[00:17:34]

And what he's talking about is James Comey, his decision to publicly reopen the investigation into Hillary's emails very late in the fall, just before the election.

[00:17:45]

And so that I think that's where you get into this discussion of agents of chaos and just where the chaos is coming from. And it was pretty multifaceted in this presidential campaign in 2016. So let's talk about partisanship a little bit more, because this clearly wasn't an issue that would impact voters of only one party. If you could speak directly to Republican voters right now, what would you say to them about the way this entire topic has been framed to date?

[00:18:20]

And, you know, obviously this this episode will come out on the day the film is released and everyone should go watch it, because I think it's the first real cohesive look at what happened in 2016 and what is happening now. But but I'd love for you to spend a couple of minutes talking about the problem of partisanship in in making the film and and in trying to communicate about what happened.

[00:18:41]

Sure. I mean, I think the Russian attack needs to be understood as an attack on the idea of democracy and also an attack on American sovereignty, the idea that another country would interfere in our ability to determine our own futures. And and that was, interestingly enough, the pitch that the Obama administration made to Congress to try to get Congress in a very polarized time to come out with a joint statement by Republicans and Democrats to let the American people know that the information that the intelligence agencies were providing to them, that as a bipartisan matter, Russia was trying to interfere in our election.

[00:19:28]

And that should be something that all Americans should know about.

[00:19:31]

But in this case, Mitch McConnell and Paul Ryan refused to do that and they refused to do it. Why? Because they felt that Russian interference was benefiting their candidate. So they reacted in an extremely partisan way, which ultimately, in my view, was unpatriotic. And it gets to this whole idea that we had to focus on throughout doing this documentary about. Trying to disentangle political motives from a kind of larger national sense of purpose, and and it was the intent of Russian interference to exacerbate partisanship and to increase the sense of tribalism so that a nation divided against itself would not stand.

[00:20:25]

There's a line, I think, from the very early parts of the film that I thought was so well put, which is that this is really about making sure that the American people are the ones who get to decide who's going to run the country no matter who.

[00:20:39]

That's Andrew Weissman who who was talking.

[00:20:41]

Yeah, just write a brilliant way of putting it in a way that message really hasn't penetrated in this topic, at least not enough anyway. So I want to turn now to the IRA, which is the Internet research agency, the troll farm that produced and executed the misinformation campaign in 2016.

[00:21:04]

Can we can you lay out for our listeners how the IRA created their personas and how they executed this misinformation campaign?

[00:21:13]

It's simple, I think, to today we look back at the operation and we think it's simple because now we've seen more sophisticated operations. But it's good to keep in mind that it was simple and it worked back then. Right. So a classic example is they paid for the ads on Facebook in Russian currency today. It's easy to look back and say, well, now that's an interesting telecine, right? Nobody would want to do that. But it worked.

[00:21:41]

Back then, Facebook accepted the currency and ran the Russian ads. Of course, it wouldn't fly in 2020, but it was simple back then because it was effective to be simple to to sort of recap the arc of the IRA targeting the US. It starts in 2013. They send up a project that is going to apply all the lessons that they've learned, targeting their own audiences, targeting Ukraine. And it's going to turn that against the US. It's not just a few social media posts.

[00:22:18]

It's a bit more complicated than that. They do send people on the ground. They send a few IRS employees to go and study the Americans to see what's what's happening, what are their hot buttons, what do they react to. And they also start making ties with real people in the U.S. on the ground. So they start making ties with activist groups who they continue talking to, they talk to over messenger, and then they help them coordinate rallies.

[00:22:51]

So. Throughout 2014 and 2015, they sort of put all of this infrastructure in place and they start a series of experiments, right? One of them is called Columbia Chemicals. They try to see if you can create an underground panic about a chemical incident that never happened. And it kind of works. Eventually, the FBI looks into it and says, no, that there was there was not a real incident. It must have been a hoax. But but at no point back then does anyone say, well, this is Russians, right?

[00:23:28]

It was like a very odd idea back then. And so all of this sort of a series of experiments try to understand what you can do using social media to get people to act in reality, culminates in 2016 with the part of the campaign that we know, the fake personas, the ads, the division of post, the messages to bring people on both sides of the streets that disagree with one another. And then, of course, after 2016, they continue.

[00:24:01]

They don't just stop there. They continue iterating right. More more divisive post, more polarizing issues. And then they change their strategy to respond to what the platforms are doing, to respond to how the media is reacting to Russian interference. And, you know, I want to bother you with the with the full details of how they evolved. But they evolved in twenty eighteen, a change again in 2019. They try new techniques in twenty twenty. I've seen them do things that I hadn't seen them do before.

[00:24:43]

So for instance, use A.I. to create fake profile pictures of people that never existed. That is something we saw in their latest campaign in August. And so it is a simple operation. It this really dynamic operation that also learns from the responses that it that it generates. Right.

[00:25:02]

Oh, my goodness. I'm going to come back to you and ask about the A.I. driven personas. But but while we're here, Alex, this is a really important point that you are making the film, and I'd love for you to speak to it. But part of what's so striking about how these misinformation campaigns are built is that they're built on each other. Right. They leave these breadcrumbs. They make it look like multiple people were corroborating the narrative.

[00:25:24]

But how much of the IRA's work was about injecting new ideas and new content into our social media site? Guys are our filter bubbles. And how much was it about highlighting what already existed? And and I'd love to hear your your thoughts on exactly how vulnerable we already were to this type of attack.

[00:25:46]

You know, the genius of the attack was that it wasn't pushing anything. It wasn't propaganda in the classic sense. I mean, I can remember riding the Trans-Siberian railway in nineteen seventy two. Oh, wow. And and and propaganda being piped into our, you know, our births in English. You know, the capitalist system is dead. You are doomed, you know, and we will we will crush you.

[00:26:15]

That was propaganda like brute force brainwashing, brute force brainwashing. Right. You know, just listen to more of this. Of course, you couldn't you know, after after the first day, we couldn't get any food on the train because they ran out and there was a ten day trip. So, so clearly something was there was some cognitive dissonance there. But in any event, the the the campaign by the Russian trolls in 2016 was entirely to exacerbate divisions and push hot buttons that existed here.

[00:26:48]

So that was the genius of the campaign. It wasn't to inject something new to try to convince Americans to stop sanctions, for example. It was no, it was all about what are the what are the hottest button topics like race or immigration and how can we inflame both sides of those issues so that people ultimately become so angry and so full of vitriol that they become disgusted by the entire thing and in some cases the most effective campaigns?

[00:27:25]

That we don't know for sure exactly what contributed to this. But I would suspect that the that if one was to guess at what the most effective, you know, outcome of this was, was to encourage certain voters to stay home.

[00:27:40]

You know, for example, in you know, a lot of African-American voters were targeted. And and we know that in Detroit, for example, 75000 people from the. Freud, who had turned out for Barack Obama in 2012, did not turn out for him in 2016, and Hillary Clinton lost Michigan the entire state by only 10000 votes.

[00:28:05]

But but that, in a sense, makes it too simple because it's you know, it it's really more about creating a sense of dysfunction, lack of trust in truth based institutions and in facts themselves, and encouraging people to rely increasingly on a sense of tribal affinity rather than, you know, doing rigorous analysis of holding, you know, governments and politicians to account. Can you give us an example of. Oh, go ahead.

[00:28:41]

Yeah, I was going to say I think Alex is really right by pointing to trust as being the main goal of what the Russians are trying to undermine trust in the system. And you can see that because when they talk about their own campaigns, right. They tend to also exaggerate their impact. They said we're so good at interference that you can no longer trust your own elections. Right. We've been so successful that now you have to doubt through your own process and your own results were great at undermining the system.

[00:29:15]

Yeah, you talked about exacerbating existing divisions. Can you can you give our listeners an example of what that looks like? And I would note that they aren't really concerned with the consistency of the message. Right. As a matter of fact, inconsistent messages are why it's so effective. I'd love to offer folks an example of what this looks like. And I can I can give a very specific example.

[00:29:38]

So there was one moment where the IRA targeted two opposite groups in Texas and encouraged them both to go on the same street on the same day to do a protest and a counter protest. There are also indications that when they communicated with both of these groups, they encouraged them to bring arms. And so on the one hand, you had the group that was very anti-immigrant and on the other hand, you had the group that they encouraged to to go and confront that group.

[00:30:15]

So this is sometimes, you know, I think sometimes it's easy to think that we're talking in metaphors when we say inflame people against one another. But sometimes we're not. We're literally saying they have targeted one street in Houston, in Texas, two opposing groups, and told them to come down in the street same day, same time and sort of watched to see how bad it could get.

[00:30:38]

And so these pieces of information, these these memes that that they would propagate have real world implications, like they take the activity from the digital world to the real world. That was always the goal.

[00:30:54]

And it's interesting because you can see them try this goal in different ways in 2014 and 2015. This is a series of very bizarre experiments where, for instance, in New York, they try to see if they can get people to come to specific places at specific times by telling them they're going to get free hot dogs.

[00:31:15]

They've always tried to see how you can use social media to get real things to happen in the street.

[00:31:22]

We can't leave this topic without offering voters some things that they can do to recognize these types of campaigns. So I'm sure you've answered this question quite a lot. But what can voters do to guard themselves against these types of campaigns or even to recognize them in the first place?

[00:31:38]

It's a difficult question because a lot of that is actually not on individuals. I think what you can do is you can watch Alex's film and sort of think about how much all of this feeds on the polarization, feeds on sharing things too quickly, feeds on inflamed patients on social media.

[00:32:01]

And so you can sort of try to help bring bring some counter to the chaos. But I don't think we want to live in a world where everybody is looking at any message on social media, trying to figure out whether it's a Russian troll.

[00:32:17]

At the end of the day, this isn't where we are. Right. And if everybody starts to doubt real activity, real activism, online, real grassroots movement, accusing, accusing others of being a Russian troll. And this is exactly what these campaigns were designed for. So acknowledging that this has happened, acknowledging how much it plays with our own divisions or own polarization while not switching to a mode where you suddenly distrust everything online is really where we need to be.

[00:32:50]

Yeah, I you know, I hate to just keep coming back to. Partisanship, but, Alex, it feels to me like the first thing that has to happen before we can recognize the depth of the wound and the severity of the vulnerability is to set aside partisanship, which seems completely intractable at this point because we've now moved from a place of extreme polarization to increasing radicalization. And, um, and I, I this is a big part of the conversation, I think.

[00:33:22]

But what do you see as. Constructive steps that either individuals or communities or governmental bodies, what needs to happen to protect against this kind of interference?

[00:33:36]

I mean, it's a really difficult and complicated question because in some ways it requires all of us to understand that we are motivated by unconscious biases as part of our everyday behavior and that people are making money off of manipulating those biases.

[00:33:57]

That's kind of step one, which is which sounds pretty difficult. Yeah, but but but, you know, it's something I learned from doing.

[00:34:05]

You mentioned earlier in the film when you when you were, you know, introducing me that I had done a film called Going Clear.

[00:34:11]

And one of the subtitles of Going Clear, which is about the Church of Scientology, a cult is called the prison of Belief. And the prison of belief is basically the idea that once you decide that a certain group is right, then you're in a prison that has no that is open to the outside world, but you never leave it because you you're so comfortable there that you won't question anything that has to do with the beliefs that make you comfortable. And also to the extent that.

[00:34:44]

That we're engaged by economic mechanisms that are really tickling our sense of anger and vitriol and and I would add that, you know, in this case, Donald Trump is a master at doing that, but it goes way beyond Trump and and goes to the to what the Russians were doing. You kind of need increasing levels of vitriol.

[00:35:10]

It's like a junkie that needs more and more potent fixes in order to be able to get by during the day. And and it prohibits you from, you know, a reckoning in a in a smart and thoughtful way with both people around you and particularly with your governments. So, yeah, in that sense, you know, partisanship, it's a tricky thing because, of course, there are elements of partisanship which are important and valuable, valuable, informed opinion.

[00:35:43]

But once that is exacerbated to the level where you're caught, we're locked in your own prison of belief, then you're in a world that's beyond fact. And beyond any kind of observable truth, and that's where you're allowing yourself to be manipulated by people who only care about one thing, and that's power, which is exactly where Russia wants us to go, because that's that's that's making us more like them politically correct.

[00:36:15]

And I think, you know, going back to something that you said earlier, you know, and maybe referring also to Anne Applebaum in terms of American exceptionalism, you know, I think we're used to thinking of ourselves as in relation to Russia. So there's us and we're the good guys. And Russia, they're the bad guys.

[00:36:32]

It's probably more useful to think of this as a continuum and, you know, levels of authoritarian behavior.

[00:36:41]

And there are times in one of the things we do in the film is to show protests in Russia that are being brutally put down by federal troops who are unmarked. And suddenly you see images toward the end of the film were exactly the same thing is happening in this country, federal troops unmarked, putting down protesters very violently.

[00:37:04]

And you have to ask yourself, well, is that so very different? And and so it becomes a sense of understanding how the United States and all of us as citizens fit within a larger context of justice, a sense of democracy and the rule of law. And it seems to me also that we're not talking about what Celeste Wallander I think she was the former National Security Council senior director. Yes. Russia expert. Russia expert. Exactly. Called treason because they made this look partisan.

[00:37:43]

And it comes back to the same problem when when we aren't able to recognize. Treason, something as as grave as as cooperation with a foreign power to undermine the stability of our system. It feels we're in a very dire place as a democracy. That's right. And that's not partisan. No, that's not partisan.

[00:38:08]

That means that you've moved into a different realm and that's called soft authoritarianism, where if if there are no rules and the only thing that matters is power. That is the road to tyranny. Let's talk briefly about the Senate intelligence report.

[00:38:27]

There have been a series of reports out of the Senate Intelligence Committee, one of the things we've learned recently is that the Trump campaign, through Paul Manafort, shared internal polling information with Russian Intelligence Assets Committee.

[00:38:42]

How could that information have shaped the IRA's work? And and also, I should ask you to talk a little bit about the work you did for the Senate Intelligence Committee, because I understand you were tapped by them to do the investigation into this interference.

[00:38:57]

That's right. In 2017, the Senate Select Intelligence Committee did something quite remarkable that had hadn't been done before. They turned to the large social media platforms. They went to Twitter, to Facebook and to Google, and they said produce the data for what the Russians did on your products and on your services. We want a full comprehensive look at everything they did and we want to understand this campaign. And it's full and so they've got all of this in a hard drive, and then they brought in a few researchers, including my team, and we worked with our colleagues at the Oxford Internet Institute.

[00:39:41]

They gave us the hard drive and it said, tell us what's going on. Wow. It was quite something. Yeah, that was a really interesting moment of them really pushing a bipartisan effort to shed some light on what happened next was also really good, because not only the. Accepted our report. They said, great, if that is your independent, dispassionate assessment of what happened, that's great. Now we're going to tell the public in doing so.

[00:40:15]

I do believe that they shaped the way Silicon Valley then responded to this information. They forced the issue in the public eye and they set a standard that in order to protect people against manipulation, the first thing that you needed to do is to clearly tell and show what these types of campaigns looked like. They enabled us to publish our long report, and they set a standard that when you find information operation targeting people, targeting elections, the right thing to do was simply to tell the public and to to show this is what it looks like.

[00:40:57]

So that's what we did with the Senate. They continued investigating the continued publishing about it. And I'm really happy to see that the transparency standard that they're setting has continued to progress. And we see other institutions being now much more transparent about these issues that they've ever been in the past.

[00:41:20]

So I want to come back to that point in just a minute about the role that big tech has to play here, not just going forward, both at an institutional level and how users engage with these platforms. But first, Alex, I want to talk about this report as a whole. And I have two questions I'd like you to speak to. First is when they put out the report, Marco Rubio, who was then the chairman of the Intelligence Committee, said, we found no evidence of collusion.

[00:41:50]

And and in this word, collusion is very problematic in this conversation. So can you talk about this concept of collusion and how we should think about the outcome of these investigations if there wasn't, quote unquote, collusion?

[00:42:07]

Yeah, I think that term is problematic. It's an imprecise term collusion because it has no legal force or definition, but it does suggest some kind of working together in this case between the Trump campaign and and and Russia. Well, I think that at one point in the film, we call it cartoon collusion. And Timothy Snyder calls it he says, don't use the term collusion, use the term seduction, by which he means a series of winks and nods, nudges and and in some ways a kind of call and response.

[00:42:49]

The case of Paul Manafort, which you mentioned just a bit ago, is is instructive. I mean, we now know and I mean, the Mueller report had a lot of this information already, but the Senate Intelligence Committee confirmed it. It's in our film also, but also took it one step further. We now know that Paul Manafort was literally working with an agent of Russian intelligence to provide polling data on swing states to a Russian intelligence. Now, if that's and he was the head of the Trump campaign.

[00:43:24]

Now, if that's not collusion, I don't know what collusion is. Now, he was doing it.

[00:43:31]

And not out of some.

[00:43:33]

He was not doing it because he was literally a Russian spy, but because he wanted to get out of a massive debt to an oligarch named Oleg Deripaska. And Paul Manafort, who is in many ways the inventor of the modern favor factory, knew that if he could put himself in a position of influence, he could use that influence for monetary gain. And in this case, it meant cancelling a debt which was canceled. So it's corrupt. It's not it's not like a John le Carre novel, but it's clearly a kind of a cross talk between a political campaign in this country and Russian intelligence.

[00:44:18]

So how is that not collusion?

[00:44:21]

Yeah, it's a great question. And and the follow up to the follow up to that whole question is these reports about Russian election interference. They never would have seen the light of day unless Mitch McConnell wanted them to. And so I'm I'm left scratching my head. And I'm wondering if you have any idea why he might have done that and what does it mean for Trump?

[00:44:47]

I'm not sure I can answer that. It is an interesting and intriguing that that and and that this report came out because, you know, talking about partisanship, what's interesting about this report is that it comes out of a Senate that is majority Republican. And this was a report that was bipartisan in nature. And they had to agree on the results and the findings. And so that makes the report quite potent and powerful, which which is important.

[00:45:18]

And it was a it was a good moment, I think, in terms of a fact based investigations.

[00:45:28]

I can't I'm not sure I'd ever want to try to understand Mitch McConnell in any deeper sense, except that, you know, he is one of those people, I think, who is.

[00:45:44]

Not interested in principle, right, only interested in the pursuit of power. Exactly, exactly. Which is why it feels like such a mystery, because this thing easily could have remained buried. And yet it's out. And Marco Rubio criticized it even after it passed his committee. And still, we're not having a a big enough conversation and inclusive enough conversation about the findings in the report. But before we go, let's look at the bigger Russia picture. And I do want to come back to the big tech role in what happens next.

[00:46:21]

But toward the end of the film, I was left deeply troubled, even more so than I already was, because we know that the misinformation campaign was not the only thing that Russia had cooking. Right. There were there were lots of other things. And I think you mentioned this earlier, the Russia playbook. Right. As as as it's called, we know that the misinformation campaign hacking the DNC emails were just a few of the other tactics.

[00:46:50]

But what else were they prepared to do if the disinformation campaign didn't work?

[00:46:57]

I think it's a little more systematic than that. I think that what we've seen them do is throwing all the spaghetti at the wall and seeing what sticks approach. I think that you have multiple different type of institutions. They're all pursuing their own different tactics and their objectives to sort of broad strategic alignment that you have a lot of people ultimately trying to impress the same boss, sort of outbidding one another in creative approaches when we talk about the IRA a lot, because they've captured our imagination so much.

[00:47:37]

But there were other Russian units who were creating fake profiles to target Americans and the Russian military intelligence. Are you also that they created fake journalists who pitched stories to news outlets? I think there's really sort of competition between all people, between many different actors to outbid one another in creative strategies to ultimately impress their boss.

[00:48:06]

I think Comi is on to something important here, which is we mustn't overemphasize the coordinated and world beating power of this Russian interference. And in some ways it was very inefficient, chaotic and decentralized. And in one case, we we we pretty strongly believe that the GRU, Russian military intelligence and the SVR essentially, you know, Russia's CIA both penetrated the DNC and essentially through different doors and may not have known that each of them were there, which doesn't sound like the most efficient way of going about it.

[00:48:47]

And less, as Comey suggests, you're both trying to please the same boss and you're in competition to do so.

[00:48:56]

But but getting back to the other point, you made one element of this, which we should bear in mind on the eve of our own election in 2020, is that a key component of the cyber campaign? And that's how I see this sort of broader Russian interferences in cyber terms. You know, the social disinformation, the literal hacking and exfiltration of of of of data and weapons using it, but also intrusions.

[00:49:27]

Into an American electrical systems, and they did that in 2016, we pretty strongly believe not to literally swing certain states or districts one way or the other because, you know, for one, the Russians were warned, rather.

[00:49:47]

Intensely by Obama that if they were to do that, there would be hell to pay.

[00:49:53]

In fact, he allegedly said that we would destroy their economy, but we also think that they did it because it was a way of harmonizing. Again, this gets back to the kind of collusion where Trump says something and the Russians harmonize. Trump calls in public for, he says, Russia, if you're out there, you know, find me those Russian those missing Hillary emails.

[00:50:17]

Lo and behold, a few hours later, the Russians are at work doing just that. And Trump also during 2016 was was was trying to insinuate that our election systems were rigged. If he lost, it meant things were rigged. And so the incursions into all 50 states by by the Russians in 2016 were not meant. To swing the vote one way or another, but to cast doubt if everyone expected Hillary to win, but to cast doubt on the results so that once again, you know, going back to what KOMY and I was saying of what we discussed earlier, to undermine trust in the system.

[00:51:07]

So that's why, you know, all Americans who, you know, whether you're Republican or Democrat or independent, should be very much in favor of as much transparency as possible when it comes to being able to cast and count votes. Alex, I have to ask a follow up question to that, because you set it up so perfectly. But when you talk about the intent being to undermine trust in the system, it's just so obvious that that's exactly what Donald Trump is trying to do right now in advance of the election.

[00:51:45]

And we know what a traditional attack feels like, right?

[00:51:49]

We have 9/11 as a reference, for example. We know we know the markers of those attacks, but we don't, I think, have the same kind of markers for what it looks like when our election system fails or when it's being actively undermined. So how are you thinking about how prepared we are to handle that in 2020? And how concerned do you think we should be?

[00:52:13]

We should be concerned, but what we should keep our eye on is the importance of. Having. Faith in our political and electoral institutions and not permitting people like Donald Trump to undermine that trust, because that is in a way what makes him an agent of chaos, which is why I believe that, you know, the Russians tend to favor Donald Trump again, not because of any policy implications, but because he is an agent of chaos and he wants to erode trust in democracy.

[00:52:54]

And that, I think, is something we should resist at all costs. OK, now back to the tech giants question. We've talked about filter bubbles.

[00:53:05]

We've talked about how it can be difficult at an individual level to resist or to recognize these campaigns. And there isn't maybe there isn't a whole lot individual people can do. How much of our progress on cybersecurity and our defense against these kinds of intrusions and media manipulation rests on the shoulders of the tech giants. And what do we need them to do? It's a great question. And not everything rests on the shoulders of the tech giant, but a lot does.

[00:53:42]

And. It's a fair assessment to say that in 2016, they really were caught flat footed again, they had no policies against this type of activities. They had no system to prioritise and take action. They weren't really people whose full time job it was to go and find this type of activity detected before it can hurt. The good news is we've now gone a long way. We've seen Facebook and Twitter and Google and Reddit and Pinterest and tick tock and a whole lot of platforms create rules, terms of services that address this type of activity.

[00:54:22]

We've seen them come and say we will not allow this type of activity on our platforms. We've seen them create specific teams of people whose job it is to go into tech that we've seen them communicate more transparently on. What does it look like when they do find a new campaign either by Russia or by China or by Iran or whatnot? And so I think that in those three short years from 2017, which is really the first time that they're foreign entities to 2020, which we've put a lot of stronger foundations in place, I think there's more to do.

[00:55:01]

I think that there's more resources that should be dedicated to this problem. But but I think we're definitely in a better spot now than when we were in 2016.

[00:55:13]

So you would characterize them as significant progress. But we've done so far. Yes, that's OK. To be fair, we're so behind it. It is significant progress. I will, however, note for the sake of intellectual honesty that we're nowhere to be found in 16.

[00:55:38]

Well, that raises the secondary question, I think, which is how concerned should we be that our primary defence, it seems now against these intrusions, against this kind of manipulation, which hits at the core of our democracy, the core of our system? Now, our first line of defence seems to be private corporations.

[00:56:03]

Yeah, I don't think that's exactly right, because the Silicon Valley giants have leveled up. Now they're taking this issue seriously. But when you look at all the people who work in this field, you see that and governments often say, you know, hey, we found this campaign. We think it's suspicious. We want to publish about this. This is what happened in August when US law enforcement referred a website that they thought was Russian to the social media giants for them to investigate.

[00:56:35]

We see our researchers all across the board independently working and covered these issues. We see investigative journalists fine, entirely neutral forums in all corners of the universe. So we needed the platforms to step up. We need them to continue investing. We need them to continue doubling down on transparency. But there are a lot of people who now participate in addressing this issue. Thank you to Alex and me for being on today. And thanks to all of you for listening.

[00:57:11]

Part one of Agents of Chaos premieres tonight, September twenty third on HBO at 9:00 p.m. Eastern Time. Part two debuts tomorrow night. At the same time, you can find more information about our movement at Lincoln Project US. If you have advice or questions about the podcast, you can always reach us at podcast at Lincoln Project US. If you haven't yet. Please make sure to subscribe rate and review the show wherever you get your podcasts for the Lincoln Project.

[00:57:41]

I'm Ron Stossel. I'll see you in the next episode.