Transcribe your podcast
[00:00:06]

Welcome to the Making Sense podcast, this is Sam Harris, just a note to say that if you're hearing this, you're not currently on our subscriber feed and we'll only be hearing partial episodes of the podcast. If you'd like access to full episodes, you'll need to subscribe at Sam Harris Network. There you'll find our private RSS feed to add to your favorite podcast, you along with other subscriber only content. And as always, I never want money to be the reason why someone can't listen to the podcast.

[00:00:34]

So if you can't afford a subscription, there's an option. And Sam Harris Digg to request a free account and we grant 100 percent of those requests. No questions asked. Welcome to the Making Sense podcast. This is Sam Harris. OK, no housekeeping today. Today, I'm speaking with Nina Shick, Nina is an author and broadcaster who specializes in how technology and artificial intelligence are reshaping society. She has advised global leaders in many countries, including Joe Biden, and she's a regular contributor to Bloomberg Sky, CNN and the BBC.

[00:01:17]

Nina speaks seven languages and holds degrees from Cambridge University and University College London. And her new book is Deep Fakes, which explores the terrain we're about to discuss. We talk about the epidemic of misinformation and disinformation in our society now and the coming problem of deep facts, which is when you imagine it in detail, fairly alarming. We get into the history of Russian active measures against the West, the weaponization of the migrant crisis in Europe, Russian targeting of the African-American community, Trump and the rise of political cynicism.

[00:01:57]

Q And on the prospect of violence surrounding the presidential election and other topics. Anyway, this is all scary stuff, but Nina is a great guide through this wilderness. And now I bring you Nina Schick. I am here with Nina Shick, Nina, thank you for joining me. Thanks for having me, Sam. We have a lot to talk about. You have a very interesting background, which I think suggests many common interests and kind of overlapping life trajectories.

[00:02:36]

I don't think we're going to really get into that because you you have produced so many urgent matters in your in your recent book that we need to talk about. But to get started here, what is your background and personally, but also just what you're focusing on these days that gives you a an expertise on the topics we're going to talk about?

[00:02:55]

Well, it's a really interesting and crazy story, one that could only happen in the 21st century. I'm half German and I'm half Nepalese.

[00:03:04]

My father was a German criminal defense lawyer who in the 70s decided, you know, he was going to seek spirituality and travel east and took his car through and a few books and did that big journey that a lot of young people did back in the 70s through Afghanistan, India, and then ended up in Nepal, which at this time was still this hermetic kingdom, fell in love with it and met my mother there briefly after a decade or so.

[00:03:31]

And basically my mother came from this totally different universe.

[00:03:35]

She grew up in Nepal as a member of this community in a Himalayan tribe, had no running water or electricity shoes when she was growing up. And because she met my father, you know, they fell in love and they kind of decided to have us.

[00:03:49]

My brother and myself and I grew up in Kathmandu in the 80s, in the 90s.

[00:03:53]

And then eventually I came to the UK to go to university and I went to Cambridge and UCL.

[00:04:00]

And my kind of discipline is really in history and politics.

[00:04:03]

I've always been fascinated by history and politics, and especially at this time when the geopolitical sands seemed to be shifting in such a dramatic way.

[00:04:12]

So my career over the last 10 years has really been working at the heart of Westminster as a policy analyst, a journalist and an advisor on some of the key geopolitical shifts around the European Union.

[00:04:24]

So this includes the kind of what happened with Russia and the invasion of Ukraine in 2013, subsequently the EU's migrant crisis in 2015.

[00:04:35]

Then obviously, I was very tied into the work here in the UK around Brexit. I was helping to advise the government on that in 2016.

[00:04:43]

Then, of course, the election of Trump in 2016.

[00:04:46]

Then I went on to advise Emmanuel Machen's campaign, which was also interestingly hacked by the Russians, and find they got to a point in twenty eighteen where I was working with the former NATO secretary general and he convened a group of global leaders, which included Joe Biden, and he wanted to look at how the 2020 election might be impacted by what we had seen in 2016 and how the new kind of threats were emerging.

[00:05:16]

And this is really where I came to debates, and that is really the starting point for my book.

[00:05:21]

So I have this background in geopolitics, politics, information warfare, and my area of interest is really how the exponential changes in technology, and particularly I are rewriting not only politics, but society at large as well.

[00:05:37]

So, yeah, you are a citizen of the world. I mean, that's quite amazing. Did you grow up speaking Nepali and German? Yeah.

[00:05:44]

I mean, I grew up with four languages, so Nepali, German, Tamang, because my mother is from an ethnic minority group in Nepal, which actually is closely related to Tibetans.

[00:05:59]

So Tamang is a completely different language.

[00:06:01]

So Nepali, German, Tamang and Hindi because everybody in Nepal speaks Hindi. Yeah, India is the big brother on the border.

[00:06:08]

So that was you know, it's something I wish I could give my daughter as well.

[00:06:13]

I live in the UK now and most people in the UK, you know, we speak English.

[00:06:18]

That's it. Yeah. All too well, I can hear. So your your English portrays none of that colourful backstory. It's quite amazing. So, yeah, I know we have common interests in the kinds of things that brought your father to Nepal in the first place and the meditation and forming a philosophy of life that is aimed at deeper levels of wellbeing than than is often attained by people. But with such a colossal mess to clean up in our society now with with how our information ecosystem has been polluted and deranged, that I think we're just going to eat or do another podcast on the happy talk of what we could share when we get past these increasingly terrifying dangers and self-inflicted wounds.

[00:07:07]

I mean, it's really it's amazing to see how much of this is our own doing. And we'll talk about bad actors and people who are consciously using. Our technology against us to really destroy the possibility of of living in an open society, but so much of this is a matter of our entertaining ourselves into a kind of collective madness and what seems like it could be, you know, a common social collapse. I realize that if you're not in touch with these trends, you know, if anyone in the audience who isn't this kind of language coming from me or anyone else can sound hyperbolic, but we're really going over some kind of precipice here with respect to our ability to understand what's going on in the world and to converge on on a common picture of a shared reality, because we're in the midst of an information war and it's being waged against democratic societies by adversaries like Russia and China.

[00:08:08]

But it's also a civil war that's being waged by factions within our society and their various political cults. And then there's the president of the United States himself. All of this is happening on the back of and facilitating an utter collapse of trust in institutions and a global decline in democracy. And again, we've built that the very tools of our derangement ourselves. And in particular, I'm talking about social media here. Yes. Your book goes into this and it's organized around this this new piece of technology that we called Deep Fakes.

[00:08:47]

And the book is Deep Fakes the come in InFocus Lips, which that's not your coinage. It's on the page is very easy to pass when you say it. It's hard to understand what's being said there, but it's really you're talking about an information apocalypse. Just remind people what deep fakes are and suggest what's at stake here in terms of of how difficult it could be to make sense of our world in the presence of this technology. Yes, absolutely.

[00:09:14]

So a deep FAQ is a type of synthetic media. And what synthetic media essentially is, is any type of media. It can be an image. It can be a video. It can be a text that is generated by A.I. and this ability of A.I. to generate fake or synthetic media is really, really nascent.

[00:09:36]

We're only at the very, very beginning of the synthetic media revolution.

[00:09:40]

It was only probably in about the last four or five years that this has been possible and for the last two years that we've been seeing how the real world applications of this have been leaching out from beyond the research community.

[00:09:55]

So the first thing to say about synthetic media is that it is completely going to transform how we perceive the world, because in future, all media is going to be synthetic because it means that anybody can create content to a degree of fidelity that is only possible for Hollywood studios right now.

[00:10:18]

Right. And they can do this for little to no cost using apps or software, various interfaces which will make it so accessible to to anyone.

[00:10:29]

And the reason why this is so interesting, another reason why synthetic media is so interesting is until now, the best kind of computer effects CGI do. You still can't quite get humans.

[00:10:43]

Right? So when you use CGI to do effects, when you're trying to create robotic humans, it still doesn't look like it's called, you know, uncanny valley.

[00:10:51]

But it turns out that A.I., when you train your machine learning systems with enough data, they're really, really good at generating fake humans are synthetic humans, both in images.

[00:11:02]

I mean, and when it comes to generating fake human faces.

[00:11:06]

So images, still images, it's already perfected that.

[00:11:10]

And if you want to kind of test that, you can go and look at this person does not exist dot com. Every time you refresh the page, you'll see a new human face that to the human eye. To you or me, Sam.

[00:11:20]

We'll look at that and we'll think that's authentic human, whereas that is just something that's generated by a human that really doesn't exist.

[00:11:28]

And also now increasingly in other types of media like audio and film.

[00:11:35]

So I could take essentially a clip of a recording with you, Sam, and I could use that to train my machine learning system.

[00:11:43]

And then I can synthesize your voice so I can literally hijack your biometrics. I can take your voice synthesizer, get my I can a machine learning system to recreate that.

[00:11:53]

I can do the same with your digital likeness. Obviously, this is going to have tremendous commercial applications, entire industries are going to be transformed. For example, corporate communications advertising, the future of all movies, video games.

[00:12:12]

But this is also the most potent form of mis and disinformation, which are democratizing for almost anyone in the world at a time when our information ecosystem has already become increasingly dangerous and corrupt.

[00:12:27]

So the first thing I'd say about synthetic media is it is actually just heralding this tremendous revolution in the way that we communicate.

[00:12:36]

The second thing I'd say is that it's coming at a time when we've had lots of changes in our information ecosystem over the past 30 years.

[00:12:44]

So, you know that society hasn't been able to keep up with from the Internet to social media to smartphones.

[00:12:49]

And this is just the next step in that.

[00:12:51]

And then the final thing, this is where I come to debates, is that this field is still so nascent and emerging that the taxonomy around it is completely undecided yet.

[00:13:01]

And as I already kind of pointed out or touched upon, there will be legitimate use cases for synthetic media. And this is one of the reason, reasons why this cat is out of the bag.

[00:13:11]

There's no way we're putting it back in because there's so much investment in the kind of commercial use cases ever since.

[00:13:18]

I think there's almost 200 companies now that are working exclusively on generating synthetic media.

[00:13:24]

So we have to distinguish between the legitimate use cases of synthetic media and how we draw the line.

[00:13:31]

So I very broad brush in my book say that the use and intent behind synthetic media really matters in how we define it. So I refer to deep faith as when a piece of synthetic media is used as a piece of misr disinformation.

[00:13:46]

And, you know, there is so much more that you could delve into there with regards to the kind of the ethical implications on the taxonomy.

[00:13:52]

But broadly speaking, that's how I define it. And that's my definition between synthetic media and deep fakes. Well, so it may as you point out, all of this would be good, clean fun if it weren't for the fact that we know there are people intent upon spreading misinformation and disinformation and doing it with a truly sinister political purpose. I mean, not not just for amusement, although that can be harmful enough. It's something that state actors and people internal to to various states are going to leverage to further divide society from itself and increase political polarization.

[00:14:38]

But it's amazing that it is so promising in the fun department that we can't possibly even contemplate putting this cat back in the bag. I mean, that's just that's the problem we're seeing on all fronts. It's so it is with social media. So it is with the the ad revenue model that is selecting for so many of its harmful effects. I mean, we just can't break the spell wherein people want the cheapest, most fun media and they want it endlessly.

[00:15:11]

And yet the the harms that are accruing are so large that it's it's amazing just to see that there's just no there's no handhold here whereby we can resist our slide toward the precipice. Just to underscore how quickly this technology is developing, in your book, you point out what happened with the once Martin Scorsese. He released his film The Irishman, which had this exceedingly expensive and laborious process of trying to damage its principal actors, Robert De Niro and Joe Pesci.

[00:15:45]

And that was met with something like derision for the imperfection of of what was achieved there, again, at great cost. And then very, very quickly, someone on YouTube was using free software, did a nearly perfect DaGian of the same film. It's just amazing what's happening here. And again, these tools are going to be free, right? I mean, they're already free and and ultimately the best tools will be free. Absolutely, so you already have various kind of software platforms online, so the barriers to entry have come down tremendously right now.

[00:16:28]

If you wanted to make a convincing fake a video, you would still need to have some knowledge, some knowledge of machine learning, but you wouldn't have to be an expert by any means.

[00:16:38]

But already now we have apps that allow people to do certain things like swap their faces into scenes. For example, we face I don't know if you've come across that app. I don't know how old your children are.

[00:16:50]

But if you have a teenager, you probably come across that you can basically put your own face into a popular scene from a film like Titanic or something.

[00:17:00]

This is using the power of synthetic media.

[00:17:04]

But experts who I speak to on the generation side, because it's so hugely exciting to people who are generating synthetic media, think that by the end of the decade, any YouTube to any teenager will have the ability to create special effects and film that are better than anything a Hollywood studio can do now.

[00:17:25]

And that's really why I put that anecdote about the Irishman into the book, because it just demonstrates the power of synthetic media.

[00:17:32]

I mean, Scorsese was working on this project from 2015.

[00:17:35]

He filmed with a special three camera.

[00:17:38]

He had this best special effects artists, post-production work, multi-million dollar budget.

[00:17:44]

And still the effect at the end wasn't that convincing? It didn't look quite right. And now one YouTube free software takes a clip from Scorsese's film in 2020.

[00:17:55]

So Scorsese's film came out in twenty nineteen this year. He can already create something that's far more when you look at it looks far more realistic than what Scorsese did.

[00:18:05]

This is just in the realm of video, as I already mentioned, with images. It can already do it perfectly. There is also the case of audio. There is another YouTube of, for example, who makes a lot of the kind of early pieces synthetic media have sprung up on YouTube. There is a YouTube called Vocal Synthesis, who uses an open sourced model to train or trained on celebrities voices so he can watch something that he's done that's gotten many, many views on YouTube as he's literally taken audio clips of dead presidents and then made them rap and fuck the police.

[00:18:47]

Ronald Reagan, FDR, he is very interesting.

[00:18:52]

This is an indicator of how complex these challenges are going to be to navigate in future, because another thing that he did was he took.

[00:19:02]

Z's voice and made him recite Shakespeare to be or not to be an interestingly Jay record label, filed a copyright infringement claim against him and made him kind of take it down.

[00:19:15]

But this is really just a forebear of the kind of battles we're going to see when any anonymous user this is can take your likeness, can take your biometrics and make you say or do things that you never did.

[00:19:30]

And of course, this is disastrous to any liberal Democratic model, because in a world where anything can be faked, everyone becomes a target. But even more than that, if anything can be faked, including evidence that we today see as an extension of our own reality. And I say evidence in quotation marks, video, film, audio, then everything can also be denied.

[00:19:56]

So the very basis of what is reality starts to become corroded.

[00:20:01]

Of course, reality itself remains. It's just that our perception of reality starts to become increasingly clouded. So what are we going to do about this?

[00:20:11]

Again, we're going to we're going to get into all of the evidence of just how aggressively this will be used, given everything else that's been happening in our world. We'll talk about Russia and Trump and Kuhnen and other problems here. But many of us can dimly remember 20 years ago before covid, when the Bush audiotape dropped and Trump sort of attempted to deny that the audio was real of him on the bus, but we were not yet in the presence of such widespread use of deep fake technology that anyone was even tempted to believe him.

[00:20:54]

We knew the audio was real. Now, apparently, it didn't matter, given how corrupted our sense of everything had become by that point politically. But we could see the resort to claiming fakery that will be relied upon by everyone and anyone who is committed to lying, because it'll be so much of it around that really it's you know, it will be only be charitable to extend the benefit of the doubt to people who say, listen, that wasn't me.

[00:21:27]

That's just a perfect simulacrum of my voice and even my face. But you actually can't believe your eyes and ears at this point. I would never say such a thing in any of your conversations with experts on this topic or any of them hopeful that we will be able to figure out how to put a watermark on digital media in such a way that we will understand its provenance and be able to get to ground truth when it matters.

[00:21:55]

So I think the problem of what do we do about it is so huge that ultimately we can only fight the corroding information ecosystem by building society wide resilience.

[00:22:07]

But the solutions, if you want to term it that way, broadly fit into two categories. The first are the kind of technical solutions.

[00:22:16]

So because synthetic media is going to become ubiquitous and we as humans will not be able to discern because of the fidelity, the quality, whether it's real or fake.

[00:22:29]

So you can't rely on digital forensics in the sense that somebody goes through and clicks and looks at each media and decides, oh, are the eyes blinking correctly?

[00:22:38]

Do the ears look a little bit blurred? Because these are what we do now, right? Because the generation side of synthetic media is still so nascent, so we're not going to be able to do that.

[00:22:49]

Second, the sheer volume, when you talk about at the scale at which you can generate synthetic media means that humans are never going to be able to go through it all, never going to be able to fact check each piece of media.

[00:23:02]

So we have to rely on. Building the A.I. software to detect, for example, defects and right now there is an interest and increasingly there are certain experts and groups who are putting money into being able to detect fakes.

[00:23:19]

However, the problem is because of the adversarial nature of the A.I. and the way that it's trained, every time you build a detector that's good enough to generate to detect the fake, the generation model can also become stronger.

[00:23:36]

So you're in this never ending game of cat and mouse where you know, the you you keep on having to build better detectors.

[00:23:43]

And also given the various different models and ways in which the fakes can be generated, there's never going to be a one size fits all model. There's a hypothetical question which is open still in the research community about whether or not the fakes can become so sophisticated.

[00:24:03]

So we already know they're going to be humans. They already basically do.

[00:24:07]

But is there a point where the fakes become so sophisticated that even I and II detector can never detect in the DNA of that fake that it's actually a piece of synthetic media?

[00:24:17]

We don't know yet is the answer to that. But I will say that there is far more research going into the generation side because like so much in terms of the information ecosystem, the architecture of the information ecosystem in the information age, it has been driven by this almost utopian, flawed vision of how these technologies will be serving an unmitigated good for humanity without thinking about how they might amplify the worst sides of human intention as well.

[00:24:48]

The second side, and you touched upon that is building provenance architecture into the information ecosystem.

[00:24:55]

So basically embedding right into the hardware of devices, whether that's a camera, mobile phone, the authenticity watermark to prove that that piece of media is authentic.

[00:25:09]

You can track it throughout its life to show that it hasn't been tampered with or edited. And this is something that, for example, Adobe is working on along with the on its content initial authenticity initiative.

[00:25:23]

So there are. Technical solutions underway both inside in terms of the detection and the provenance side of the problem, however, ultimately this is a human problem to the extent that. Disinformation or bad information didn't just come about at the turn of the millennium, it's just that we have never seen it at this scale.

[00:25:49]

We have never seen it this potent and we have never, ever been able to see to have it as accessible as it is now. So ultimately, this is a human problem.

[00:25:59]

There's no way we can deal with the challenges of our corroding information ecosystem without talking about human, quote unquote, solutions.

[00:26:06]

How do we prepare society for this new reality?

[00:26:09]

And we are way behind. We're always reactive. Our reactions are always piecemeal.

[00:26:14]

And the biggest problem is the information ecosystem has become corrupt to the extent that we can't even identify what the real risks are.

[00:26:24]

Right. We're too busy fighting each other about other things without seeing what the real existential risk is here.

[00:26:30]

Yeah, yeah. I mean, that is a very symptom of the problem itself. The fact that we can't even agree on the nature of the problem. There's so much disinformation in the air. It makes me think that one solution to part of the problem, I don't think it captures all of it, but certainly some of the most pressing parts of it could be solved if we had lie detection technology that we could actually rely on. Just imagine we had real time lie detection and you could go to the source.

[00:27:00]

You know, if some awful piece of audio emerged from me and it purported to be a part of my podcast where I said something, you know, a reputation cancelling. And I said, well, that's a fake. That wasn't me. The only way to resolve that would be to tell whether I'm lying or not. We're forcing ourselves into a position where it's going to be a kind of emergency not to be able to tell with real confidence whether or not somebody is is lying.

[00:27:32]

So I think we're going to in addition to the arms race between deep fakes and deep fake identifying, I think this could inspire a lie detection arms race because there's so many other reasons why we would want to be able to detect people who are lying. Having just watched the presidential and vice presidential debates in America, one could see that the utility of having a red light go off over someone's head when when he or she knows that he or she is lying.

[00:28:02]

But if we can't trust people and we can't trust the evidence of our senses when we have media of them saying and doing things convincingly delivered to us in torrents, it's hard to see how we don't drift off into some horrifically dystopian dream world of our own confection. Absolutely.

[00:28:24]

And this is really why, you know, I wrote the book.

[00:28:27]

I wrote it in a way that was very accessible to anyone to pick up and zoom through an afternoon, because I think without this conceptual framework where we can connect everything from Russian disinformation to the increasingly partisan political divide in the United States, but also around the rest of the Western world, and understanding how now, with the age of kind, with the age of synthetic media upon us, how our entire perception of the world is going to be changed in a way that is completely unprecedented, how we can be manipulated in the age of information where we had assumed that once we have access to this much information, that surely progress is inevitable.

[00:29:19]

But to actually understand how the information ecosystem itself has become corrupt, I think is the first step. And to be honest with you, I do tend to think that things will probably get worse before they get better.

[00:29:34]

And I think the US election is a great case study of that, because it's almost no matter the outcome, right.

[00:29:41]

Let's say that Trump loses and he loses by a large margin. You know, that he could still refuse to go, even if the Secret Service will come and take his bags and ask him, please, Mr. Trump, there's the door.

[00:29:56]

He has this influence now where a lot of his followers genuinely believe that he is, you know, the same this kind of savior of America.

[00:30:08]

And if he asks them to take arms and take to the streets, I mean, this is literally already happening right now. Right.

[00:30:13]

You have armed insurrection, militia kind of patrolling the streets of the United States on both the left and the right for their political grievances.

[00:30:22]

So if Biden wins, let's say Trump goes quietly and Biden wins, well, then you still haven't addressed the bigger problem of the pocalypse where the information ecosystem has become so corrupt and so corrupted and the synthetic media revolution is still upon us.

[00:30:37]

So I I'm hopeful that we still have time to address this, because, like I said, this technology is so nascent, we can still try to take some kind of action in terms of what's the ethical framework, how are we going to adjudicate the use of synthetic media? How can we digitally educate the public about the risks of synthetic media?

[00:30:58]

But it is a ticking time bomb and the window is short.

[00:31:04]

As if to underscore your your last point at the time we're speaking here, there's a headline now circulating. The 13 men were just arrested, including seven members of a right wing militia plotting to kidnap the Democratic governor of Michigan, Gretchen Whitmer, for the purposes of inciting a civil war. One can only imagine the kind of information diet of these militia members. But this is the kind of thing that gets engineered by crazy information and pseudo facts being spread on social media.

[00:31:36]

And this is the kind of thing that, when even delivered by a mainstream news channel, one now has to pause and wonder whether or not it's even true, because there's been such a breakdown of trust in in journalism and there's so many cries of fake news, both cynical and increasingly real. That is just what we're dealing with, a circumstance of such informational pollution. Let's talk about Russia's role in all of this, because Russia has a history of prosecuting what they call active measures against us.

[00:32:13]

And we really have for a long time been in the midst of an information war, which is essentially a psychological war. And Russia is increasingly expert at exploiting the divisions in our society, especially racial division. So maybe you can summarize some of this history.

[00:32:33]

Yeah, I mean, I start my book with Russia because I my career intersected a lot with what Russia was doing in Ukraine in 2014 and the kind of information war they fought around, the annexation of Crimea and eastern Ukraine, where they basically denied that it was happening at all. And the same with the shooting down of MH 17.

[00:32:57]

This was the Malaysian aircraft that was shot down over eastern Ukraine, which now has been proven to have been by Russian military services.

[00:33:05]

But at the time, they were saying this had nothing to do with them and that this was pro-Russian Ukrainian separatists who had shot down the airliner.

[00:33:14]

So what Russia did with information warfare around Ukraine, Crimea, around Europe in 2015, when Putin and Assad stepped up their bombardment of civilians in Syria, unleashing this mass migration which basically led to the EU's migrant crisis five years ago.

[00:33:33]

I don't know if you remember those images of people just arriving at the shores, you know, and some of them were refugees.

[00:33:43]

But as we now know, you know, a lot of them were there were also terrorists, economic migrants, and how that almost tore Europe apart and the information war that Russia fought around those events where they perpetrated these stories about, for example, girls in Germany who had been raped by supposedly raped by arriving migrants.

[00:34:07]

And stories like this legitimately did happen. But this story was completely planted.

[00:34:12]

So it's dividing the line and it's blurring the line between what's real and fake. But what was also very interesting for me was that I worked on or I studied and I worked on the Russian information operations around the US election in 2016.

[00:34:27]

And the first thing to say about that is to me, it's an it's a we can see how corrupt the information ecosystem has become to the extent that those information operations have become a completely partisan event in America.

[00:34:41]

Right. Some people say that Russia is behind everything and others. Deny that Russia did anything at all, and this is just nonsense, you know for sure the Russians intervened in the 2016 election and they continue to intervene in US politics to this day.

[00:34:58]

And I suppose what was very interesting to me about what Russia was doing was how.

[00:35:04]

This information warfare strategy, which is old and it goes all the way back to the Cold War, was becoming increasingly potent with the weapons of this modern information ecosystem.

[00:35:17]

And one of those was social media.

[00:35:20]

What they did in Ukraine and then Europe around the migrant crisis and then around the U.S. election was influence operations on social media, where they actually posed, in the case of the United States as authentic Americans. And then they over the years, by the way, this wasn't just them getting involved in the weeks running up to the election. They started their influence operations in the United States in 2013.

[00:35:46]

They build up these tribal communities on social media and built up well, basically played identity politics, built up their pride in their distinct identity. And interestingly, this wasn't just Russians targeting, you know, right wing kind of Trump supporters.

[00:36:07]

They did it across the political spectrum.

[00:36:10]

And as a matter of fact, they disproportionately focused on the African-American community.

[00:36:15]

So they built these fake groups, pages, communities where you in. You imbue them with your distinct pride in your distinct identity.

[00:36:26]

And then as we got closer to the election, those groups were then sporadically injected with lots of political grievances, some of them legitimate to make these groups feel alienated from the mainstream.

[00:36:37]

And again, the primary focus of their influence operations on social media was the African-American community who they were basically targeting so that they felt so disenfranchised and disconnected from Hillary America at large that they wouldn't go and vote in the election.

[00:36:53]

Right.

[00:36:54]

And what has happened now four years later is that those operations are still ongoing, but they've become far more sophisticated.

[00:37:02]

So in 2016, it might have been a troll forum in St. Petersburg, but in twenty twenty one operation, that was earlier this year, which was revealed through CNN, Twitter, Facebook, a joint investigation was that the Russian agency which was in charge, which is in charge of the social media operations, it's called the Internet Research Agency Eira, they had basically outsourced their work to Ghana, which they had set up what was what looked ostensibly like a legitimate human rights organization.

[00:37:35]

They had hired employees in Ghana, real, authentic Canadians, and then told them, you know, you're going to have to kind of build these groups and communities.

[00:37:44]

And here is basically a same Meems, the same ideas that they they'd used in 2016. They were basically recycling in 2020.

[00:37:55]

So I start with Russia because what is really interesting is that their strategy of information warfare is actually something called is a phenomenon where they flood the zone with a lot of information, bad information across the political spectrum.

[00:38:13]

So they're not just targeting, you know, Trump voters, for example.

[00:38:16]

And this chaos, this bad information, this chaotic information has the effect where it's called censorship. They do censorship through noise. So this chaotic, bad information overload gets to the point where we can't make decisions in our own interest of protecting ourselves, our country, our community.

[00:38:37]

And that very spirit of information warfare has come to characterize the entire information ecosystem.

[00:38:45]

I mean, I start with Russia. I map out how their tactics are far more potent.

[00:38:50]

But you cannot talk about the corrosion of the information ecosystem without recognizing that the same chaotic spirit has come to imbue our homegrown debate as well.

[00:39:03]

So I actually think, you know, of course, the Russians are intervening in the U.S. election in 2020. What's also very interesting is that other rogue and authoritarian states around the world are looking at what Russia is doing and copying them.

[00:39:16]

China is becoming more like Russia, but this is also happening at home.

[00:39:21]

And arguably the domestic disinformation, misinformation and information disorder is far more harmful than anything that foreign actors are doing.

[00:39:31]

Yeah, I want to cover some of that ground again, because it's easy not to understand at first past just how sinister and insidious this all is, because the fact that we can't agree as a society that Russia interfered in the 2016 presidential election is one of the greatest triumphs of the Russian interference in our information ecosystem. The fact that that you have people on the. Left over, ascribing to Russian influence causality, and you have people on the right denying any interference in the first place and the fact that each side can sleep soundly at night convinced that the other side is totally wrong.

[00:40:21]

That is itself a symptom of how polluted our information space has become, a kind of singularity on the landscape where everything is now falling into it. And it's it's happening based on the dynamics you just sketched out. Whereas if you mingle lies of any size and consequence with enough truths and half truths or, you know, background facts, that suggests a plausibility to these lies. Or at least you can't you can't ever ascertain what's true. It leads to a kind of epistemological breakdown and a cynicism that is the goal of this entire enterprise.

[00:41:01]

It's not merely to misinform people, which is to say have them believe things that are false. It is to break people's commitment to being informed at all because they realize how hopeless it is. And so we all just tune out and go about our lives being manipulated to who knows what end. So, you know, some of the history which you go through in your book is relates to the fact that, you know, for a long ago, long before they had any tools really to work with, you certainly didn't have social media.

[00:41:32]

The Russians planted the story that AIDS was a essentially a bioweapon cooked up in a US lab, you know, with the purpose of performing a genocide on the black community. And they targeted the black community with this lie. And to this day, you know, a disproportionate number of people in the black community in the U.S. believe that AIDS was made in a lab for the purpose of wiping out black people. But the reason why that was so clever is because it has an air of plausibility to it, given the history of the Tuskegee experiments, the syphilis experiments where African-Americans who had syphilis were studied and not given the cure even once the cure penicillin emerged.

[00:42:20]

They were then studied to the end of their lives with what amounted to the ethical equivalent of the Nazi cold water experiments, trying to see the effects of tertiary syphilis on people. And it was an absolutely appalling history. And it's in the context of that history that you can make up new allegations that should seem patently insane. They're so evil, but they don't seem patently insane, given the points of contact to a surrounding reality that that is fact based.

[00:42:55]

And so it is with, you know, the current leveraging of identity politics in the US where they create Black Lives Matter Facebook groups that are fake. And they can you know, I think there's there was one protest in Times Square that had like 5000 or 10000 people show up and it was completely fake. I mean, that the organizers were fake. You know, there were Russians, there was no man on the ground who was actually a real leader of the thing.

[00:43:24]

And people went to this protest never realizing that they were characters and in somebody's dreamscape. Absolutely.

[00:43:33]

This is why it is so dastardly.

[00:43:35]

And as you pointed out, the Russians or even the Soviets going back to the Cold War very quickly identified that race relations as a sore point for the United States.

[00:43:46]

And they abused that to great effect.

[00:43:50]

And the operation infection, the lie that you already correctly pointed out, that the CIA invented the HIV virus as a way to kill African-Americans was something that in the 1980s took about 10 years to go viral.

[00:44:07]

But when it did, oh, boy, did it grab a hold of the imagination to the extent that it still plays a challenge when you're trying to deal with HIV, public health policy today, where you have communities, African-American communities, you disproportionately believe that the HIV virus is somehow connected to a government plan to commit a genocide.

[00:44:30]

And in 2016, I suppose what happened is that the strategy was the same.

[00:44:36]

Right? We want to play identity politics. We want to hit the United States where it hurts. We know that race is the dividing factor.

[00:44:44]

But in 2016, it became so much more powerful because operation infection, the HIV lie, was a single lie.

[00:44:52]

Whereas in 2016 and what's happening in 2020 is numerous groups, communities, pages, where it's not only about spreading one lie, but it's actually about entrenching tribal. Visions entrenching identity politics and in the context of what's happened in 2020, very interesting, some of the other kind of information operations that have come out that have been exposed is unsurprisingly, given your interest and kind of the culture wars and weakness.

[00:45:23]

Is that a lot of kind of. Unemployed American journalists who had lost their job due to covid were now working for a kind of social justice oriented left wing news news network in favor of BLM, and it turned out that actually that entire network was fabricated and the Russians were behind it. So these unwitting Americans who genuinely have good intentions are being co-opted into something that is actually being run by Russian intelligence.

[00:45:56]

And I suppose with our information ecosystem right now, it's so much easier to actually infiltrate public life in the United States in a way that wouldn't have been possible in the 1980s.

[00:46:08]

So we can't we don't even know what we're starting to see the impact of these operations on society. That's not to say that, you know, the Russians created the problems with race. Of course not.

[00:46:21]

But do they exploit them? Absolutely.

[00:46:24]

And our other countries, also other rogue and authoritarian nation states seeking to do the same. Absolutely.

[00:46:31]

Russia is the best at this kind of information warfare, but other countries are learning quickly.

[00:46:37]

And what's been really interesting for me to watch is, for example, how China has taken an aggressive new interest in pursuing similar disinformation campaigns in Western information spaces.

[00:46:51]

This was something that they didn't do until about last year when the protests started in Hong Kong. And then obviously this year with with covid.

[00:47:00]

I think you say in your book that Russian television, R.T. is the most watched news channel on YouTube. Yes, it is.

[00:47:09]

So this is another example to me of how quick they were to recognize that the architecture of this new information ecosystem. Right. Which developed that's characterized by. If you'd like to continue listening to this podcast, you'll need to subscribe at Sam Harris. Doug, you'll get access to all full length episodes of the Making Sense podcast and to other subscriber only content, including bonus episodes and Amma's and the conversations I've been having on the Waking Up app. The Making Sense podcast is ad free and relies entirely on listener support.

[00:47:48]

And you can subscribe now at Sam Price Tag.