Happy Scribe Logo

Transcript

Proofread by 0 readers
[00:00:00]

In July 2016, Hillary Clinton seemed like she was on her way to the White House, she was leading Donald Trump in the polls by a wide margin, and the other year her staff would have felt pretty confident. But around the same time, they also found themselves dealing with something unprecedented.

[00:00:18]

The Russian government and hackers penetrated the computer network of the Democratic National Committee and gained access to the entire database of opposition.

[00:00:28]

Jake Sullivan was Hillary's top policy adviser. He and others suspected the DNC hack was part of a larger information war being waged by Russia to tear down Hillary Clinton and get Donald Trump elected. And even as Trump added Russia on Russia, if you're listening, I hope you're able to find the 30000 emails that are missing. A lot of people thought Jake and the Clinton campaign were crazy, including sometimes themselves. We had a sense that maybe we were conspiracy theorists, you know, maybe that this couldn't possibly be happening because we hadn't seen anything quite like it before and because we would go talk to the senior producers of networks and the editors of major newspapers and tell them.

[00:01:17]

And they'd look at us like we were wearing tinfoil hats.

[00:01:20]

What they were seeing went way beyond the hacked emails like Facebook groups set up by fake accounts, spreading lies about Hillary's civil rights record, urging African-Americans not to vote crackpot social media, conspiracy theories about Hillary getting caught in some kind of criminal act by President Obama, who had made her his porn online video suggesting she was suffering brain damage from a fall in 2012.

[00:01:45]

It's Hillary on the verge of a mental breakdown due to stress or all her strange outbursts linked to a medical condition.

[00:01:54]

Eventually, these fake stories will get picked up by outlets like Fox News or at least by guess, when they appeared on Fox News, like Rudy Giuliani, go online and put down Hillary Clinton's illness.

[00:02:06]

Take a look at the videos for yourself.

[00:02:07]

Then the rest of the media would report on the controversies, which made it all seem reasonable, even mainstream. It was a symphony of lies, and the evidence pointed to an invisible conductor, Russia. But still a hostile nation trying to determine the outcome of a U.S. election, deploying social media platforms most often used to spread cat videos. To a lot of people, it seemed like a conspiracy theory, so we had moments of genuine self-doubt about how extensive and systematic this was, or maybe we were, you know, making more out of it than it was.

[00:02:51]

Turns out they weren't. US intelligence has concluded that Russian President Vladimir Putin ordered his military to help Donald Trump win the election.

[00:03:02]

The 37 page indictment says the Russians, beginning in 2014, used fake social media posts and advertisements to sway opinion in favor of then candidate Donald Trump.

[00:03:13]

Well, Russian intelligence officers are charged with hacking into the email servers of the Democratic Party and the Hillary Clinton campaign.

[00:03:26]

I'm Ben Rhodes and welcome back to Missing America. A look at the political disease is sweeping across the world in the absence of American leadership.

[00:03:35]

This week, disinformation.

[00:03:39]

In the years after the election, the world learned just how much Russia had created many of those fake websites, fake Facebook groups and fake news.

[00:03:49]

And you can be sure they're still doing that as I speak.

[00:03:53]

But Russia isn't the only actor using online lies to create real world chaos from Southeast Asia to the United States, disinformation is a growing part of the political landscape.

[00:04:04]

Today, we'll hear about how quickly the infected one country, the Internet, had barely started taking off in Myanmar.

[00:04:11]

And still there was sort of this really powerful effect of social media just amplifying something that anybody could have posted.

[00:04:20]

And we'll learn what the U.S. could do about it if our own president was in a constant source of disinformation himself, leading the fight in a virtual war on this episode of Missing American. On January 5th, 2017, I was sitting in the Oval Office watching leaders of the U.S. intelligence community briefed President Obama and Vice President Biden about the scale of Russia's intervention in our election. I wasn't surprised.

[00:04:48]

I'd seen the Russian government's disinformation machine at work two years earlier during their war with Ukraine.

[00:04:55]

New video shows the fiery wreckage just moments after the downing of Malaysia Airlines Flight 17.

[00:05:00]

You may remember this story. A civilian passenger plane was shot down probably by accident. It been hit by a Russian missile over the part of Ukraine controlled by Russian backed separatists, advised by the Russian military. But Russia's foreign ministry quickly deflected the blame with an avalanche of lies.

[00:05:20]

Russia has repeatedly denied any involvement in the shooting down of MH 17, and the Kremlin has promoted many alternative theories on what happened.

[00:05:30]

They said a Ukrainian combat aircraft did it. Ukrainian surface to air missiles were in the area. The missile was fired from territory controlled by the Ukrainian government.

[00:05:41]

This is conclusive evidence that Ukraine was not just involved in this tragedy, but it also manipulated the international investigation.

[00:05:51]

Never mind the stories contradicted each other to position. Never mind that they could all be debunked.

[00:05:56]

The official investigation would drag on for over a year before reporting all the facts.

[00:06:01]

Meanwhile, Russia's state run news channels and fake social media accounts flooded the Internet with disinformation within days, and the lies move faster and farther than the truth. I remember being frustrated our government had no way to stop this disinformation, we had little capacity to fight back. Russia seem to have thousands of people pumping out fake news. They were well-funded. They were willing to lie. I had a small press staff, a handful of official Twitter accounts and an obligation to back up any assertion with, you know, actual evidence.

[00:06:38]

No wonder that in our 2016 election, Russia was able to pull off the intelligence coup of the century. But since then, we've learned the damage that can be done. Even without Russia's resources, what's interesting to watch now, as you look at these disinformation campaigns, basically in every major political context now, is how simple and replicable and yet hard to stop this strategy is.

[00:07:08]

That's what's so alarming, is that it's not some super sophisticated, highly resourced can only be done once every few years. Effectively kind of operation is something that can be copied, scaled, adapted and executed basically on a moment's notice for exploiting political divisions in any society, in any context to get a sense of the danger involved.

[00:07:38]

Look no further than Myanmar and Southeast Asia may be the purest example of how much damage social media disinformation can do and how quickly and how the institutions most able to stop it. Don't you may still know Myanmar is Burma, its British colonial name for decades was ruled by a military dictatorship that kept the country cut off from the rest of the world. Think North Korea only twice the size.

[00:08:09]

The military also cracked down repeatedly on an ethnic Muslim group called Rohingya in Myanmar's Rakhine state. They were a convenient minority against whom to rally the country's Buddhist majority. Muslims and Buddhists would clash. Then the military would crack down on the side of the Buddhists. There would be deaths.

[00:08:29]

And mass displacement, simmering tension between Rohingya and the larger Buddhist community exploded in June of 2012, killing hundreds of people and leaving thousands of homes burnt to the ground.

[00:08:42]

By the mid 2010s, that tensions still existed. But in other ways, Myanmar was changing.

[00:08:50]

The government had opened up space for independent media and civil society, political prisoners were released.

[00:08:56]

One was even elected to lead the country, routing the military's chosen candidate. I remember traveling to Myanmar with President Obama around this time and tens of thousands of people greeted his motorcade.

[00:09:09]

It was a hopeful time democracy was taking root, just Peterson is a Danish social entrepreneur using technology to empower people. He moved to the country just in time to watch the political transformation.

[00:09:23]

But there was also another reason that I think for me personally made it a really interesting time to be in Myanmar. And that was sort of the rise of technology where until, I suppose early 2014 meant 2014. Barely anybody in Myanmar had access to the Internet. So SIM cards were super expensive. There was not really any 3G or 4G fixed.

[00:09:45]

Internet connections were just not a thing, perhaps except for the wealthy elite, until as part of its democratic opening, the government allowed two telecom companies to set up mobile networks across the country in 2014.

[00:10:00]

They both launched.

[00:10:01]

And so basically overnight the cost of a SIM card dropped from hundreds and in some cases thousands of dollars to basically the equivalent of a dollar and a half.

[00:10:12]

So you could take that SIM card and put it in a smartphone to go on the Internet, which is something that most people hadn't been able to do before.

[00:10:19]

Sagi Liang transaction was also working to help young Burmese use tech to start companies and create jobs. She remembers 2014 as an optimistic moment for Myanmar. After all, the Arab Spring movement had just used Twitter to help topple a dictator in Egypt. When the whole country have access to Internet, so much hope and excitement. You know, people see technology as a force for civic engagement. So at that time, technology is hope for the country. That hope turned out to be short lived and a big reason was Facebook.

[00:10:56]

See, in Myanmar, people largely access the Internet from smartphones, but many had never owned a smartphone before. They'd also never downloaded apps from the App Store, didn't even know when App Store was. So whoever sold them the phone with loaded with an app or two, one of which was almost always Facebook.

[00:11:16]

When people go to the shop, Facebook is often in-store on the phone. So that was the first application that people, you know, have access to from having no information about what Internet at all. So it's become Facebook is Internet in Myanmar. Yeah.

[00:11:32]

So Facebook and the Internet are indistinguishable. Yeah, it bears repeating in Myanmar. Facebook is the Internet. Many people never use any other website or app. Facebook is their entire online experience. Imagine living your whole life having access to little information other than government propaganda.

[00:11:54]

And then overnight in 2014, you think you have access to all of the information in the world. And it's Facebook.

[00:12:03]

A lot of people that I talk to first time when they saw things on Facebook, they thought that everything on Facebook is the same as newspaper. Yeah, right. There must be whatever put on Facebook must be through.

[00:12:15]

Sagi says she had friends who thought that like a newspaper, every article on Facebook had been vetted by some kind of Facebook editor and a fact checking staff for death was people's perception of Facebook because they were exposed to technology.

[00:12:28]

So it's really easy for bad actors to use that against them.

[00:12:34]

Sure enough, in a country still divided by ethnic conflict, bad actors did use Facebook to throw fuel on the fire almost immediately.

[00:12:44]

S Peterson back in the mid of 2014. I think it was in July in Mandalay in the north of Myanmar. There was a blog post that popped up on the Internet accusing a Muslim shop owner falsely of raping a young Buddhist girl. What happened very quickly was that the content from this blog post moved on to Facebook where it started spreading like wildfire. Lots of people started taking to the streets. There were riots that broke out between Muslim and Buddhist groups.

[00:13:13]

The crowd threatened to kill all Muslims, saying they want to get rid of them. They want to avenge the death of a Buddhist man who was killed by Muslims during riots that started on Tuesday.

[00:13:24]

What happened eventually was that the government resorted to shut down Facebook in Mandalay and around Mandalay to sort of kill the conversation about all of this until the riots died down and stopped.

[00:13:38]

And I think, you know, this was at a time when the Internet had barely started taking off in Myanmar. And still there was sort of this really powerful effect of social media just amplifying something that anybody could have posted. The hate speech and conspiracy theories continued month after month for years. Just walk me through another moment when violence threatened to erupt in 2017.

[00:14:05]

There were sort of two sets of chain messages going around Facebook. And one of these chain messages were warning people that Muslim extremists were planning a terrorist attack. Sort of non-specific, just broadly, be careful in the next few days. They're Muslim jihadists who are planning a terrorist attack. And at the same time, another chain of messages were going around. Facebook messages are warning Muslims that Buddhists were planning an anti-Muslim movement and riots. And obviously, it was quickly pretty clear to people monitoring this and looking at this that this was likely to come from the same place and to have an aim of sort of creating tension and riots and perhaps violence between Muslim and Buddhist groups.

[00:14:52]

Who is behind all this? What was the motive? The New York Times would later report that Myanmar's military played a central role, part of a concerted campaign to stir up Buddhist nationalism and a deeper distrust of the Muslim Rohingya. But at the time, all Peterson knew was this was a disaster waiting to happen. What happened then was that my colleagues and I and a bunch of other organizations and people in our community here in Yangon reached out to Facebook about this and told them, listen, folks, you just take a look at this, because these messages are going around and we know from before that this kind of really serious consequences.

[00:15:31]

So, you know, we think you should put a stop to it.

[00:15:34]

Here's the good news. Facebook took down the messages. It deactivated the fake accounts and spread them just like that.

[00:15:43]

This one disaster was averted. But just knew there was bound to be another one. Because the incident he'd flagged pointed to this much bigger problem. The fact that he had to alert Facebook about it in the first place. See, at the time, Facebook deployed around 7500 human moderators to keep an eye out for fake news in countries across the globe. But almost none of them were monitoring Myanmar. Sagi, young friend Zakarin, I think, like at the beginning, Facebook see Myanmar as a new market that hasn't been explored, right.

[00:16:19]

So they came to the country without thinking of what are the impact of Facebook for the population. If you look at the data around 2014, Facebook has only about two or three content moderators for the country, for the whole country, for the whole country.

[00:16:33]

Two or three people, two or three people, country of, you know, 60 million people.

[00:16:37]

Yeah, two or three people who monitor hate content review the content on Facebook.

[00:16:44]

None of those monitors actually lived in Myanmar, by the way. The company didn't even have an office there.

[00:16:51]

In other words, Facebook had become the dominant source of information in the entire country, but civilians, many of whom were totally new to the Internet, were expected to police the country's millions of posts. Not surprisingly, that wasn't enough to stop the disinformation it continued on Facebook all through the summer of 2017, ginning up hatred on both sides of the conflict.

[00:17:16]

I think what, Gary, is not just what they said, they also call for action in that message. Right. If not just I hate reinjuring Jagat.

[00:17:24]

They also asked people to do something about that. Yeah, that's the scary part.

[00:17:30]

With predictable consequences.

[00:17:33]

The world's fastest growing humanitarian crisis as thousands of Rohingya refugees are spending a fourth night stranded near the border with Bangladesh in August, Rohingya militants attacked police checkpoints.

[00:17:47]

In response, Buddhist extremists in the military attacked Rohingya villages, burning their homes, killing people and driving hundreds of thousands of men, women and children out of Myanmar. The U.N. called it a textbook case of ethnic cleansing. A director at Human Rights Watch in Asia placed the blame partly on Facebook for letting its platform become a clearinghouse for propaganda. He called the company absentee landlords. Since then, Facebook's been shamed into hiring more content monitors in Myanmar, and it's made efforts to flag and limit posts, it might be disinformation.

[00:18:26]

But it still doesn't have an office in the country where monitors could be attuned to what's happening on the ground, and most importantly, Facebook's algorithms are still designed to favor the sensationalist posted tend to attract views mainlining heat and the users news feeds. Meanwhile, even more of Myanmar's people who have joined Facebook. And there's an election coming up, I think technology play a big role in the upcoming 2020 election. Yeah, and and a Amobi scare actually, from what we saw with what happened with Fringier, because in this upcoming 2020 election, we know that most of the young people will be on social media or the political party will use social media to get their message across.

[00:19:13]

And we know that they're going to be a lot of bad actors who are playing a big role on this.

[00:19:19]

Myanmar is an extreme case of the damage that disinformation can do. But wherever you live in the world, the chances are you are consuming disinformation on social media. So if companies like Facebook won't protect us from disinformation and if more and more governments are propagating it. What's to be done? That's where we come in. Stay with us. Missing America is brought to you by policy genius. September is National Life Insurance Awareness Month. With everything going on, a lot of people aren't even aware that it's possible to buy life insurance at all.

[00:20:03]

The good news is it's still easy to shop for life insurance right now, and if you have loved ones, depending on your income, you probably should. Right now, you could save 1500 dollars or more a year by using policy genius to compare life insurance policies when you're shopping for a policy that could last for a decade or more. Those savings really start data. What is policy genius? It's an insurance marketplace built and backed up by a team of industry experts.

[00:20:31]

Here's how it works. Step one, head to policy genius Dotcom.

[00:20:35]

In minutes, you can work out how much coverage you need and compare quotes from top insurers to find your best price step to apply for your lowest price. Step three, the policy genius team will handle all the paperwork and the red tape. Policy genius works for you, not the insurance company. So if you had any speed bumps doing the application process, they'll take care of everything. They even have policies which allow eligible customers to skip the in-person medical exam and do it over the phone.

[00:21:04]

That kind of services are in policy genius, a five star rating, over six hundred reviews on trust pilot and Google. So if you need life insurance, head to policy genius Dotcom. Right now to get started, you could save 1500 dollars or more a year by comparing quotes on their marketplace policy genius when it comes to insurance. It's nice to get it right. Missing America is brought to you by some basket if you've been listening to every episode of Missing America, and I hope you have, you know, we're fans of Sun Basket over here.

[00:21:34]

It is the way to skip the grocery store, eat delicious, healthy food, and you don't have to go out, get Sunbus, get fresh ready meals delivered to you each week. Some basket meals are made with fresh organic produce, sustainable seafood and meats that are free of antibiotics, hormones and steroids. None of the bad stuff. All the good stuff. Their chef said when Michelin Awards and James Beard Award so you can take the night off and let them cook for you.

[00:22:00]

Here are the greatest hits, the paper, daily pasta with wilted spinach, sweet peas and fresh ricotta, southwestern turkey and sweet potato skillet cauliflower, mac and cheese and many, many more. That's just the tip of the iceberg. The meals come prepared and heat up in as little as six minutes. They're ready to heat and eat, which means no mess in your kitchen. The paleo vegetarian Mediterranean and gluten free options, too. I've talked about the butter chicken.

[00:22:26]

Talk to you about my cauliflower stir fry. The last meal I made from some basket was the Hanaway steak stir fry with zucchini and pepper. It is hard to get that taste of good Vietnamese food in your kitchen with this meal.

[00:22:38]

I got it in less than 30 minutes because it was already done for me by some basket. So check it out. Right now, some mascotte is offering thirty five dollars off your order when you go to some basket dotcom slash missing and enter promo code missing a checkout that's sun basket dotcom missing and enter promo code missing at checkout for thirty five dollars off your order. Sun basket dot com missing. Another promo code missing. So when I tell you the United States needs to lead the fight against disinformation, I know it might be hard to fathom why, especially since it's now pretty well known that our own president's reelection campaign is busy spreading disinformation on a daily basis.

[00:23:25]

The Times reports that a consultant for Team Trump has created a fake campaign website for Biden. That's unflattering, to say the least. And in the last three months, that fake website has become the most popular Biden website on the Internet.

[00:23:40]

But I have to remind you, Facebook, Twitter, Google, all these platforms Russia used to hack our last election the same platforms and Myanmar's military used to spread hateful lies. They're almost all created and run by American companies. It's not only our responsibility to reign in the Frankenstein monster we've created, we're the ones best positioned to do it. Even if a lot of the best ideas for how to do it happen to come from overseas. Morigi Sharqiya spent 10 years representing the Netherlands in the European Parliament, and her advice to us is to start doing something Europe's already taken steps towards doing, regulate social media platforms, because why wouldn't we?

[00:24:30]

In many ways, it is almost unprecedented and hard to imagine how such a powerful sector, billions and billions of dollars in revenue and profit has remained so unregulated.

[00:24:44]

And I think regulation is inevitable, not for its own sake, but to preserve the rule of law online, to preserve democratic rights and human rights online and logically, Americans should lead in these regulations if they believe that those values are integral to their quality of life.

[00:25:02]

There's a big obstacle to regulating these companies, though. America's current president is as eager to weaponize regulations as he is disinformation itself. Remember his executive order this May aimed at regulating social media, they try to silence views that they disagree with by selectively applying a fact check, fact check, the censorship and bias is a threat to freedom itself. Therefore, today I'm signing an executive order to protect and uphold the free speech and rights of the American people.

[00:25:34]

Of course, Trump's order was intended to cause social media companies so they'd let more right wing disinformation, including his own, continue to spread unchecked. A Wild West where any attempt to clamp down on hateful lies is called censorship. Trump's order probably won't take effect. Crafting regulations is Congress's job, but adhering to social media CEOs, some members of Congress barely seem to understand how these platforms work in the first place.

[00:26:04]

How do you sustain a business model in which users don't pay for your service, Senator? We run ads. I say, that's great. So how many data categories do you stored on the categories that you collect? Senator, can you clarify what you mean by do you understand this better than I do?

[00:26:24]

But maybe you can explain to me why that's complicated.

[00:26:27]

So in this fight, progressives will have to make sure that Congress understands the scale of the problem and the need for action. And then Shakar says we have to frame the debate over regulation carefully, not as an attempt to censor, but to protect.

[00:26:43]

The way in which I think about the role of government is not to regulate a platform or what people call regulate the Internet, but to actually regulate for free expression, to regulate against discrimination, to regulate for fair competition. And these, you know, non-discrimination, freedom of expression, fair competition, they are actually not that controversial in our society.

[00:27:07]

And how do you regulate a platform like Facebook for freedom of expression without restricting that expression? Shakar says, you don't regulate the speech. You regulate the algorithms that determine what kind of speech is rewarded.

[00:27:22]

I think we can't live in a world anymore where we think the technology is something that is neutral.

[00:27:27]

Paul Duhan agrees with Shakar. He used to work in Silicon Valley. Now he's a tech activist and entrepreneur in Paris. So he'll give you one example.

[00:27:36]

If you look at Facebook's news ranking algorithms. If you look at the way that Google will rank news or YouTube, very often they will say that the algorithm is neutral because it's just promoting the videos that the A.I. has decided will have the most engagement and will bring in the most ad revenue. The reality is not useful, right? Because the content that tends to drive more engagements tend to be content that is more controversial. Content is more sensationalistic.

[00:28:04]

Conspiracy theories, right? Yeah.

[00:28:06]

And so in this case, you have a direct conflict between what the what the market is optimizing for and what we may say are the values of our societies.

[00:28:16]

In other words, social media algorithms push disinformation to maximize profits regardless of whether it's destroying society. So regulating these algorithms is in our public interest and not just our own, because disinformation is an international plague. In 2019, EU countries created something called the Rapid Alert System to identify and squash disinformation campaigns before they can spread. The system was triggered a few months ago when Europe was flooded with disinformation about the coronavirus. This kind of international cooperation is going to be increasingly important, so says Graham Brookie.

[00:28:57]

He's director of the Digital Forensic Lab. The expose and explain disinformation campaigns worldwide.

[00:29:03]

Disinformation is a collective problem, and it doesn't respect our neatly defined border. So with the problem of disinformation, we're always looking for areas by which we can create global standards. So whether that's, you know, data protection in Europe, that everybody that the companies have to apply to the entire world or whether that's, you know, telecoms policy in Brazil now has to be applied to Europe or to India or wherever.

[00:29:28]

A couple of years ago, the European Union drafted a version of this kind of international set of standards.

[00:29:35]

The EU code of practice was supposed to compel social media platforms in Europe to police disinformation more vigorously and respond to it faster. Facebook, Twitter and other platforms all signed on to the code, but it hasn't had that much effect, probably not, least of all, because compliance is voluntary. Similarly, a slew of big companies voluntarily pulled their advertising from Facebook this summer. It was a boycott after Facebook led lies spread across the platform in the wake of the Black Lives Matter protest.

[00:30:12]

But what's preventing them from plunging their advertising dollars right back into Facebook as soon as Black Lives Matter is in front page news? Mark Zuckerberg himself reportedly told his staffers advertisers would return, quote, soon enough, civil rights groups say that they remain unconvinced that Facebook is doing enough to combat hate speech on its platform.

[00:30:32]

Representatives discuss the company's handling of hate speech with Mark Zuckerberg and other Facebook executives on Tuesday. Many of them called the meeting, quote, disappointing as it was just last week.

[00:30:44]

Responses, a self-described militia on Facebook posted a call to arms against BLM protesters in Kenosha, Wisconsin. Just like Jess Peterson of Myanmar, users warned Facebook that the post could lead to violence and Facebook did finally take down the malicious page after this happened.

[00:31:04]

Two people were killed during another night of Black Lives Matter protests in Kenosha, Wisconsin. Investigators say it may have been a vigilante attack carried out by a young white man as well.

[00:31:15]

This is why regulation needs to come from the government by the police. And the regulations better have teeth. Jake Sullivan, well, particularly for the social media platforms, it seems to me that we have to be talking seriously about removing the immunity they have under U.S. law for being held liable for the content on their sites. Facebook is receiving massive amounts of revenue from the advertising it's selling on every one of these pages being loaded as these conspiracy theories and information warfare operations are being spread and they should have to take some responsibility for it.

[00:31:52]

Now, imagine applying that principle internationally. How might have Facebook responded differently in Myanmar if it could have been held legally liable for spreading hate that contributed to the ethnic cleansing of an entire people? So how can America lead if we have a decent former UN chief sitting in the White House, won't surprise you that I believe we cannot solve this problem so long as Donald Trump is president. But I also know people around the world are more aware than ever. The disinformation is not a tinfoil hat conspiracy theory.

[00:32:29]

They're more aware than ever that social media is not always good for us. And that means if this election goes our way, there's hope to make social media better. Jake, of it, in a way, we've been a few years behind the curve of Europe and we're just now catching up on everything from privacy to extremist content online to disinformation. And so I do think there is a real opportunity, if you got a different president, to have a conversation with the Europeans about some common set of regulatory approaches that would allow us to have people be able to enjoy and make use of social media, but that would curb some of these these excesses and these abuses.

[00:33:09]

Second of all, I said it before on the show and it bears repeating getting rid of Trump won't solve our problems. That's just the beginning of our work. Remember where we started Russia's attack on the 2016 election? That happened in part because our government didn't have the right tools to respond. But the hard truth, we were an easy mark. We were polarized. We had right wing media eager to echo the disinformation we the mainstream media that couldn't resist reporting on it.

[00:33:41]

Russia just poured gasoline on fires. There were already there grand Brooky. The Internet's not written in stone. You've got to engage at all times. And the other side is really, really good at engaging at all times on stuff that's easy and like vitriolic in a lot of times and viral. I mean, the term going viral is an informative term here. So how do you beat vitriolic content that's going viral? It's not just regulation. The first thing is you've got to have a better story.

[00:34:12]

Yeah, just full stop. If you don't have a better story, you're competing with a backward looking negative story about how dark and nasty the world is and how scary other people are. Sure, government has a role to play, but so do citizens by rejecting disinformation and lies, by resisting the hopelessness and apathy, the disinformation breeds, and by telling stories that recognize our shared humanity. At our best, that's what America has always done. You think about the United States of America, we have a really good story.

[00:34:50]

Called the Declaration of Independence. We hold these truths to be self-evident, that all men are created equal, that we're endowed with certain inalienable rights, that among these are life, liberty and the pursuit of happiness. It was just a good story that they were telling about what could be. And then people were attracted to that story.

[00:35:13]

And that led to independence and it led to immigrants from around the world who who wanted that vision for themselves.

[00:35:22]

We get to choose what stories to believe in, what stories we tell, what stories we spread. After all, whether we live in America or Russia or Myanmar, we're all just people with the choice to be kind or cruel, to do good or evil. Whatever the platform Sagi leads us with this, I do think that is really normal. When we have some new thing come in, we we don't know how to respond to that if no law or regulation with Facebook.

[00:35:51]

Right. But I do believe the power of the citizen and government and society. If we come together, we will find a way to respond to this. She's right. Even for Americans, we're still in the early days of social media. Facebook is only existed for 16 years, but we have to develop the antibodies against disinformation before it's too late. This week, we talked about how disinformation made the sectarian resentment in one country worse. Next week, we'll explain how sectarian resentment built in the first place and why we're seeing more and more of it around the world overnight.

[00:36:38]

Perfectly all right, Nieblas. Perfectly normal people don't savages people who you call your best friends, your neighbors. As Trump in the GOP polarized America, how can we lead the charge against dangerous polarization abroad? The problem and some solutions on our next episode. Missing America is written hosted by me, Ben Rhodes, it's a production of Crooked Media.

[00:37:11]

The show is produced by Andrea Gardner Bernstein.

[00:37:14]

Rico Gagliano is our story editor. Austin Fisher is our associate producer, sound design and mixing by Daniel Ramirez, production support and research from Nimmy Brewery in Sydney. Rapp fact checking by Justin Kozko original music by Marty Fallot. The executive producers are Sarah Geissman, Larry Smith and Tanya.

[00:37:38]

So many special thanks to Allison falsetto Tommy Vietor, John Lovett and John Pfeiffer. Thanks for listening.