Transcribe your podcast
[00:00:00]

This is Sam Dolnick, I'm an assistant managing editor at The New York Times, our newsroom has been empty since March, but we've been busier than ever before. The pandemic has changed how we work, but it hasn't changed what we do. This is why we became journalists to bring to light real verified information when the stakes couldn't be higher. We can't do this work without our subscribers. If you'd like to subscribe, please go to NY Times dot com slash subscribe and thanks.

[00:00:30]

From The New York Times, I'm Michael Barbaro. This is daily.

[00:00:40]

Today, the nation's biggest social media companies are determined to avoid the mistakes that they made during the 2016 election. But in the process, they've ignited a different kind of firestorm. My colleague Kevin Bruce reports from San Francisco. It's Wednesday, October 21st. You know, Kevin, actually, it's been ages since we had you on the show ages. Yeah. And I wonder whose fault that is.

[00:01:16]

You don't call. You don't write. I thought we had something.

[00:01:20]

And I think that you didn't intersect with the news. That's true. But here you are intersecting. There's been a lot going on.

[00:01:28]

So, Kevin, my sense is that the big social media companies, which you have been covering for a really long time, have been preparing very diligently, very carefully, very expensively for the twenty twenty election and for the possibility of a major moment of misinformation for some kind of act of interference, given how much they failed to do that back in twenty sixteen. So I want you can summarize what the preparations that they have been doing have looked like so since twenty sixteen.

[00:01:58]

These three big social media companies, Facebook, Twitter, YouTube, they've spent tons of time and money investing in trying to keep foreign interference from happening on their platform again. There have been new policies, new teams. They've hired tons of new people and moderators. And basically their goal is to just avoid being played again, to avoid the kind of foreign interference attempt that they were seen as having allowed on their platforms in 2016.

[00:02:32]

Right. And my sense is that beyond just being alert to this, they want to be more responsive, right? They want to be less hands off. So if something happens, the goal is not just to notice it, but to actually do something. Is that right? Exactly.

[00:02:48]

They want to act in a way that is consistent with their policies, but also that is fast. That gets to these problems before they spiral out of control and become huge election interference issues.

[00:03:03]

It's not just what they do. It's also how quickly they're able to do it.

[00:03:07]

So I guess in a sense, in the weeks leading up to the election, they are kind of just waiting for the first big test case for these systems yet.

[00:03:17]

But the kind of fear that they've had is this kind of October surprise, this introduction of something new, some attempt to kind of steer the narrative about the election at the last possible moment. And that came for them last week in the form of this New York Post article that appeared on Wednesday morning.

[00:03:39]

And in broad strokes, what was this so at a sort of broad level, what The Post publishes is a story alleging that there's this laptop that was used by a Hunter Biden, the son of Joe Biden, presidential candidate that made its way to investigators and ultimately to Rudy Giuliani, the president's lawyer, and that when they inspected this computer, it had emails on it that the Post described as being incriminating. Now, there are still lots of questions about these emails.

[00:04:12]

We have not been able to verify their authenticity. We don't know who actually got access to them and how they made their way up to Rudy Giuliani. There are still so many questions about the provenance of these materials and whether or not they're real. But that's just the sort of nutshell version of what the Post publishes on Wednesday morning. And so what are the big social media companies, the three you identified, Twitter, Facebook, YouTube? What are they thinking?

[00:04:43]

And they're very well prepared offices about this story.

[00:04:49]

Well, I've spoken to some people at the companies and basically what they were thinking is here we go again to them, it seemed to have a lot of the same hallmarks as the 2016 hack and leak campaign that Russia carried out, where they hacked into email inboxes belonging to prominent Democrats and released those emails in a coordinated fashion to try to steer the discussion around that year's election. And so the way that these platforms think about threats like this, they basically have three options for what to do next.

[00:05:21]

They can do nothing. They can let this story run its course and trust that, you know, people will work out the facts and that the systems will work as designed. Mm hmm. They can step in really aggressively and essentially ban the story from their platforms and say, we're going to take this down. We're not going to put any links to it. You're not going to be allowed to link to it. And we're going to lock your account if you do link to it.

[00:05:43]

And that's sort of the nuclear option. And then there's all these decisions that they can make in the middle of that between doing nothing and taking it down. And those would include things like putting a label on it or putting a little fact check underneath it or reducing its distribution through their algorithms rather than banning it outright. And these platforms know that they have to do something quickly and that whatever decision they make, the longer they wait, the harder it's going to get to restrain or to reel in this narrative that is already starting to go viral.

[00:06:19]

Mm hmm.

[00:06:20]

So how did these companies react in that very short period of time? What did each of them do?

[00:06:25]

So the initial reaction from these platforms is a little bit all over the place.

[00:06:29]

So YouTube basically does nothing. It's given what we know about it right now. This story is allowed on YouTube and we'll continue to evaluate it. Next, we have Facebook and they basically come out several hours after this story is posted. And they say that they are going to demote the story, basically slow its spread in their algorithms until it can be evaluated by third party fact checkers. So they're basically saying we're going to pump the brakes on this story until the people that we trust to determine whether or not these things are true or not can look at it so they don't blog it.

[00:07:09]

They're kind of throwing a sheet over it. They're not blocking it. They're just kind of putting it on ice for the moment. And then you have Twitter, which makes the most aggressive call in the early hours after this story is posted. And they say that we are not even going to allow people to link to this story because this violates our rules against sharing private information, because there were some sort of private information contained in these emails that The Post had excerpted and that also it was a violation of their hacked materials policy.

[00:07:42]

So they were basically treating this as if it were a hacker sharing some passwords that they had gotten from some data dump.

[00:07:51]

So Twitter is taking the kind of nuclear option, as you described it.

[00:07:54]

Not only is Twitter banning people from linking to this story, but it's locking the accounts of the people who do link to it, including some pretty prominent people like Kayleigh McEnany, the White House press secretary.

[00:08:08]

And for Twitter like this is a pretty big enforcement action and they're basically taking no chances with this.

[00:08:18]

Right. So they are choosing to do the exact thing that everyone said social media companies did not do in 2016 when information began circulating of dubious origin. For example, John Podesta's emails, they are just clipping its wings. They are making sure it cannot be shared.

[00:08:36]

Right. And their theory on why they're doing this is that it's better for them to act to aggressively and let up later than to let something go and then try to catch up to it.

[00:08:48]

Mm hmm. And so, Kevin, what is the reaction to this decision by these companies?

[00:08:54]

So among many people, including a lot of people on the left, frankly, there was relief that after years of criticism for not having done enough to stop interference in the 2016 election, that they were being attentive and proactive and doing something rather than nothing to avert a potential similar crisis this year.

[00:09:20]

But then this is a dark moment. There were people, many on the right.

[00:09:24]

This was mass censorship on a scale that America has never experienced, not in two hundred and forty five years.

[00:09:29]

It's both insidious and infuriating who were very offended by these decisions by the platforms. Make no mistake, Twitter, Facebook, they are not arbiters of truth. They're all engaging in censorship. So you're kept in the dark, cold, calculated political actors.

[00:09:50]

Josh Hawley, the senator from Missouri, one of the sort of most vocal critics of these big tech platforms, came out, if Republicans don't stand up and do something about this, these companies are going to run this country and said that this was the dawn of a dangerous new era in American history.

[00:10:09]

Silencing the media is a direct violation of the principles of the First Amendment. Ted Cruz, senator from Texas, also said that this was an affront to free speech.

[00:10:21]

We're seeing Silicon Valley billionaires, frankly, drunk with power.

[00:10:26]

So the entire sort of Republican establishment goes nuts over this. They are calling for subpoenas against the leaders of Twitter and Facebook.

[00:10:38]

The Senate Republicans need to ask Jack Dorsey, what is your policy so we can decide whether or not we're going to start regulating this.

[00:10:45]

They are calling for legal protections for these platforms to be revoked.

[00:10:48]

You're not a real platform. You're just another liberal editor.

[00:10:52]

And they are essentially treating these companies as if they are themselves interfering in a US election by acting to prevent a possible interference attempts.

[00:11:03]

And let me be very clear. Twitter is interfering in this election. Big tech.

[00:11:08]

They want to run America. We've got to stop them. We've got to do something right now. They are interfering by trying to stop interference. Complicated. Very, yeah. The whole thing is crazy and of course, finally the president himself weighs in and it's like a third arm, maybe a first arm of the DNC and a tax.

[00:11:37]

These companies repeats his call for the repeal of these legal protections that these companies have.

[00:11:44]

But it's going to all end up in a big lawsuit.

[00:11:46]

And there are things that can happen that are very severe and basically says that they are trying to rig the election for Joe Biden by suppressing this story. So this is not go smoothly. No, it's kind of a damned if you do, damned if you don't situation for them where they're criticized for not doing enough to protect against election interference. But then when they try to act to protect against potential election interference, they're criticized for that, too. So all this criticism, all these questions surrounding this story really leads them to try to figure out, like, did we make the right call here?

[00:12:24]

And essentially Facebook and YouTube sort of stick by their decisions, like YouTube doesn't take down content. Facebook continues to kind of limit this content without blocking it entirely.

[00:12:36]

But Twitter starts actually reversing its original decision. They sort of land on this position of we think that this article has been so widely discussed that we no longer think it makes sense to block links to it. But in the future, we will put labels on materials that might have been hacked. So we will essentially take a middle ground position on stories like this in the future.

[00:13:05]

Mm hmm.

[00:13:05]

And I think from the platform's point of view, this story and the issues it raised were actually a little more complicated than they originally thought or the.

[00:13:36]

But you could use a snack right about now. How about a toasty grilled cheese sandwich? Just be warned, if you happen to achieve gooey, cheesy perfection, you may be inspired to upgrade your tiny, drab kitchen. Only you won't be able to do it alone. In this moment of newfound passion, the people of U.S. bank want to help. No matter what you're cooking up there, dedicated to turning your new inspiration into your next pursuit. U.S. bank equal housing lender member, FDIC.

[00:14:06]

So, Kevin, what made this New York Post story, as you just said, such a uniquely complicated situation for these big social media companies?

[00:14:16]

So I think all the platforms, Facebook, YouTube, Twitter, would agree that foreign election interference is bad and that it's part of their job. And something they're very committed to doing is stopping foreign interference attempts. But foreign election interference is not always very obvious.

[00:14:35]

So in 2016, you had troll's in St. Petersburg who were literally buying Facebook ads in rubles. But there are much more subtle ways that a government or an entity could try to influence a U.S. election.

[00:14:50]

And one possibility is that they could use a hack and leak operation where they steal information and then distribute it.

[00:15:00]

But instead of going directly out with it or through an organization like WikiLeaks, they could go through a major American news organization like The New York Post.

[00:15:10]

And these platforms, they don't find it particularly hard to take action against sort of cyber attacks and things that are, you know, pretty blatantly trying to manipulate their services. But in this case, it's more like trying to figure out if The New York Post can be trusted or not, which is a very uncomfortable position for them to be in.

[00:15:34]

Right. Because all of a sudden a company like Twitter or Facebook would suddenly be in the position, perhaps throughout an election of routinely deciding whose confidential sources are trustworthy and whose articles based on documents should be allowed to have their links shared or blocked. And that could become a pretty slippery slope pretty quickly.

[00:15:58]

Yeah, and after Twitter made its decision to block links to the New York Post story, they heard from a lot of journalists, not just in the US but around the world, who worried that this policy against not allowing links to hacked materials could endanger their ability to report on things involving confidential sources and whistleblowers. And that's part of the reason that they walked back their initial call. Right, I mean, at the end of the day, I think it's safe to say we don't want social media companies to be the gatekeepers of journalism.

[00:16:31]

We want journalists to produce good journalism that applies careful standards and thoughtful judgments to what information gets out and where it comes from. And I guess this is still an open question.

[00:16:43]

The New York Post may not have done that in this case.

[00:16:46]

There's still a lot we don't know, to be clear. I mean, our colleagues have done some reporting and there seem to be some sort of red flags in the process behind this particular story. Apparently, the reporter who wrote most of the story didn't want his byline attached to it. Things like that. Not a good sign, not a good sign. So very unorthodox process behind the story. But I think it does point to the fact that these platforms are reluctantly being asked to not just sort of keep their platforms from being manipulated, but in cases like this to kind of referee journalism in a way that they're very uncomfortable doing.

[00:17:23]

And that arguably shouldn't be their job at all. But because of the nature of these hack and leak campaigns, they sometimes have to be careful listening to you.

[00:17:34]

I'm wondering, are these social media companies and their policies on this kind of content? Are they making journalists?

[00:17:42]

Approach this with greater rigor and perhaps with the fear that they would publish something that would be blocked or de platformed, and would that become an incentive for everyone in the news media who, of course, want their content to be shared, to behave better? I don't think the platforms set out to improve journalism. I don't think that's one of their sort of goals here.

[00:18:09]

But I think in this case, they may actually have done reporters a favor by coming out early and aggressively to say there is something fishy about this story and their decision to act on it changed the tenor of some of the coverage that followed.

[00:18:28]

Instead of just repeating what was in these alleged emails, the story became sort of about process and journalistic rigor and platform governance and all of these sort of meta topics that I think ultimately shed more light on what happened than just going back through the emails and printing the most salacious parts.

[00:18:51]

And that's very different from what happened in 2016 when the emails stolen from the Democratic Party. And again, we don't know if any emails were stolen from anybody in this case, but back in 2016, those emails were made public. And news organizations, including The New York Times, including me as a reporter, went through them and generated stories about them. And it wasn't about where they had come from. It was about what they had revealed about the characters in the emails.

[00:19:17]

Right. And I think that points to one of the major shifts that has happened since 2016. It's not just the platforms that have been worried about a repeat hack and leak operation and making plans for what they're going to do. It's also news organizations at the Times.

[00:19:33]

Like I and a group of other disinformation, reporters and researchers put together a list of guidelines for how we would handle a hack and leak that resembled the one we saw in 2016.

[00:19:45]

Mm hmm. And is there anything you can say about that process? Yeah, it's it's a five step process. And we we gave it a silly acronym. It's the email method.

[00:19:57]

Of course, there's evidence, motive, activity, intent and labels. And it's just the kind of thing that we are considering internally as sort of a checklist for the process that we go through when something like a hack and leak does emerge.

[00:20:14]

Kevin, I'm curious if all the diligence from these news organizations like The Times and the crackdown by the big social media companies on this New York Post story, did it work? Did it at the end of the day limit the reach of this story, whose origins we are still trying to figure out in which are suspicious?

[00:20:37]

Yes and no. I think just in pure statistical terms, like the story still traveled very widely.

[00:20:43]

It was among the highest performing articles on social media that day. It's been getting a wall to wall coverage on Fox News and from other right wing media outlets. So in that sense, if the goal was to reduce the visibility of the story, I think the answer is that, no, it didn't stop this story from getting out or being discussed and may have, in fact, drawn more attention to it than it otherwise would have gotten.

[00:21:11]

But I think the thing that did change is the kind of attention that was being paid to the story.

[00:21:16]

And it represents, I think, a real break from 2016 when the story became all about the emails, all about John Podesta and the DNC and Hillary Clinton when we were late, frankly, to turn our attention to the bigger story behind that story, which was that Russia was trying to interfere in our election.

[00:21:39]

Hmm. So a bit messily and with some high profile reversals, with plenty of angry partisans and with a story that was supposed to be losing steam, that is definitely made its way around the Internet. We are seeing some meaningful improvements in this social media system over 2016. Yeah, I think we are, and all of this is super messy and complicated, and there will no doubt continue to be mistakes and reversals and people claiming partisan bias and threats to free speech, all of that's going to continue.

[00:22:21]

And, you know, there are still two weeks until the election. So we don't know what could happen between now and then.

[00:22:27]

But right now, I think the big picture is that we are all now much more aware of how we can be manipulated, whether we are executives at a social platform like Facebook, Twitter or YouTube, whether it's us as journalists at The New York Times or whether it's just voters, people who consume the news and are trying to make sense of what's happening. I think we are all now much more conscious than we were in 2016 of all of the ways that we could be manipulated or tricked or baited or taken advantage of.

[00:23:02]

And I think that increased awareness, that consciousness of the risks that we face is a good thing no matter what happens with any one particular story.

[00:23:24]

Thank you, Kevin. Appreciate it. Thanks for having me. We'll be right back. When is a grilled cheese sandwich more than just a grilled cheese sandwich, when you make so many of them, you achieve gooey, cheesy perfection, which inspires you to upgrade your tiny, drab kitchen, only you can't do it alone. In these moments of newfound passion, the people of U.S. bank want to help. No matter what you're cooking up, they're dedicated to turning your new inspiration into your next pursuit.

[00:24:12]

U.S. bank, equal housing lender member, FDIC. Here's what else you need to know today, on Tuesday, the U.S. Department of Justice sued Google for allegedly violating anti-trust law, accusing it of maintaining an illegal monopoly over search and search advertising.

[00:24:35]

The lawsuit represents the most significant legal challenge to a tech companies market power in a generation.

[00:24:43]

Google achieved some success in its early years, and no one begrudges that. But as the antitrust complaint filed today explains, it has maintained its monopoly power through exclusionary practices that are harmful to competition.

[00:24:57]

The lawsuit accuses Google and its parent company of using exclusive business contracts and agreements to lock out rivals. One such contract paid Apple billions of dollars to make Google the default search engine for iPhones. As a result, the lawsuit said, both competition and innovation suffer.

[00:25:19]

So the Justice Department has determined that an antitrust response is necessary to benefit consumers if the government does not enforce the antitrust laws to enable competition. We could lose the next wave of innovation. If that happens, Americans may never get to see the next Google.

[00:25:39]

In a statement, Google called the lawsuit deeply flawed and said that rather than helping consumers, it would hurt them.

[00:25:55]

That's it for the day, I'm Michael Barbaro. See you tomorrow. But you could use a snack right about now, how about a toasty grilled cheese sandwich? Just be warned, if you happen to achieve gooey, cheesy perfection, you may be inspired to upgrade your tiny, drab kitchen. Only you won't be able to do it alone. In this moment of newfound passion, the people of U.S. bank want to help. No matter what you're cooking up there, dedicated to turning your new inspiration into your next pursuit.

[00:26:30]

U.S. bank equal housing lender member, FDIC.