Happy Scribe Logo

Transcript

Proofread by 0 readers
Proofread
[00:00:05]

Tell me what Internet is. Can you explain what Internet is? What do you write to it like mail? How do I become a member of Internet?

[00:00:15]

Now, as you may or may not know, I have pored over 12 years of my life into my YouTube channel, this YouTube channel, Heart, Mind, Body, Soul, money, resources, travel expertise. And I stand by what I have done.

[00:00:32]

And then let me tell you what happened in a 24 hour period. I have a strike against my channel. OK, so Kevin, not long after you and I went down to West Virginia and talked to Caleb, things that YouTube started changing, right?

[00:00:54]

Apparently, you see, I had violated community standards. So as stories like Caleb's have started coming out about how YouTube and its recommendation algorithm may be pushing people toward these extreme views. This was a direct shot across the bow.

[00:01:12]

YouTube has started telling some video makers that their content is against YouTube's new policies.

[00:01:17]

What they want to do is to have me start to self censor and say, like Steffon Mollinedo.

[00:01:22]

And so I've just been told I'm on double secret probation. Gavin McKennitt, some sort of thirty day watch on YouTube where if I do anything wrong, I'm done forever.

[00:01:32]

People who had been associated with the. All right. What content could possibly be acceptable.

[00:01:38]

They also affected people like Steven Crowder, The Demonetization, specifically targeting our opinions as unpopular speech who didn't really have the same association with the right.

[00:01:48]

But just like Stofan Marlina and Gavin McKinnis, his videos got demagnetized for being, quote, what they term now on the border.

[00:01:55]

Like they couldn't make any ad money off of their YouTube videos anymore.

[00:01:59]

Yeah, meaning it's gotten a lot harder for some of them to make a full time living making YouTube videos.

[00:02:04]

It renders us unable to create a show for you. And if it keeps going through, the channel will cease to exist. I am going to need to figure out what to do. I am begging you philosophy back to you.

[00:02:16]

The future begs you. Please help out the show at Friedemann Radio dot com. They are shutting down the discussion.

[00:02:24]

Other people like Alex Jones, they disappeared me just got banned altogether.

[00:02:30]

If this is 1984, baby, I don't know what is so test a few months ago. Give me give me a level.

[00:02:43]

Kepner's our colleague Julia Longoria flew out to San Francisco. We are here in San Bruno, California, a little south of San Francisco. And the two of us drove down to YouTube's headquarters. That building that looks like a big insurance company or something.

[00:03:03]

That's YouTube and why are we here?

[00:03:08]

And we went there to talk to YouTube CEO Susan, which is she's someone who has seen kind of the entire evolution of the modern Internet. She's got the steering wheel of this giant, powerful artificial intelligence that runs YouTube.

[00:03:23]

Hey, guys, how's it going? Hey, do you guys have a suspect? We're here for an interview. Sorry I got you a noise. Thank you. Thanks.

[00:03:35]

And anything that she does with that power, it has all these downstream consequences. So a couple of years ago, I guess now they had a shooting here, a person who was a YouTube creator who was upset about her channel and ad money and things like that. She actually came on to this campus with a gun and shot three people and shot herself. It was a really horrible, traumatic thing for the people here. And so as a result, they've really stepped up security.

[00:04:08]

So I'm not surprised that we're getting scoped out by the security guys so you guys can follow.

[00:04:17]

OK, so they lead us inside.

[00:04:19]

So this is the lobby and we go down this set of back stairs.

[00:04:25]

I've been here, but not to the basement. What is just Studios' down here and that's where we meet. Are you Susan?

[00:04:35]

Hi. Hi to you.

[00:04:37]

Before we get into the interview itself, what is Susan like as a person? Like what was your first impression of her?

[00:04:44]

Let's grab you a chair. So how are you doing? OK, how are you?

[00:04:48]

She doesn't really have, like, tech CEO vibes.

[00:04:52]

I didn't I majored in computer science. I have a history and literature undergraduate degree. She doesn't really get the same kind of attention as Mark Zuckerberg or Jack Dorsey. What was. The early memory for you of the Internet and what you thought it was? Well, I first saw the Internet probably in my late 20s, but she's actually been around the tech world for longer than either of them.

[00:05:13]

And of course, I remember Netscape, like Google actually started in her garage.

[00:05:19]

What does what does that mean? It's sort of a wild piece of Silicon Valley trivia. She was working at Intel and living in Silicon Valley. And these two Stanford grad students named Sergey Brin and Larry Page, they were looking for a place to house their brand new startup, this search engine called Google. And eventually they convinced her to come over and help them build it as a business. And she became Google's 16th employee.

[00:05:48]

And so, you know, then you became history's most successful landlord and you you know, you got to Google and started working in the tech world and in twenty fourteen, you know, made the decision that you were going to come over and run YouTube.

[00:06:07]

And what do you remember about YouTube when you came in. Like what, what was it like?

[00:06:12]

Well, first of all, you know, when I was first asked whether or not I wanted to have this role, I had been running our advertising business. And so Larry asked me, Larry Page, and I immediately said, yes, I have always had a love for the visual arts. I was a photographer. I love creative. I had created Google's image search. And I could just see clear as day that YouTube had huge potential in front of it.

[00:06:36]

And so in many ways I thought, oh, I'll go to YouTube and I'll get to work with all these fun entertainment creators.

[00:06:43]

Probably seemed like you were going from a very serious job to a very fun job.

[00:06:46]

Yeah, YouTube at the time was really much more of an entertainment site turned down.

[00:06:55]

Some fans say it was not seen as a very serious site.

[00:07:01]

Yesterday, the Internet blew up over this video of a rat carrying a whole slice of pizza.

[00:07:07]

And actually one of the areas that I really pushed our team a lot on was actually freshness because I would get frustrated.

[00:07:13]

I would come to YouTube, some fans say repeatedly, and I would keep seeing the same videos.

[00:07:19]

Some fans, one of theirs had pushed them on and have consistently pushed them on is actually exploring new areas, which is, of course, the hardest area for us to discover interest that you haven't necessarily told us that you're interested in or that you might not know you're interested or you might not know.

[00:07:37]

But we think they're interesting. Like that exploratory part is a really important part.

[00:07:42]

And I do think we've gotten better at it and we certainly have gotten better at predicting what people are interested in.

[00:07:47]

So part of why I wanted to talk to Susan was to ask her about this 2015 2016 period, this period when YouTube was trying to engineer this new algorithm that ended up opening the door to more polarization and extreme views and was also right when Caleb was getting introduced to all of these new characters. And it was kind of striking that almost right off the bat she owned the fact that she had driven that change.

[00:08:16]

I want to talk more about the recommendations thing. As you know, I've been very interested in it.

[00:08:20]

And one of the people I talked to and wrote about was a guy named Caleb Cain, just like to sort of ask the most blunt possible question, like, what do you think of that story?

[00:08:33]

I mean, I I thought of the story you were trying to understand how recommendation systems were working and what the impact of them were. And that was.

[00:08:43]

And what was her take on Caleb story? Well, she she didn't really get into the specifics of his story, but and it wasn't just you.

[00:08:51]

There are many people that have raised questions or concerns on how our recommendation systems work, and we wanted to make sure we're taking that seriously.

[00:09:00]

She did acknowledge that YouTube had taken from stories like his that they needed to start making some changes.

[00:09:06]

And we have taken it seriously. And I think, you know, we've made a lot of changes to and I want to talk about all those changes, but I'm struck by the fact that you said that one of the challenges for the algorithm and your design of it was getting people to explore new interests. That strikes me as something that that happened to him, where he went to YouTube looking for self-help videos. He was going through a rough patch. He wanted to find something to cheer him up.

[00:09:30]

And then he got introduced to these people who would do self-help, but then they would also talk about politics and they would go on each other's shows. And they were creating this kind of bridge between like the self-help part of YouTube and the more political part of YouTube, because that's something that you observed happening on the platform.

[00:09:49]

I mean, it's interesting that you say that because I I guess I want to say, going back from my early initial days, which is that we couldn't get people interested in news or get people interested in politics, I don't think we we had no indication. This was something that people were interested in, like people were interested in gaming and music entertainment, they came to laugh, they came to to escape in many ways.

[00:10:11]

She went back to this idea that YouTube just wasn't seeing data that suggested that people were looking for politics on YouTube.

[00:10:20]

I acknowledge that there are many political commentators and a lot of political views that have emerged on the platform, but that was not an initial part of how YouTube work. And I guess the only reason I'm bringing that up, because I do want to say that, you know, YouTube for all its creators, for, you know, millions and millions of creators, there is a set of creators who do find success with that. But it's a small set.

[00:10:39]

And in the end, she said that essentially like this kind of political content, it's still a really small percentage of everything else on YouTube.

[00:10:50]

And is that true because it seems like it seems like there's a lot of it, I guess I mean, there's no way to know for sure without having access to their internal data.

[00:11:02]

But like, even if it were true that only one percent of YouTube videos consisted of this kind of political content because YouTube is so big, that would still translate to millions and millions of people around the world watching it.

[00:11:18]

And we've talked to some people who were at YouTube earlier and its history around the time that you came and even before then, who said, you know, that there was sort of this obsession with growth, that there was a very strong push to expand the watch time on the platform and that any challenges that were brought to management around that that these things were just weren't given a real hearing. Does that resonate with you at all?

[00:11:44]

And I asked about film, too, about the red flags that he had tried to raise back when he was at YouTube.

[00:11:49]

I mean, I think I've certainly heard people say that. And I. I mean. You know, like I came from a company that was very focused on quality and quality of information. It was always like the most important thing we also always prioritize was quality.

[00:12:10]

And she didn't really deny it. But she pivoted pretty fast to her work at Google. And this concept of quality, which is basically like the term that Google uses to talk about its search engine and how it wants people to get good, reliable information when they go looking for something.

[00:12:28]

And so I tried to figure out how do you reconcile that with a company that is a more entertainment based company? Right. So what does that mean to have quality? If the main thing that you're doing is gaming videos, cat videos and music videos, then what does that really mean for quality?

[00:12:49]

And I think what she's getting at is that at the time, like YouTube just didn't really think that it was capable of doing much harm.

[00:12:56]

And I guess the reason I'm bringing that up is that one of the biggest realizations for me was that we needed to start adapting and changing and that we needed a very different set of ways of managing information than you want to manage entertainment.

[00:13:11]

Was there a moment that crystallized that for you? Well, it's the moment celebration became a nightmare in 2016. People were screaming, kids were crying. There was a terrorist attack that had happened in France for tonight in the French resort town of Nese.

[00:13:31]

Susan says that in 2016, on Bastille Day in the coastal city of NIS, just as the fireworks ended, a truck plowed into the crowd when an ISIS terrorists attacked a crowd.

[00:13:43]

At least 84 people were killed, dozens of others were hurt, and the government there declared a state of emergency.

[00:13:49]

And I remember reading that and being, you know, just extremely upset and thinking our users need to know about it.

[00:13:58]

YouTube decided that for people in France, it was actually going to push news about the attack onto their home pages. The attack turned the streets into a scene of chaos and for people looking for information about the attack.

[00:14:11]

The mayor of NIS warning people to stay indoors tonight of another terror attack.

[00:14:15]

They were actually going to prioritize videos from what they considered authoritative sources, but that didn't perform very well on our platform.

[00:14:25]

She says that basically the engineers at YouTube were telling her, like the users don't want to actually see it. No one is clicking on these videos. That's just not what they want to see.

[00:14:34]

And so what do you do as a platform? Do you show it to them anyway? And so that's actually I remember that very clearly because that was the first time I said to them, you know, it doesn't matter. We have a responsibility. Something happened in the world and it's important for our users to know.

[00:14:48]

And according to her, she said basically like, too bad we're showing it to him anyway.

[00:14:53]

And it was the first time we started using the word responsibility and the fact that we needed to put information that in our site that was relevant, even if our users were not necessarily engaging with it in the same way that they would with the entertainment videos.

[00:15:06]

So if I'm understanding this right, what Susan is saying is that the nese attack marked the first time that YouTube took this idea of responsibility and prioritized that over watch time. Right. At what point did that sense of responsibility extend beyond like a big news event?

[00:15:28]

It took a little while. I mean, I think over 2000, 17 and 18 and into 2019, I think pressure was starting to build on YouTube.

[00:15:38]

CNN reports that YouTube ran ads from large brands like Adidas, Amazon and Hershey before videos which promote extreme content.

[00:15:46]

There were reports in the Times and other places you're about to meet a man who says he was radicalized by all right figures via their persuasive YouTube videos.

[00:15:57]

Regulators, parents. YouTube is one of the greatest engines of extremism that we might have ever created at the scale advertisers. They were all chiming in.

[00:16:09]

And morning, everyone, the Subcommittee on Consumer Protection and Commerce will now come to order.

[00:16:15]

Congress actually held hearings about these Internet platforms.

[00:16:20]

Congress has unfortunately taken a laissez faire approach to regulation of unfair and deceptive practices online over the past decade. And platforms have let them flourish and invited experts, including former Google employees.

[00:16:37]

So there you are. You're about to hit play in a YouTube video and you hit play and then you think you're going to watch this one video and then you wake up two hours later and say, oh, my God, what just happened? And the answer is because you had a supercomputer pointed at your brain to talk about how YouTube was essentially built to pull people into these polarizing rebuttals.

[00:16:56]

It said to me, because it's happening not by accident, but by design. And that's when YouTube started making these big changes. Last year, they disappeared me.

[00:17:05]

So I've just been told I'm on double secret probation.

[00:17:08]

Apparently, you see, I had violated community standards.

[00:17:12]

You know, one of the biggest changes that we made from our recommendation system, and it was probably one of the later changes we made.

[00:17:19]

And I think it's because it's a harder one to to grapple, which is that we realized that there was a set of content that even if people were repeatedly engaging with it, we thought it was important to make the recommendations more diversified and to show them more quality content alongside.

[00:17:38]

So what she essentially said is that there's a certain kind of video where, like even if a lot of people are watching it and it's generating all this viewing time, the site will actually intervene.

[00:17:49]

You know, if a video has hate speech in it or if it's made by like a neo-Nazi who's denying the Holocaust happened, that will come totally down off YouTube. But there's this other class of videos that YouTube has a harder time categorizing.

[00:18:06]

You know, we began to identify content that we called borderline content and that if users were repeatedly engaging with this borderline.

[00:18:14]

Content, we will show them other content alongside that we've ranked as more high quality content and these videos, like they don't explicitly violate YouTube's rules, but they're also like not the kind of thing that YouTube wants to boost and recommend.

[00:18:29]

So what they've done is essentially to train algorithms to identify these kinds of videos and then demote them in people's feeds and in their sidebars so that they don't get seen as often.

[00:18:44]

And what that has done is it's actually had a 70 percent reduction of use of borderline content in the U.S. And we've been in the process of rolling that out globally. It's in most English language countries right now and a lot a few large markets like France and Germany. But we continue to roll that out.

[00:19:02]

And how do you define like borderline like what's borderline versus just, you know, edgy jokes? How do you sort of separate the two?

[00:19:10]

Yeah, I mean, it is a complicated and nuanced area. And I mean, we have a whole process, an algorithm to find it.

[00:19:17]

Right. And I guess what I'm wondering is like these are sort of interventions in an automated system, right? Like you set up this automated system, let it run, realized like some of the things that it produces, realize that maybe those aren't creating an optimal experience for people. And then humans come in and sort of tinker with it to produce different results. Do you feel like when you hopefully we do more than Tinker?

[00:19:40]

Well, we really have a lot of advanced technology, of course, but scientifically to make sure we do it right. Right.

[00:19:46]

I guess a sort of overarching question is, do you think there was an overreliance in YouTube's early days on on automation, on machines? Do you think there was a sort of an under appreciation for the role, that sort of judgment and and human intuition played in this stuff?

[00:20:04]

No, I mean, I think I think both are really important. And if we look at how our systems work, we definitely have a lot of we have basically a system that is very heavy with with humans, that is extended with machines. There is an automated component.

[00:20:22]

But then there's also, of course, a human component.

[00:20:25]

So is she's saying that humans are now doing more work then the AI? No, I mean, like, yeah, like they do have more people looking at videos than they used to.

[00:20:40]

We have a number of raiders at all watch and review videos. And based on the feedback, we identify a set of content that looks borderline and then based on.

[00:20:49]

But YouTube is so big they could hire 100 times as many people and they still wouldn't be able to watch a fraction of everything that's being uploaded to YouTube on any given day.

[00:21:01]

So it's still the case then that most videos that I'm recommended that anyone who goes to YouTube is recommended. Those recommendations are coming from different algorithms being run by an artificial intelligence.

[00:21:15]

Yeah, the robots are still very much running the show. But what she's saying is that they're trying to give humans a bigger role in supervising them.

[00:21:26]

OK, and you can't see Andrew, but she's waving us off.

[00:21:28]

Well, we have like a couple more minutes. We're good. OK, well, we'll start wrapping it up. I do have some more questions. One is I wanted to ask about the shooting here in twenty eighteen.

[00:21:40]

I wonder what that was like for you.

[00:21:44]

Yeah. I mean it was first of all, just a horrible event to go through.

[00:21:50]

We have a report of something with a gun. This will be from the You Tube building.

[00:21:57]

Like when something like that happens to you, you can't really process that it's happening.

[00:22:03]

A flood of YouTube employees streaming out towards safety, some still clutching tightly to their laptops, turning what was an ordinary, normal lunch hour, this tech campus, into an all too familiar shooting scene in America.

[00:22:17]

You don't know.

[00:22:18]

Is there one shooter or there five shooters? Where are they?

[00:22:20]

New details about the woman police say shot three people at YouTube's headquarters before taking her own life.

[00:22:27]

I'm being discriminated on Facebook, on YouTube and like old videos that's used to get many views.

[00:22:33]

I stopped getting views using the website of filtering her channels to keep her videos from getting views, something she blames on new closed minded YouTube employees.

[00:22:43]

I I'm very thankful in some ways like that.

[00:22:47]

You know, nobody was killed.

[00:22:50]

I think it also just showed any kind of information company, you know, has some risk of people being upset with information that's reported. And so, you know, my key takeaway was that we need to make sure that our employees are always safe and that we're doing everything we possibly can.

[00:23:06]

And I wonder, what was that a moment for you where you realized the stakes of the decisions you made here in a different way? Did it change the way you looked at your responsibility?

[00:23:17]

I mean, information is a tough business. I mean, you you are in the information business yourselves and you understand that sometimes people can get really upset and they can get really angry about ways that they're covered are ways that information is displayed.

[00:23:33]

And just like a journalistic organization like yourselves, you need to do what's right. I want to look back on this time and feel that I made the right decisions.

[00:23:42]

I don't want to say, oh, I allowed this type of content because I was afraid. Yeah.

[00:23:47]

I mean, I remember sort of that day and I, you know, other shootings since that day and that have been sort of related in some way to the Internet and just feeling like a sense of loss.

[00:23:58]

Like I I grew up on the Internet. I love the Internet. I have wondered if we're ever going to get back to a time when these things feel fun and useful and like they're not leading to these culture wars and polarization. Like, do you think that will ever feel like that about the Internet again? I love the Internet. I think that the Internet has enabled so many different opportunities for people that wouldn't have had it. And I think you're right, like in the last couple of years, there's been a lot of scrutiny.

[00:24:31]

And it's because of the size that we are and because people realize the significance of our platform. And I recognize that. And I take that very seriously.

[00:24:43]

My focus on responsibility is is probably one of the most important things I'll do in my life and why? Because we I think we live at this time where there's tremendous change.

[00:24:53]

And so, yes, we've had years of all this fun and gaming and cat videos, but there are a lot of really important decisions to be made about what this future will hold and what will platforms be and how will they operate in the future.

[00:25:05]

And those rules haven't been written yet or we're in the process of writing them.

[00:25:11]

Yeah, well, thank you, thank you, appreciate taking time, thank you. All right. So, Kevin, this is where things stood a couple of months ago, but then came the coronavirus, right? This is Sarah Koenig, host of the serial podcast. I want to tell you about our new show, Nice White Parents. It's reported by Chana Joffe. Walt, who's made some of the best, most thought provoking, most emotional radio stories I've ever heard back in 2015.

[00:26:03]

Hannah wanted to find out what would happen inside this one public school in her neighborhood during a sudden influx of white students into a school that had barely had any white students before. And they're not satisfied that she fully understood what she was seeing. She went all the way back to the founding of the school in the 1960s and then fought again up to the present day. And eventually Hannah realized she could put a name to the unspoken force that kept getting in the way of making the school better.

[00:26:28]

White parents, I've been waiting so long to tell people about this show and now I can finally say it. Go listen to nice white parents. Nice White Parents is made by Zero Productions, a New York Times company. You can find it wherever you get your podcasts.

[00:26:44]

A handful of people is spreading the idea on social media that the rollout of 5G cell towers is responsible for the covid-19 epidemic, some of the towers have even been set on fire in the U.K. So this April, as we were all dealing with the coronavirus and quarantining at our houses, I started to see.

[00:27:04]

Let's talk about what's really going on.

[00:27:07]

As usual, something just since the beginning hasn't seemed right with this coronavirus.

[00:27:12]

The Internet was filling up with misinformation. You're saying that Silver's solution would be effective, totally eliminate it, kills it and deactivate it within 12 hours.

[00:27:23]

There were all these miracle cures. Just drinking hot water can kill it.

[00:27:27]

If you drink bleach, you will not get the coronavirus. Vitamin C can kill it. No problem. You can prevent this from happening with vitamin C.

[00:27:36]

And then if there were these conspiracy theories about anyone else, find it coincidental that Dr. Foushee is on the Leadership Council for the Bill and Melinda Gates Foundation is all coming out about Foushee, working with Gates.

[00:27:49]

He knew all about it.

[00:27:50]

Bill Gates or the Jews are behind this. We are quite literally watching a bio warfare drama play out before us.

[00:27:59]

Stuff about like, is this being caused by five towers? Yes.

[00:28:02]

Five cell phone towers causing coronavirus. Anybody want to make one guess as to where the first completely blanketed five city in the world was exactly like the exact sort of thing that Susan would consider low quality content?

[00:28:22]

Right.

[00:28:23]

Another incredible site in this surreal ordeal. As we battle this coronavirus pandemic, the Capitol steps the scene of a protest.

[00:28:35]

Of course, like a lot of misinformation, this didn't just stay online.

[00:28:40]

Are you concerned about this virus? I was in the beginning until I've done my research and found out the realities and the media's overreach on it. And that is not as serious as they made it out to be.

[00:28:54]

It actually contributed to these protests that were going on at state capitols all around the country, these sort of anti lockdown protesters who were refusing to go along with social distancing and the other recommendations that medical and health authorities were making about how to keep society safe.

[00:29:12]

The disease is not that serious that we should quarantine. We don't quarantine for the flu. We should not quarantine for covid-19. I got you there. Hello. Yes, I'm in my closet, in my office.

[00:29:31]

I tell my kids.

[00:29:33]

So in early April, you recording on your end or can you record on your end, like, memo or something? And I don't normally record.

[00:29:41]

I got in my very fancy home studio.

[00:29:44]

Susan, if you could hold the phone as if you're talking on the phone, that'll be the right placement.

[00:29:49]

And Julianne, I got Susan back on the phone the last time we met. We talked about all of the ways that you were trying and had tried to improve the quality of the information on YouTube. And since then, like, there's this coronavirus, which seems like maybe kind of the highest stakes, possible version of that mission, where the difference between people getting good information and bad information can literally be life or death. So what are you doing now to make sure that people are getting good information about the virus?

[00:30:26]

From the very beginning, we took this extremely seriously and we took our team and decided what are all of the things that we can do to be as responsible as possible?

[00:30:36]

And right away, she said, like all these coronavirus conspiracy theories and miracle cures, just drinking hot water can kill it very quickly.

[00:30:44]

We were reviewing all of the videos, so we are very proactive. And anything that would be harmful or anything that would promote medically unproven methods. If you drink bleach, you will not get the coronavirus. We have been removing that type of conduct. You can prevent this from happening with vitamins.

[00:31:01]

She said YouTube has been aggressively hunting down and deleting them as fast as they can.

[00:31:07]

Hi, I'm Dr. John Brooks with the CDC. We put links to the CDC.

[00:31:13]

Hello, everyone, and welcome to this regular press conference on covid-19 from Geneva, the World Health Organization, or to the associated equivalent globally in the home feed under videos that were about coronavirus and in our searches.

[00:31:29]

Let's work together to keep ourselves healthy, our families healthy and our communities healthy.

[00:31:35]

On every video about the coronavirus, they like put this little link that directs people to like the CDC website or other health authorities.

[00:31:45]

So the five G story is complete and utter rubbish. It's nonsense. It's the worst kind of fake news.

[00:31:51]

We have a new shelf that we launched that gives users information from authoritative sources.

[00:31:58]

Far fetched conspiracy theories like this are really damaging because they undermine public health efforts to stop the spread of the virus by convincing some people that it's not the real problem.

[00:32:08]

They created this feature that basically, like when you search for information about the coronavirus, the first thing you see is this section called Top News that contains essentially like information from authoritative, trusted sources.

[00:32:26]

Staying inside saves lives. Stay home. They also got popular YouTube to make these social distancing plays.

[00:32:35]

I really think togetherness is the superpower of our species. Do it together. We will keep each other company. So they weren't just hearing it from YouTube. They were hearing it from their favorite creator.

[00:32:45]

Coming over each day can be a different thing, which they then promoted to all their subscribers.

[00:32:52]

And when we looked at the combined number of subscribers that they have, it's over one point five billion subscribers.

[00:32:59]

You can slow the growth of this and save lives. We have everyone at YouTube working throughout this crisis to make sure that they are following what's happening, making changes on the policy, making sure that we are taking down content, making sure that we're adjusting whatever changes we need to our algorithm.

[00:33:22]

So and as she's going through her list of everything that YouTube is doing on the coronavirus, it really strikes me like how much work it takes to make sure that this algorithm that YouTube has built, that it's not leading people in a dangerous direction.

[00:33:38]

I mean, I could go on we did a lot of different steps here to make sure we were doing everything possible.

[00:33:43]

Yeah, I mean, just personally, like I've been impressed by how hard it's been to find misinformation about the coronavirus on YouTube. And I guess I'm wondering, like, why isn't it always like this? Like what is preventing YouTube from taking this approach all the time to every subject?

[00:34:02]

Well, we do take a responsibility approach to everything that we do. What's a little bit different about this one is there is a very clear authority, which is a World Health Organization, and they are recommendations that health organizations are making. And as a result, it's very clear to us that anything that would contradict social distancing, for example, would be a violation of our policies.

[00:34:27]

And in a lot of other areas, there may be some agreement on the science, but there could be politically different views. And so we want to enable a broad range of views.

[00:34:37]

So Susan's saying that she's basically comfortable doing this kind of work when it comes to something like the coronavirus, because it's just science and there are clear authority figures to turn to.

[00:34:47]

But politics is a completely different story. And maybe there's some obvious answer to this. But like, why is that?

[00:34:55]

We want to enable a broad range of views. We want to make sure that the full spectrum of political views or social ideas are represented there. But in this case, it's very clear that there's an authority and information behind that authority.

[00:35:09]

I mean, I think on an ideological level, there are a lot of people at YouTube who still want it to be this open platform where every kind of view is welcome and is equal. And I think there's also this more practical angle to it, which is that they don't want to be seen as putting their thumb on the scale. They don't want politicians and regulators, people who might try to break them up or enforce new rules on them to think that they're taking a partisan stand.

[00:35:36]

Right. But I feel like because of the time that we're living in, in part due to Internet platforms like YouTube and the powerful eyes that have been designed to capture and keep our attention at all costs, like now everything feels polarizing and political, even everything around the coronavirus.

[00:35:58]

And so it sounds like Susan is saying, you know, OK, we recognize that we helped let this genie out of the bottle. Right. We didn't mean to. We were just trying to detain. Mm hmm. We're now going to try and put it back. But I just wonder, like in the environment that they helped to create if that is even possible. Yeah, I mean, it's kind of a reckoning for them. YouTube is part of Google and I've been at Google for over 20 years now.

[00:36:30]

And, you know, it's an information company, which means our goal, our mission is to deliver users the most relevant information. Right. And that it's accurate. And that's true for YouTube, too, that we want to deliver accurate, useful information. And I think in the information area, it's very important.

[00:36:46]

Yeah. And and I guess it strikes me as like there's a trust crisis right now. I mean, people just they don't trust necessarily the institutions that maybe they would have years ago to sort of give them accurate information. So you're kind of trying to rebuild trust. It strikes me like placing content from organizations like the CDC and the H.O. prominently on YouTube. Do you feel like that's part of what you're sort of trying to do right now is to kind of help people on YouTube understand that they actually can trust these sort of mainstream authorities?

[00:37:22]

Well, part of what we want to do for YouTube is make sure that we have all of the voices represented on the platform. And YouTube started out with a lot of individuals just posting videos about what was happening in their lives. But what we've really done over the last couple of years is reach out to make sure that news organizations trusted scientific health. Government officials all also have the resources needed to be able to come onto the platform on YouTube. And what I've seen happen with covid-19 is it's really accelerated.

[00:37:56]

Public health officials recognizing that YouTube is an important medium to be communicating with users about anything involving health.

[00:38:05]

And so, you know, in many ways this I think this would have happened naturally. It might have taken a few years. It just accelerated a few years. And we plan to continue to work with all of these organizations to make sure they can get the right information out to their users.

[00:38:23]

Thank you. Thank you. I think we're good. What Susan is saying, this change in YouTube, it's actually like a fundamental shift in the YouTube universe, like for its entire existence, YouTube has been defined as an alternative media space.

[00:38:43]

And the people that were there were like these insurgents and upstarts, these kind of like rebels and misfits.

[00:38:50]

And now YouTube is basically saying when it comes to certain things, we're going to let the establishment win, which is tricky because the establishment isn't always right, like the World Health Organization has changed its guidance on things like face masks during the pandemic.

[00:39:07]

So the ramifications of this change, they're not totally clear yet. But what is clear is that this is going to be a huge battle.

[00:39:16]

Old school media does not like Internet personalities because they're scared of us.

[00:39:22]

We have so much influence and such a large because the Internet culture that grew up on YouTube, not only has it gotten bigger than mainstream culture. Why is this an article? Because click bait, that's why. And not only does it distrust a lot of mainstream institutions, if there's anything I learned about the media from being a public figure is how they blatantly misrepresent people for their own personal gain. But over time, it's basically come to despise them. I'm still here.

[00:39:53]

I'm still making videos. Nice try. Wall Street Journal. Try again, motherfuckers.