Transcribe your podcast
[00:00:00]

The following is a conversation with Christos Goodrow, vice president of engineering at Google and head of search and Discovery at YouTube, also known as the YouTube algorithm. YouTube has approximately one point nine billion users, and every day people watch over one billion hours of YouTube video. It is the second most popular search engine behind Google itself. For many people, it is not only a source of entertainment, but also how we learn new ideas for math and physics videos to podcasts, to debates, opinions, ideas from out of the box.

[00:00:35]

Thinkers and activists are some of the most tense, challenging and impactful topics in the world today. YouTube and other content platforms receive criticism from both viewers and creators, as they should, because the engineering task before them is hard and they don't always succeed and the impact of their work is truly world changing. To me, YouTube has been an incredible wellspring of knowledge. I've watched hundreds, if not thousands of lectures that changed the way I see many fundamentals, ideas and math, science, engineering and philosophy.

[00:01:12]

But it does put a mirror to ourselves and keeps the responsibility of the steps we take in each of our online educational journeys into the hands of each of us. The YouTube algorithm has an important role in that journey of helping us find new, exciting ideas to learn about. That's a difficult and exciting problem for an artificial intelligence system. As I've said in lectures and other forums, recommendation systems will be one of the most impactful areas of A.I. in the 21st century, and YouTube is one of the biggest recommendation systems in the world.

[00:01:47]

This is the Artificial Intelligence Podcast. If you enjoy it, subscribe on YouTube. Good. Five Stars and Apple podcast. Follow on Spotify, supported on Patreon or simply connect with me on Twitter. Allex Friedman spelled F.R. IDM man.

[00:02:03]

I recently started doing ads at the end of the introduction. I'll do one or two minutes after introducing the episode and never any ads in the middle that can break the flow of the conversation. I hope that works for you and doesn't hurt the listening experience. This show is presented by Kashyap, the number one finance app in the App Store. I personally use cash app to send money to friends, but you can also use it to buy, sell and deposit Bitcoin in just seconds.

[00:02:29]

Cash also has a new investing feature. You can buy fractions of a stock, say one dollar's worth no matter what the stock price is. Broker's services are provided by cash up investing a subsidiary of Square. And remember SIPC. I'm excited to be working with cash out to support one of my favorite organizations called First Best known for their first robotics and Lego competitions. They educate and inspire hundreds of thousands of students and over one hundred and ten countries and have a perfect rating.

[00:02:59]

And Charity Navigator, which means that donated money is used to maximum effectiveness. When you get cash out from the App Store or Google Play and use Scolex podcast, you'll get ten dollars in cash. I will also donate ten dollars. The first, which again is an organization that I've personally seen, inspire girls and boys to dream of engineering a better world. And now here's my conversation with Christos Goodrow. YouTube is the world's second most popular search engine behind Google, of course, we watch more than one billion hours of YouTube videos a day more than Netflix and Facebook video combined.

[00:03:55]

YouTube creators uploaded over 500000 hours of video every day. Average lifespan of a human being, just for comparison, is about 700000 hours. So what's uploaded every single day is just enough for a human to watch in a lifetime. So let me ask an absurd philosophical question.

[00:04:16]

If from birth when I was born and there's many people born today with the Internet, I watched YouTube videos nonstop.

[00:04:24]

Do you think their trajectories through YouTube video space that can maximize my average happiness or maybe education are my growth as a human being? I think there are some great trajectories through YouTube videos, but I wouldn't recommend that anyone spend all of their waking hours or all of their hours watching YouTube.

[00:04:48]

I mean, I think about the fact that YouTube has been really great for my kids, for instance, my oldest daughter, you know, she's been watching YouTube for several years. She watches Tyler Oakley and the Vlog Brothers. And I know that it's had a very profound and positive impact on her character and my younger daughter. She's a ballerina. And her teachers tell her that YouTube is a huge advantage for her because she can practice a routine and watch like professional dancers do that same routine and stop it and back it up and rewind and all that stuff.

[00:05:24]

Right.

[00:05:24]

So it's been really good for them. And then even my son is a sophomore in college. He. He got through his linear algebra class because of a channel called Three Blue One Brown, which. You know, helps you understand linear algebra, but in a way that would be very hard for anyone to do on a whiteboard or a chalkboard. And so I think that those experiences, from my point of view, were very good and so I can imagine really good trajectories through YouTube.

[00:05:55]

Yes.

[00:05:55]

Have you looked at do you think of broadly about that trajectory over a period because YouTube is growing up now. So over a period of years, you just kind of gave a few anecdotal examples. But, you know, I used to watch certain shows on YouTube.

[00:06:11]

I don't anymore. I've moved on to other shows. And ultimately you want people to, from YouTube's perspective, to stay on YouTube, to grow as human beings on YouTube. So you have to think not just what makes them engage today or this month, but also over a period of years. Absolutely.

[00:06:30]

That's right. I mean, if YouTube is going to continue to enrich people's lives, then, you know, then it has to grow with them and and people's interests change over time. And so I think we've we've been working on this problem.

[00:06:47]

And I'll just say it broadly as like how to introduce diversity and introduce people who are watching one thing to something else they might like. We've been working on that problem all the eight years I've been at YouTube. It's a hard problem because, I mean, of course, it's trivial to introduce diversity that doesn't help. Yeah, right. I got a random video.

[00:07:11]

I could just randomly select a video from the billions that we have. It's likely not to even be in your language.

[00:07:18]

So the likelihood that you would watch it and develop a new interest is very, very low.

[00:07:25]

And so what you want to do when you're trying to increase diversity is find something that is not too similar to the things that you've watched, but also something that you might be likely to watch. And that balance, finding that spot between those two things is quite challenging.

[00:07:46]

So the diversity of content, diversity of ideas, it's it's a really difficult it's a thing like that.

[00:07:53]

It's almost impossible to define.

[00:07:55]

Right. Like what's different. So how do you think about that? So two examples is I'm a huge fan of three blue, one brown, say, and then one diversity.

[00:08:08]

I wasn't even aware of a channel called Very Tarsem, which is a great science physics, whatever channel.

[00:08:15]

So one version of diversity is show me Derrick's very Tassimo channel, which I was really excited to discover, actually. And I watch a lot of his videos.

[00:08:24]

OK, so you're a person who's watching some math channels and you might be interested in some other science or math channels. So like you mentioned, the first kind of diversity is just show you some some things from other channels that are related, but not just, you know, not all the three blue one brown channel. Throw in a couple others. So so that's the maybe the first kind of diversity that we started with many, many years ago.

[00:08:53]

Taking a bigger leap is is about I mean, the mechanisms we do we use for that is is we basically cluster videos and channels together, mostly videos. We do every almost everything at the video level.

[00:09:07]

And so we'll we'll make some kind of a cluster via some embedding process and then and then measure, you know, what is the likelihood that that users who watch one cluster might also watch another cluster? That's very distinct. So we may come to find that that people who watch science videos also like jazz. Hmm. This is possible. Right.

[00:09:33]

And so and so because of that relationship that we've identified through the through the embedding and then the measurement of the people who watch both, we might recommend a jazz video once in a while.

[00:09:48]

So there's this cluster in the betting space of jazz videos and science videos. And so you kind of try to look at aggregate statistics where if a lot of people that jump from science cluster to the jazz cluster tend to remain as engaged to become more engaged, then that's that means those two are they should hop back and forth and they'll be happy.

[00:10:14]

Right. There's a higher likelihood that a person from who's watching science would like jazz. Then the person watching science would like, I don't know, backyard railroads or something else. Right.

[00:10:25]

And so we can try to measure these likelihoods and use that to make the best recommendation we can.

[00:10:33]

So, OK, so let's talk about the machine learning of that. But I have to linger on things that neither you or anyone have an answer to.

[00:10:41]

There's gray areas of truth, which is. For example, now, I can't believe I'm going there, but politics, it it happens so that certain people believe certain things and they're very certain about them.

[00:10:57]

Let's move outside the red versus blue politics of today's world. But there's different ideologies. For example, in college, I read quite a lot of Ayn Rand I studied. And that's a particular philosophical ideology. I find I found it interesting to explore. OK, so that was that kind of space. I've kind of moved on from that cluster intellectually. But nevertheless, it's an interesting cluster. There's I was born in the Soviet Union. Socialism, communism is a certain kind of political ideology.

[00:11:24]

That's really interesting to explore. Again, objectively, just there's a set of beliefs about how the economy should work and so on.

[00:11:31]

And so it's hard to know what's true and not in terms of people within those communities are often advocating that this is how we achieve utopia in this world and they're pretty certain about it.

[00:11:43]

So how do you try to manage politics in this chaotic, divisive world, not policy or any kind of ideas in terms of filtering what people should watch next, in terms of also not letting certain things be on YouTube? This is exceptionally difficult responsibility, right?

[00:12:06]

Well, the responsibility to get this right is our top priority.

[00:12:11]

And and the first comes down to making sure that we have good, clear rules of the road.

[00:12:19]

Right. Like just because we have freedom of speech doesn't mean that you can literally say anything. Right. Like we as a society have accepted certain restrictions on our freedom of speech. There are things like libel laws and things like that. And so where we can draw a clear line, we do. And we continue to evolve that line over time. However, as you pointed out, wherever you draw the line, there's going to be a borderline. And in that borderline area, we are going to maybe not remove videos, but we will try to reduce the recommendations of them or the proliferation of them by demoting them.

[00:13:03]

And then alternatively, in those situations, try to raise what we would call authoritative or credible sources of information.

[00:13:12]

So we're not trying to I mean, you mentioned Ayn Rand and communism.

[00:13:19]

You know, those are those are two like valid points of view that people are going to debate and discuss. And and, of course, people who believe in one or the other of those things are going to try to persuade other people to their point of view. And so we're not trying to settle that or choose a side or anything like that. What we're trying to do is make sure that the the people who are expressing those point of view and offering those positions are authoritative and credible.

[00:13:51]

So let me ask a question about people I don't like personally.

[00:13:56]

You heard me.

[00:13:57]

I don't care if you leave comments on this is and but sometimes there's brilliantly funny, which is trolls.

[00:14:05]

So people who kind of mock I mean, the Internet is full of mock style comedy where people just kind of make fun of point out that the emperor has no clothes and there's brilliant comedy in that.

[00:14:20]

But sometimes it can get cruel and mean. So on that on the mean point and sorry to linger on these things that have no good answers, but actually I totally hear you that this is really important. You're trying to solve it. But how do you reduce the meanness of people on YouTube?

[00:14:44]

I understand that anyone who uploads YouTube videos has to become resilient to a certain amount of meanness. Like I've heard that from many creators. And we would we are trying in various ways comment ranking, allowing certain features to block people to reduce or make that that meanness or that trolling behavior are less effective on YouTube. Yeah. And so, I mean, it's it's very important, but it's something that we're we're going to keep having to work on. And, you know, as we improve it, maybe we'll get to a point where where people don't have to suffer this sort of meanness when they upload YouTube videos.

[00:15:33]

I hope we do. But, you know, but it just does seem to be something that you have to be able to deal with as a YouTube creator.

[00:15:42]

Now, do you have a hope that you mentioned two things? I agree with you.

[00:15:46]

So this is like a machine learning approach of ranking comments based on whatever, based on how much they contribute to the healthy conversation, let's put it that way, then.

[00:15:58]

The other is almost an interface question of how do you how does the creator filter so block or how does how do humans themselves, the users of YouTube, manage their own conversation?

[00:16:14]

Do you have hope that these two tools will create a better society without limiting freedom of speech too much, without sort of entertaining, even like saying that people like what do you mean limiting sort of curating speech?

[00:16:29]

I mean, I think that that overall is our whole project here at YouTube.

[00:16:34]

Right. Like, yeah, we fundamentally believe and I personally believe very much that YouTube can be great. It's been great for my kids. I think it can be great for society. But it's absolutely critical that we get this responsibility part right. And that's why it's our top priority. Susan Wisky, who's the CEO of YouTube, she says something that I personally find very inspiring, which is that we want to do our jobs today in a manner so that people 20 and 30 years from now, we'll look back and say, you know, YouTube, they they really figured this out.

[00:17:11]

They really found a way to strike the right balance between the openness and the value that the openness has and also making sure that we are meeting our responsibility to users in society.

[00:17:26]

So the burden on YouTube actually is quite incredible. And the one thing that people don't I don't give enough credit to the seriousness and the magnitude of the problem.

[00:17:36]

I think so.

[00:17:37]

I personally hope that you do solve it because a lot is in your hand and a lot is riding on your success or failure. So it's besides, of course, running a successful company. You're also curating the content of the Internet and the conversation, the Internet. That's a powerful thing.

[00:17:57]

So one thing that people wonder about is how much of it can be solved with pure machine learning.

[00:18:06]

So looking at the data, studying the data and creating algorithms that curate the comments, create the content and how much of it needs human intervention, meaning people here at YouTube in a room sitting and thinking about what is the nature of truth, what is what are the ideals that we should be promoting, that kind of thing.

[00:18:32]

So algorithm versus human input. What's your sense?

[00:18:37]

I mean, my own experience has demonstrated that you need both of those things, algorithms. I mean, you're familiar with machine learning algorithms.

[00:18:47]

And the thing they need most is data. And the data is generated by humans. And so, for instance, when we're building a system to try to figure out which are the videos that are misinformation or borderline policy violations, well, the first thing we need to do is get human beings to make decisions about which which of those videos are in which category. And then we use that data and basically, you know, take that information that's that's determined and governed by humans and and extrapolated or apply it to the entire set of billions of YouTube videos.

[00:19:31]

And we couldn't we we couldn't get to all the videos on YouTube. Well, without the humans. And we we couldn't use the humans to get to all the videos of YouTube.

[00:19:41]

So there's no world in which you have only one or the other of these things.

[00:19:48]

And just as you said, a lot of it comes down to people at YouTube spending a lot of time trying to figure out what are the right policies, you know, what are the outcomes based on those policies? Are they the kinds of things we want to see? And then once we kind of get to get an agreement or build some consensus around around what the policies are, well, then we've got to find a way to implement those policies across all of YouTube.

[00:20:18]

And that's where both the human beings, we call them evaluators or reviewers, come into play to help us with that. And then and then once we get a lot of training data from them, then we apply the machine learning techniques to take it even further. Do you have a sense that these human beings have a bias and some kind of direction? Sort of. I mean, that's an interesting question.

[00:20:44]

We do. In autonomous vehicles and computer vision in general, a lot of annotation and we rarely ask, what bias do the annotators have? You know, even in the sense that they're better than they're better and doing certain things than others, for example, people are much better at entertaining segmentation, at segmenting cars in a scene versus segmenting bushes or trees.

[00:21:15]

You know, there's specific mechanical reasons for that, but also because the cement, it's cement, a gray area. And just for a lot of reasons, people are just terrible at annotating trees.

[00:21:26]

OK, so in that same kind of sense, do you think of in terms of people reviewing videos or annotating the content of videos?

[00:21:34]

Is there some kind of bias that you're aware of or seek out in that human input? Well, we take steps to try to overcome these kinds of biases or biases that we think would be problematic. So, for instance, like we ask people to have a bias toward scientific consensus, that's something that we we instruct them to do.

[00:21:59]

We ask them to have a bias towards demonstration of expertise or credibility or authoritativeness. But there are other biases that we that we want to make sure to try to remove. And there's many techniques for doing this. One of them is you you send the same thing to be reviewed to many people. And so, you know, that's one technique. Another is that you make sure that the people that are doing these sorts of tasks are from different backgrounds and different areas of the United States or of the world.

[00:22:33]

But then even with all of that, it's possible for certain kinds of what we would call unfair biases to creep into machine learning systems, primarily, as you said, because maybe the training data itself comes in in a biased way. And so we also have worked very hard on the on improving the machine learning systems to remove and reduce unfair biases when it's when it goes against or is involved some protected class.

[00:23:06]

For instance, thank you for exploring with me some of the more challenging things. I'm sure there's a few more that will jump back to.

[00:23:14]

But let me jump in to the fun part, which is maybe the basics of the quote unquote, YouTube algorithm.

[00:23:23]

What is the YouTube algorithm look at to make recommendations for what to watch next from a machine learning perspective or when you search for a particular term, how does it know what to show you next? Because it seems to, at least for me, do an incredible job of both. Well, that's kind of you to say.

[00:23:44]

It didn't used to do a very good job, but it's gotten better over the years. Even even I observed that it's improved quite a bit.

[00:23:53]

Those are two different situations, like when you search for something, YouTube uses the best technology we can get from Google to make sure that the YouTube search system finds what someone's looking for.

[00:24:07]

And of course, the very first things that one thinks about is, OK, well, does the word occur in the title? For instance, you know, but there but there are much more sophisticated things where we're mostly trying to do some syntactic match or or maybe a semantic match based on words that we can add to the document itself. For instance, you know, maybe is is this video watched a lot after this query?

[00:24:40]

Right. That's something that we can observe and then as a result, make sure that that that document would be retrieved for that query.

[00:24:51]

Now, when you talk about what kind of videos would be recommended to watch next, that's something again we've been working on for many years. And probably the first the first real attempt to do that well was to use collaborative filtering so you can describe a collaborative filtering.

[00:25:16]

Sure. It's just basically what we do is we observe which videos get watched close together by the same person. And if you observe that and if you can imagine creating a graph where the videos that get watched close together by the most people are sort of very close to one another in this graph and videos that don't frequently watch close to close together by the same person or the same people are far apart. Then you end up with this graph that we call the related graph that basically represents videos that are very similar or related in some way.

[00:25:57]

And what's amazing about that is that it puts all the videos that are in the same language together, for instance, and we didn't even have to think about language. It just does it yeah, right, and it puts all the videos that are about sports together and it puts most of the music videos together and it puts all of these sorts of videos together just because that's sort of the way the people using YouTube behave.

[00:26:25]

So that already cleans up a lot of the problem.

[00:26:29]

It takes care of the lowest hanging fruit, which happens to be a huge one of just managing these millions of videos.

[00:26:37]

That's right. I remember a few years ago I was talking to someone who was trying to propose that we do a research project concerning people who who are bilingual. And this person was making this proposal based on the idea that YouTube could not possibly be good at recommending videos. Well, to people who are bilingual. And so she was telling me about this and I said, well, can you give me an example of what problem they think we have on YouTube with the recommendations?

[00:27:16]

And so she said, well, I'm a researcher in in the US and and when I'm looking for academic topics, I want to look I want to see them in English. And so she searched for one, found a video and then looked at the watch next suggestions. And they were all in English. And so she said, oh, I see. YouTube must think that I speak only English. And so she said, now I'm actually originally from Turkey.

[00:27:41]

And sometimes when I'm cooking, let's say I want to make some baklava. I really like to watch videos that are in Turkish. And so she searched for a video about making the baklava and then and then selected it. And it was in Turkish and the watch next recommendations were in Turkish.

[00:27:55]

And she just couldn't believe how this was possible.

[00:27:58]

And and how is it that, you know, that I speak both these two languages and put all the videos together and it's just as a sort of an outcome of this related graph that's created through collaborative filtering.

[00:28:10]

So for me, one of my interest is just human psychology. Right. And and that's such a powerful platform on which to utilize human psychology to discover what people, individual people want to watch next. But it's also be just fascinating to me.

[00:28:28]

You know, I've Google Search has ability to look at your own history, and I've done that before, just just what?

[00:28:36]

I've searched three years for many, many years. And it's fascinating picture of who I am, actually. And I don't think anyone's ever summarized that.

[00:28:46]

I personally would love that.

[00:28:48]

A summary of who I am as a person on the Internet to me, because I think it reveals I think it puts a mirror to me or to others, you know, that's actually quite revealing and interesting. You know, just maybe the number of it's a joke, but not really is the number of cat videos I've watched videos of people falling, you know, stuff that's absurd, that kind of stuff.

[00:29:16]

It's really interesting.

[00:29:17]

And of course, it's really good for the machine learning aspect to just show to figure out what to show next. But it's interesting. Hey, have you just as a tangent, played around with the idea of giving a map to people?

[00:29:33]

Sort of, as opposed to just using this information to show us next, showing them, here are the clusters you've loved over the years kind of thing?

[00:29:42]

Well, we do provide the history of all the videos that you've watched. Yes. So you can definitely search through that and look through it and search through it to see what it is that you've been watching on YouTube.

[00:29:52]

We have actually in various times experimented with this sort of cluster idea, finding ways to demonstrate or show people what topics they've been interested in or what what clusters they watch from. It's interesting that you bring this up because. In some sense, the way the recommendation system of YouTube sees a user is exactly as the history of all the videos they've watched on YouTube.

[00:30:23]

And so you can think of yourself or any user on YouTube as kind of like a a DNA strand of all your videos.

[00:30:35]

Right. That sort of represents you. You can also think of it as maybe a vector in the space of all the videos on YouTube. And so, you know, now, once you think of it as a vector in the space of all the videos on YouTube, then you can start to say, OK, well, you know, which videos, which which other vectors are close to me and to my vector.

[00:30:57]

And and that's one of the ways that we generate some diverse recommendations because you're like, OK, well, you know, these these people seem to be close with respect to the videos they watched on YouTube. But, you know, here's a topic or a video that one of them has watched and enjoyed, but the other one hasn't. That could be an opportunity to make a good recommendation.

[00:31:19]

I gotta tell you, I mean, I know I'm going to ask for things that are impossible, but I would love to Kluster than human beings. Like I would love to know who has similar trajectories as me as probably would want to hang out.

[00:31:32]

There's a social aspect there, like actually finding some of the most fascinating people I find I need to have like no followers. And I start following them and they create incredible content. And on that topic, I just love to ask. There's some videos just blew my mind in terms of quality and depth and just in every regard are amazing videos and they have like fifty seven views.

[00:31:58]

OK, how do you get videos of quality to be seen by many. So the measure of quality, is it just something.

[00:32:09]

Yeah. How do you know that something is good. Well I mean I think it depends initially on what sort of video we're talking about. So in the realm of let's say you mentioned politics and news in that realm, you know, quality news or quality journalism relies on having a journalism department.

[00:32:35]

Right.

[00:32:35]

Like, you have to have actual journalists and fact checkers and people like that. And so in that situation and in others, maybe science or in medicine, quality has a lot to do with the authoritativeness and the credibility and the expertise of the people who make the video.

[00:32:54]

Now, if you think about the other end of the spectrum, you know, what is the highest quality prank video or what is the highest quality Minecraft video? Yeah, right. That might be the one that people enjoy watching the most and watch to the end.

[00:33:10]

Or it might be the one that when we ask people the next day after they watched it, were they satisfied with it? And so we in in especially in the realm of entertainment, have been trying to get at better and better measures of quality or satisfaction or enrichment since I came to YouTube. And we started with well, you know, the first approximation is the one that gets more viewers. But but, you know, we both know that things can get a lot of viewers and not really be that high quality, especially if people are clicking on something and then immediately realizing that it's not that great in abandoning it.

[00:33:57]

And that's why we moved from viewers to thinking about the amount of time people spend watching it with the premise that, like, you know, in some sense, the time that someone spends watching a video is related to the value that they get from that video. It may not be perfectly related, but it has something to say about how much value they get. But even that's not good enough. Right, because I myself have spent time clicking through channels on television late at night and ended up watching under siege, too, for some reason.

[00:34:32]

I don't know. And if you were to ask me the next day, are you glad that you watched that show on TV last night?

[00:34:39]

I'd say, yeah, I wish I would have gone to bed or read a book or almost anything else, really. And so that's why some people got the idea a few years ago to try to survey users afterwards.

[00:34:52]

And so so we get feedback data from those surveys and then use that in the machine learning system to try to not just predict what you're going to click on right now, what you might watch for a while. But what when we ask you tomorrow, you'll give four or five stars to.

[00:35:10]

So just to summarize. What are the signals from the machine learning perspective, these can provide, as you mentioned, just clicking on the video views the time watch maybe the relative time watched the the clicking like and dislike on the video, maybe commenting on the video, all of those things, all of those things.

[00:35:32]

And then the the one I wasn't actually quite aware of, even though I might have engaged in it, is a survey afterwards, which is a brilliant idea.

[00:35:41]

Is there other signals? I mean, that's already a really rich space of signals to learn from. Is there something else?

[00:35:48]

Well, you mentioned commenting, also sharing the video. If you if you think it's worthy to be shared with someone else, you know, within YouTube or outside of each of us either.

[00:35:59]

Let's see, you mentioned like, dislike, like and dislike.

[00:36:02]

How important it is that it's very important. Right. We want it's predictive of satisfaction, but it's not it's not perfectly predictive.

[00:36:13]

Subscribe if you subscribe to the channel of the person who made the video, then that also is a piece of information and it signals satisfaction.

[00:36:24]

Although over the years we've learned that people have a wide range of attitudes about what it means to subscribe.

[00:36:33]

We would ask some users who didn't subscribe very much why, but they watched a lot from a few channels. We'd say, well, why didn't you subscribe? And they would say, well, I can't afford to pay for anything.

[00:36:47]

And, you know, we tried to let them understand, like, actually it doesn't cost anything. It's free. It just helps us know that you are very interested in this creator. But then we've asked other people who subscribe to many things and and don't really watch any of the videos from those channels. And we say, well, well, why did you subscribe to this if you weren't really interested in any more videos from that channel? And they might tell us why.

[00:37:15]

Just you know, I thought the person did a great job. And I just want to kind of give them a high five. Yeah.

[00:37:20]

And so, yeah, that's where I said I actually subscribe to channels where I just this person is amazing. I like this person, but then I like this person. I really want to support them. That's how I click subscribe, even though I may never actually want to click on their videos when they're releasing it.

[00:37:40]

I just love what they're doing and it's maybe outside of my interest area and so on, which is probably the wrong way to use the subscribe button. But I just want to say congrats.

[00:37:50]

It's a great work. Well, so you have to deal with all the space of people that see the subscribe on is totally different.

[00:37:57]

That's right. And so, you know, we we can't just close our eyes and say, sorry, you're using it wrong. You know, we're not going to pay attention to what you've done. We need to embrace all the ways in which all the different people in the world use the subscribe button or the like in the dislike button.

[00:38:14]

So in terms of signals of machine learning, using for the search and for the recommendation, you've mentioned titles like metadata, like text data that people provide description and title and maybe keywords. Maybe you can speak to the value of those things in search and also this incredible, fascinating area of the content itself.

[00:38:39]

So the video content itself trying to understand what's happening in the video. So YouTube released a data set that, you know, in the machine learning computer vision world. This is just an exciting space. How much is that currently? How much are you playing with that currently? How much is your hope for the future of being able to analyze the content of the video itself?

[00:38:59]

Well, we have been working on that also since I came to YouTube. So analyzing the content and analyzing the content of the video. Right. And what I can tell you is that.

[00:39:11]

Our ability to do it well is still somewhat crude. We can we can tell if it's a music video, we can tell if it's a sports video. We can probably tell you that people are playing soccer. We probably can't tell whether it's Manchester United or my daughter's soccer team. So these things are kind of difficult. And and using them, we can use them in some ways. So, for instance, we use that kind of information to understand and inform these clusters that I talked about and also maybe to add some words like soccer, for instance, to the video if if it doesn't occur in the title or the description, which is remarkable that often it doesn't.

[00:39:56]

But I one of the things that I ask creators to do is, is please help us out with the title of the description.

[00:40:04]

For instance, we were a few years ago having a live stream of some competition for World of Warcraft on YouTube. And it was a very important competition. But if you type World of Warcraft in search, you wouldn't find it.

[00:40:22]

Well, the Warcraft was in the title. World of Warcraft wasn't in the title.

[00:40:25]

It was match for seven, eight, you know, A team versus B team. And World of Warcraft wasn't the title. Just like, come on, give up.

[00:40:33]

Being literal, being literal on the Internet is actually very uncool, which is the problem.

[00:40:39]

Oh, is that right?

[00:40:40]

Well, I mean, in some sense, well, some of the greatest videos, I mean, there's a humor to just being indirect, being witty and so on and actually being, you know, machine learning algorithms want you to be, you know, literal. Right? You just want to say what's in the thing, be very, very simple. And in some sense, that gets away from wit and humor. So you have to play with both.

[00:41:04]

Right. So but you're saying that for now, sort of the content of the title, the content of the description, the actual text is is one of the best ways to pay for the algorithm to find your video and put them in the right cluster. That's right.

[00:41:20]

And and I would go further and say that if you want people, human beings to select your video in search, then it helps to have, let's say, World of Warcraft in the title because. Why would a person's you know, if they're looking at a they type World of Warcraft and they have a bunch of videos, all of whom say World of Warcraft except the one that you uploaded? Well, even the person is going to think, well, maybe this isn't somehow search made a mistake.

[00:41:46]

This isn't really about World of Warcraft.

[00:41:48]

So it's important not just for the machine learning systems, but also for the people who might be looking for this sort of thing. They get a clue that it's what they're looking for by seeing that same thing prominently in the title of the video.

[00:42:03]

OK, let me push back on that. So I think from the algorithm perspective, yes. But if they typed in World of Warcraft and saw a video that with the title simply winning and the thumbnail has like a sad orc or something, I don't know.

[00:42:20]

Right. Like, I think that's much it's it gets your curiosity up. And then if they could trust that the algorithm was smart enough to figure out somehow that this is indeed a World of Warcraft video that would have created the most beautiful experience, I think, in terms of just the wit and the humor and the curiosity that we human beings should have.

[00:42:43]

But you're saying I mean, realistically speaking, it's really hard for the algorithm to figure out that the content of that video will be a World of Warcraft.

[00:42:51]

And you have to accept that some people are going to skip it. Yeah, right. I mean, and so you're right.

[00:42:57]

The people who don't skip it and select it are going to be delighted. Yeah, but other people might say, yeah, this is not what I was looking for.

[00:43:06]

And making stuff discoverable, I think is what you're really working on and hoping. So, yeah.

[00:43:13]

So from your perspective, put stuff in the title of the sketch and remember the collaborative filtering part of the system. It starts by the same user watching videos together. Right. So the way that they're probably going to do that is by searching for them.

[00:43:30]

That's a fascinating aspect like Ant Colony. That's how they find stuff is. So I mean, what degree for collaborative filtering in general is one curious ant, one curious user, essentially sort of just a person who is more willing to click on random videos and sort of explore these cluster spaces. In your sense, how many people are just like watching the same thing over and over and over and over? And how many are just like the explorers, just kind of like click on stuff and then help help the other and the ants colony discover the cool stuff?

[00:44:07]

You have a sense of that at all? I really don't think I have a sense of relative sizes of those groups.

[00:44:12]

But I but I would say that, you know, people come to you, too, with some certain amount of intent. And as long as they do, the extent to which they they try to satisfy that. And that certainly helps our systems. Right. Because our systems rely on on kind of a faithful amount of behavior. The right like and there are people who try to trick us.

[00:44:35]

Right. There are people and machines that try to associate videos together that really don't belong together. But they're trying to get that association made because it's profitable for them. And so we have to always be resilient to that sort of attempt at gaming the system.

[00:44:54]

So speaking to that, there's a lot of people that in a positive way, perhaps, I don't know, I don't like it, but like to want to try to game the system to get more attention. Everybody creators in a positive sense, want to get attention. Right.

[00:45:08]

So how do you how do you work in this space when people create more and more? Sort of click baity titles and thumbnails, sort of very Testim Derek has made a video which basically describes that it seems what works is to create a high quality video, really good video, where people would want to watch and wants to click on it, but have clicked Baity titles and thumbnails to get them to click on it in the first place. And he's saying, I'm embracing this fact.

[00:45:38]

I'm just going to keep doing it. And I hope you forgive me for doing it and you will enjoy my videos once you click on them.

[00:45:45]

So in what sense do you see this kind of click bait style attempt to manipulate, to get people in the door to manipulate the algorithm or play with the algorithm or game the algorithm?

[00:46:00]

I think that that you can look at it as an attempt to game the algorithm. But even if you were to take the algorithm out of it and just say, OK, well, all these videos happen to be lined up, which the algorithm didn't make any decision about which one to put at the top or the bottom, but they're all lined up there. Which one are the people going to choose? And and I'll tell you the same thing that I told Derek is, you know, I have a bookshelf and they have two kinds of books on them, science books.

[00:46:29]

And I have my math books from when I was a student. And they all look identical except for the titles on the covers. They're all yellow, they're all from Springer, and they're every single one of them. The cover is totally the same.

[00:46:44]

Yes, right. Yeah. On the other hand, I have other more pop science type books and they all have very interesting covers. Right.

[00:46:52]

And they have provocative titles and things like that.

[00:46:56]

I mean, I wouldn't say that they're click Baity because they are indeed good books and I don't think that they crossed any line. But but, you know, that's just a decision you have to make, right? Like the people who who write classical recursion theory by purity, Freddie was fine with the yellow, tidal and the and nothing more.

[00:47:19]

Whereas I think other people who who wrote a more popular type book understand that they need to have a compelling cover and a compelling title. And and, you know, I don't think there's anything really wrong with that.

[00:47:35]

We do we do take steps to make sure that there is a line that you don't cross. And if you go too far, maybe your thumbnails especially racy or or, you know, it's all caps with too many exclamation points.

[00:47:50]

We observe that users are kind of, you know, sometimes offended by that.

[00:47:58]

And so so for the users who are offended by that, we will then depress or suppress those videos.

[00:48:07]

And which reminds me, there's also another signal where users can say, I don't know if I was recently added, but I really enjoy it.

[00:48:14]

Just saying I don't I didn't say something like I don't want to see this video anymore or something like like this is like there's certain videos to just cut me the wrong way.

[00:48:26]

Like just just jump out at me is like I don't want to I don't want this.

[00:48:28]

And it feels really good to clean that up, to be like I don't that's not that's not for me.

[00:48:35]

I don't know. I think that might have been recently added, but that's also a really strong signal. Yes, absolutely right.

[00:48:40]

We don't want to make a recommendation that people are unhappy with.

[00:48:46]

And that makes me that particular one makes me feel good as a user in general and as a machine learning person, because I feel like I'm helping the algorithm, my interaction. I don't always feel like I'm helping the algorithm. Like I'm not reminded of that fact. Like, for example, Tesla on autopilot. You know, I must create a feeling for their customers, for people that own testers, that they're helping the algorithm of test to be like they're all like are really proud.

[00:49:12]

They're helping me learn. I think YouTube doesn't always remind people that you're helping the algorithm get smarter. And for me, I love that idea. Like we're all collaboratively, like Wikipedia gives that sense that we're all together creating a beautiful thing. Each of us doesn't always remind me of that. That's this conversation is reminding me of that.

[00:49:34]

But, well, that's a good tip. We should keep that fact in mind when we design these features. I'm not sure I I really thought about it that way, but that's a very interesting perspective.

[00:49:44]

It's an interesting question of personalization that I feel like when I click like on a video, I'm just improving my experience.

[00:49:56]

It would be great. You would make me personally people are different, but make me feel great if I was helping. Also the YouTube algorithm broadly say something.

[00:50:04]

You know what I'm saying is that I don't know if that's human nature, but you want the products you love and I certainly love you. Do you want to help me get. Smarter and smarter, smarter, because there's some kind of coupling between our lives together, being better if job is better than I.

[00:50:23]

My life will be better. And that's that kind of reasoning. Not sure what that is. And I'm not sure how many people share that feeling. That could be just a machine learning feeling. But on that point, how much personalization is there in terms of next video recommendations?

[00:50:39]

So is it kind of all really boiling down to a clustering, like find the nearest clusters to me and so on, and that that kind of thing or how much this person has to meet the individual completely?

[00:50:52]

It's very, very personalized. So your experience will be quite a bit different from anybody else's who's watching that same video, at least when they're logged in. And the reason is, is that we found that that users often want two different kinds of things when they're watching a video. Sometimes they want to keep watching more on that topic or more in that genre. And other times they just are done and they're ready to move on to something else. And so the question is, well, what is the something else?

[00:51:29]

And one of the first things one can imagine is, well, maybe something else is the latest video from some channel to which you've subscribed. And that's going to be very different from for you than it is for me. Right. And and even if it's not something that you subscribe to, it's something that you watch a lot. And again, that'll be very different on a person by person basis.

[00:51:51]

And so even the watch next, as well as the homepage, of course, is quite personalized.

[00:52:00]

So what we met some of the signals, but what does success look like? What a success look like in terms of the algorithm creating a great long term experience for a user? Or put another way, if you look at the videos I've watched this month, how do you know the algorithm succeeded for me?

[00:52:20]

I think, first of all, if you come back and watch more YouTube, then that's one indication that you found some value from it to just the number of hours is a powerful indicator.

[00:52:30]

Well, I mean, not the hours themselves, but the fact that you return on another day.

[00:52:38]

So that's probably the most simple indicator. People don't come back to things that they don't find value in. Right. There's a lot of other things that they could do. But like I said, I mean, ideally, we would like everybody to feel that YouTube enriches their lives and that every video they watch is the best one they've ever watched since they've started watching YouTube. And so that's why we survey them and ask them, like, is this one to five stars?

[00:53:09]

And so our version of success is every time someone takes that survey, they say it's five stars. And if we ask them, is this the best video you've ever seen on YouTube, they say yes, every single time. So it's hard to imagine that we would actually achieve that. Maybe asymptotically we would get there. But but that would be what we think success is.

[00:53:33]

It's funny, I've recently said somewhere, I don't know, maybe tweeted, but that radio has this video on the the economic machine, I think it was called.

[00:53:44]

But the 30 minute video and I said is the the greatest video I've ever watched on YouTube. It's like I watched the whole thing and my mind was blown is a very crisp, clean description of how the at least the American economic system works. It's beautiful video. And I was just I wanted to click on something to say this the best.

[00:54:04]

This is the best thing ever.

[00:54:05]

Please let me I can't believe I discovered it. I mean, the views and the like reflect its quality, but I was almost upset that I haven't found it earlier and wanted to find other things like it.

[00:54:17]

I don't think I've ever felt that this is the best video ever watched.

[00:54:21]

And that was that. And to me, the ultimate utopia, the best experience is where every single video where I don't see any of the videos I regret, every single video I watch is one that actually helps me grow, helps me enjoy life, be happy and so on.

[00:54:39]

Well, so that's that's a heck of a that's one of the most beautiful and ambitious, I think, machine learning tasks.

[00:54:49]

So when you look at a society as opposed to the individual user, do you think of how YouTube is changing society when you have these millions of people watching videos, growing, learning, changing, having debates? Do you have a sense of. Yeah, what the big impact on society is? Because I think it's huge, but you have a sense of what direction we're taking this world.

[00:55:13]

Well, I mean, I think. You know, openness has had an impact on society already, there's a lot of what do you mean by openness?

[00:55:22]

Well, the fact that unlike other mediums, there's not someone sitting at YouTube who decides before you can upload your video whether it's worth having you upload it right or worth anybody seeing it really right. And so, you know, there are some creators who say, like, I wouldn't have this opportunity to to reach an audience. Tyler Oakley often said that, you know, he wouldn't have had this opportunity to reach this audience if it weren't for YouTube.

[00:55:59]

And and so I think that's one way in which YouTube has changed society. I know that there are people that I work with from outside the United States, especially from places where literacy is low. And they think that YouTube can help in those places because you don't need to be able to read and write in order to learn something important for your life. Maybe you know how to do some job or how to fix something. And so that's another way in which I think YouTube is possibly changing society.

[00:56:38]

So I've worked at YouTube for eight, almost nine years now. And it's fun because I meet people and, you know, you tell them where they where you work. You say you work on YouTube and they immediately say, I love you, too. Yeah, right. Which is great.

[00:56:54]

Makes me feel great. But then, of course, when I ask them, well, what is it that you love about YouTube? Not one time ever. Has anybody said that the search works outstanding or that the recommendations are great. What they always say when I ask them, what do you love about YouTube is they immediately start talking about some channel or some creator or some topic or some community that they found on YouTube and that they just love. Yeah.

[00:57:24]

And so that has made me realize that YouTube is really about the video and connecting the people with the videos and then everything else kind of gets out of the way.

[00:57:39]

So beyond the video, it's an interesting because you kind of mentioned creator. What about the connection with just the individual creators as opposed to just individual video? So like I gave the example, Ray D'Alessio video, that the video itself is incredible. But there are some people who are just creators that I love there.

[00:58:04]

One of the cool things about people who call themselves YouTube or whatever is they have a journey. They usually almost all of them are. They suck horribly in the beginning and then they kind of grow, you know, and then there's that genuineness in their growth.

[00:58:18]

So, you know, YouTube clearly wants to help creators connect with their audience in this kind of way. So how do you think about that process of helping creators grow, helping them connect with their audience, develop not just individual videos, but the entirety of a creator's life on YouTube?

[00:58:35]

Well, I mean, we're trying to help creators find the biggest audience that they can find.

[00:58:41]

And the reason why that's you brought up creator versus video. The reason why creator channel is so important is because if we have a hope of of people coming back to YouTube, well, they have to have in their minds some sense of what they're going to find when they come back to YouTube. If YouTube were just the next viral video and I have no concept of what the next viral video could be, one time it's a cat playing a piano and the next day it's some children interrupting a reporter.

[00:59:18]

And the next day it's, you know, some other thing happening then. Then it's hard for me to tell when I'm not watching YouTube say, gosh, I really, you know, would like to see something from someone or about something. Right. And so that's why I think this connection between fans and creator is so important for both, because it's it's a way of sort of fostering a relationship that can play out into the future, like talk about kind of a dark and interesting question in general.

[00:59:55]

And again, a topic that you or nobody has an answer to, but social media has a sense of, you know, it gives us highs and it gives us lows in the sense that so creators often speak about having sort of burn burn out and having.

[01:00:14]

Psychological ups and downs and challenges mentally in terms of continuing the creation process, there's a momentum, there's a huge excited audience that makes everybody that makes creators feel great.

[01:00:25]

And I think it's more than just financial. I think it's literally just they love that sense of community is part of the reason I upload to YouTube. I don't care about money. Never. Well, what I care about is the community. But some people feel like this momentum. And even when there's times in their life when they don't feel, you know, for some reason don't feel like creating. So how do you think about burnout, this mental exhaustion of some YouTube creators go through?

[01:00:55]

That's something we have an answer for. Is it something? How do we even think about that? Well, the first thing is we want to make sure that the YouTube systems are not contributing to this sense. Right.

[01:01:05]

And so we've done a fair amount of research to demonstrate that. You can absolutely take a break if you are a creator and you've been uploading a lot.

[01:01:17]

We have just as many examples of people who took a break and came back more popular than they were before, as we have examples of going the other way. Yeah. Can we pause on that?

[01:01:27]

Four seconds of the feeling that people have, I think is if I take a break, everybody, the party will leave. Right.

[01:01:36]

So if you can just linger on that. So in your sense that taking a break is OK? Yes, taking a break is absolutely OK. And the reason I say that is because we have we can observe many examples of being of creators coming back very strong and even stronger after they have taken some sort of break. And so I just want to dispel the myth that this somehow necessarily means that your channel is going to go down or lose viewers. That is not the case.

[01:02:12]

We know for sure that this is not a necessary outcome.

[01:02:16]

And so we want to encourage people to make sure that they take care of themselves.

[01:02:20]

That is job one, right? You have to look after yourself and your mental health.

[01:02:26]

And, you know, I think that it probably in some of these cases contributes to better videos once they come back. Right. Because a lot of people I mean, I know myself, if I'm burnt out on something that I'm probably not doing my best work, even though I can keep working until I pass out. And so I think that the taking a break may even improve the creative ideas that someone has.

[01:02:56]

OK, I think it's a really important thing to sort of to dispel. I think that applies to all of social media. Like literally I've taken a break for a day every once in a while.

[01:03:08]

Sorry, sorry if that sounds like a short time, but even like so email just taking a break from email or only checking email once a day, especially when you're going through something psychologically in your personal life or so on, or really not sleeping much because of work deadlines. It can refresh you in a way that's that's profound. And so the same applies there when you came back there. And it looks different actually when you come back, you sort of brighter side with some coffee, everything.

[01:03:37]

The world looks better. So it's important to take a break when you need it. So you've mentioned kind of the the YouTube algorithm isn't, you know, E equals EMC squared, it's not a single equation. It's it's potentially sort of more than a million lines of code. Sort of is it more akin to what autonomous, successful autonomous vehicles today are, which is they're just basically patches on top of patches of heuristics and human experts really tuning the algorithm and have some machine learning modules?

[01:04:15]

Or is it becoming more and more a giant machine learning system with humans just doing a little bit of tweaking here and there. What's your sense? First of all, do you even have a sense of what is the algorithm at this point and which however much you do have a sense, what does it look like?

[01:04:32]

Well, we don't usually think about it as the algorithm because it's a bunch of systems that work on different services. The other thing that I think people don't understand is that what you might refer to is the YouTube algorithm from outside of YouTube is actually a, you know, a bunch of code and machine learning systems and heuristics, but that's married with the behavior of all the people who come to YouTube every day.

[01:05:01]

So the people part of the code, essentially. Exactly right. Like if there were no people who came to YouTube tomorrow, then there the algorithm wouldn't work anymore. Right. Right. So that's a critical part of the algorithm.

[01:05:12]

And so when people talk about, well, the algorithm does this, the algorithm does that, it's sometimes hard to understand. Well, you know, it could be the the viewers are doing that. And the algorithm is mostly just keeping track of what the viewers do and then reacting to those things in sort of more fine grained situations. And I and I think that this is the way that the recommendation system and the search system and probably many machine learning systems evolve is, you know, you start trying to solve a problem.

[01:05:45]

And the first way to solve a problem is often with a simple heuristic. Right. And and, you know, you want to say, what are the videos we're going to recommend? Well, how about the most popular ones?

[01:05:56]

Right. And that's where you start. Right. And and over time, you collect some data and you refine your situation so that you're making less your mistakes and your you're building a system that can actually learn what to do in different situations based on some observations of those situations in the past. And and you keep chipping away at these juristic over time. And so I think that just like with diversity, you know, I think the first diversity measure we took was, OK, not more than three videos in a row from the same channel.

[01:06:31]

Right. It's a pretty simple heuristic to encourage diversity. But it worked. Right. Who needs to see four or five, six videos in a row from the same channel? And over time, we try to chip away at that and make it more fine grained and and basically have it remove the heuristics in favor of something that can react to individuals and individual situations.

[01:06:58]

So how do you you mentioned you know, we we know that something worked. How do you get a sense when decisions are kind of a B testing, that this idea was a good one? This was not so good. What's how do you measure that? And across which time scale? Across how many users that kind of that kind of thing?

[01:07:19]

Well, you mentioned that AB experiments and so just about every single change we make to YouTube, we do it only after we've run a big experiment.

[01:07:30]

And so in those experiments, which run from one week to months, we measure hundreds, literally hundreds of different variables and and measure changes with confidence intervals and all of them, because we really are trying to get a sense for ultimately. Does this improve the experience for viewers? That's the question we're trying to answer. And an experiment is one way because we can see certain things go up and down. So, for instance, if we noticed in the experiment people are dismissing videos less frequently or they're saying that they're more satisfied, they're giving more videos, five stars after they watch them than those would be indications of that the experiment is successful, that it's improving the situation for viewers.

[01:08:25]

But we can also look at other things like we might do user studies where we invite some people in and ask them, like, what do you think about this? What do you think about that? How do you feel about this and other various kinds of user research? But ultimately, before we launch something, we're going to want to run an experiment. So when we get a sense for what the impact is going to be, not just to the viewers, but also to the different channels and all of that, an absurd question.

[01:08:54]

Nobody knows what. Interesting, maybe there's an answer, but if I want to make a viral video, how do I do it? I don't know how you make a viral video. I know that we have in the past tried to figure out if we could detect when a viral video was going to go viral, you know, and those were you take the first and second derivatives of the view count and maybe use that to do some prediction. But but I can't say we ever got very good at that.

[01:09:27]

Oftentimes we look at where the traffic was coming from. You know, if it's if if a lot of the viewership is coming from something like Twitter, then then maybe it has a higher chance of becoming viral then maybe then if it were coming from surge or something. But that was just trying to detect a video that might be viral, how to make one like. I have no idea. So you get your kids to interrupt you while you're on the on the news.

[01:09:54]

Absolutely.

[01:09:55]

But after the fact, on one individual video sort of ahead of time, predicting is a really hard task. But after the video went viral, an analysis can you sometimes understand why went viral from the perspective of YouTube broadly? First of all, is it even interesting for YouTube that particular videos as viral or is does that not matter for the individual, for the experience of people?

[01:10:22]

Well, I think people expect that if a viral video is going viral and it's something they would be interested in, then I would I think they would expect YouTube to recommend it to them. Right.

[01:10:33]

So something is going viral. It's good to just let the wave let people ride the wave of its violence.

[01:10:40]

Well, I mean, we want to meet people's expectations in that way, of course. So, like like I mentioned, I hung out with Derek Miller a while ago, a couple of months back. He's actually the person who suggested I talk to you on this podcast. All right. Well, thank you, Derek.

[01:10:57]

At that time, he just recently posted an awesome science video titled Why Are 96 Million Black Balls on This Reservoir?

[01:11:07]

And in a matter of I don't know how long, but like a few days, you got 30 million views and still growing. Is this something you can analyze and understand why it happened, this video and one particular video like it?

[01:11:23]

I mean, we can surely see where it was recommended, where it was found, who watched it and those sorts of things. So it's actually sorry to interrupt.

[01:11:33]

It is the video which helped me discover who Derek is. I didn't know who he is before.

[01:11:38]

So I remember, you know, usually I just have all of these technical boring MIT Stanford talks in my recommendation because as I watch and then also sudden there's this black balls in reservoir video with like an excited nerd in the would like just and why is this being recommended to me.

[01:11:57]

So I clicked on and watched the whole thing. It was awesome. But and then a lot of people had that experience, like why was I recommend this? But they all of course watched it and enjoyed it, which is what's your sense of this just wave of recommendation that comes with this viral video that ultimately people get to enjoy after they click on it?

[01:12:17]

Well, I think it's the system, you know, basically doing what anybody who's recommending something would do, which is you show it to some people and if they like it, you say, OK, well, can I find some more people who are a little bit like them? OK, I'm going to try it with them. Oh, they like it. Do let me expand the circle or find some more people. Oh, it turns out they like it.

[01:12:35]

Do so. And you just keep going until you get some feedback that says no. Now you've gone too far. These people don't like it anymore. And so I think that's basically what happened.

[01:12:45]

Now, you asked me about how to make a video go viral or make a viral video.

[01:12:52]

I don't think that if you or I decided to make a video about ninety six million balls, that it would also go viral.

[01:12:59]

It's possible that Derek may like the canonical video about those black balls in a lake and that he did, actually.

[01:13:08]

Right. And so I don't know whether or not just following along is the secret.

[01:13:14]

Right? Yeah, but it's fascinating. I mean, just like you said, the algorithm sort of expanding that circle and figuring out that more and more people did enjoy it and that sort of phase shift of just a huge number of people enjoying it.

[01:13:28]

And the algorithm quickly, automatically, I assume, figuring that out. That's a I don't know the dynamics, the psychology of that is a beautiful thing.

[01:13:36]

And what do you think about the idea of of clipping?

[01:13:42]

Too many people annoyed me into doing it, which they were requesting it.

[01:13:46]

They said it would be very beneficial to add clips in like the the coolest points and actually have explicit videos like I'm really. A video like a short clip, just what the podcasts are doing it. Do you see as opposed to girls at timestamps for the topics, you know, they want the clip. Do you see YouTube somehow helping creators with that process or helping connect clips to the original videos? Was that just on a long list of amazing features to work towards?

[01:14:17]

Yeah, I mean, it's not something that I think we've we've done yet, but I can tell you that I think clipping is great and I think it's actually great for you as a creator.

[01:14:29]

And here's the reason. If you think about I mean, let's let's say the NBA is uploading videos of of its games. Well, people might search for warriors versus rockets or they might search for Steph Curry, and so a highlight from the game in which Steph Curry makes an amazing shot is an opportunity for someone to find a portion of that video. And so I think that you never know how people are going to search for something that you've created. And so you want to I would say you want to make clips and and add titles and things like that so that they can find it as easily as possible.

[01:15:15]

Do you ever dream of a future, perhaps a distant future when the TV algorithm figures that out, sort of automatically detects the parts of the video that are really interesting, exciting, potentially exciting for people and sort of clip them out in this incredibly rich space.

[01:15:34]

If you talk about if you thought even just this conversation, we probably covered 30, 40 little topics and there's a huge space of users that would find, you know, 30 percent of those topics really interesting. And that space is very different.

[01:15:50]

It's something that's beyond my ability to clip out. Right. But the algorithm might be able to figure all that out, sort of expand into clicks. Do you have a do you think about this kind of thing? Do you have a hope, a dream that one day they are going to be able to do that kind of deep content analysis?

[01:16:07]

Well, we've actually had projects that attempt to achieve this, but it really does depend on understanding the video. Well, and our understanding of the video right now is quite crude. And so I think it would be especially hard to do it with a conversation like this. One might be able to do it with, let's say, a soccer match more easily. Right. You could probably find out where the goals were scored. And then, of course, you you need to figure out who it was that scored the goal.

[01:16:41]

And and that might require a human to do some annotation. But I think that trying to identify coherent topics in a transcript like like the one of our conversation is, is not something that we're going to be very good at right away.

[01:16:59]

And I was speaking more to the general problem, actually, of being able to do both the soccer match and our conversation without explicit sort of almost my my hope was that there exists an algorithm that's able to find exciting things in video.

[01:17:17]

So Google now on Google Search will help you find the segment of the video that you're interested in.

[01:17:23]

So if you search for something like how to change the filter in my dishwasher, then if there's a long video about your dishwasher and this is the part where the person shows you how to change the filter, then then it will highlight that area and and provide a link directly to it.

[01:17:40]

And, you know, if from your recollection, do you know of the thumbnail reflects like what's the difference between showing the full video and the shorter clip? Do you know how it's presented in search results?

[01:17:51]

I don't remember how it's presented.

[01:17:53]

And the other thing I would say is that right now it's based on creator annotations. Got it.

[01:18:00]

So it's not the thing I'm talking about which is there. But but folks are working on the more automatic version. It's interesting, people might not imagine this, but a lot of our systems start by using almost entirely the audience behavior. And then as they get better, the refinement comes from using the content. And I wish I know there's privacy concerns, but I wish you to explore the space, which is sort of putting a camera on the users, if they allowed it right to study there, like I did a lot of emotion recognition work and so on to study actual sort of richer signal.

[01:18:44]

One of the cool things when you upload 360 like VR video to YouTube, and I've done this a few times, that I've uploaded myself this horrible idea. Some people enjoyed it. But whatever the video of me giving a lecture in 360 of the 360 camera, and it's cool because YouTube allows you to then watch where the people look at there's a heat map of where you know, of where the center of the VR experience was. And it's interesting because that reveals to you like what people looked at.

[01:19:14]

And it's it's not always what you were. It's not in the case of the lecture is pretty boring. It is what we're expecting. But we did a few funny videos where there's a bunch of people doing things and they yeah, everybody tracks those people. You know, in the beginning, they all look at the main person and they start spreading around and looking at other people. It's fascinating. So that kind of that's a really strong signal of what people found exciting in the video.

[01:19:38]

I don't know how you get that from people just watching, except they tuned out at this point like it's hard to measure.

[01:19:47]

This moment was super exciting for people. I don't know how you get that signal, maybe comment. Is there a way to get that signal where this was like this is when their eyes opened up and they're like like for me with the reality of it.

[01:19:59]

Right? Like, first I was like, OK, here's another one of these, like, dumbed down for you videos and then you start watching.

[01:20:06]

It's like, OK, there's really crisp, clean, deep explanation of how the economy works. That's where I, like, set up and watch right that moment. Is there a way to detect that moment?

[01:20:16]

The only way I can think of is by asking people to just label it.

[01:20:20]

Yeah, you mentioned that we're quite far away in terms of doing video analysis, deep video analysis, of course, Google, YouTube, you know, we're quite far away from solving autonomous driving problem to it's I don't know.

[01:20:37]

I think we're closer to that. Well, the you know, you never know. And the Wright brothers thought they're never they're not going to fly fifty years, three years before they flew.

[01:20:47]

So what are the biggest challenges, would you say? Is that the broad challenge of understanding video, understanding natural language, understand the challenge before the entire machine learning community or just being able to understand it? Or is there something specific to video that's even more challenging than understanding natural language, understanding?

[01:21:09]

What's your sense of what the biggest video is? Just so much information and so precision becomes a real problem. It's like a you know, you're trying to classify something and you've got a million classes. And you the distinctions among them, at least from a from a machine learning perspective, are often pretty small. Right. Like.

[01:21:39]

You know, you need to see this person's number in order to know which player it is and and there's a lot of players or you need to see, you know, the logo on their chest in order to know, like which which team they play for. And so and that's just figuring out who's who. Right. And then you go further and saying, OK, well, you know, was that a goal? Was it not a goal? Like is that an interesting moment, as you said, or is that not an interesting moment?

[01:22:08]

These things can be pretty hard, so.

[01:22:11]

OK, so John looking I'm not sure if you're familiar sort of with this current thinking and work. So he believes that self what is referring to self supervised learning will be the solution sort of to achieving this kind of greater level of intelligence. In fact, the thing he's focusing on is watching video and predicting the next frame. So predicting the future video. Right.

[01:22:35]

So for now, we're very far from that. But his thought is because it's unsupervised or as he refers to, a self supervised. You know, if you watch enough video, essentially if you watch YouTube, you'll be able to learn about the nature of reality, the physics, the common sense reasoning required by just teaching a system to predict the next frame. So he's confident this is the way to go.

[01:22:59]

So are you from the perspective of just working with this video, how do you think an algorithm that just watches all of YouTube stays up all day and night watching YouTube would be able to understand enough of the physics of the world about the way this world works, be able to do common sense reasoning and so on.

[01:23:23]

Well, I mean, we have systems that already watch all the videos on YouTube, right, but they're just looking for very specific things, right. They're supervised learning systems that are trying to identify something or classify something.

[01:23:38]

And I don't know if I don't know if predicting the next frame is really going to get there, because I don't I'm not an expert on compression algorithms, but I understand that that's kind of what compression video compression algorithms do, is they basically try to predict the next frame and and and then fix up the places where they got it wrong. And that leads to higher compression than if you actually put all the bits for the next frame there.

[01:24:03]

So so I don't know if I believe that just being able to predict the next frame is going to be enough because because there's so many frames and even a tiny bit of error on a per frame basis can lead to wildly different videos.

[01:24:19]

So the thing is, the idea of compression is one way to do compression is to describe through text was contain the video. That's the ultimate high level of compression. So the idea is traditional. When you think of video image compression, you're trying to maintain the same visual quality while reducing the size. But if you think of deep learning from a bigger perspective of what compression is, is you're trying to summarize the video. And the idea there is if you have a big enough neural network, that's by watching the next bit, trying to predict the next frame, you'll be able to form a compression of actually understanding what's going on in the scene.

[01:24:59]

If there's two people talking, you can just reduce the entire video into the fact that two people are talking and maybe the content of what they're saying and so on, that that's kind of the the open ended dream.

[01:25:12]

So I just wanted to sort of express what is interesting, compelling notion. But it is nevertheless true that video our world is a lot more complicated than we get credit for.

[01:25:24]

I mean, in terms of search and discovery, we have been working on trying to summarize videos in text or with some kind of labels for eight years at least. And and we're kind of so slow, so.

[01:25:41]

And so if you were to say the problem is one hundred percent solved and eight years ago was zero percent solved, how where are we on that timeline, would you say, to summarize a video?

[01:25:55]

Well, maybe less than a quarter of the way. So on that topic, what does YouTube look like 10, 20, 30 years from now?

[01:26:08]

I mean, I think that YouTube is evolving to take the place of TV. You know, I grew up as a kid in the 70s and I watched a tremendous amount of television. And I feel sorry for my poor mom because people told her at the time that it was going to wrap my brain and that she should kill her television. But anyway, I mean, I think that YouTube is, at least for my family, a better version of television, right?

[01:26:38]

It's one that is on demand. It's more tailored to the things that my kids want to watch. And also they can find things that they would never have found on television. And so I think that at least from just observing my own family, that's where we're headed, is that people watch YouTube kind of in the same way that I watch television when I was younger.

[01:27:03]

So from a search and discovery perspective, what do you what are you excited about in the five, 10, 20, 30 years?

[01:27:10]

Like, what kind of things? And it's already really good.

[01:27:13]

I think it's achieved a lot of of course, we don't know what's possible.

[01:27:18]

So it's it's the task of search, of typing in the text or discovering new videos by the next recommendation. I personally am really happy with the experience I continue to see. I rarely watch a video that's not awesome from my perspective. But what's what else is possible? What are you excited about?

[01:27:38]

Well, I think introducing people to more of what's available on YouTube is not only very important to YouTube and to creators, but I think it will help enrich people's lives because there's a lot that I'm still finding out is available on YouTube that I didn't even know. I've been working on YouTube eight years, and it wasn't until last year that I learned that that I could watch USC football games from the 1970s. Like I didn't even know that was possible last year.

[01:28:11]

And I've been working for quite some time. So, you know what was broken about? About that. But it took me seven years to learn that this stuff was already on YouTube even when I got here. So I think there's a big opportunity there.

[01:28:23]

And then, as I said before, you know, we want to make sure that YouTube finds a way to ensure that it's acting responsibly with respect to society and enriching people's lives. So we want to take all of the great things that it does and make sure that we are eliminating the negative consequences that might happen. And then lastly, if we could get to a point where all the videos people watch or the best ones they've ever watched, that would be outstanding to do.

[01:28:57]

You see, in many senses, becoming a window into the world for people. And it's especially with live video. You get to watch events. I mean, it's really it's the way you experience a lot of the world that's out there is better than TV in many, many ways. So do you see becoming more than just video DC creators creating visual experiences and virtual worlds that if I'm talking crazy now, but sort of virtual reality and entering that space, there's that, at least for now, totally outside of what YouTube is thinking about.

[01:29:30]

I mean, I think Google's thinking about virtual reality. I don't think about virtual reality too much. I know that we would want to make sure that YouTube is there when virtual reality becomes something or if virtual reality becomes something that a lot of people are interested in. But I haven't seen it really take off yet. Take off.

[01:29:55]

Well, the future is wide open. Christo's I've been really looking forward to this conversation has been a huge honor. Thank you for answering some of the more difficult questions I've asked. I'm really excited about what YouTube has in store for us. It's one of the greatest products I've ever used and continues. So thank you so much for talking to it.

[01:30:13]

It's my pleasure. Thanks for asking me. Thanks for listening to this conversation and thank you to presenting sponsor cash app, download it, use Legs podcast, you'll get ten dollars and ten dollars will go to First, a STEM education nonprofit that inspires hundreds of thousands of young minds to become future leaders and innovators. If you enjoy this podcast. Subscribe on YouTube. Give it five stars, an Apple podcast. Follow on Spotify, support on Patra or simply connect with me on Twitter.

[01:30:44]

Now, let me give you some words of wisdom from Marcel Proust, The real voyage of discovery consists not in seeking new landscapes, but in having new eyes. Thank you for listening and hope to see you next time.