Transcribe your podcast
[00:00:00]

The following is a conversation with Kevin Scott, the CTO of Microsoft. Before that, he was the senior vice president of engineering and operations at LinkedIn, and before that, he oversaw mobile ads engineering at Google. He also has a podcast called Behind the Tech with Kevin Scott, which I am a fan of. This was a fun and wide ranging conversation that covered many aspects of computing. It happened over a month ago before the announcement of Microsoft's investment company that a few people have asked me about.

[00:00:34]

I'm sure there'll be one or two people in the future. They'll talk with me about the impact of that investment. This is the artificial intelligence podcast, if you enjoy it, subscribe on YouTube, give it five stars and iTunes supported that Patriot or simply connect with me on Twitter at Lex Friedman spelled f our I d m n. And I'd like to give a special thank you to Tom and Delonte, big cousin, for their support of the podcast on Patrón.

[00:01:04]

Thanks, Tom in L.A.. Hope I didn't mess up your last name. Too bad your support means a lot and inspires me to keep the series going. And now here's my conversation with Kevin Scott. You've described yourself as a kid in a candy store at Microsoft because of all the interesting projects that are going on.

[00:01:41]

Can you try to do the impossible task and give a brief, whirlwind view of all the spaces that Microsoft is working in, both research and product?

[00:01:54]

If you include research, it becomes even even more difficult.

[00:01:59]

So. So I think, broadly speaking, Microsoft's. Product portfolio includes everything from, you know, big cloud business, like a big set of SAS services, we have, you know, sort of the original are like some of what are among the original productivity software products that everybody use. We have an operating system business. We have a hardware business where we make everything from computer mice and headphones to high end, high end personal computers and laptops. We have a fairly broad ranging research group where like we have people doing everything from economics.

[00:02:48]

Research like this is really, really smart.

[00:02:52]

Young economist Glenn while who, like my group, works with a lot, who's doing this research on these things called radical markets, like he's written an entire entire technical book about about this whole notion of radical markets, like the research group sort of spans from that to human computer interaction to artificial intelligence.

[00:03:13]

And we have we have GitHub, we have linked and we have a search advertising and news business. And like probably a bunch of stuff that I'm embarrassingly not recounting in this gaming to Xbox and so on.

[00:03:29]

Yeah, gaming for sure. Like I was I was having a super fun conversation this morning with with Phil Spencer. So when I was in college, there was this game that Lucas Arts made called Day of the Tentacle that my friends and I played forever. And like we're, you know, doing some interesting collaboration now with the folks who made my day at the tentacle.

[00:03:55]

And I was like completely nerdy out with Tim Schaefer, like the guy who wrote the technical this morning, just a complete fanboy, which, you know, sort of like happens a lot like, you know, Microsoft has been doing so much stuff at such breadth for such a long period of time that, you know, like being CTO, like, most of the time my job is very, very serious.

[00:04:19]

And sometimes, like, I get I get caught up and like how amazing it is to be able to have the conversations that I have with the people I get to have them with.

[00:04:31]

You have to reach back into the sentimental and what's the the radical markets and the economics.

[00:04:38]

So the the idea with radical markets is like, can you come up with a new market based mechanisms to you know, I think we have this we're having this debate right now, like, does capitalism work like free markets work?

[00:04:57]

Can the incentive structures that are built into these systems produce outcomes that are. Creating sort of equitably distributed benefits for every member of society.

[00:05:12]

You know, and I think it's a reasonable, reasonable set of questions to be asking, and so what, Glenn, it's like, you know, one mode of thought there, like if you have doubts that the that the markets are actually working, you can sort of like, tip towards like, OK, let's let's become more socialist and, you know, like have central planning and, you know, governments or some other central organization is like making a bunch of decisions about how, you know, sort of work gets done and, you know, like where the you know, where the investments and where the outputs of those investments get distributed.

[00:05:45]

Glen's notion is like lean more in to like the market based mechanism. So like, for instance, you know, this is one of the more radical ideas like suppose that.

[00:05:59]

You had a radical pricing mechanism for assets like real estate, where you were you could be bid out of your position in.

[00:06:12]

In in your home, you know, friends and so like if somebody came along and said, you know, like I've I can find higher economic utility for this piece of real estate that you're running your your business in, like then like you either have to, you know, sort of bid to sort of stay or like the thing that's got the higher economic utility, you know, sort of takes over the asset, which would make it very difficult to have the same sort of rent seeking behaviors that you've got right now because like if you did speculative bidding.

[00:06:52]

Like you, would you very quickly, like, lose a whole lot of money and so like the prices of the assets would be sort of like very closely index to like the value that they can produce. And like because like you'd have this sort of real time mechanism that would force you to sort of mark the value of the asset to the market, then it can be taxed appropriately. Like you couldn't sort of sit on this thing and say, oh, I like this house is only worth ten thousand bucks when, like, everything around, it is worth 10 million less.

[00:07:24]

So the incentive structure that or the prices match the value much better. Yeah.

[00:07:30]

So England has a much, much better job than I do at selling. And I probably picked the world's worst example, you know, and but like it and it's it's intentionally provocative, you know.

[00:07:41]

So like this whole notion, like I, you know, like I'm not sure whether I like this notion that, like we can have a set of market mechanisms where I could get bit out of it, out of my property, you know, but but, you know, like if you're thinking about something like.

[00:07:56]

Elizabeth Warren's wealth tax, for instance, like you would have, I mean, it'd be really interesting and like how you would actually set the the price on the assets and like you might have to have a mechanism like that if you put a tax like that in place.

[00:08:11]

It's really interesting that that kind of research, at least tangentially, is touching Microsoft research. Yeah. That you're really thinking broadly and maybe you can speak to.

[00:08:24]

This connects to A.I., So we have a candidate, Andrew Yang, who kind of talks about artificial intelligence and the concern that people have about, you know, automation's impact on society. And arguably, Microsoft is at the cutting edge of innovation and all these kinds of ways. And so it's pushing forward. How do you think about combining all our conversations together here with radical markets and socialism and innovation and the idea that Microsoft is doing and then Andrew Yangs worry that that that will that will result in job loss for the lower and so on.

[00:09:03]

How do you think about that?

[00:09:04]

I think it's sort of one of the most important questions and technology, like maybe even in society right now about how is I going to develop over the course of the next several decades and what's it going to be used for? And like what what benefits will it produce and what negative impacts will it produce? And you know, how who who gets to steer this whole thing? You know, I'll say it at the highest level.

[00:09:34]

One of the real joys of of getting to do what I do at Microsoft is Microsoft has this heritage as a platform company. And so, you know, like Bill, Bill has the staying that he said a bunch of years ago where, you know, the measure of a successful platform is that it produces far more economic value for the people who build on top of the platform than is created for the the platform owner or builder.

[00:10:04]

And I think we have to think about I that way.

[00:10:08]

Like a platform. Yeah. It has to like it has to be a platform that other people can use to build businesses to fulfill their creative objectives, to be entrepreneurs, to solve problems that they have and their work and in their lives. It can't be a thing where there are a handful of companies sitting in a very small handful of sit cities geographically who are making all the decisions about what goes into the eye and and like and then on top of like all this infrastructure, then build all of the commercially valuable uses for it.

[00:10:48]

So, like, I think like that's bad from a, you know, sort of, you know, economics and sort of equitable distribution of value perspective, like, you know, sort of back to this whole notion of, you know, like do the markets work? But I think it's also bad from an innovation perspective because, like, I have infinite amounts of faith and human beings that if you, you know, give folks powerful tools, they will go do interesting things.

[00:11:15]

And it's more than just a few tens of thousands of people with the interesting tools that should be millions of people with the tools.

[00:11:22]

So sort of like, you know, you think about the the steam engine in the late 18th century, like it was, you know, maybe the first large scale substitute for human labor that we've built like a machine.

[00:11:36]

And, you know, in the beginning, when these things are getting deployed, the folks who got most of the value from the steam engines were the folks who had capital so they could afford to build them. And like they built factories around them and businesses and the experts who knew how to build and maintain them.

[00:11:55]

But access to that technology democratized over time, like now, like a like an engine is not a it's not like a differentiated thing. Like there isn't one engine company that builds all the engines and all of the things that use engines are made by this company.

[00:12:11]

And like they get all the economics from all of that, like not like Billy DeMarco, but like they're probably, you know, we're sitting here in this room and like even though they don't, they're probably things like the the meme's gyroscope that are both of our vote, like they're like little engines, you know, sort of everywhere.

[00:12:30]

They're just a component in how we build the modern world like ideas to get there. Yeah.

[00:12:35]

So that's a really powerful way to think. If we think of it as a platform versus a tool that Microsoft owns as a platform that enables creation on top of it, as a way to democratize it, that's really that's really interesting, actually. And Microsoft throughout history has been positioned well to do that.

[00:12:55]

And, you know, the tieback to the to this radical market saying, like the.

[00:12:59]

So my my team has been working with Glenn on this and Jaron Lanier, actually. So, Jaron, is the like the sort of father of virtual reality, like he's one of the most interesting human beings on the planet, like a sweet, sweet guy. And so Jaron and Glenn and. The folks at my team have been working on this notion of data is labor or like they call it, data dignity as well.

[00:13:30]

And so the the idea is that if you you know, again, going back to this sort of industrial analogy, if you think about data, is the raw material that is consumed by the machine of A.I. in order to do useful things, then like we're not doing a really great job right now and having transparent marketplaces for valuing those data contributions. So and we all make them like explicitly like you go to LinkedIn, you sort of set up your profile on LinkedIn, like that's an explicit contribution, like, you know exactly the information that you're putting into the system.

[00:14:07]

And like you put it there because you have some nominal notion of like what value you're going to get in return. But it's like only nominal. Like you don't know exactly what value you're getting in return, like services free, you know, like it's low amount of like perceived. And then you've got all this indirect contribution that you're making just by virtue of interacting with all of the technology that's in your daily life.

[00:14:30]

And so, like what Glen and Jaron and this data dignity team are trying to do is like, can we figure out a set of mechanisms that let us value those data contributions so that you could create an economy and like a set of controls and incentives that would allow people to like maybe even in the limit, like earn part of their living through the data that they're creating.

[00:14:58]

And like you can sort of see it in explicit ways.

[00:14:59]

Are these companies like scaley and like there are a whole bunch of them in in China right now that are basically data labeling companies. So like you're doing supervised machine learning, you need you need lots and lots of label training data.

[00:15:15]

And like those people are getting competent, like who work for those companies are getting compensated for their data contributions into the system.

[00:15:23]

And so that's easier to put a number on their contribution because they're explicitly labeling. Correct. But you're saying that we're all contributing data and has always and it's fascinating to start to explicitly try to put a number on it. Do you think that that's possible?

[00:15:39]

I don't know. It's hard.

[00:15:40]

It really is, because, you know, we don't have as much transparency is as I think we need in, like, how the data is getting used. And it's super complicated. Like, you know, we we you know, I think it's technologists sort of appreciate, like, some of the subtlety there. It's like, you know, the data, the data gets created and then it gets you know, it's not valuable, like the data exhaust that you give off or the you know, the explicit data that I am putting into the system isn't value valuable.

[00:16:20]

It's super valuable atomically. It's only valuable when you sort of aggregate it together and, you know, sort of large numbers is true even for these folks who are getting compensated for, like labeling things like for supervised machine learning that like you need lots of labels to train, you know, a model that performs well. And so, you know, I think that's one of the challenges. It's like, how do you you know, how do you sort of figure out like because this data is getting combined in so many ways, like through these combinations, like how the value is flowing.

[00:16:53]

Yeah, that's that's that's tough. Yeah. And it's fascinating that you're thinking about this. And I wasn't even going into this conversation expecting the breadth of of research really, that Microsoft broadly is thinking about your thinking about Microsoft.

[00:17:09]

So if we go back to eighty nine when Microsoft released office or nineteen ninety when they released Windows 3.0, how's the in your view, I know you weren't there the entire, you know, there history, but how is the company changed in the 30 years since as you look at it now, the good thing is it's started off as a platform company like.

[00:17:35]

It's still a platform company like the parts of the business that are thriving and most successful are those that are building platforms like the mission of the company. Now, is the mission stage like changing a very interesting way, so. You know, back in 89, 90, like they were still on the original mission, which was like put a PC on every desk and in every home, and it was basically about democratizing access to this new personal computing technology, which when Bill started the company, integrated circuit microprocessors were a brand new thing.

[00:18:14]

And people were building, you know, homebrew computers, you know, from kits like the way people build ham radios right now.

[00:18:25]

And I think this is sort of the interesting thing for folks who build platforms in general.

[00:18:30]

Bill saw the opportunity there and what personal computers could do. And it was like it was sort of a reach, like you just sort of imagine like where things were, you know, when they started the company versus where things are now, like in success. When you've democratized a platform, it just sort of vanishes into the platform. You don't pay attention to it anymore. Like operating systems aren't a thing anymore. Like they're super important, like completely critical.

[00:18:55]

And like, you know, when you see one, you know, fail, like you just you sort of understand. But like, you know, it's not a thing where you're you're not, like, waiting for, you know, the next operating system thing in the same way that you were in 1995. Right. That's like 1995. Like, you know, we had Rolling Stones on the stage with the Windows 95 roll out, like it was like the biggest thing in the world.

[00:19:16]

Everybody, they lined up for it in a way that people used to line up for.

[00:19:20]

But, you know, eventually and like this isn't necessarily a bad thing. Like, it just sort of you know, it the success is that it's sort of it becomes ubiquitous. It's like everywhere, like human beings. When their technology becomes ubiquitous, they just sort of start taking it for granted. So the mission now that Satya rearticulated five plus years ago now when he took over as CEO of the company.

[00:19:46]

Our mission is to empower every individual and every organization in the world to be more successful. And so, you know, again, like that's a platform mission and like the way that we do it now is is different.

[00:20:03]

It's like we have a hyperscale cloud, the cloud or building our applications on top of like we have a bunch of A.I. infrastructure that people are building their A.I. applications on top of.

[00:20:13]

We have you know, we have a productivity suite of software like Microsoft Dynamics, which, you know, some people might not think is the sexiest thing in the world, but it's like helping people figure out how to automate all of their business processes and workflows and, you know, like help those businesses using it to, like, grow and be more.

[00:20:36]

So it's it's a much broader vision in a way now than it was back then, like it was sort of very particular thing.

[00:20:44]

And like now like we live in this world where technology is so powerful that it's like such a basic fact of life that, you know, that it both exists and is going to get better and better over time or at least more and more powerful over time. So, like, you know, what you have to do as a platform player is just much bigger, right?

[00:21:07]

There's so many directions in which you can transform. You didn't mention mixed reality to you know, that's. Yep. That's that's probably early days or depends how you think of it. But if we think in a scale of centuries, it's the early days of mixed reality. Oh, for sure. And say with Howland's, Microsoft is doing some really interesting work there. Do you do you touch that part of the effort? Well, what's the thinking, do you think, of mixed reality as a platform to Doescher when we look at what the platforms of the future could be?

[00:21:38]

So like fairly obvious that like A.I. is one like you don't have to. I mean, like that's you know, you sort of say it like some wired and, you know, like they they get it.

[00:21:49]

But like we also think of the like, mix reality and quantum is like these two. Interesting. You know, potentially quantum computing.

[00:21:58]

Yeah. OK, so let's get crazy then. So. So you're talking about some futuristic things here. Well, the mixed reality stuff is really not even futuristic is here. It is incredible stuff.

[00:22:10]

And it look and it's having its have an impact right now. Like one of the one of the more interesting things that's happened with mixed reality over the past couple of years that I didn't clearly see is that it's become the computing device for for folks who were doing their work, who haven't used any computing device at all to do their work before.

[00:22:34]

So technicians and service folks and people who are doing like machine maintenance on factory floors. So like they you know, because they're mobile and like they're out in the world and they're working with their hands and, you know, sort of servicing these like, very complicated things. They're they don't use their mobile phone and like they don't carry a laptop with them and, you know, they're not tethered to a desk.

[00:23:00]

And so mixed reality like where it's getting traction right now, where Hollywoods is selling a lot of a lot of units, is for these sorts of applications for these workers that it's become like I mean, like the people love it. They're like, oh, my God. Like, this is like for them, like the same sort of productivity boost that, you know, like an office worker had when they got their first personal computer.

[00:23:25]

Yeah, but you did mention it's certainly obvious I as a platform, but can we dig into it a little bit? How how does I begin to infuse some of the products in Microsoft, so currently providing training of, for example, neural networks in the cloud or providing retrain models or just even providing computing resources, whatever different inference that you want to do using the on that works well. How do you think of infusing the as a platform that Microsoft can provide?

[00:24:03]

Yeah, I mean, I think it's it's super. It's like everywhere. And like we we run these we run these review meetings now where it's been cited and like members of society as leadership team and like a cross-functional group of folks across the entire company who are working on like either A.I. infrastructure or like have some substantial part of their.

[00:24:33]

Of their product work using A.I. in some significant way. Now, the important thing to understand is like when you think about, like how the A.I. is going to manifest in like an experience for something that's going to make it better. Like, I think you don't want the the anus to be the first order thing. It's like whatever the product is. And like the thing that is trying to help you do like that. I just thought it makes it better.

[00:25:03]

And, you know, this is a gross exaggeration. But like I said, yeah, people get super excited about, like where the A.I. is showing up in products. And I'm like, do you get that excited about like where you using a hash table, like your code? Like it's just another just a tool. It's a very interesting programming tool, but it's sort of like it's an engineering tool. And so it shows up everywhere. So, like we've got dozens and dozens of features now in office that are powered by, like, fairly sophisticated machine learning, our search engine wouldn't work at all if you took the machine learning out of it.

[00:25:41]

The. Like, increasingly, you know, things like content, moderation on our Xbox and X cloud platform, when you mean moderation.

[00:25:55]

You mean like the recommendations, like showing what you want to look at next. No, no. It's like anti-bullying stunts. So the usual social network stuff they have to deal with. Yeah, correct.

[00:26:04]

But it's like really it's targeted it's targeted towards a gaming audience. So it's like a very particular type of thing where, you know, the the line between playful banter and like legitimate bullying is like a subtle woman like you have to. But it's sort of tough, like I have I'd love to if we could dig into it, because you're also you led the engineering efforts of LinkedIn. Yep. And if you look at if we look at LinkedIn as a social network and if we look at the Xbox gaming and the social components, the very different kinds of, I imagine, communication going on on the two platforms and the line in terms of bullying and so on is different on the platforms.

[00:26:48]

So how do you I mean, it's such a fascinating philosophical discussion of where that line is. I don't think anyone knows the right answer. Twitter folks are under fire now, Jack on Twitter for trying to find that line. Nobody knows what that line is, but how do you try to find the line for, you know, trying to prevent abusive behavior and at the same time let people be playful and joke around and that kind of thing?

[00:27:19]

I think in a certain way, like, you know, if you have. What I would call vertical social networks. It gets to be a little bit easier. So, like, if you have a clear notion of, like what your social network should be used for or like what you are designing a community around, then you don't have as many dimensions to your sort of content safety problem as you know, as you do in a general purpose platform.

[00:27:50]

I mean, so like I'm on LinkedIn, like the whole social network is about connecting people with opportunity, whether it's helping them find a job or to, you know, sort of find mentors or to, you know, sort of help them like find their next sales lead or to just sort of allow them to broadcast their their, you know, sort of professional identity to their their network of peers and collaborators and, you know, sort of professional community like that is I mean, like in some ways, like that's very, very broad.

[00:28:28]

But in other ways it's sort of, you know, it's narrow. And so, like, you can build a size, like machine learning systems that are.

[00:28:40]

You know, capable with those boundaries of making better automated decisions about like what is, you know, sort of inappropriate and offensive comment or dangerous comment or illegal content when you have some constraints, you know, same thing with the same thing with like the gaming, gaming, social networks of friends. It's like it's about playing games, not having fun. And like the thing that you don't want to have happen on the platform is why bullying is such an important thing, like bullying is not fun.

[00:29:10]

So you want to do everything in your power to encourage that not to happen.

[00:29:16]

And yeah, but I think it's it's sort of a tough problem in general, is one where I think, you know, eventually we're going to have to have.

[00:29:26]

Some sort of clarification from our policy makers about what it is that we should be doing, like where the lines are because it's tough, like you don't like in democracy, right? Like you don't want you want some sort of democratic involvement. Like people should have a say in like where where the lines lines are drawn. Like you don't want a bunch of people making, like, unilateral decisions. And like we are in a we're in a state right now for some of these platforms where you actually do have to make unilateral decisions, where the policymaking isn't going to happen fast enough in order to, like, prevent very bad things from happening.

[00:30:09]

But like, we need the policymaking side of that to catch up, I think is as quickly as possible, because you want that whole process to be a democratic thing, not a you know, not not some sort of weird thing where you've got a non representative group of people making decisions that have like national and global impact as fascinating, because the digital space is different than the the physical space in which nations and governments were established.

[00:30:37]

And so what policy looks like globally, what bullying looks like globally, what's healthy communication looks like global is this open question and we're often figuring it out together.

[00:30:49]

Yeah, I mean, with, you know, sort of fake news, for instance, and you fakes and fake news generated by humans.

[00:30:59]

Yeah.

[00:30:59]

So we can talk about defects like I think that is another like, you know, sort of very interesting level of complexity. But like if you think about just the written word. Right, like we have, you know, we invented papyrus, what, three thousand years ago where we you know, you could sort of put put word on on paper. And then 500 years ago, like, we we get the printing press, like where the word gets a little bit more ubiquitous.

[00:31:28]

And then, like, he really, really didn't get ubiquitous printed word until the end of the 19th century when the offset press was invented and then, you know, just sort of explodes and like, you know, the cross product of that and the industrial revolutions need for educated citizens resulted in like this rapid expansion of literacy and the rapid expansion of the word.

[00:31:53]

But like we had three thousand years up to that point to figure out like how to, you know, like what's what's journalism, what's editorial integrity like, what's, you know, what scientific peer review. It's like you built all of this mechanism to, like, try to filter through all of the noise that the technology made possible to like, you know, sort of getting to something that society could cope with.

[00:32:21]

And like, if you think about just the piece, the PC didn't exist 50 years ago. And so in like the span of, you know, like half a century, like we've gone from no digital, you know, ubiquitous digital technology to like having a device that sits in your pocket where you can sort of say whatever is on your mind to like, what would it Mary having? Or Mary Meeker just released her new, like, slide deck last week.

[00:32:49]

You know, we've got 50 percent penetration of the of the Internet to the global population. Like they're like three and a half billion people who are connected now.

[00:32:58]

So it's like it's crazy. Crazy, like inconceivable like how fast all of this happened.

[00:33:03]

So, you know, it's not surprising that we haven't figured out what to do yet.

[00:33:08]

But like, I've got to like we got to really, like, lean into this set of problems because, like, we basically have three millennia worth of work to do about how to deal with all of this.

[00:33:19]

And like, probably what, you know, amounts to the next decade worth of time.

[00:33:24]

So since we're on the topic of tough, you know, tough, challenging problems, let's look at more on the tooling side in the eye that Microsoft is looking at face recognition software. So there's there's a lot of powerful positive use cases for face recognition, but there's some negative ones and we're seeing those in different governments in the world. So how do you how does Microsoft think about the use of face recognition software as a platform and governments and companies, how do we strike an ethical balance here?

[00:33:59]

Yeah, I think we've articulated a clear point of view.

[00:34:04]

So Brad Smith wrote a blog post last fall, I believe that sort of like outline like Barry specifically what you know, what our what our point of view is there. And, you know, I think we believe that there are certain uses to which face recognition should not be put. And we believe, again, that there's a need for. Regulation there, like the government should, like, really come in and say that, you know, this is this is where the lines are.

[00:34:34]

And we very much wanted to like figuring out where the lines are, should be a democratic process, but in the short term, like we've drawn some lines where, you know, we push back against uses of facial recognition technology. You know, like this city of San Francisco, for instance, I think is completely outlawed. Any government agency from using facial recognition tech and like that may prove to be a little bit overly broad.

[00:35:03]

But for certain law enforcement, things like you, you really I would personally rather be overly cautious in terms of restricting use of it until like we have sort of defined a reasonable, you know, democratically determined regulatory framework for like where we we could and should use it.

[00:35:25]

And, you know, the other thing there is like we've got a bunch of research that we're doing on a bunch of progress that we've made on bias there. And like there are all sorts of like weird biases that these models can have, like all the way from like the most noteworthy one where, you know, you may have underrepresented minorities who are, like, underrepresented in the training data and then you start learning like strange things. But like there even, you know, other weird thing like we've I think we've seen in the public research, like models can learn strange things, like all doctors are men, for instance.

[00:36:11]

Yeah. I mean, so like, it really is a thing where.

[00:36:17]

It's very important for everybody who is working on these things before they push, publish, they launched the experiment.

[00:36:27]

They, you know, push the code, you know, online or they even publish the paper that they are at least starting to think about what some of the potential negative consequences are, some of this stuff.

[00:36:42]

I mean, this is where, you know, like the deep, vague stuff I find very worrisome just because. They're going to be some very good beneficial uses of like Gane generated imagery. And funny enough, like one of the places where it's actually useful is we're using the technology right now to generate synthetic synthetic.

[00:37:14]

Visual data for training some of the face recognition models to get rid of the bias, right. So, like, that's one like super good use of the tech, but like.

[00:37:25]

You know, it's getting good enough now where, you know, it's going to sort of challenge a normal human being's ability to like now you're just sort of say like it's it's very expensive for someone to fabricate a photorealistic fake video and like Gan's are going to make it fantastically cheap to fabricate a photorealistic fake video. And so, like, what you assume you can sort of trust is true versus like be skeptical about is about to change. And like, we're not ready for it.

[00:37:56]

I don't think the nature of truth. Right. That's it's also exciting because I think both you and I probably would agree that the way to solve to take on that challenge is with technology. Yeah, right. There's probably going to be ideas of of ways to verify which which kind of video is legitimate, which kind of is not. So to me, that's an exciting possibility, most most likely for just the comedic genius that the Internet usually creates with these kinds of videos.

[00:38:27]

Yeah. And hopefully will not result in any serious harm. Yeah.

[00:38:31]

And it could be, you know, like I think we will have technology to that may be able to detect whether or not something is fake or real, although, yeah, the fakes are pretty convincing, even like when you subject them to machine scrutiny.

[00:38:51]

But, you know, we also have these increasingly interesting social networks, you know, that are under fire right now for some of the bad things that they do like. One of the things you could choose to do with a social network is like you could you could use crypto and the networks to like have content signed where you could have a, like, full chain of custody that accompanied every piece of content. So, like, when you're viewing something and like you want to ask yourself, like how, you know, how much can I trust this?

[00:39:28]

Like, you can click something and like have a verified chain of custody that shows like, oh, this is coming from, you know, from this source. And it's like signed by like someone whose identity I trust. Yeah, yeah.

[00:39:41]

I think having, you know, having that chain of custody, like being able to like say, oh, here's this video. Like it may or may not have been produced using some of this deep fake technology. But if you've got a verified chain of custody where you can sort of trace it all the way back to an identity and you can decide whether or not, like, I trust this identity, like, oh, no, this is really from the White House or like this is really from the, you know, the office of this particular presidential candidate or it's really from, you know, Jeff Weiner, CEO of of LinkedIn, or Satya Nadella, the CEO of Microsoft like that might that might be like one way that you can solve some of the problem.

[00:40:17]

And so, like, that's not the super high tech. Like we've had all of this technology forever and back. But I think you're right. Like, it has to it has to be some sort of technological thing because the underlying tech that is used to create this is not going to do anything but get better over time. And the genie is sort of out of the bottle. There's no stuffing it back in.

[00:40:39]

And there's a social component, which I think is really healthy for a democracy where people will be skeptical about the thing they watch. Yeah. In general. So, you know, which is good skepticism in general is good and content so difficult in that sense of creating global skepticism about can they trust what they read? It encourages further research. I come from the Soviet Union where basically nobody trusted the media because you knew it was propaganda and that encouraged that kind of skepticism, encouraged further research about ideas as opposed to just trusting any one source.

[00:41:19]

Well, look, I think it's one of the reasons why the the you know, the scientific method and our apparatus of modern science is so good because you don't have to trust anything like you like the whole notion of, you know, like modern science beyond the fact that, you know, this is a hypothesis and this is an experiment to test the hypothesis.

[00:41:42]

And, you know, like this is a peer review process for scrutinizing published results. But like, stuff's also supposed to be reproducible. So, like, you know, it's been vetted by this process. But like you also are expected to publish enough detail where, you know, if you are sufficiently skeptical of the thing, you can go try to, like, reproduce it yourself.

[00:42:01]

And like, I don't know what it is like.

[00:42:04]

I think a lot of engineers are like this where like, you know, sort of this like your brain is sort of wired for for skepticism. Like you don't just first-order, trust everything that you see and encounter. And like you're sort of curious to understand, you know, the next thing. But I think it's an entirely healthy.

[00:42:24]

A healthy thing like we need a little bit more of that right now, so I'm not a large business owner, so I'm just I'm just a huge fan of many of Microsoft products. I mean, I still actually in terms of I generate a lot of graphics and images and I still use PowerPoint to do that piece. Illustrator for me, even professional, I sort of it is fascinating. So I wonder, what is the future of, let's say, Windows and office look like?

[00:42:59]

Is do you see it? I mean, I remember looking forward to XP was an exciting when XP was released, just like you said, I don't remember when ninety five was released, but for me it was a big celebration and one 10 came out. I was like, OK, what's nice. It's a nice improvement but so what do you see the future of these products.

[00:43:20]

You know, I think there's a bunch of excited. I mean on the office front there's going to be this like increasing productivity winds that are coming out of some of these high powered features that are coming like the products are sort of get smarter and smarter and like a very subtle way, like there's not going to be this big bang moment where, you know, like Clippy is going to reemerge and it's going to. Wait a minute.

[00:43:45]

OK, we'll have to wait, wait, wait. Could be coming back, but quite seriously. So injection of A.I., there's not much or at least I'm not familiar sort of assistive type of stuff going on inside the office products like a Clippy style assistant, personal assistant. Do you think that there's a possibility of that in the future?

[00:44:09]

So I think there are a bunch of like very small ways in which, like machine learning power at assistive things or in the product right now. So there are there are a bunch of interesting things like the auto response stuff's getting better and better, and it's like getting to the point where, you know, it can auto respond with like, OK, look, you know, this person's clearly trying to schedule a meeting. So it looks at your calendar and it automatically tries to find like a time in a space that's mutually interesting.

[00:44:44]

Like we we have this notion of Microsoft search where it's like not just Web search, but it's like search across, like all of your information that's sitting inside of, like, your Office 365 tenent and and like, you know, potentially in other products.

[00:45:04]

And like we have this thing called the Microsoft Graph that is basically an API for Iterator that, you know, sort of gets you hooked up across the entire breadth of, like all of the you know, like what were information silos before they got woven together with a graph like that is like getting increasing with increasing effectiveness, sort of plumbed into the into some of these other response things where you're going to be able to see the system, like automatically retrieve information for you.

[00:45:35]

Like if you know, like I frequently send out emails to folks where, like, I can't find a paper or a document or whatnot, there's no reason why the system won't be able to do that for you. And like, I think the it's building towards, like, having things that look more like like a fully integrated, you know, assistant. But like, you'll have a bunch of steps that you will see before you like. It will not be this like Big Bang thing where like Clippy comes back and you've got this, like, manifestation of, you know, like a fully, fully powered assistant.

[00:46:10]

So I think that's that's definitely coming, like all of the, you know, collaboration and coauthoring stuff, stuff's getting better. You know, it's like really interesting. Like if you look at how we use the office product portfolio at Microsoft, like more and more of it is happening inside of like teams as a canvas. And like it's this thing where, you know, you've got collaboration is like at the center of the product.

[00:46:38]

And like we we we built some, like, really cool stuff.

[00:46:43]

That's some of which is about to be open source that are sort of framework level things for doing, for doing coauthoring.

[00:46:53]

So in is there a cloud component to that. So on the Web or is it forgive me if I don't already know this, but with Office 365 we still the collaboration we do fordoing word, we still send the file around.

[00:47:07]

No, whether this is we're already a little bit better than that. And like, you know, so the fact that you're unaware of it means we've got a better job to do. Feel like helping you discover, discover this stuff. But yeah, I mean, it's already got a huge, huge cloud. And part of, you know, part of this framework stuff, I think we're calling it like I like we've been working on it for a couple of years.

[00:47:31]

Like, I know the the internal code name for it. But I think when we launch of the bill is called a fluid framework and but like what fluid it lets you do is like you can go into a conversation that you're having in teams and like referents, like part of a spreadsheet that you're working on where somebody is like sitting in the Excel canvas, like working on the spreadsheet with a chart or whatnot. And like you can sort of embed like part of the spreadsheet and the team's conversation where, like, you can dynamically updated and like all of the changes that you're making to the continuous, this object are like coord and everything is sort of updating in real time.

[00:48:11]

So you can be in whatever canvas is most convenient for you to get your work done.

[00:48:17]

So out of my own sort of curiosity, engineer, I know what it's like to sort of lead a team of 10, 15 engineers at Microsoft has I don't know what the numbers are, maybe 50, maybe 60 thousand engineers use.

[00:48:32]

I don't know exactly what the number is, has a lot of it's it's tens of thousands.

[00:48:35]

This is more than 10 or 15 for what? I mean, you've you've led a different sizes, mostly the large size of engineers.

[00:48:47]

What does it take to lead such a large group into a continual innovation, continue being highly productive and yet develop all kinds of new ideas and yet maintain what does it take to leave such a large group of brilliant people?

[00:49:06]

I think the thing that you learn as you manage larger and larger scale is that there are three things that are like very, very important for big engineering teams. Like one is like having some sort of forethought about what it is that you're going to be building over large periods of time.

[00:49:28]

Like not exactly like you don't need to know that, like, you know, I'm putting all my chips on this one product and like, this is going to be the thing. But like, it's useful to know, like, what sort of capabilities you think you're going to need to have to build the products of the future and then, like, invest in that infrastructure like weather. And I like I'm not just talking about storage systems or cloud APIs.

[00:49:49]

It's also like, what is your development process look like? What tools do you want? Like what culture do you want to build around? Like how you're sort of collaborating together to like, make complicated technical things and so like having an opinion and investing in that is like it just gets more and more important. And like the sooner you can get a concrete set of opinions, like the better you're going to be. Like you can wing it for a while.

[00:50:17]

Small scales like, you know, when you start a company like you don't have to be like super specific about it. But like the biggest misery's that I've ever seen as an engineering leader are in places where you didn't have a clear enough opinion about those things soon enough. And then you just sort of go create a bunch of technical debt and like culture debt that is excruciatingly painful to to clean up. So, like, that's one bundle of things like the other the other, you know, another bundle of things is like it's just really, really important to.

[00:50:56]

Like, have a clear mission that's not just some cute crap you say, because, like you think you should have a mission, but like something that clarifies for people like where it is that you're headed together. Like, I know it's like probably like a little bit too popular right now, but you've all Harare book Saipan's one of the central ideas in his book is that like storytelling is like the quintessential thing for coordinating the activities of large groups of people.

[00:51:37]

Like once you get past Dunbar's number and like I've really, really seen that just managing engineering teams like you can you can just brute force things when you're less than one hundred and twenty hundred fifty folks where you can sort of know and trust and understand what the dynamics are between all the people. But like past that, like things just sort of start to catastrophically fail if you don't have some sort of set of shared goals that you're marching towards.

[00:52:07]

And so, like, even though it sounds touchy feely, you know, like a bunch of technical people will sort of balk at the idea that, like, you need to, like, have a clear like the mission's like very, very, very important.

[00:52:20]

You've always write write stories. That's how our society that's the fabric that connects us, all of us is these powerful stories and that works for companies to it works for everything like I mean, even down to like, you know, you sort of really think about it.

[00:52:34]

Like a currency, for instance, is a story. A constitution is a story. Our laws are story.

[00:52:40]

I mean, like we believe very, very, very strongly in them. And thank God we do. But like they are they're they're just abstract things, like they're just words like we don't believe in them.

[00:52:52]

They're nothing.

[00:52:53]

And in some sense, those stories are platforms and the kind some of which stuff is creating platforms in which we define the future. So last question. What do you get philosophical, maybe bigger than even Microsoft? What do you think the next 20, 30 plus years looks like for computing, for technology, for devices? Do you have crazy ideas about the future of the world?

[00:53:21]

Yeah, look, I think we we're entering this time where we've got. We have technology that is progressing at the fastest rate that it ever has, and you've got. You got some really big social problems like society scale problems that we have to we have to tackle. And so, you know, I think we're going to rise to the challenge and like figure out how to intersect, like all of the power of this technology with all of the big challenges that are facing us, whether it's global warming, whether it's like the biggest remainder of the population boom is in Africa for the next 50 years or so.

[00:54:03]

And global warming is going to make it increasingly difficult to feed the global population in particular, like in this place where you're going to have like the biggest population boom. I think we, you know, like A.I. is going to like if we push it in the right direction, like it can do like incredible things to empower all of us to achieve our full potential and to, you know, like live better lives.

[00:54:32]

But like that also means focus on, like, some super important things, like how can you apply it to health care to make sure that, you know, like our air quality and cost and sort of ubiquity of health coverage is is better and better over time.

[00:54:52]

Like, that's more and more important. Every day is like in the United States and like the rest of the industrialized world.

[00:55:00]

So Western Europe, China, Japan, Korea, like you've got this population bubble of like aging, working, you know, working age folks who are, you know, at some point over the next 20, 30 years, they're going to be largely retired. And like you, you're going to have more retired people than working age people. And then like you've got, you know, sort of natural questions about who's going to take care of all the old folks and who's going to do all the work and the answers to, like, all of these sorts of questions, like where you're sort of running into, you know, like constraints of the, you know, the the world and a society has always been like, what tech is going to, like, help us get around this?

[00:55:40]

Like when I was when I was a kid in the 70s and 80s, like we talk all the time about like like population boom, population boom. Like we're going to like we're not going to be able to, like, feed the planet. And like we were like right in the middle of the Green Revolution where like this massive technology driven increase and productivity like worldwide. And like some of that was like taking some of the things that we knew in the West and like getting them distributed to the, you know, to the to the developing world.

[00:56:12]

And like part of it were things like, you know, just smarter biology, like helping us increase it.

[00:56:20]

Like we don't talk about, like, the overpopulation anymore because like we can more or less, we sort of figured out how to feed the world. Like that's that's a technology story. And so, like, I'm I'm super, super hopeful about the future and in the ways where we will be able to apply technology to solve some of these super challenging problems.

[00:56:45]

Like I, I like one of the things that I, I'm trying to spend my time doing right now is trying to get everybody else to be hopeful as well, because, you know, back to Harare, like we we are the stories that we tell, like if we if we get overly pessimistic right now about like the the potential future of technology, like, you know, like we may fail to fail to get all of the things in place that we need to, like, have our best possible future.

[00:57:13]

And that kind of hopeful optimism. I'm glad that you have because you're leading large groups of engineers that are actually defining that, writing that story, that are helping build that future, which is super exciting. Yeah. And I agree with everything you said, except I do hope Clippy comes back.

[00:57:33]

We miss them. I speak for the people. So thank you so much for talking to us. Thank you so much for having me.

[00:57:39]

It was a pleasure.