Transcribe your podcast
[00:00:00]

You don't need to track people to make money because it's all based on the contextual advertising, so the better user experience would be like not tracking anybody at all.

[00:00:16]

Hello and welcome. I'm Shane Parrish and this is the Knowledge Project podcast, exploring the ideas, methods and mental models that will help you learn from the past and what other people have already figured out. You can learn more and stay up to date at F-stop Blogs podcast. We also have a newsletter. It's called Brain Food. It comes out every Sunday. It's short and sweet, contains recommendations for articles we found online, books were reading, quote, we find interesting and worth thinking about and so much more.

[00:00:45]

It's all signal, no noise. And it's one of the most popular things we've ever done. And it's free. You can learn more at F-stop Blogs newsletter. That's F-stop DOT Blog Slash newsletter. Most of the guests on the show are subscribers to the weekly email as well. So check it out. On the show today is Gabriel Weinberg, the founder and CEO of Duck Duck Go and author of Super Thinking, a giant book of mental models. We're going to explore data privacy, touch on a bit of cybersecurity, kuldeep on mental models, and explore how you can help your kids think better.

[00:01:19]

You don't want to miss this. It's time to listen and learn. Before we get started, here's a quick word from our sponsor. Farnam Street is sponsored by Medlab for a decade, Medlab has helped some of the world's top companies and entrepreneurs build products that millions of people use every day. You probably didn't realize that at the time, but odds are you've used an app that they've helped design or build apps like Slack, Coinbase, Facebook Messenger, Oculus, Lonely Planet and so many more.

[00:01:56]

Medlab wants to bring the unique design philosophy to your project. Let them take your brainstorm and turn it into the next billion dollar app from ideas sketched on the back of a napkin to a final ship product. Check them out at Medlab Dutko. That's Medlab Dutko. And when you get in touch, tell them chainsawing you. Man, I'm so happy to be sitting here with you. I'm happy to be here. When did you know that you wanted to go to MIT?

[00:02:23]

So I grew up in Atlanta, which the closest thing I had to kind of the engineering world was Georgia Tech. And I went to like an engineering summer camp there.

[00:02:33]

And I thought at an early age I wanted to be like a computer engineer, but I honestly had no knowledge of the different schools.

[00:02:41]

So I think I had an irrational want to go to MIT that didn't really understand the landscape of the schools. But around like in maybe freshman year of high school, I had my mind set. For some reason I'm going to MIT.

[00:02:53]

And that was the only school you applied to? I.

[00:02:56]

So I applied early to MIT and got in early and then I was going to apply to other schools, but I just stopped applying because I figured I'd just go.

[00:03:05]

And you you did a double degree. What was your degrees in?

[00:03:08]

So I was an undergrad in physics and I met my wife there. She wasn't good at math. And then I started a company right out of school, actually graduated early.

[00:03:20]

It's a different story and did this company for a few years and then went back to graduate school to MIT and technology and public policy, which is like this interdisciplinary economics law, public policy degree. You couldn't get in a year.

[00:03:35]

Exactly. Well, my wife stayed there for nine years. She got a Ph.D. in operations research. So I was already like hanging out at MIT.

[00:03:42]

So even after I graduated, I was running my company like they had a library and I would go audit classes. And her department was in the interdisciplinary area. Operations research is like statistics and logistics. And so there happened to be this other cool interdisciplinary program in her building. So I was like, I'm going to apply here. And I actually ran my second company while I was doing this degree and just still working in the library and that kind of thing.

[00:04:05]

What was the company that you're so that company? The first company I started was a social I mean a a educational software company right out of school. The idea was to help use the Internet to increase student achievement by increasing parental involvement. Good in theory, 15 years too early.

[00:04:24]

That stuff still just happening now. I know you have kids and it's still hard to get what they're getting in school. Yeah, ironically.

[00:04:30]

Oh, totally. So that's what we were trying to do it.

[00:04:31]

But this whole like Portland system for lesson plans, things like that. So that ultimately didn't go anywhere. Second company was a really early, effectively social networking company that we eventually sold to classmates dot com.

[00:04:46]

It's it was kind of the almost antithesis of Facebook and their privacy practices. Like we had a paid business model.

[00:04:55]

We didn't collect much information. It was mainly about getting in touch with old friends. But we also just got destroyed by Facebook and so eventually sold it and then moved on to current company, which is tough to go.

[00:05:10]

And so let's dive into that. I mean, let's start with the privacy issues online in terms of not just Dutko, but in terms of Facebook.

[00:05:20]

And what should people understand about this that they don't understand about online privacy and like just something as innocuous as a search and what happens behind the scenes?

[00:05:30]

Yeah, I mean, there's so many aspects of privacy and harms. And I think it's interesting.

[00:05:38]

Awareness has really started to increase over the last two years, but it's all over the map terms of what people really care about. So in 2013, when our company started to take off, that was the Snowden revelations. That was duck, duck, go. Yeah. Do you want to explain that just for a second? Yes, sir.

[00:05:54]

So, Doctor Go is a search engine that doesn't track you. So it's an alternative to Google. And then in the last couple of years we've expanded and now we offer a suite of privacy tools. We call it's one download. It's called Privacy Essentials or privacy browser, and it includes private search tracker blocking. So it blocks like Google and Facebook trackers across the Web and encryption, which helps stop your ISP from snooping on you. But in any case, it wasn't even though privacy like was a main thing for our search engine, it wasn't the main attractor to it until 2013 when the Snowden revelations happened.

[00:06:30]

Right. And then that was all about I remember that this government surveillance. Right. So it's like, is the government spying on you, which is now becoming more an increasing thing. China's doing some crazy things, but in the intervening years, corporate surveillance has really spiked and they're actually related because government surveillance actually uses the data you give to corporations. That's where they get it. Most of the data from anyway. And so they're linked. But corporate surveillance, namely like corporations tracking you all over the Web and building big profiles about you, has additional harms.

[00:07:04]

So we're talking about corporations tracking you as you go along the web, not corporations tracking your employees. That's right.

[00:07:10]

Effectively, as you browse the web, there are hidden trackers on all the pages. Are you visit? What's a tracker tracker is so you go to say a random sighting of the New York Times or something, you expect to interact with The New York Times, but in reality, they're about like 30 other companies sitting on that page that The New York Times embedded into it.

[00:07:30]

Now, some of them are for like analytics, like who's looking at this page?

[00:07:35]

Some of them are for advertising, like they run Google ads on their page. Some of them are to help them run their site. But the privacy policy of these companies are generally they can use your data for whatever they want. And so they may help The New York Times in X, Y or Z purpose. But then they're aggregating all that information into big profiles and they can go sell.

[00:07:53]

And ultimately, this comes in large profiles about you that are being bought and sold kind of into something as simple as Google Analytics can then be used to affect your search engine ranking, because Google would know that the page on the site is popular and therefore it could bump it up. Yeah, and so there's a few kind of negative effects of that. One is all the creepy ads that follow you around is one that people notice. A second one, the one you're just mentioning is called the filter bubble.

[00:08:18]

So when you're on search results, say, on Google or something happens on Facebook, news feed or even things like Netflix and whatnot, they're using your search history or your profile information or your browsing history to show you certain things they think you're going click on.

[00:08:34]

But by doing that, they have to hide things that they don't think you can click on and that can end up being opposing viewpoints.

[00:08:40]

And if you we've done several studies on political terms and people in different areas are seemed completely different results when you search for things like gun control or political candidate name or something like that, and effectively, especially in search engines, your expectation is you're seeing the results, but you're not getting the results. And it effectively biases you when you're doing research.

[00:09:02]

That's that's what harm the other harms people are justifiably really upset about are things like data breaches which relate to identity theft.

[00:09:11]

And so people would just like to reduce their digital footprint online, kind of block all this extraneous information that's getting out there about them. And that's kind of what we allow them to do.

[00:09:20]

Do you think that everybody should have the same search results?

[00:09:24]

So, yeah, I think by default, because with duck duck go. That's what. Yeah. So undoctored. Go. We don't have a filter bubble. And so if you search for the same topic at somebody else at the same time you'll get the same results. I think that things can be made opt in. So like we have a region setting where you can say, I want more Canadian results if you want, but everybody wants more Canadians or do they?

[00:09:49]

You should opt in to something like that. Right.

[00:09:52]

And so there are some notions like that, but I don't think it should be personalized in the way it is by without you opting into it.

[00:09:58]

Do you think we're moving to a world where the least of our worries is sort of like personalized search and we're moving into more personalized content where you have almost the same fundamental content, but now you can shift perspectives based on if you're a Democrat or Republican and you can sort of play into that or is that too sci fi?

[00:10:20]

So then again, I mean, I think that's generally the danger of that. That I see coming is the correlation causation piece is hard here, but polarization has just increased, at least in the US and it's relatively, you know, correlated with all of this social unrest, social media and tracking. And so the thought is at least they're interrelated in the sense that people are just in their own bubbles. You know, they're not really seeing these opposing viewpoints.

[00:10:47]

Is exposure to opposing viewpoints enough? Probably not. But it's like a necessary but not sufficient condition, right? If you don't have any, you're definitely not going to engage with the other side.

[00:11:00]

And so aside from the annoying ads that we see that follow us around, is that the Facebook tracking pixel? Sort of, yeah. So on the all these sites that are lurking behind webpages.

[00:11:12]

Right. If you go and look and like analyze which companies are there, Google's on about 80 percent of the top million sites. And a lot of that is Google Analytics, but also like the Google ads and stuff like that, Facebook's on twenty five percent and that's this Facebook pixel where it could be the Facebook login or just sites on Facebook ads, or they want to just track what audience and then sort of ads on Facebook. And so Facebook and Google are the biggest trackers.

[00:11:39]

I remember last year we took the Facebook tracking pixel off for so tired of people like targeting our audience.

[00:11:47]

Yeah. And it was like, oh, that's really weird when somebody pointed it out to me and then I was like, OK, we'll just delete that. That'll be that's that's the key thing you're getting at is like you put it on for your purposes. Yeah. But it can be used for everybody. Yeah. That's the real problem. If it was if these companies were just a service provider for you, that would be fine. But it's that the data is leaking beyond the use that you set up for it.

[00:12:08]

Yeah, I remember the first time I went into my parent's Facebook account and tried to create an ad targeting front of St. Peter's and I was like. What this is crazy, like my mom is not connected to, like Farnam Street Girls. It was a little weird. What do you think? Sort of like how do you think this plays out over like this student stuff came out, which is more about sort of like what governments were doing to protect themselves.

[00:12:32]

Actually, maybe that's a good discussion.

[00:12:34]

What do you feel the role of government should be in data privacy?

[00:12:39]

So we're having kind of a moment now in the U.S., at least after. So what happened is, is Europe took the lead on this and they made something called GDP or the general data protection regulation, which is actually a restatement, something else they made in 1995. They've been kind of answer this for decades, but they're really putting in some rights and saying privacy is a fundamental right that should extend to online and you should have the right to know what companies are tracking you and know what information they're collecting on you and have a right to opt out of sharing it.

[00:13:12]

Among other things. It's a long regulation. Yeah. So then California and the US, there hasn't been a general privacy regulation ever. There are some specific ones like there's HEPA for medical data and there's stuff for financial data, but there's nothing like general like that. So California, through a ballot initiative last year, passed the first one, which goes into effect in 2020. And that set a clock for the federal government to either make their own and preempt California or have to deal with California's law.

[00:13:46]

And so that's kicked off this process. What's happening right now over the next, you know, nine months before that law goes into effect to think about what federal privacy legislation should look like? And I think to answer the question that, yeah, I think the government needs to play a role here because I mean, my general proposition I've said for years is if the data can be collected and if it can be used to make profits, it's going to be used for that by people unless the government stops it.

[00:14:14]

And it's an extra naledi in general, like the it has a lot of data breaches. A good example. All these companies are collecting information about you. The general have terrible security practices, Equifax, et cetera, et cetera, and eventually they get hacked. All the data gets out there. Tons of people have identity theft and they didn't really pay to protect the data. And so that's challenging for everybody. And so the government definitely has a role to clear it all up.

[00:14:41]

Do you think that the data collected by these companies and maybe I'll add some context to this, is like the way that I'm thinking about this problem is if you're a company like Google or Facebook, you or Amazon, you get a lot of data that other companies don't have access to you. Data can be used to extend your competitive advantage. It can be used to provide better recommendations that the guy in their garage can no longer do. Does the government have a role in equalizing the playing field?

[00:15:13]

And what does that mean and how does it look? Does it mean that data becomes a public good, like a public utility, or does it mean like how do you see that?

[00:15:21]

Yeah, so you're asking a separate question. So independent of data privacy legislation, which gives individuals rights over their data. There's a whole antitrust competition question going on. I actually recently testified in front of the U.S. Senate on this notion about gender privacy legislation. But my message was exactly what you're saying is that in the world there's a duopoly in digital advertising and it's Google and Facebook owns basically all digital advertising on the Web, which relates to all this data profiles.

[00:15:49]

And the more data they have, the better advertising they have, the more targeting they can do. And so they're locking up all the growth in the industry. And that's what's putting out of business things like media companies as well as small business ad tech players and other advertisers.

[00:16:05]

No one can really compete with it. So, yes, I think if that that has a inherent network effect in it because, you know, they're getting all the data and all the eyeballs and it just is a feedback loop that is getting more and more. And so unless the government breaks that feedback loop, I don't see another way out of it. What I suggested to the Senate was a few things.

[00:16:27]

One is so they're collecting all this data from different kind of business lines like Facebook bought Instagram, they bought WhatsApp, Google runs Google Analytics.

[00:16:36]

You know, they run DoubleClick, they run a bunch of other things. The government and Germany is starting to do this already can prohibit them from sharing data across their business units. Right. That's kind of like a lot of financial firms. Exactly like a Chinese wall situation. That's kind of a weaker argument, which I think would work of the get to break them up kind of argument and spin off some of these different business units. And then there are other things that you could do, too.

[00:17:03]

But like if fundamentally you don't kind of break up these data monopolies in some way, then I don't think there's I think it has reduced competition and there's no easy way for other people to compete.

[00:17:13]

Do you think the. The solution is making the data bring up the data monopoly over sort of like the monopoly writ large in terms of the size of a company and maybe the scope of their influence.

[00:17:27]

It's funny because I think that there are I think this is where it gets complicated. I think there are actually different issues like so I think you have to say about data monopoly. But independent of that, some of the companies have had bad practices within their core markets. Right. Like, we've had some trouble competing insurge just because of search issues that are specific to search. Nothing to do with the data monopoly. And Europe has come out and you find, say, Google a bunch of times related to those practices.

[00:17:59]

So I view them as independent and generally they've been independent agencies pursuing them in the government. The government has. The reality is these companies are huge. They do a lot of different things. And so the government, different government agencies need to be kind of on top of it.

[00:18:13]

I want to push back just a little bit on like whether they're independent, because maybe I'm thinking about this wrong and you can correct me.

[00:18:19]

But one of the reasons that they get so big is because of the data. Yeah, no, no, they're related.

[00:18:24]

OK, yeah. I just think that there are you can address them separately, sort of.

[00:18:28]

Well also if you clean up one, it won't necessarily clean up the other because like say you say you regulated privacy data monopoly piece there already so big that they can still use their way to do other monopolistic practices, which would be bad for competition outside of their monopoly piece. So you need someone else looking at those competitive issues.

[00:18:50]

Do you think that like Apple strategy is to be more privacy focused?

[00:18:55]

Do you think that that is a good business strategy?

[00:18:59]

Yes, I do. I think that in general, so people have had this misnomer that like, you know, people say they care about privacy, but they don't really care.

[00:19:10]

They won't do anything. We've been running our own kind of research data on this for years now, really trying to dig in to what people are actually willing to do and what people do. And there's got two realities in there. One is, as awareness has increased people's willingness and interest in doing something that has greatly increased and we've seen about a third of people take some actual significant action to reduce their digital footprint online in some way, shape or form, like a third of all people, third, a third of a subset.

[00:19:40]

Now, a third of all people are pulser in America, but we think it's related similarly to other major developing countries across socioeconomic across.

[00:19:49]

Yeah, that's a whole other interesting thing is like another people think it's like, oh, young people don't care about privacy and old people care or vice versa. And we didn't find that to be the case either. It's like across all demographics, decently similar percentage. I've been taking action. I don't know what exactly threads them. It's not education, it's not income, it's not age, it's not political affiliation.

[00:20:09]

None of the main factors that we get on, like the general survey, stuff like spiked. But we consistently see this growing percentage of people taking trying to take at least actual action. Some of the actions they take, they don't realize actually don't help them. But they did try to take it out. But they're trying to. But they're actually trying. Exactly. And Apple recognizes this. And I think they also view it on principle that kind of like we do, that privacy is a fundamental right and people deserve it.

[00:20:37]

Therefore, and if there's no alternative, then people have no choice. And that's kind of what's going in that. And so people had wanted to take action. There just was no action for them to take break. Right. If you have two options and both of them are bad yourself to pick what people want, a smartphone, you know. But now that Apple, say, offering a smartphone has privacy, those group of people can now choose Apple smartphone or they can choose to go as a search engine.

[00:21:01]

I think that people need, like, actual tools that work. So if our search engine was terrible, like they're not willing to have too much sacrifice. Right. But let's suppose it's equal. And at that point, it's somewhat a no brainer, at least for this group.

[00:21:16]

You get to switch, you get to reduce your digital footprint and you still get all the tools and good search results.

[00:21:24]

Do you think people will pay for privacy? Like, is this the future where like Apple, for example, and their integration and their solutions, you tend to pay a higher price for that? Do you think we like to go? Do we end up paying for that in the future or is that so?

[00:21:41]

Who is to look at this?

[00:21:42]

And so in the Apple case, yes, I think people are paying for, you know, additional services that are private in search. You're already paying for it in Google already because you're paying with your data.

[00:21:56]

Just isn't Humetewa. Yeah, and in effect, it does, because there was there's a real interesting story the other day in New York Times about one of their partners quit Facebook for five months or has quit for five months. And that kind of recounting their stories for doing so. And one of the things that happened is. He started spending 50 percent less. Oh, that's interesting, yeah, because he wasn't being exposed to the exactly. And it's I think there's two components there.

[00:22:24]

One, he wasn't being exposed to all the ads to some of the ads are just purely manipulative. I mean, if you think about AB testing and it's hard to know where this line is, but they they're made up of in two ways. One, advertisers target hyper target.

[00:22:39]

So they find the exact group that is, you know, willing to buy this thing, whatever it is, and to they test different messages over and over and over again, different images until they find what is the exact emotional trigger, all these different influence, mental models, if you will, of like things that will actually get you to buy.

[00:22:55]

So some of those things he maybe didn't buy because he just didn't see that some of those things maybe he didn't buy because he didn't see that. Right. Manipulative message. And so that manipulation piece, which is probably I don't know exactly what it's called, 50 percent.

[00:23:09]

That was money directly out of his pocket, right? Yeah. So he did pay for that. He was paying for that. Just he wasn't didn't sign his credit card up to them. Right. But some decent portion was because of using that free product. So one answer is you're already paying.

[00:23:26]

The second answer is for something like Google. We if we charge for our search engine, I just think a lot less people would use there would be some people who pay for it. And maybe we make the same amount of money, but we just expose less people to it because it is a big friction to take out your credit card free versus not free.

[00:23:42]

So paying interest, almost like wide influence versus like running a profitable business in a way. And so, yeah. So I could see some markets. The direct answer question in some markets definitely that already have a precedence of pain, pay a little extra for more privacy in a market that is already free. It's hard to see a mainstream provider go paid on it because there's so much friction going from free to paid.

[00:24:09]

That said, we just to be clear then, the question is like, how do we make money? There are advertising in and of itself is a good thing to say is not bad. We make this distinction. This is what I said, the Senate between contextual advertising and behavioral advertising. So contextual advertising is just advertising based on a context of the page.

[00:24:26]

So like you search for something, I'd have to go search your car. You get a car ad. Right. I don't know anything about you to serve that ad. That's different than the ads that follow you around based on all of your history, that contextual advertising.

[00:24:40]

It used to be the way the Internet worked and it could go back to that.

[00:24:43]

And if you imagine, like here on New York Times, again, you look at an article, it's an ad just based on that article or an ad based on the video. You're watching the content of the video, nothing to do with you. And so that contextual advertising doesn't have all these privacy problems of manipulation and filter bubble and all these.

[00:25:01]

And my argument is, is basically we've been up this feedback loop of trying to improve behavioral advertising over the last 10 years, but no one's been doing that for contextual advertising.

[00:25:11]

If we haven't, in effect, actual advertising, it might be just as good right now. And in fact, there are some decent experience that that have just come out like after GDP in Europe. The New York Times decided to ditch all the behavioral advertising and put up contextual advertising online and they increased revenue. This is one of the first data points out of this. And so I think this actually has legs and that you could return to a world of contextual advertising and that's all to go does.

[00:25:37]

And you think about it, that's really still what Google does most of the time on Google search, because all the search ads are still based on keywords.

[00:25:44]

They're just using your search history in addition to target ads on YouTube, Gmail and the rest of the Internet. But they don't really have to for the search ads where they make most of their money. It's still all contextual.

[00:25:55]

Why do you think revenue went up at The New York Times?

[00:25:58]

So I think it strikes me as surprising. Yeah, I think it went up probably because I don't know the whole story, but my guess is they their first party ads, now they're selling it themselves. Right. And so instead of using Ambrosi. Yeah. They're cutting out a whole middleman at that. Right.

[00:26:14]

So if you even get near the same advertising rate, then you're cutting out the 15 to 20 percent you're paying to a broker. Yeah, sometimes the spots could be worse for, you know, for different advertisers, depending on the network.

[00:26:26]

I know we get less money on Farnam Street because I'm in charge of sponsorship, which isn't based on page views. And there's no tracking code. There's nothing embedded in there where companies get information on you. Whereas if we use like Google AdWords based on page views, we probably get more money. But then you tie into this whole tracking contextualization and it would be kind of creepy to go read an article and like mental models and see this book that you search for like last week.

[00:26:52]

Pop up in the sidebar.

[00:26:54]

Exactly. I mean, for for you in particular, you know, there's a good, you know, daring fireball. And John Gruber. Yeah, he sells his own ads directly. And I think if you have a niche audience like you do, you might actually make more money if you start selling them. I gotta talk to high priced price. Yeah. Yeah, I want to come back to that. For oh, cool, I want to come back to the privacy and hardware and I want to I want to take the role and expand it.

[00:27:23]

So thus far we've sort of been talking about the individual in the role of the ecosystem and sort of like e commerce.

[00:27:30]

I want to talk about how that role shifts. As you think we move up in the stack and buy up in the stack. I'm talking like you have a local government, you have a state government, you have a country, and then globally, do you where do you where do you stand on privacy?

[00:27:47]

Should the government be able to break into your phone? Do you think that those companies should make that physically impossible or should the government have a key like walk me through? Yeah.

[00:27:59]

So I there's a lot of nuances in here, but I'm of the view that privacy is kind of two fundamental principles.

[00:28:09]

Right? Privacy is a fundamental right to have a right to have kind of private communications. In the past, that was you could write a letter, you could kind of write things down on paper. Now it's in your phone. I still think you have that should have that fundamental right.

[00:28:22]

Just to be clear, that's not like a U.S.. Right. You believe that's a fundamental and worldwide.

[00:28:27]

Yeah. You should be able to, like, send an email to somebody anywhere in the world and have that.

[00:28:33]

It's not currently the case on email. Yeah, but yeah, I think so.

[00:28:36]

I mean, this this gets into you don't want to tell other countries how to run their government, so maybe it's to.

[00:28:47]

You know, maybe it's crossing some line, but at least in the US, the idea has always been like you have a private thought space. You know, you have you should have private space. Your home is your home.

[00:28:58]

And without kind of really good cause, no one should invade that privacy. I think if you start invading privacy in that way, it leads to all sorts of bad effects, such as chilling effect is a big, big one, which is, you know, people change their behavior once they're feel that they're being watched or they don't have privacy. There's all sorts of good examples of this to give you to really back to search. Right after Snowden, there were two studies that really interesting.

[00:29:27]

One is people stop searching in terrorism related terms because presumably they feel that they might have been investigated if they searched terms like al-Qaida and stuff like that went down. Wikipedia also saw lower amounts of their articles being looked at. That's interesting. I never knew that. Additionally, someone did another same time analysis and found that health related terms also went down on Google. So people stop searching as much for their own like health symptoms and things like that. And the thought was now that they are aware that governments or their data was generally being tracked, they're worried about it leaking either and coming back to haunt them, either from maybe insurance companies or other things.

[00:30:10]

And so it went down.

[00:30:12]

But that is just immediately harmful for people because they're not searching about their health information. There are two minor examples, but when you get into things like China and stuff like that, you get into much more major ones that people are afraid to say anything bad about the government, things like that, or have public discourse. So I think that you want a private space and able to be able to communicate with people completely privately. And so I do think that end to end encryption, which is kind of what you're talking about, like built into its message.

[00:30:43]

Now, that was the Apple FBI case. You should be able I should be able to send you a text message, you know, one, you're able to read it, except the two of us. Now, the counterargument to that is, well, what what if there's sending really bad things and stuff like that? I think that is a bit of a distracting argument for a couple of reasons. One is that there's actually more data about you online than ever before.

[00:31:07]

So what does that mean? Like there is also you're putting out all sorts of data online. It's much easier to surveil you than it has ever been in the past, like your phone setting out location information, which isn't to say encrypted. You're even on the text that we send each other. There's a record that it went through the Internet at least. And so maybe can't read the message, but you can probably figure out the metadata that we talked.

[00:31:31]

All of this information didn't exist in the past. And so if you look at like the overall information law enforcement, the government has to surveil people even with.

[00:31:41]

End to end encryption exist in it's way higher than it was 10 years ago and way, way higher than it was 20 years ago before. So you believe the government should be able to use metadata?

[00:31:51]

I think we should restrict a lot more. I'm just saying that the general argument of that I need every last piece of information is not a good one in in their own argument because they have more information and all the tools that have existed are much better than they were 20 years ago. No, I think the metadata should be restricted to I really think we should be able to communicate privately without anyone knowing. But the reality is of the current tools that is discoverable, because when we send a message across each other to the Internet, unless we're using even more sophisticated tools, it's it's figure out of all that we talked.

[00:32:26]

So if the government had the capability, I'm just trying to think through this so individuals could communicate completely privately. Governments could communicate completely privately.

[00:32:37]

Well, is the other point about the key. If you enable a back door, which is what some governments are requesting, and it is really naive to think that that back door would only be accessed by that government who wants it. And that was one of Apple's major points, is that if I build this backdoor for you, U.S., China is going to use it. You know, and not only that, it's probably going to get leaked and all sorts of bad actors are going to have it.

[00:33:04]

So like, for example, and how you're out there and stuff. But the NSA made all sorts of hacking tools. Those eventually got leaked and they're all floating around the Internet now. Tools that the NSA made for themselves are now available to all sorts of other people. And so all this stuff has a tendency of coming out and being used and exploited if it exists. And that's Apple's point is like the only way to protect against that is to have no back door.

[00:33:30]

So do you think that that will be fundamentally impossible? I mean, I think in the San Bernardino case with the the iPhone, Apple basically said it wasn't possible. Then they admitted it was possible, but they weren't going to do it.

[00:33:44]

Do you think we move to a world where they actually want to be able to say it's impossible for us to do it?

[00:33:53]

I think that would be ideal. I don't know how possible that world actually is because you have to manufacture the phones and things like that, but it's become harder and harder like built into hardware and things like that on the Internet was the closest thing we have to that is tor, you know, you can basically go into and you explain to us, yeah, you can basically go into a special browser that kind of works like it did the movies where, you know, where it bounces off things before it gets to the person.

[00:34:27]

And so it makes like three hops into random places before it gets to the other side. And so if both of us are using Tor to communicate, it's extremely difficult to break that. And as far as we can tell, Tor itself isn't really broken. There have been some exploits in in the Tor browser, like using Flash. But but things like the news organizations now to give anonymous tips to them, they use something called Secure Drop, which works over Tor, and it's worked decently well.

[00:34:59]

And so that's what you need to do in the non hardware world to communicate even without any device.

[00:35:04]

How do you think about government's role, not with their citizens, but government's role in terms of like the world and like, walk me through your thinking should I don't want use specific countries, but should one country be able to eavesdrop on another country that they feel might pose a significant threat to their citizens?

[00:35:30]

I mean, I don't think there's any stopping countries from doing that. They've done that from the beginning of time. And so they're going to have an incentive and desire to have espionage agencies. I don't think there's much possible to prohibit that.

[00:35:45]

But then the tools exist, the very tools that you would use against your citizens, you would be using against foreign entities.

[00:35:55]

They're making all the tools they can in that process, I'm suggesting is people like Apple should be the opposite of that. And I would have to go and enable citizens to do the best they can to avoid that kind of surveillance, either corporate or government. What got you so interested in privacy?

[00:36:14]

Interesting. I'm interested in general public policy of all kinds. So I you know, to talk about mental book, I'm interested in like everything and have, you know, this degree in general public policy and technology policy. I started this company and kind of backed in a search for privacy in particular. I had I had sold my other company. I was trying to find something that I really wanted to do. And I got really interested in search in general.

[00:36:40]

And eventually I started this search engine that wasn't initially private.

[00:36:46]

I hadn't thought about it. It was more just trying to make better results. And then I got some initial messages after launching about like, well, what's your policy on search, privacy and stuff? And so I put myself out of just interest, took more of a deep dive on it and decided, oh, wow, this is really personal data, arguably the most personal like why you search you kind of give your financial problems, personal problems to your search engine.

[00:37:13]

You don't need to track people to make money because it's all based on the contextual advertising. So the better user experience would be like not tracking anybody at all.

[00:37:20]

And and so just made that unilateral decision. It was just me at the company was a one person company, and that's basically I fell into it.

[00:37:29]

Funny aside, we used to remember when Gmail used to display ads based on what was in your E? Yeah. Yeah.

[00:37:36]

So a couple of friends of mine, we figured this out. And what we would do is we put keywords at the bottom of the email, but put them in white text so people would get ads that have nothing to do with the email.

[00:37:47]

But we could control what ads that we're seeing based on the keywords we're putting in.

[00:37:51]

If I had a search work, I'm really curious on the back end, like, how does, like, walk me through?

[00:37:58]

We can get on this and go into as much detail as you want. Like walk me through how Duck Duck builds a search database, how it determines relevance for a search term at a particular time, and then how that relevant changes over time.

[00:38:15]

So I started out originally doing all the builder and index kind of stuff. And then quick you quickly realized it is really expensive.

[00:38:27]

And also building your own index is like going out, touching every Web site, keeping a copy of it, like somehow sort of. Yep. Creating metadata based on the site. Yep. I just want to make sure that everybody understands the terms again. Yeah, right. I crawlin the whole Internet basically and I realized a few things. One is there were a bunch companies who raised a bunch of money, all trying to do that, basically compete with Google at their core game and.

[00:38:51]

And realize it's someone it was somewhat unnecessary because there were a few different Web indexes at the time that existed and they had reached diminishing returns and quality. And so you could get good Web index results from Google Bing, which wasn't being at the time it was missing. You get it from Yahoo! Indexes at the time, Asked Jeeves was still crawling and there were a bunch of bunch of ones and I realized that.

[00:39:19]

It was unnecessary then to focus on that and you could treat it as a commodity and instead try to differentiate on other things. I also realized that we were one of the first people to instant answers. And so what's happened, Major, in the last 10 years in search has been when you search about half the time, you don't click or use the actual you just start to get results.

[00:39:40]

Yeah, well, not the autocomplete stuff like the you get weather or product instant answers or sports scores or local restaurant listing or whatever it is. Those are not for Web indexes. Those are Wikipedia information. You know, those are very verticalization indexes. And so no one was doing that. And we started. So what I originally figured out was I'll just try to not spend all my time on the Web index and focus on instant answers. And so the first thing that I focused on was indexing in Wikipedia and getting really good knowledge, graph answers and over time.

[00:40:22]

So the other end of thing I ended up doing this was way back in 2007. There was a lot of spam and web search back then. There was like all sorts of content forms and things like that.

[00:40:30]

So we did crawl the web and looking for spam results and then remove them from the index. It was like a negative index. In any case, what's basically happened is we're a small organization, so we're like 65 people.

[00:40:46]

And, you know, now it's gotten so much more expensive to crawl the web and make that deep and is probably hundreds of million dollars a year maybe to maintain it, just to maintain it, arguably maybe a billion. And most innocense went away.

[00:41:00]

And so we've decided to do also the individual verticals have gotten more expensive. So like Yelp is like the best in class for restaurants. And you have maps. There's only a couple of mouse providers. So we've decided to do instead is maintain privacy at all costs, but try to work with the best partners and all these different verticals. And so then our basic technology, therefore, is getting a query, trying to figure out what is the best instant answer provider and who are all the providers we need to work with to get the deliver you the set of results and any given search.

[00:41:35]

It might be two or three different providers, you know, to like forget the Web services like Google give you access to their sort of like we currently door with Google at the moment, but there's still Yahoo! Microsoft. We work with Google on YouTube because it's the only place where the videos are.

[00:41:51]

So video search is another big one. You know, we work with Yelp and TripAdvisor and there's literally like four hundred different partners because it goes into all the different longtail stuff, some of the stuff we sort of built ourselves. So like we still use Wikipedia and build it ourselves, you know, like stack exchange, stack overflow that stuff. We build our index of that.

[00:42:10]

So we've a bunch of these, we actually have to ourselves, but we also work with all these other partners and then try to, on any given search, figure out is this a local search, is this a program in search, is this a kind of search, is this and then give you results back?

[00:42:24]

And then we have to make decisions of like how news is this search? Like where should we display news results should be at the beginning.

[00:42:31]

Should it be in the fourth result, that kind of thing you mentioned like indexing Wikipedia and then you mentioned knowledge graph. What does that mean?

[00:42:40]

So Wikipedia, like you read this page programmatically on Wikipedia.

[00:42:46]

Unfortunately, Wikipedia is somewhat unstructured. It would be better if it was way more structured.

[00:42:51]

What's the difference what a like structured would be is like, you know, here is the person's name and their age and everything like that.

[00:43:02]

Wikipedia is, you know, edited by anybody and it's got all these different rules, but it's basically just a lot of text with some markup on it.

[00:43:09]

And so if you index it, you have to like figure out all the different edge cases for all the different markup and somehow extract, you know, your name and age and all that stuff. Right. How do you do that? A lot. There's a lot of code.

[00:43:24]

So you do you literally just write, like for one ofthe exceptions or is it there's a lot of case exceptions. Yeah. They they try to standardize it, but there's it's not very standardized as well as it should be. Yeah. It's a large code base, but you're effectively standardizing it for them in a weird way.

[00:43:42]

Or am I misunderstanding. Like yeah. We're kind of reverse engineering it if you will, you know, like turning it into a standardized format. Yeah. Right. For our own purposes.

[00:43:51]

And then could you give that information back to Wikipedia and have them adjust the page at least in like metadata or something?

[00:43:56]

There have been some projects like that that haven't gone very well. OK, yeah, other people have tried. There's like a thing called wiki data and such. But yeah, that's effectively what we do for Wikipedia.

[00:44:07]

And so the graph then is like the author is this. Yeah. Relevant dates. Are this the.

[00:44:12]

And then how do you pick out like when you read text through the computers, like what is the. What does the algorithm considering in terms of trustworthiness of the site?

[00:44:24]

What is the algorithm considering in terms of like how do we find the relevant passage in the text we're reading for the query that you're it?

[00:44:34]

It really depends on the vertical.

[00:44:36]

And so, like, can you walk me through a couple? Yeah.

[00:44:39]

So like local search, for example, like restaurants and whatnot really focuses on like when you do a local search, usually have a what, an aware class. So it's like I want pizza in New York or whatever. Right. Mean too broad.

[00:44:55]

But if millions. Yeah.

[00:44:59]

And so like for that one, once we or those local search we have to figure out like the what and the where and like pull that out kind of semantically from the query and then do different things with them, like figure out what the location is. And even that gets more complicated because if you like Paris or something in Paris, are you talking about France? You're talking about Paris, like Indiana. And this bunch, Paris is in the U.S. I don't know if I find that right.

[00:45:23]

And and same thing with the what like is pizza category or is that a name of a restaurant?

[00:45:29]

And so on. These verticals you have different indexes and different rules and so forth, like stock exchange, for example. The stack overflow as another example, which is programming some answers like that has a lot of special characters that you have to do stuff with, like people type in programming like Plus's and and less than greater than signs. And in other context, you would ignore those. Right. But in the super relevant. Yeah, they're super relevant.

[00:45:57]

So it is what ends up happening is you really need to first decide what domain this thing is in and then apply different rules, different relevancy roles and search roles per that domain and do different things with it. How should we think about search moving forward, like what is it like?

[00:46:18]

I'm just trying to contextualize a couple of things here, one of which is like an argument that Google would probably make is like if you're standing in New York and you search for pizza, like you probably want local contextualised results without your permission, we'll just give it to you, because that's giving you a better one thing about local, which is interesting is you can do that.

[00:46:36]

So without tracking people on doctor go because your location is generally sent over by your browser automatically. And we can also do that contextually and throw away the location after that query. So how does that work? Like your location is sent by your browser? Yeah, so. So no matter what browser using Chrome, Internet Explorer, well, there's some location built into your IP address.

[00:46:56]

So when you connect to anything, you're setting up your dress over and so you can get a rough location based on that here. Also have something called the browser location and like sites request. You know, can you use your location with this app or the site that'll send a more granular location based on GPS on your phone and so on, that go if it's a very hyper local query, we'll ask for that. Would you allow us to use the location?

[00:47:21]

But we promise to throw it out immediately. We don't store it at all. We just use it to bring you back the local restaurants around where you are and then it's gone. So there's no location history or anything like that.

[00:47:33]

And then we're search going like in your mind in the next five years, like, is it going to voice, is it going more mobile? Is it going to let you search based on images or any of these trends that we sort of like here about a little bit in the media? But are they are they what you're witnessing?

[00:47:50]

So what's interesting about the history of search is those trends are real, but they they've never supplanted earlier search queries.

[00:47:58]

So like the rise of mobile search queries has been great in the last 10 years. But desktop search queries actually haven't decreased. And so it's people are searching more. So that's one.

[00:48:09]

And more of that new share is mobile. Yeah, definitely.

[00:48:12]

Mobile voice has been slower than I think people predicted, but it's definitely a new thing. Like people aren't searching less on mobile devices, just like other queries. So that's one thing. The second thing is it's the rise of instant answers. And just like more instant, more like not waiting through results has just continued and continued. And I imagine that'll keep continuing.

[00:48:37]

Does that mean that like Google, Facebook, Amazon, to some extent, Dudko, you get to play gatekeeper, like do you get to pick winners?

[00:48:45]

There's a big archiving role. And I think that's that's been a bunch of the competitive complaints.

[00:48:51]

You know, that's an argument and search right now of like the user experience is arguably better if you give somebody one result. But it's a gatekeeping role.

[00:49:03]

How do you think that that should be handled?

[00:49:06]

So we've been trying to do but we haven't had the luxury because we don't make our own ones.

[00:49:13]

All the case all the time is trying to give people choice. And so, like, when we have map results, for example, people can choose which provider they want to do the directions when they go off of doctor, go onto a directions provider.

[00:49:29]

And so that's the model I think would be good, is giving people more choice. But even that has its own problem because there are hundreds of providers and you can't give you enter the paradox of choice model just like or Hick's law.

[00:49:44]

It's like you give people a billion choices. Yeah.

[00:49:47]

What if I want to start up a company? Yeah, it's exactly like, well then it's like you can't you give Dropbox of a hundred things because you're just confused people. And so at some point you do have to make choices. We've been trying to work with the biggest partners that have the most breadth and best results.

[00:50:04]

And so the way that you think about making that choice is like who's the biggest partner is got the better sort of results for.

[00:50:10]

But even then, like, if we're not here yet, because it's not the case in every country where you can imagine, like if you were really into Amazon versus Target or something and we give you a choice, would you rather prefer target results or Amazon results or Yelp or TripAdvisor results? You had you had like a favorite brand you could choose and then be like, OK, see more of that on when you get those answers.

[00:50:33]

But then we're still like sort of anti-competitive to new up and coming. Yeah, I mean, I think it's inherently a problem with some answers.

[00:50:42]

So do you think, like, in a weird way that we haven't had before and I'm just like thinking, you know, and throwing this out there.

[00:50:48]

But do you think, like, bigger gets bigger? I mean, I think in general, when you have data network effects, you have winner take most markets because I'm thinking like I like if I wanted to compete with Wal-Mart, I could open a store and I can compete with them.

[00:51:03]

And people would see my store and they would find it.

[00:51:05]

They would try to buy it. But online, that's not necessarily the case. Just because you create a website or you have a better product doesn't mean that you don't get noticed.

[00:51:15]

No, I mean, there there are there definitely are equivalent ways to get noticed. But what you're basically saying is, which I think is generally true, is. Ah, capitalism in general has scale effects a number of ways, you know, just kind of the scale, network effects, etc. and that is a feedback loop that makes things bigger unless there's some disruption that happens. And I think that naturally happens. And there's a good argument to show that that's happened a lot, at least in the U.S. markets in the past 20 years.

[00:51:47]

So if you put it all, not even online. Just even offline. Right. Lots of industries, for example, comes to mind. So how can I just read this book? That was the myth of capitalism that had all sorts of examples. And now they're kind of like diam escaping me. But there was like from eyeglasses was one of them. I think it was like oil refinery because they're just control.

[00:52:16]

They're effectively like the lenses are controlled by one.

[00:52:19]

Yeah. Just ends up being scale. So like it's all the same effects. Like people get bigger, they buy up other companies. There's been a lot of acquisitions, you know, and nothing nefarious necessarily. Right. But just the concentration of over time, the concentration becomes more and more and more and that creates less innovation.

[00:52:38]

And and that's kind of the general argument.

[00:52:41]

That's interesting, because as you're talking about that, I'm thinking like one of the big national debates is sort of like this wealth inequality. Yeah.

[00:52:47]

But we don't necessarily directly tie this in that book to the same thing. OK, yeah.

[00:52:51]

Because I was going to say we don't tend to talk a lot about company inequality, where you have this big disparity between the sort of I don't want to call it winners and losers, but the bigger companies get bigger almost as a necessity of them being bigger.

[00:53:07]

They can make more acquisitions. They have more preferable terms with people. They can be front and center. They can buy advertising more than other companies. They can generate more data that they can use to make their products better.

[00:53:20]

Yeah, there's a direct tie to inequality there in a couple respects. The biggest one is that a lot of the newer companies, especially online companies, have a lot less employees than the older companies. It's taking less there's just more leverage online because digital has no marginal cost generally. Right. And so it takes less employees to reach the same market cap rate.

[00:53:42]

And so there's less people benefiting from that.

[00:53:45]

Like comparing Berkshire Hathaway, which has like four hundred thousand employees to Google, which I don't even know what they have, but it's nowhere near that.

[00:53:52]

Yeah, exactly. And then even that the subset of the, you know, employees who are making lots of money is lower RAND. And that's very different than, say, like a Walmart who employs like a million people or something in the U.S., you know. And so over time, the rise of online and bigger companies has created some of the wealth inequality. At least that's the theory.

[00:54:14]

What are before we move on out of this area, I want to talk about what are some of the things that people can do to reduce their footprint tangibly if they want to do it without downloading anything or like doctor goes, obviously one of them.

[00:54:28]

But what are the other things people can do to reduce their or increase their privacy online? And let's start there.

[00:54:36]

Yeah, so so Doctor, go wise. That's been our goal is to give like the one download you need. Right. And so on iOS, Android, Chrome and Firefox, you can download the browser extension or our browser and it has all the kind of essentials for search and browse.

[00:54:54]

In addition, we have a blog that spread privacy dotcom that has a device type section. And so for any major device you have could be laptop, desktop or phone. There's a set of settings that you should probably change, you know, on your iPhone or your laptop that would generally help you. That kind of helps you on the main search. Browse, use the Internet side.

[00:55:21]

Then there's like all the other services that you use that kind of aren't search browse like. Well, and for all these services, there are generally like more private alternatives, sometimes really private. But also just getting off of Google and Facebook helps you not put all your eggs in one basket. And so for email, there's like proton mail use something called fast mail. It's very private. Fast mail is not just built for privacy. They're they're just an alternative paid email provider.

[00:55:53]

Right. You know, and so moving your email off of Gmail, the company that provides search. Yeah. You want to separate your kind of data as best you can.

[00:56:05]

Independent of all of that, you know, there are a bunch of all these like people sites that like you look up your name and you see like Shane Barasch or whatever age, whatever, you know, you can opt out of most of those.

[00:56:21]

OK, there are some services that will do it paid, but there are some lists where you can just go through and just request your removal. I'm super happy to have you all that.

[00:56:31]

This guy in Australia, Dieter Burm, something was like a TV star.

[00:56:36]

And so the number one result for Shane Parrish is like that. That's great.

[00:56:41]

That's awesome. Yeah. So, I mean, those are the things I think if you want to get more extreme, there are other things that we talking about using Tor or something like that. But in terms of just like no sacrifice, very seamless, you know, I would use something like Decco, a tracker blocker, private search, encrypt more encryption and then tweak your settings. Do you have Instagram, Facebook yourself? No.

[00:57:07]

So I, I quit Facebook. I want to say like seven years ago, maybe a lot, maybe more.

[00:57:14]

Kind of like 2010. And I don't use Instagram. I don't really use my social media except I use Twitter.

[00:57:23]

Do you think that there's a correlation between unhappiness and the use of social media?

[00:57:30]

So I do I will say that I have it some more mental models, but I really try to go with the thinking grey model of like not totally making my mind on anything, but all the research that I've seen and I haven't like in depth read all the studies, you know, show that you're quoting Facebook has had very positive effects for people. So not only the the your wallet effect, which you talked about earlier, but just people feel less isolated, you know, they feel less lonely in a counterintuitive way when they quit Facebook.

[00:58:06]

And so I what explains that?

[00:58:09]

Like in your mind, like what are the probabilistic reasons that are the majority of what would explain?

[00:58:16]

Yeah, I mean, I think it would be interesting, though, I'm not the expert. I've done the research, but my guess is twofold. One is there's just an opportunity cost for your time. So like you were just spending a lot of time wasting time on Facebook that time. A substitute was something else, which is probably better time spent that makes you happier, whether that's talking to other people or, you know, reading something, something else that's better.

[00:58:42]

At least that was certainly the case with me. The second is the actual content that you are engaging with is sometimes pretty toxic. And so, like, there's kind of two effects of that that have been talked about a lot. One is like you're seeing this kind of Instagram, other people's awesome life effect. You're seeing other people being seemingly happier than you or something like that. But it's often it's often a kind of a ruse because they're posting like the fake photos or the best things that are coming out of their life.

[00:59:12]

The other is just like, you know, I remember just seeing endless political debates that were very like, you get emotionally charged.

[00:59:24]

Yeah. Yeah. And it was just like it wasn't constructive debate. It was like ad hominem attacks and things like that. And that's just of negativity. And so it's like you're cutting out negativity. I think that's always a good idea.

[00:59:40]

You know, it's kind of like the there's nothing like the you know, you're the closest of the five friends you hang out with kind of thing. You are what you eat in a way. And so if you reduce negativity and replaced with positivity, you're just going to be happier.

[00:59:56]

Do you think we can have constructive online debates and what do they look like? I think so.

[01:00:02]

I think it's possible. Gabrielsson to Intelligence Squared that podcast. It's pretty good. You only listen to one.

[01:00:10]

The knowledge. Yeah, fair enough. Oh, it's a they've done maybe two hundred of these. It's a structured debate podcast. They have a topic, they've had a bunch of social media ones like in social media. Good for you, bad for your kind of thing. They have four really good experts on and then they play both sides of it, play water conversation, moderate debate.

[01:00:33]

It's really good and it's totally civil discourse we actually listen to with our kids a decent amount. How old are your kids? Ten and seven and and yeah. So I think something like that would be good.

[01:00:49]

I've worked on a project with somebody else who's been working for a couple of years on this very topic and have been going back and forth with different ways. You can kind of do that. I don't think he's really hit the nail on the head yet. But I do think it's possible to your point. Do you think it's possible to have a national conversation? About a topic in a constructive way. I don't know I don't know exactly what that means because it's just so many people, you know, we were kind of envisioning is part of the problem is it's kind of a bunch of mental models in here.

[01:01:30]

But like, people effectively just talk past each other and a lot of these kind of debates.

[01:01:35]

And so take something like climate change. There's actually like, you know, ten different debates with climate change, probably more. And I made a flowchart one time to try to isolate, OK, if you're talking about individual arguments. Yes. Which part of the argument are you in?

[01:01:50]

Like, OK, do you believe it's manmade? OK, if you do believe it exists, you know, if you live as Hamade, do you believe we can do anything about it? You know, do you believe there's harm? You believe it's innocent harm, you know, bringing it down into its.

[01:02:03]

Yeah.

[01:02:03]

Mental and so like in any particular argument would you really want to do is what I try to do at least is if you're really want to have it engage in debate with somebody, you want to start with the premise that there's suppose that you think you disagree on something you want to find, break down the argument into premises and find which one you actually disagree on, something like climate change.

[01:02:28]

There's like a ton of different underlying premises.

[01:02:31]

And so you have to go down and feel like, oh, we agree on that. Do agree on that. Do we agree on that trigger? And that that's not generally what happens when you have a public debate. What happens is people throw out there fact and maybe someone else threats or other fact.

[01:02:43]

They're arguing like totally different things. And so that's you can't have a productive debate unless you're arguing on the same premise. And so that's what I was hoping this kind of software or ultimate debate would do, would help people zoom in and figure out what do you really believe on this topic? Which do you believe or not? And then if you're going to engage in an argument about it, let's define the scope of what premise we're arguing over, like a search engine do that.

[01:03:10]

Like if I Google climate change, like, why couldn't you take that? You had broken that down into like eleven different things.

[01:03:15]

Why couldn't the result be sort of like one on the left? You get sort of like one side and in the middle you get the argument on the right you get the other side. And like that as a result you can click through to that would be kind of ideal.

[01:03:29]

That would be like an embedded this software that this gentleman's building. He was kind of context, you know, because to really give you context on the right and then once you get into a premise, then it's like, OK, well, what are all then? You can have more reasoned discussion because you can be like, OK, what are all the evidence, pro and con or different aspects of this premise? And then you can cite where the sources come from and people can comment individually on those sources and say like, OK, well, the main fact that everyone's behind on this is here.

[01:03:59]

And the people can be like, well, I don't I disagree with the methodology of that study or whatever it is, you know, but you're having a much more reasoned debate. And if you if you approach a topic, you can come in and say, OK, here's all the different parts of this. I can get up to speed on this topic really easily. Right. And that's kind of the idea. Whereas some of these topics now, it's just you don't even know where to start.

[01:04:20]

What don't the newspapers do that and sort of like instead of publishing, like publishing, sort of like simultaneously both sides of an issue with the middle, it would be a great issue.

[01:04:31]

There's a few sites online that I've tried. There's some called like pro con where you're kind of researching for this vox. Does it interesting they have these explainers, you know, which is pretty good, but nothing really captures this the way I think it should be. So to answer your question, I think yes, I think it is possible. OK, but nothing's really hit upon it yet. All right.

[01:04:51]

If I wasn't doing this, I might work on that problem. Maybe you should do both.

[01:04:57]

And maybe let's talk mental models.

[01:05:00]

You have a book coming out super thinking I read it is great. What what made you create the book?

[01:05:11]

So in a way, it's been ruminating for like twenty years, like I had wanted to and started writing different versions of this book for a while. But it would literally lapse for like five years. I could get into it if you want.

[01:05:27]

But then what happened was maybe four years ago, Doctorow started a more take off and we have not really hired a lot of people from outside the company. We've really tried to grow people within the company. And so we have this executive team that was, you know, needed to step up and learn more. And and I was trying to understand, OK, how do I train our executives? Like, what do they need to do to really actually make good strategic decisions more of the time?

[01:05:58]

Because that's really what executives do.

[01:06:01]

And I got to thinking, OK, what I really think it is is which is really counterintuitive, is I think they need to know, like, these three hundred different topics.

[01:06:13]

The models, which I think you've come onto the same notion, is like if you knew all these things and internalized it and could pattern match relatively effectively, like the situation applies to these three mental models, then you can skip lower levels of thinking and have this whole, like, higher level conversation really quickly.

[01:06:31]

Right. Whereas if you don't know these mental models, you're like starting from scratch every time. And you may you kind of don't know the design patterns of strategic thinking. And so I started making a list and I was like, OK, I'm going to make a list. And I initially made a list, like a hundred things. And I asked somebody and I was like, how many of these do you know? You know? And they were like, I know like 30 of them.

[01:06:54]

Were these one hundred things that you applied or did you make this list like you went to University 101 and you're like, what are the big ideas out of my head?

[01:07:01]

I was like, these are the hundred things that keep coming up again and again, mental models that I continually use.

[01:07:09]

And so did you always think like that or was this something that evolved like did you come at a school going like, Oh, I'm going to think about this and mental models?

[01:07:16]

Or did you like did you this is the more the longer term history. So I came out of school, my undergrad.

[01:07:23]

I took a lot of courses in different disciplines. And so the original book I wanted to write was like this kind of interdisciplinary, like guidebook. Right. You know? And I was like, I think a lot of people would benefit because they only have one major. Right. You specialize.

[01:07:38]

Yeah, he was immediately. But then that totally kind of ruins your education in a way. And you really wonder, you know, all these things. So I was like, what if we just made this like interdisciplinary textbook, like a survey course of everything?

[01:07:52]

And I eventually gave up on that. Why?

[01:07:56]

I don't know. It just wasn't coming together. And I started my company and things.

[01:08:00]

And then maybe like five years after that, I was like thinking about I had come on to Mongar at that point before, though, and I hadn't come on time. And then I was thinking, well, another formulation I had of this concept was, you know, I keep coming back to problems like I problems in front of me. And when you solve them, you want to, like, run down a list of mental models.

[01:08:22]

Like, is this a critical mass problem or does this have to do with power law or is that how you solve them like you have a mental checklist that you iterate through?

[01:08:30]

Well, I was doing that at the time and I was like, wouldn't it be cool if there was something like a Problem Solvers handbook? And I was like, you know, when you're faced with a problem, like here's like a checklist or help jog your memory to see if you can recommend solutions sars-cov-2 down that road and then that I that didn't I could get that to work either. So then five years after that was like more like this problem.

[01:08:53]

And then I end up with a list of 100 things and I ask that the guy who's he knew like 30 and I was like, OK, well that to me is what people's core curriculum should be.

[01:09:04]

They should learn the rest that they don't know. So then I went and I tried to more systematically list all of them. So like, I went back to Mongar and, you know, he has that, quote, about 80 to 90 models, but no one ever list them. All right. So I'm like, OK, well, I'm on a list of all. And then I'm more systematically try to brainstorm my own.

[01:09:24]

And then I went and started going down the rabbit hole of looking in depth at each one and like, what are the related articles on Wikipedia and all these different things just dogging my memory of other things, most of which I already knew, some of which I learned from from scratch. And then I put out a blog post. Mental models that I used repeatedly ended up being about two hundred fifty, and that did really well, like people really took to it.

[01:09:53]

And for a while I was just using that as like in our reviews internally and saying, like, you should go anything you don't know on this list, you should go, go read about it, go read about it. And I link to all the Wikipedia article. So it's like, here's a starting point, you know, and then we internally, we use them, you know, consistently when we're faced with different problems. And then our my publisher from my first book reached out and was like, well, as opposed to really successful, would you consider writing a book on this?

[01:10:19]

And I was like I just wrote a book and it was really difficult and took me five years to write. That was the previous book. And so I don't really have time for this. But my wife had just kind of left her job as a statistician and was interested in doing it.

[01:10:33]

And so we decided to do it together, ended up being extremely long and amount of work. And I somehow, like, forgot how much work it was. And it took two and a half years basically to turn that list into an actual book. But now it's done. So I'm happy about that. I have so many questions here.

[01:10:53]

But let's start with like, what are the big five sort of mental model? Is the one hundred carry them as well. What are the five out of those hundred that you think?

[01:11:02]

Yeah, I'm curious what you think about to do because you're also a big list, right? And so so what we ended up doing is, you know, the list was like the original list was like a. Big list and organized by discipline, right? And then when we started on the book, we realized that's actually not the best way to to do it because they're even though they came out of one discipline, they're actually interrelated, often cross discipline.

[01:11:27]

And so we set out organize. They're all connected. They're all connected. Exactly. So we ended up trying to group them into, like, really how are they connected?

[01:11:35]

And so we ended up with nine chapters that were laid off, about three hundred. And so there's like, you know, 40 in each chapter roughly. But they but they're in a narrative related fashion. Right. And so those, like general chapters end up being a lot of the kind of general formulations of things you should know. So like the first chapter is on bias in general and secretaries on unintended consequences. Third, chapters on using your time wisely, that kind of stuff.

[01:12:04]

So within all of those, though, if you're going to like boil it down and some of those are like business and management. And so some of that doesn't apply to everybody all the time. But the ones that really come up a lot to me are opportunity cost, which is where I started, which is that, you know, I would just tell the executive people and I still do is that like you shouldn't argue that something's important and therefore we should do it.

[01:12:28]

It should always be. It's important and it's more important than all these other things, you know, and that just frame of mind is like changes almost everything, because it's almost every decision you ever make about how to spend your time is like, am I spending it in sort of a way relative to my other options? Right. So opportunity cost comes up.

[01:12:45]

Another one comes up a ton, which is more rare. I don't think we mentioned as much as forcing function to me. OK, so like forcing function is the idea that is often operationalized as something in your calendar where you've preset a time or mechanism or something's going to happen and it forces you to think critically about something. So like in a company that can be a one on one sanity meeting that you have every week to talk about something you know, or it could be, you know, board meetings often serve this function because it's a forcing function to get you to compile all your metrics and stuff.

[01:13:20]

You know, even if the meeting is not that useful, compiling all that stuff is that useful right now. And you can just in all sorts of ways like schedule and time to go to the gym or whatever it is.

[01:13:29]

You know, what's an example of a forcing function? You use the you super valuable. Yeah.

[01:13:34]

So the another mental model that we kind of run through, that is this one. I once had a meeting.

[01:13:38]

So in our company everybody has a career advisor and there's a there they have to have a one on one with them every week. And that's an unstructured meeting. That's the the advisee, it's their meeting, but it's a forcing function to get them to think about anything that is going on that week. What's really on their mind. Another one we use is every project needs to have a summary update at the end of every week, kind of like what happened, what's next?

[01:14:14]

And it's you know, it's again, it's of course, if I'm trying to think critically about kind of what's going on, we have kickoff calls and postmortems for every project.

[01:14:25]

Also kind of you doing like agile sprints in the background or like how do the projects get you know, that's a whole different subject. I mean, we've kind of because objectives and projects can be more exploratory and not or may have external deadlines or not. We try to stay away from deadlines, if I can. Yeah, but we do have these like kick off sometimes mid Morde or Premortem Square. It's like a forcing function to get people to think critically about what's going on.

[01:14:51]

So we had opportunity costs forcing function. What would be next on your intuitive list that I put you on the spot?

[01:14:59]

Probably one I, I mentioned earlier, which is thinking grey. Yeah. And so yeah, I know. I should have probably brought a list.

[01:15:10]

Oh no, no, no. Yeah but I actually did. But maybe that's more realistic because it's the ones that it. But I think that comes up a lot.

[01:15:17]

The idea there is like, you know, you could view this as confirmation bias too. But once you commit to a decision, it's really hard to see a confirmation bias. Right. And you're you're like committed to that decision.

[01:15:30]

You're psychology wants you to look for things that confirm your decision if you force yourself to actually not make a decision or not not not commit to absolutely believe in something. Right. So I'm leaning this way, but I'm willing to entertain other evidence. It doesn't become part of your identity. Right. And then you're you're you're a free thinker. And so that's this concept of thinking. Grey. There is another people sometimes mean something else slightly different with that which I also agree with, which is this notion of.

[01:16:04]

But but I categorize something else in the book of called Black and White Thinking where it's like things are either, you know, zero one way. There's only two options, right? Usually that's completely nonsense. I mean, there's lots of options and there's there's a whole degree of things. It's usually not nothing's usually black and white. You know, in the podcast we do with Annie Duke, she had a really interesting sort of way of thinking about this, which was like just by default, say you're 99 percent certain on everything.

[01:16:33]

Never. One hundred percent. Yes. And the reason I love her, right. Is like, well, you signaled to the other person that you are open to changing your mind.

[01:16:41]

Another conversation becomes like, what would cause you to change? Yes, she's got a more productive debate out of it, but you're less likely to sort of like ignore things that are contrary to your beliefs. I love that.

[01:16:53]

That quote from Mongar one is my favorite one is like, you know, I always know the other side person. The argument. Yeah, yeah, yeah. And and that's what I think the thing is all about. It's like you want to understand all sides perfectly for the argument. Right.

[01:17:09]

And you use these terms internally. Yeah. Yeah. You actually use the term and you have like a I wouldn't say a checklist but do you have like what is your decision making sort of methodology or is there one.

[01:17:23]

It really varies.

[01:17:24]

So what we do is we have a we have a somewhat unique way of running the company because it's all remote. So we're not everyone's remote. It's remote, my first company. And so as a result, there's a lot and also people are way different time zones. And so there's a lot of asynchronous communication. And the way we've run effectively is to make everything really transparent. So all the projects are all open to anyone to follow and to make all that work effectively.

[01:17:54]

That is in chaos. Every project has a standard scoping template and so it has like a background objective impact and complexity assessment.

[01:18:04]

And and so when we have a project and have a kickoff call or post-mortem, it's in that framework of like what was the objective of this project? Like what are the success criteria is another way we define the objective and we can talk about like what it should be. Did we meet it? You know, there's like a way to frame that. And you do that before you start the project. Yeah, before we start the project. That's rare.

[01:18:26]

Yeah. Well, the other thing is, like we're small company. I mean, I think a lot of times are like this, but we're competing against companies that have infinite resources relative to us. And so this notion of opportunity cost, we don't have a lot of resources to waste. Right. And so one of our kind of core values that we have three articulated is question assumptions.

[01:18:48]

And so what are the other two valdete direction and build trust. And so with question assumptions, like we're basically looking like or should we be doing this at all? Like, is there a simpler way to do this? You know? And so we're asking right from the beginning almost trying to like say maybe we shouldn't do this project, you know, and say the only way to do that is to write out all the reasoning and then kind of discuss it and then really question, well, maybe we could do this a different way.

[01:19:21]

That's cheaper, better, faster, you know.

[01:19:23]

OK, and so there's no list of, like, mental models in the office wall that you're, like, walking through.

[01:19:28]

Well, it's funny because I do have a I, I, I, I've got to be friends with a guy around school of thought, OK, if you know. So he's got he's got some.

[01:19:41]

Websites once called, like a logical fallacy is and it's like a list of all the biases kind of thing. It's got another list of logical fallacies. And so I have two posters of his on my door.

[01:19:53]

OK, and so those are lists of mental models, but they aren't of the lists like from the book or anything.

[01:19:58]

Do you think awareness or maybe to what extent does awareness of.

[01:20:04]

Because I think people get this list of cognitive biases and their inclination or default is to be like, oh, OK, well, I have a list of them and I'll go through this checklist.

[01:20:14]

But they're really good at explaining, like why we why our thinking leads us astray, but they seem less good at being able to in the moment or with foresight, avoid those thinking.

[01:20:28]

Yeah, they're really good at explaining why we were stupid. They're really bad. It's sort of like preventing stupid.

[01:20:34]

And my experience has been a lot of times, not always, but if you have a list of them, the smarter you are, the more you just convince yourself that they don't apply.

[01:20:45]

So like you're actually not even open anymore because you go through this list and the smarter, more intelligent sort of you are, the better the story you tell yourself about why you're not overconfident.

[01:20:55]

Yeah, I so I think there's some real truth to this. I mean, this is why we have the core value of questioning assumptions. And so I think what you need often is somebody else to call you on this stuff.

[01:21:09]

Yeah. And then be open to it and then the organization or you need to be open to, you know, thinking about it and and changing your mind. Right.

[01:21:17]

And so what. That's what that's the culture we're trying to create. And so I do think it's actually pretty hard to do on your own, to your point. Yeah. You know, I think it's better if you have somebody, you know, to talk to. And so, like, for example, I am lucky in that I walk pretty much every morning with my wife for an hour. And that's also how this book got started as we were.

[01:21:42]

We talk about all sorts of things, so many things, but also just current events. And we apply these models and stuff. But she helps, you know, call me on my boss's dinner table conversations.

[01:21:55]

Yeah. Yeah. And vice versa, you know, and she's going to see if she's got a double degree.

[01:22:00]

Yeah, might do. Yeah.

[01:22:03]

She there's a whole chapter in this book about statistics. What you had to she was she was the main one on that and she was worried that she gets some wrong, you know, she'd say something wrong because what's funny about these mental models is you don't need to be the world's expert to apply them. Right. You know, but if you're gonna write a book on it, you'd also don't want to get it has to be accurate. Yeah.

[01:22:24]

Yeah. First, what is the validate direction and what does that and how does it relate to feedback loops and what sort of feedback loops do you set up to.

[01:22:32]

Yeah. So about direction is, you know, this gets to your point about like decision and like is the project going in the right direction? And so what we basically try to set up is another mental model we use really heavily adept at go. It could be the top five is directly responsible individual Deery, which we really took from Apple, which is that, but really expanded it, which is that every task, every project, every objective has one person who owns it and they are the directly responsible individual.

[01:23:07]

And it avoids this diffusion of responsibility and the bystander effect, other mental models, which basically like if you're on an email with five people and no one responds because no one felt they were like it was stress to them. Yeah, same thing in a meeting. If you have action items and no one's assigned, the task rate often doesn't get done because people are like, well, I thought I thought he was going to do it exactly. Right after the worst meetings, I spent an hour and everybody walks away going, Oh, we decided what?

[01:23:35]

But nobody knows who's doing exactly.

[01:23:37]

So to avoid that, we assign a driver everything and projects also have to arise.

[01:23:44]

So someone will go on in their tasks. And we have the kickoff call and they're like, OK, I know what my objective is. I'm going to go off and do this thing. But they could go off. It's been the real and you off on the wrong direction. They could do all sorts of things. And so we try to say is there are a lot of other smart people in the company who may be able to do things faster, better know the right answer.

[01:24:04]

And so generally, you should be coming back and validating your direction. And so the easiest way to do that is to just write what you're doing. Yeah. And say not stop yourself. Just say, look, I'm going to go to this.

[01:24:17]

Does anyone have any ideas or object to what I'm going to do if no one objects? I'm just going to come out and keep moving forward. You know, default is this is going to happen. But you can. Yeah, exactly.

[01:24:26]

Chime in if you want. And that works really well because it doesn't stop people in their tracks. But anyone can follow anything. And if sometimes people call out certain people that they know might have information about this, you know, or not knowledge. And if they do, they can chime in and if not, they're not. And so that's really like the core operationalising of the valdete direction like that. How does this relate to being a parent?

[01:24:48]

Like how do you teach your kids about not only mental models, but sort of like preparing for, like, general knowledge about the world in multiple disciplines? Yeah, that was one.

[01:25:02]

After we saw the original I was writing the book was Train Executives. But then when I got into it with my wife, our impetus was more.

[01:25:10]

We think that all kids, including our kids, should learn all these things and they generally don't learn them in school. And so. Oh, yeah, yeah.

[01:25:20]

I mean, they learn a lot of the concepts in school, but in a domain dependent, they don't learn how to apply them and then even have them. They literally don't learn. Right.

[01:25:28]

Like have these things I either had to come across randomly or I was really lucky to go to this interdisciplinary graduate program where I was exposed to a lot of these things.

[01:25:38]

Right. But had I not done that, I would have been stumbling along, not, you know, knowing these things for a long period of time. So we've been with parenting.

[01:25:51]

We've been trying to be like really explain things to kids and our kids in an adult way. So like I mentioned, we were listening to this debate podcast with them, and it's an adult show. It's nothing to do with kids, you know, and really complicated topics. And we just play it and posit and ask the kids what they know and what they think about it.

[01:26:13]

You know, this morning I've been so I've been driving my kids to school every morning and they're just like random examples, tactical examples.

[01:26:24]

The better we've been I've been listening. We first came on a bunch of kid podcasts that we listened to, but then I realized, like, oh, my kids like listening. They don't podcast as much. So we start I do this on a daily because The New York Times daily podcast about things and they do a really good job of like having kind of deep dives on historical topics this morning. It was about Brexit and Theresa May and all the situation going on there.

[01:26:51]

And so we just listen to daily on the way to school and talk about it.

[01:26:55]

What do you talk about? Like what does that look like? It is like you're saying what's going on is the first level. You know, then it's like, OK, why are they wanting to leave? Why do you think they want to leave the European Union? You know, like what are their options? Why is she resigning, you know, all that kind of stuff?

[01:27:13]

Just trying to, like, pick apart things that we know more intuitively as adults or or we're told in the podcast in an adult way and trying to make sure they understand it and the logical chain and to like, OK, well, how do they get to this and then decision point, do you sort of like nudge them to multiple perspectives to like how does the you feel?

[01:27:35]

And yeah, that's why I like the debates in particular. Right. Because you get really intelligent people talking on both sides. Right. And you get like, well, what do you like about their argument versus their argument? And say, well, what about what this person said? Versus that person said, yeah, agreed.

[01:27:49]

And so are you ever like are you trying to resolve things or are you just leaving it open? Totally open. Yeah. And you follow up. Not really. What are what are the other things maybe we should follow up. I don't know.

[01:28:03]

And what are the other things you do sort of like maybe around the dinner table or with your kids to get them just thinking in different sort of ways so that schools don't teach? Yeah, but I'm really lucky we just switch schools and I really love our kids new school. So I feel like happy that our school may actually start to do a lot of the things that we want to do now that we're not going to stop it.

[01:28:27]

So one thing, I'm more encouraged by the kids school at the moment, but so my kids are two very different kids.

[01:28:36]

So there's different strategies for each. So my oldest is really into movies. And so one entry point to him. I also like movies a lot is we've just been watching a lot of movies and then talking about them in the same way.

[01:28:48]

Movies are great because they're they're the production value or thought put into every minute of a good movie is really, really, really well produced thing. There's a lot going on there that you can, like, talk about. And and they're also very entertaining. So we've been watching a lot of movies and kind of talking about them all different kinds. My youngest is seven, right? Yeah, seven. He is gone. He's really in a bath and really he's gotten into programming a bit and to watch in like science podcasts and things like that.

[01:29:22]

And I mean, that's how science channels like on YouTube. Are you doing like scratch programming.

[01:29:28]

Yeah. He took a there in like this Johns Hopkins correspondence courses. So he took a scratch one. Yeah. This last semester and now he's taking a cryptography one. And my seven oldest is, yeah, he's going to come out with, like a block Jacob, and he's definitely more advanced than I was at this age. I'm super jealous of the opportunities they have. Oh, yeah, it's great. And yeah, my oldest is in a python one.

[01:29:54]

And but he he's more wants to talk about like the YouTube videos he's watching, you know. And so I try to I guess what I'm trying to say is I try to engage with them at what they're interested in right at the scene. You try to bring it back to them in terms of like whatever they're interested in but relate the concept to.

[01:30:12]

Exactly. Let's talk about that. The other thing we did relatively recently is I put all the Iraj crash course. No, it's so it's really good. It's a it's way PB's but two brothers, they basically made online animated, but by them talking and talking heads to AI courses on economics and biology and chemistry and film and all these different things on YouTube. What was that called again? Crash Course Krasker. Again, it's really well done. I put them all on DVD and so we drive to school every morning.

[01:30:51]

Right? It's like thirty minutes. And so, you know, not every day we watch podcasts. Sometimes I watch Crash Course, you know, I do the same thing. I guess it's all the same kind of general concept is fine. Really good content.

[01:31:01]

Right. And then talk about the content with them in a way that reflects. Yeah, I guess it's all the same really technique, whether it's a video or a course or a movie or a podcast, you know, it's like find something really engaging. You can talk about it and relate things back to it.

[01:31:16]

What made you switch schools? You said you switch school or what was the criteria for the new school? And like, what made you leave?

[01:31:22]

We were we were at a public school and it was generally good school.

[01:31:29]

They both have I don't know what it's like in Canada, but in the US, in state by state. But they have gifted programs and and they had something called gyppy, which was like an individualized education plan for the kind of gifted sat and center. But it just things are moving really slow. They were bored a lot. They our youngest, who's really kind of good at math, you know, he skipped kindergarten and then skipped another grade in math and then passed out of that grade.

[01:32:00]

And they didn't really know what to do with them. And and it just kind of bored, I guess. Yeah.

[01:32:06]

And so this new school is it's kind of brand new. It was only four years old. It is a it's a it's a school for gifted education. Totally. But it's not like you have to be super smart to get in, like they're willing to take a lot of people who I think are willing to just engage with the idea. And it's more project based learning. And so they do two things I like. One is they don't waste any time during the day.

[01:32:34]

It's just kind of like they do way more. And so they have all the regular stuff, but they also have coding every day, a project class every day. So cool. And and then I regular things like science or something. They're doing a project related to that thing. And they also kind of force the kids to do it. And they have they fail if they fail. Right. And they don't they don't like coddle them with it.

[01:32:54]

Right.

[01:32:55]

And and then they have all these other kind of random things like, you know, do they do things to like build resilience, like, as you know, because a lot of a lot of gifted kids, like things come very natural to them. And then when they're faced with a struggle, like what do they do to build grit or resilience?

[01:33:12]

They the main thing they do is let them fail at things and just make them try again until they get it. And I think that goes a long way, seemingly, although we're relatively new so. Well, and the kids liking it.

[01:33:26]

But kids love it. Yeah.

[01:33:27]

Because they they're both like super creative and love projects.

[01:33:32]

And so it's like half the day is like doing different projects like Eli's in the fourth grade, they're building an escape room and their project class for the semester that they like to. They split the the fourth and fifth grade actually runs like a middle school and they split them into two groups. And they're like making a scapegoat for each other. And they have to, like, build it and like doing the test it. Yeah. And so they're just like doing that.

[01:33:58]

And he loves that kind of stuff. Oh, that's so cool. And so I'm excited for them. I wish I would because the school. Oh my God.

[01:34:05]

But seriously, we didn't start programming in school until grade twelve and it was like Pascola.

[01:34:13]

Yeah that was yeah. I mean yeah I'm thirty nine so yeah. We're the same in that. Yeah that was exactly my experience as well.

[01:34:20]

But it was great because I mean I started much earlier than that. So like the coding class became like super easy.

[01:34:26]

It's a great way to get my grades up for university and I'm with you.

[01:34:29]

That was my path as well. Uh, listen, this has been a phenomenal conversation. I want to thank you for taking the time and maybe we can follow up.

[01:34:38]

Yeah, love to. Again, an. Hey, guys, this is Shane again, just a few more things before we wrap up. You can find show notes at Farnam Street blog, dot com slash podcast. That's fair. And I am s t r e t blog, dotcom slash podcast. You can also find information there on how to get a transcript.

[01:35:04]

And if you'd like to receive a weekly email from me filled with all sorts of brain food, go to Furnham Street blog, dotcom slash newsletter. This is all the good stuff I found on the Web that week that I've read and shared with close friends, books I'm reading and so much more. Thank you for listening.