Transcribe your podcast
[00:00:00]

Welcome, welcome, welcome to armchair expert experts on expert Dick Rather, and I'm joined by Princeton Mouse. Welcome, Dick. Rather, that's a rather rough name. I can't believe I've made it this far with that name.

[00:00:14]

I had to overcome the adversity and adversity. We have talked a bazillion times since we both saw the social dilemma, which is on Netflix we really were infected by. And that, of course, became a gateway drug to doing Rabbit Hole, The New York Times podcast, which we loved. And so we had an opportunity to talk to Tristan Harris, who was front and center in the social dilemma. And we think you will enjoy him immensely. Tristan is a computer scientist that spent three years as Google design ethicist, developing a framework of how technology should ethically steer the thoughts and actions of billions of people from screens.

[00:00:51]

He is now co-founder and president of the Center for Humane Technology, whose mission is to reverse human downgrading and realign technology with humanity. Is really the head honcho on this topic. He is. Yeah, he's he is the what would we say? He Vangard. He's the vanguard.

[00:01:10]

He really is. He was the first one out there really sounding an alarm. Yeah. And we all are in debt to this. We are huge ethical, moral compass. So please enjoy Tristan Harris. We are supported by sleep number. The mattress I slept in last night got a beautiful night's sleep. My sleep number is eighty right now, which means that my body is healed up nicely and my sleep score IQ was ninety. Whoa, I love it.

[00:01:38]

You know, I've been going a bit earlier and waking up earlier and I'm loving it. It's great. Now, quality sleep is more important than ever as we head into winter. Not only is it a natural immunity booster, but it also helps with energy and recovery. When we sleep, our brains begin a highly orchestrated, intricate dance of complex activities that fuel our waking life. Throughout the night, we cycle in and out of two stages of sleep, slow wave sleep and rapid eye movement.

[00:02:00]

In fact, the brain is doing four times more work than the heart at night. The brain can process complex stimuli while asleep, and it uses this information to help you make decisions when you're awake. Learning, dreaming, flushing out harmful toxins. The space between brain cells increases during sleep, which allows the brain to flush out toxic molecules that have accumulated during the day. Discover the sleep number three. Sixty smart bed for proven quality sleep and now during the ultimate sleep number event, say 50 percent on the sleep number three.

[00:02:28]

Sixty limited edition. Smart bed for a limited time only at sleep, no stores or sleep, no dotcoms attacks. We are supported by Bob's Red Mill. So Monica. Yeah, in my Cedar Mill recording with Aaron and Scott Johnson joined and he just started pontificating on how much he loves Bob's right meat because of listening to the show. And he's incorporated it and it keeps him full into the evening. It's so good. It's also so easy. All I got to do is fill the little container up with some hot water.

[00:02:58]

Put the lid on for a minute, pull it off. Bingo, bango. Bongo, I'm having a beautiful breakfast. What's really exciting is that Bob's Remelt Monaca has this really, really great one to one baking flour that's gluten free in the holidays. They are knocking at the back door. They sure are. So I'm going to have the family Chef Baker using a lot of Bob's red mill so we can have all my treats this holiday season without triggering any gluten intolerance.

[00:03:23]

I urge everyone to learn more about lots of awesome and delicious products and recipes at Bob's Red Mill, Dotcom, DACs. And while you're there, be sure to enter for a chance to win an exclusive Bob's Red Mill Prize pack in an armchair expert T-shirt. That's Bob's Red Mill. Dotcom starts. He's an chance.

[00:03:57]

Hey, Dex, how are you doing? Good, how are you doing? Good. You know, Monica, Monica Padmini. Hey, Monica, how are you doing? Good. So excited to have you. We've been wanting to have you on for years. I think. I think this is years in the making. Yeah. Excellent.

[00:04:12]

I'm just excited to meet you, talk to you and go through some of the many amazing things you are an expert on. First of all, I'm curious, where are you?

[00:04:21]

I am in Scottsdale, Arizona, at the moment because believe it or not, my house and everything that I owned burned down in the recent Santa Rosa fires.

[00:04:29]

No way. Oh, no. Yeah. So I've lost everything. Oh, you normally live in Santa Rosa.

[00:04:36]

I only live in the North Bay. I had moved up to Santa Rosa, which was my deceased family's home during the quarantine and pandemic. And then these fires three weeks ago just took everything. So. Oh, my God.

[00:04:49]

Awful. No, but it's also happening in the middle of when this film has come out. And so the most important thing I can do for the world is just keep focusing on the film.

[00:04:59]

So that's where my attention is going, being the social dilemma. Yeah.

[00:05:04]

OK, great, great, great. I wanted to make sure that I wasn't in the dark about another film. We have seven films out right now. Oh my God. You're like a mini studio. Mini major Santa Rosa is Charles Schulz country, right?

[00:05:17]

That's exactly right, yeah.

[00:05:18]

Did any of that stuff burned out? Did he ever like an ice rink he had built and used to have like a Christmas pageant and stuff?

[00:05:24]

I think they did. And I don't really remember everything there. The airport is named after him, the Charles Schulz Airport.

[00:05:30]

And there's cute little Life-Size figurines of the Peanuts characters, if I recall.

[00:05:34]

That's right. All around center isn't downtown. I think I have a picture of my then two year old hugging the Snoopy statue. Yeah. Oh, yeah.

[00:05:43]

Where did you grow up? Did you grow up in Santa Rosa? I grew up originally in San Francisco.

[00:05:47]

And when I was about 11, my mother wanted to move up to Santa Rosa to fulfill her dream of riding horses and being near the state parks. So I went to high school up in Santa Rosa and then later back down the bay to Stanford for college to study computer science.

[00:06:02]

And what was happening career wise for mom in San Francisco, or are you like second generation technology person?

[00:06:09]

No, the opposite. Actually, I was the only one in my family who really even touched a computer. And I was just, you know, obsessed as a child born in the year of the Macintosh would be. I was born in 1984, so I didn't see the Macintosh ad. But what's really interesting to loop the story back around is that the co-founder of our nonprofit Center for Humane Technology is Aiza Raschein, whose father, Jeff Raskin, invented the Macintosh project or started it at least at Apple.

[00:06:34]

Oh, really? Yeah. And my life has been largely defined by, I think, the impact of the Macintosh on my early childhood.

[00:06:42]

And yeah, that's a kind of an interesting place to theoretically start, which is prior to Macintosh. And again, I'm not super savvy about this, but I will say what used to be seemingly kind of a utilitarian pursuit of people who liked making simple programs on these early computers with Mac comes, in my opinion, a culture or a movement. It like transcends that space a little bit. Was that your experience with it? Did it feel like more of an emotional connection to that thing?

[00:07:13]

It actually did, yeah.

[00:07:15]

I didn't know any of the lore behind the Macintosh. You know, who Steve Jobs was or Andy Hertzfeld or Bill Atkinson and some of these characters who, you know, invented and put all of this culture and soul into this computer. But it's funny because not even knowing that history, I felt this kind of weird, mystical, almost connection to the creativity behind it. Yes, it really was. You know, as the saying goes and as we say in the film, that the Macintosh and the computers were a bicycle for our minds, they would be giving us kind of leverage for the kind of conceptual creativity and capacity that we could have as human beings.

[00:07:49]

And that's a really optimistic view, obviously, of technology that I still believe is possible. We just went astray with these business models that kind of pulled us into this many dark ages.

[00:07:57]

Well, and then to drill down a little bit on what you just said, which I think could be useful, is, yes, the bicycle is a tool. You do a great job of describing that meaning a tool, a rake, a lawn mower. It sits in your shed until you want to use it to perform some tasks that it's going to assist you in. And that is not how we interact with these tools. Our smartphones, our computers, they are actually engaging us at all times.

[00:08:24]

Right. We didn't wake up necessarily with a game plan of like, you know, let's try to get five, six hours on that thing today. No, no. It handles that for you.

[00:08:32]

Yeah, that's exactly right. I mean, it's important to state if you go back to the Marshall McLuhan kind of theories of what is media, what is technology, and it's an extension of thinking and action. So all technologies are extending and reshaping the way we make sense of the world and the kind of actions that we might take. So a bicycle is going to change your basic sense, making a choice, making the menu of options that might occur to your mind.

[00:08:53]

Where do I want to go today or what I want to do today? Because your bicycle extends the. So it's important to say that first, however, as you said, a bicycle doesn't have an agenda about what it wants from you, right? Right.

[00:09:04]

Well, it wants you to lubricate the chain, occasional pressure at optimum level.

[00:09:10]

I guess it's some existential level. It wants to survive, but it doesn't have any means of doing that except by being able to make you care about that. But, you know, a bicycle doesn't have an avatar voodoo doll like model of each of us that uses trillions of data points to figure out predictively, how do I get you to drive more to places like McDonald's and less to places like parks and playgrounds? Because I need to make money from you using me in a very particular way.

[00:09:36]

And that's the difference, as we say in the film, between a kind of tools based technology environment and an addiction to manipulation based technology. That's the thing that's really changed. And that we talk about in the social dilemma is that the business model of Facebook, YouTube, Tick-Tock, Twitter and LinkedIn, Instagram, et cetera, they all depend on us using it in very particular ways that involve hours, a day of screen time. But that's not really the core harm when you I'm sure we'll get into this more.

[00:10:03]

It's really the kind of the erosion of the life support systems that make up a society that makes society work. Because we need our mental health. We need a common view of reality. We need shared facts and truth. We need to have a basic compassion for each other. And each one of those dimensions of our society are things that are not inside of the business models interests. And that's really why we have to change the broader system.

[00:10:25]

All right. Let's go back to the bike analogy. So, yes, the bike changes culture in that you used to only be able to have lunch, you know, a mile away, whatever you're willing to walk now, going five miles away is an option. So because the bike has no agenda, you'll just end up at some coffee shop or some restaurant and you'll interact with people there and they'll be of all variety because it hasn't selected for anything.

[00:10:49]

And then you might like the food or not, and then you may ride your bike to another place, but you are generally in charge of, oh, I didn't dig what was going on at that coffee shop, but I kind of like what was going on. That one. You're still in the driver's seat, whereas the a bike is going to take you to a coffee shop. It knows you're like because it's so, so good at predicting what you will like because it is memorized every choice you've made over the last ten years online.

[00:11:14]

Right. I think people lose sight of we don't have a hundredth of a sense of what we are like as much as our computer does, because we don't have the capacity to even remember all the times we made decisions but seen all of our decisions accumulated over ten years. It is very easy to recognize a pattern within that if you can see it from that thirty thousand foot view, right. So we know 90 percent of the time if you're making a choice to eat at three p.m. and you didn't need it breakfast, we know pretty much you're going to the shittiest place imaginable and so on and so forth.

[00:11:51]

Right. And so the bike is now telling you what you like and predicting what you will like. And it will rule out ever giving you something you don't like, which is so often how we grow and evolve. And we don't think we like a French restaurant, but a friend drags us and then we lo and behold, escargot is not bad. It shouldn't be good. But by God, someone made me try it.

[00:12:12]

Now I like this is exactly right. And then the question is also what is the perceptual tool that a bike looks at from us to know what we like? Because, for example, let's give the bike an eyeball and let's give that bike an eyeball called time spent. So the only way it knows what we like is about the amount of time that we spend there. Now, let's say there's like a certain place that we could go, like the Conspiracy Theory Cafe.

[00:12:34]

And whenever we go to the Conspiracy Theory Cafe, you spend five hours there talking to mindblowing people who will tell you about aliens and government conspiracy theories and COINTELPRO and MK Ultra. It's fascinating and it keeps you there for hours and hours and hours. And let's say the bike doesn't have any other way to know what you like, except by the time that you spent.

[00:12:53]

It's erroneous data in that if its goal is to give you what you want, what you want isn't necessarily what you spend time doing. So first and foremost, I hope that would be arrogant to say, but it would be great if it had gotten to you. How often we talk about social dilemma, we just absolutely loved it. It terrified us. And we've also found a couple of great companion pieces that go along with that, I would argue feels good.

[00:13:15]

Man is a great kind of companion. Have you seen that documentary? No, I haven't seen it. No. It's about how the artists who created Pepé the Frog had his cute little creation that he had done many comic books with. In short, stuff just got owned by the alt. Right. And how what role Pepé the Frog had in the last election, which is so fascinating. So that's a really good detailed account of how this stuff can work and drive people further and further and further into an increasingly fundamental militaristic, you name it, direction.

[00:13:53]

And then have you heard the Rabbit Hole and New York Times? Podcast, Rabbit Hole, yeah, I know Kevin very well to you. Oh, OK, great, so loved it. We loved it. Like seeing your movie then those other two for us cause critical mass where we're just like, oh my goodness, wow, wow, wow, wow, wow.

[00:14:13]

And I got to give you a lot of credit, because I do remember when you first hit the scene giving interviews, I imagine it was six, seven years ago. When did you, quote, defect from Google?

[00:14:25]

Yeah, well, the internal the Google the first presentation that I was sort of calling a moral call to arms was back in twenty thirteen and kind of twenty twelve time period. But the first TED talk in 60 Minutes piece, which is probably the biggest public exposure, was in twenty seventeen. So I was about four years ago. Three years ago.

[00:14:42]

OK, and I remember at that time seeing you thinking many things. One, wow, this guy must have some integrity because he's most certainly said goodbye to many millions of dollars by doing this. That was kind of my first. I'm very financially driven and fearful. So that was my first thought. Second was, oh, wow. He's explained that we've transitioned into an attention economy, which at that time was a bit of an abstract premise to me.

[00:15:06]

And I thought, well, how does one monetize time spent? It's still going to be a weird economy. But sure, in the Yuval Harari, a sense it's all a story and whatever we buy into work. So, OK, it wasn't until I noticed Netflix started publishing as a result of how good their platform was, how many hours had been spent watching Adam Sandler movies. Right. So it was an insane amount of hours. I can't remember now, but it was in the hundreds of millions or maybe even a billion hours, whatever it was.

[00:15:32]

I thought to myself, oh, wow, now I get it. Adam Sandler has a value that's very quantified because a billion dollars has been spent staring at him. And I'm like, OK, he was right. That was about a year and a half after I first heard you talk. And again, I still didn't understand the full implications of what you were saying. And I can't imagine that, A, you knew the full implications and then B, that you could have even told us at that time because I didn't even understand the intention economy.

[00:15:59]

So backing all the way up to you, working at Google as an ethicist. Right. You're in charge of what the ethics of the products are.

[00:16:06]

Is that an accurate description?

[00:16:08]

Well, yeah. I mean, just to make sure I don't overstate my level of authority, I was concerned about how do you design products that are going to influence two billion people's attention, whether you want to or not, like, you know, you have the power of a God like Zus. But if you bump your elbow, you might scorch half the earth if you're not really thoughtful about, you know, where does the tension really go? And there hadn't been a role or a name for what I was researching, which is called at the time, design ethics.

[00:16:35]

How do you ethically design two billion people sort of steering the nervous systems and the kind of Paleolithic cognitive biases that we have buried in our brains? How do you ethically push those buttons? Because no matter what choice you make, if you use a red notification color versus a blue notification color, you're going to get very different kind of outcomes with how the human nervous system reacts. And so really, the role became kind of a new field of design ethics.

[00:17:01]

How do you ethically design based on a more vulnerability centric view of the human animal?

[00:17:06]

Yes. And while you were at Google, you at a certain point wrote this document, and it's a very Jerry Maguire moment, I would say. And as you came in to work, you noticed it was on many people's computers and people were really talking about it and they were all saying, oh, my gosh, I'm so glad someone said this. We, too, are these are problems now we've created that are in our home.

[00:17:30]

Our I look at my phone much too often. I'm addicted to you. That was your case. I think you realized. Well, I'm very addicted to my email.

[00:17:37]

I can't stop checking it and interacting with which is funny, by the way, because a lot of people in the film didn't understand that line. Like who in the world is addicted to their email because most people just get junk. But maybe if you're from Siby big time.

[00:17:49]

Yeah, yeah, yeah. And it just doesn't end. It is a 24/7 news cycle. You're an obligation list, right?

[00:17:57]

You're always behind to get back to someone. And so if you don't actually have in your own mind a way to conceptualize what it means to be complete, then basically I've obligated you for the rest of your life because you're never going to be getting back to all those people. And that's the thing also that I was seeing at Google back in the time was the growth rate of what was going to happen, because it being a Google, you actually get bombarded in emails from thousands of people.

[00:18:19]

You get just so much information. It felt like kind of getting a front row seat to the future where you're seeing, oh, my God, what is this going to look like when everyone's getting thousands of emails and thousands of notifications? We better do something now to protect people who are going to be coming along with us. We're handing these phones out to the developing world. Yeah. What are we going to do to the collective attention of two billion people?

[00:18:41]

So that's kind of where the concerns came. It felt like an opportunity. We could change this before it got too bad and we could really make a difference. And Google was in this unique position because of Android and because of Chrome is a browser where they kind of were shaping the rules of the attention economy. We could. Make a choice, almost like a government can make a choice or a city can make a choice about what is the with the sidewalks and is there stop signs and is there traffic lights, right or not for attention or do we just sort of let it run?

[00:19:08]

There's no speed limits. There's no stop signs. Everyone just go as fast as you can. And it's a race to the bottom for who can kind of, you know, get wherever you want as fast as possible, let alone if you crash into kids or whatever you do. And that's kind of what the initial effort at Google was, was about.

[00:19:22]

So what happened is there was much fervor at Google amongst the employees and it had been brought up to the higher ups numerous times. And you at that moment thought, wow, this is going to really change things. And I'm very excited about this and I'm glad it was received this way. And then virtually nothing happened. Right. It just kind of passed as any super hot button topic in the news cycle, what it was like as all we care about for thirty six hours and then now we don't because there's too much other shit to care about.

[00:19:52]

Correct. Yeah. I mean, think about how do you hold the attention of a company to say that the attention economy itself is important. Well, there is no external forces that are sort of driving an agenda that says, hey, this is the number one thing we've got to work on. And so while, you know, I want to be really giving credit to Google and some of their generosity and letting me even try to work on this for several years, because there was a couple of executives who especially carved out space for me to even do that research.

[00:20:17]

But it's really unfortunate that we weren't able to get anything really done. And that's what led me to realize, you know, there's always this question, you know, do you change things from the inside of the outside? You could be inside of Exxon and you're in charge of a billion dollar budget of oil exploration and where the how much investments go into renewables. You know, if you could just change Exxon by one percent, that would make this huge difference because Exxon is massive, you know, or do you go to the outside to extinction rebellion and to Greenpeace and you try to change how Exxon operates by putting pressure on it?

[00:20:45]

And the same thing was true at Google, right? Like I was there in the inside and it was so seductive. If I could only change it by two percent, you know, and then you could change how Android works for a billion people that would change the world. That would be incredible. And that's why I stayed for two and a half or so years trying to do so. But I really realized that it took this external movement both with 60 Minutes back in the day.

[00:21:06]

Later, Ted talks and now with the social dilemma, which, by the way, we just heard the Netflix numbers, I think two days ago, the social dilemma was watched by thirty eight million households in the first twenty eight days, which I think is broken records for documentaries on Netflix. And if you assume that many families watch the film, that's over 40 million. Forty five, maybe 50 million people. So it's unbelievable the kind of global public attention that this issue now has.

[00:21:32]

And just to loop back with the story, 60 Minutes, that piece with Anderson Cooper called the addiction code really talking about the addiction aspect of the attention economy is what led Apple and Google to launch these digital well-being screentime features. When you look at your phone, it shows you how long did you spend on different apps? How many times did you check your phone? Those things changed because of this external pressure. So it was really a demonstration that you needed the external pressure.

[00:21:56]

And in response to having seen you on 60 Minutes, which is my all time favorite show, I had to go out and find an actual app that would have kept track of how much time I was spending at that time. That wasn't even an option from the manufacturer.

[00:22:10]

Well, not only was it not an option, they were actively denying that option. Right. Like apps were trying to get made, but they were not letting them in the marketplace at that time.

[00:22:20]

Yeah, that's right. There's an app, actually, strangely, Tim Kendall, who's in the film, who's the former president of Pinterest and who also brought the business model of advertising to Facebook. He's that insider who's who's in the film The Social Dilemma. He actually now runs a company called Moment, which was one of those first companies that did let you screenshot your battery page and would reverse engineer how much time you were spending on different apps. And this is a perfect example, by the way, of why we need the technology companies to take responsibility because we can't build our own solution to this problem.

[00:22:51]

It's sort of like saying we're building a nuclear power plant in your neighborhood. And by the way, if you have a problem, you have to go design. And so your own hazmat suit and tie, some things melts down. Yeah, that's actually what's inhumane about our current system, is we have systemic problems and we give you an individual. We put the responsibility on you, the individual, just like, you know, BP saying, you know, we really care about climate change.

[00:23:11]

So we built this carbon calculator. So now you can calculate how much carbon you're using as opposed to we need to change our practices and accelerate the transition.

[00:23:18]

Yeah, that's a great analogy. Right. What are you going to do, create your own car at home? Create your own everything. OK, I was just going to point out the irony. I love that this was stated in the documentary, which is it is both utopia and dystopia. So the notion that the problem also is getting slowly addressed on the platform. Netflix is ironic. Right. And and I think it became clear in the rabbit hole is I don't think any while certainly there are some nefarious folks, whatever, but in general, I don't think there were malicious intentions with many of these things.

[00:23:56]

I think many of these algorithms were created with a goal in mind that did not seem bad, but then just in practice turned out to be something else that no one could have foreseen or just did not foresee. So. And I'm also a little bit sympathetic to these companies because you want them to own it and yet they don't want to be liable. This is where I think there's a little bit of an issue in our justice system where there's not a lot of latitude for companies to acknowledge their errors and then try to rectify the situation without having to deal with punitive damages.

[00:24:30]

Right. It's a little bit of a catch. Twenty two for these companies. And I do think many of these companies are led by very ethical people.

[00:24:37]

You're getting to exactly the right point, which is let's say you find out that your product causes mental health problems for teenagers. Well, are you ever going to proactively go out and say that, hey, we're working on reducing the mental health problems for teenagers? You're never going to say that. And then also, if working on that problem means fundamentally changing the success metrics of your company, say, your Instagram, and the success is time spent and high time spent is correlated with, you know, teenage depression, which it is obviously isolation and self harm and teen suicide.

[00:25:07]

There's some horrible metrics that have gone up with the increased use of especially sort of the teenage apps. You're not going to be able to admit that that problem exists. You know, there's a famous writer, Upton Sinclair, said you can't get someone to question something that their salary depends on them not seeing. And I think you talked about how in many cases these were unforeseen consequences. But then later on, these were known consequences but were not dealt with.

[00:25:33]

And I actually think this is why the companies secretly kind of rely on outside pressure, like the film The Social Dilemma, like 60 Minutes, like many of these really hardworking activists who I know who've been in this space for a long time, the civil society or who are screaming at the top of their lungs, look what's happening in Myanmar and look what's happening in the conspiracy theory, correlation matrix and Facebook groups. Look what's happening. And YouTube recommendations.

[00:25:55]

You know, it's these groups who are pushing from the outside who are driving, I think, some of the most change. At the same time, I want to say I know and we work with regularly at the Center for Human Technology, many of the insiders who are leading the various key products and features inside these companies. And we find really good hearted people who are trying to do the best they can. Yeah, but they have kind of created a Frankenstein where once, you know, you've steered, let's say, 50 million people into Kuhnen or other fringe conspiracy theory groups.

[00:26:25]

The damage is kind of already been done. Right. And we can get more into some of those aspects, which I think are really the most existential. You know, there's many different societal ills that are coming from technology. But the real breakdown is our inability to have faith or trust in a shared information environment, to believe the same things and to even interpret reality in the same ways. Because as you know, with your brain, once you have a hammer, everything looks like a nail.

[00:26:49]

And technology has sort of like Moses, part of us, into two different sort of hammers seeking nails infrastructures where we have, you know, polarized societies around the world because Facebook profits, the more that each of us have our own individual personalized Truman Show reality, as opposed to one shared reality that we can actually have a conversation about.

[00:27:08]

Yeah, because in the shared reality, you're going to get information. You don't find that pleasing and you're going to get information that challenges your assumptions. And most people aren't going to spend a lot of time having their assumptions challenged. It's kind of antithetical to what we enjoy about the Internet.

[00:27:26]

Stay tuned for more armchair expert, if you dare. We are supported by hymns.

[00:27:34]

Hymns is a wellness brand for men. Let me watch the problem. It's one I experienced when I was about twenty eight years old. As I told you many times, shooting pilot for punk saw the back of my hair and thought, what the hell was happening back there? This is not uncommon. Sixty six percent of men start to lose their hair by age 35. And once you've noticed thinning hair, it can be too late. The solution for Ms.

[00:27:53]

Dotcom. It's a one stop shop for hair loss, skin care and sexual wellness for men. It's time to write a new chapter, one in which you have hair. Now I use for him. I get my pills to keep my hair. I get my shampoo. It's all working beautifully. As you can see, I still have hair. Thanks to science, baldness can be optional. No more awkward in person. Doctor visits or long pharmacy lines.

[00:28:12]

Just answer a few quick questions. A medical professional review and if they determine it's right for you can prescribe you medication to treat hair loss that is shipped directly to your door today. Hims is giving you their best offer yet. If you're not happy with your results after ninety days, hymns will give you a full refund. And right now armchairs can get their first visit absolutely free. Go to forams dotcom dacs. That's for Hem's dotcom slash tax. Prescription products require an online consultation with the health care provider who will determine if a prescription is appropriate.

[00:28:40]

Restrictions apply see website for full details and important safety information. Remember, that's for Hem's dotcom slash tax. We are supported by Brooklyn in our friend just got Brooklyn and she was just call her by name.

[00:28:53]

Just order some Brooklynese and thank. And she used the armchair promo code, which gave her a great discount, as you know, their sheets are impeccable, they're beautiful, they're luxurious. I love being new to them, but they also have products that take you from your bed to your bathroom. They've got comfort covered in this holiday's no exception, dorking gifts that are soft, gifts that are Kosi, gifts that give you the feeling of complete and utter serenity.

[00:29:17]

And lucky for you, Brooklyn's biggest sale, the year is coming soon. There are always people on your list who are tough to please. Their selection really takes into account different needs and preferences when it comes to bedding, towels, you name it. They've got candles, silk masks and scrunchies and robes. So cozy up to Brooklyn's biggest sale of the year happening right now, Brooklyn, and is so confident in their product that all their sheets, comforters, loungewear and towels come with the three hundred and sixty five day warranty, get huge savings and free shipping during Brooklyn's biggest sale of the year, only at Brooklyn and Dotcom.

[00:29:46]

That's B-R. OK, Elai and end dotcom and use promo code armchair to let them know our show sent you. That's Brooklyn and Dotcom Promo Code armchair. And if you can't wait for the sale or if you're just hearing this and it's post Black Friday and Cyber Monday, you can still use the promo code armchair at Brooklyn and Dotcom for 10 percent off and free shipping in the US anytime.

[00:30:17]

So I want to quickly, if I can, in layman's terms, just explain kind of what happened with YouTube, as I understand it, which is in the rabbit hole, they get this great person who volunteered to be a part of this. And I will say in general, when I hear about a Kuhnen person, I'm seeing them at the end of the line. Right. All right. If there are very all right now white nationalist misogynists, I'm meeting them as that person.

[00:30:40]

And it's hard for me to imagine that they might not have been that person five years ago. And so this guy turns over his entire viewing history and you can just watch him be led, you know, a microphone at a time further and further away from where he started. And he starts as an environmental science major. He's probably, you know, centrist at worst or something. And and just because the algorithm had figured out that it's not enough to give people what they want, you also have to give people something new.

[00:31:15]

And they know if you enjoy this thing, you'll enjoy this incremental shift over to the right. Now, again, none of this I don't think was malicious or had a bad intention, but the algorithm, once it takes off, it's just it gets so perfect at knowing exactly where to bring you to get you to spend more and more time. And this guy kind of wakes up as a really far. All right. Misogynist, extreme extremist who is sympathetic to white nationalist movements because they have a right to to be proud of themselves, you know, and he just wakes up and goes, who?

[00:31:50]

Oh, my God, who am I?

[00:31:53]

Yeah. Can I give you two more examples of that? Yeah. It's important to rewind the clock and realize that, you know, we're now more than 10 years into this mass psychology experiment where three billion people's thoughts have been wired up to supercomputers that are steering what we all watch and look at. And as you said, no matter where you start, if you imagine a spectrum where on one side you have the calm Walter Cronkite sort of section of YouTube, calm, rational discourse, thoughtful, slow, maybe a little bit boring, but, you know, some kind of shared water cooler reality.

[00:32:22]

On the other hand, a side of the spectrum, you have crazy town or extreme town. You have conspiracy theories, anorexia, videos, insanity. You know, whatever is going to keep you glued no matter where you start. You can imagine you could start even in the calm section or you could start in crazy town. If I'm YouTube and along that spectrum, what's the next set of videos I'm going to recommend to you? Am I going to recommend you towards the calm section, towards that section, or am I going to recommend you towards Crazy Town?

[00:32:46]

You're always going to tilt the floor towards crazy town. So if you imagine these platforms taking the entire floor of humanity and then just tilting it by three degrees, several examples of this. If you were a teenage girl and you landed at a dieting video and YouTube and maybe you can type that in right. You started searching for diet videos, but then YouTube trying to calculate for that right hand side bar, which we should lay out. One more fact for people.

[00:33:09]

What percentage of YouTube's traffic comes from that right hand side bar of recommendations, like, if you had to guess, versus what you initially searched?

[00:33:18]

Yeah. Versus what you search for. You pick out at the very beginning, right?

[00:33:21]

Well, I'm skewed because I listen to Rabbit Hole. Yeah, tell us. Tell us. So you know. Well, you know so well, I don't even know. I just I imagine it's in the 80s or 90s or something.

[00:33:29]

So the last time YouTube released this number, because I think they got scared by how good it was. They used to be very proud of it. It was more than 70 percent of the watch time. A billion hours a day, that's 700 million hours are controlled by what a machine is recommending. So if I sold you, you know, if I was controlling the world, would we know, you know, A.I. is controlling the minds that are controlling the information flows that go into each of us, that then make up the choices that we make downstream.

[00:33:56]

So it's kind of tapped into mind control happening way upstream. And that might sound salacious, but I'm sure we can defend the legitimacy that these things really have taken control. And as you said, you know, if a teen girl watches this dieting video on the right hand side, what does it recommend is anorexia videos or what are called fins operation videos, because they're very good at keeping teenagers attention for those people who look like they clicked on those.

[00:34:20]

And this is the equivalent to driving down the five highway in L.A. And, you know, according to YouTube, if we should give you what you pay attention to while you're driving down and you see a car crash, everyone's eyes look literally single. Every car, their eyeballs veer to the right, they see the car crash. So according to these tech companies in the eye, everyone loves car crashes. We should just feed you in bottomless supply of car crashes.

[00:34:42]

And this is the exact way that these systems work. You know, for parents who I think right now are forced to have you to be the new babysitter, you sit your kids down to maybe watch some videos and they're watching World War Two videos or something. And then next thing you know, they watch it recommends Holocaust denial videos because those were really good at keeping kids attention. And then they come to the dinner table saying the earth is flat, the Holocaust didn't happen.

[00:35:05]

And you say, why is the world feel like it's going crazy everywhere all at once? And it's really because these things for ten years have been staring us just slowly, bit by bit into this crazy view of reality. I'm saying this, I know it can sound frightening or dystopian, but it's really important that if we can all collectively see that this has happened, I can snap my fingers like it does and say we can wake up from this trance.

[00:35:26]

Now, we have a very artificially inflated view of our polarization or how much we should believe conspiracy theories, because this is what it's been doing. Yes. Yeah.

[00:35:34]

I also just want to make a little bit of an analogy as a writer of movies. So if you can just think first and foremost is very acknowledge that we are storytelling machines. That's how we pass on all the information that kept us alive. So we have this great, great capacity to understand or retell and to understand stories. Now, a principle in writing that is undeniable is you will never see a movie where you see a car chase at the beginning of the movie and then see a second car chase in the middle.

[00:36:01]

The second car chase will always be longer, more spectacular and have more explosions. And then the third car chase at the end has to be the biggest, craziest one because you have to overdelivered, you have to raise the ante each time or us as storytellers, followers, we will lose interest. You can't go backwards. So that's another way in which these YouTube videos are hacking into something very primitive in how we think, communicate and what we're attracted to.

[00:36:30]

So I had just one part of this that I would love to just say out loud, which is I don't think I'm unique in believing that I'm in charge of my opinions and that I'm in charge of what I think is just an unjust and ethical or not. I think we all have a sense that we're really anchored in a self. And I think we're all we all dramatically underestimate just how persuadable we are. And this is kind of faux arrogance we all have that we know who we are at our core and that we couldn't possibly be led astray.

[00:37:05]

And the fact is, that's just not the case. Right. You had many classes on persuasion at Stanford. You know, could you just kind of give us a sense of how fallible and vulnerable we are to persuasion before we get into the fallibility and vulnerability which which we always talk about and is really an important frame?

[00:37:22]

I mean, the biggest, most obvious example of us not choosing our own thoughts and our own contents of our own minds is the fact that we speak in a language that we didn't choose to speak in. I don't speak English because I chose that. I speak English and the accent that I have, there's actually a great New York Times thing where if you just answer twenty five questions about which words you use about various objects in our environment, it'll actually pinpoint the exact zip code that you live in because the word choices are that specific to our geography.

[00:37:52]

Yeah, that literally with like something like 20 questions of which words you would use about various objects like what you call the dividing line and, you know, in a street or something like that or what you call, you know, the access lane or these kinds of things, you can actually pick the exact place that you live, which speaks to the fact that, you know, we are all operating with accents and dialects of language that we didn't choose.

[00:38:10]

We just grew up in the spaces that we did. But on the persuasion front, you know, my background originally, you know, as a kid in San Francisco with my mother, she would take me to the magic shop. And I was really fascinated by magic because even as a child, you could do something that adults didn't understand how the trick worked.

[00:38:28]

And I found that fascinating, did a really impressive bit of coin work and we enjoyed that. I love she's so horny for magicians. It's crazy magic. Yeah. We have one at my birthday this year.

[00:38:39]

Yeah, that's amazing. Yeah. I feel so lucky to be able to have gotten to know certain magicians. I've had such a huge, you know, magician crush on for a while. It's Apollo Robbins or Darren Brown over the last couple of years. It's just astonishing when you really see the level of craft in a magician knowing something about how other people's minds make meaning that other people don't know about themselves. That's what makes the illusion work, is the fact that there is an asymmetry of knowledge.

[00:39:06]

If the asymmetry of knowledge wasn't there, that I know something about you that you don't know about yourself. If you knew the thing that I know, then the trick doesn't work. Right? And magic is a very visceral experience of all of us being influenced by, you know, maybe it's been a while since a listener out there has seen a magic show. But I actually was recently at the Magic Castle down in L.A. before coronaviruses hit. And, you know, even being a magician myself, you still are just blown away by the craft of what these people can do.

[00:39:33]

And later, I studied at a class at Stanford called the Persuasive Technology Lab a persuasive design course, which originally is really about how to use persuasion for good. How do you embed persuasion into technology? For example, could you help people build habits for going on exercises or for cheering up their class? This is Vijay Fogg's class.

[00:39:55]

We interviewed him on a sweetie pie. We like him. He's wonderful.

[00:39:58]

And there's a false narrative out there that somehow we are enemies or something like, oh, he spoke highly of you.

[00:40:03]

He had he said like two of the people he was most proud to have had in his class.

[00:40:09]

You are one of them is you and the Instagram. Are you guys are in the same class. Right.

[00:40:13]

And how bizarre that is. We were you. I was in the class with the co-founder of Instagram, Mike Krieger, who's been a very close friend of mine, and there's actually a photo of the three of us at South by Southwest in 2011, which felt like a very historic photo somehow in that it was taken to, you know, just to complete that story. Mike and I, before he built Instagram. We worked on a project together in that persuasion class called Send the Sunshine, which was about, you know, in periods of bad weather, people get something called seasonal affective depression disorder.

[00:40:46]

I have it self diagnosed, but I definitely have it have it today. It's cloudy here.

[00:40:50]

I think she has a biological component having come originally from India, where there's a lot of like some of my other Indian acquaintances have suffered from it, disproportional level. This is an armchair anthropological observation, but please continue.

[00:41:04]

Well, you know, our environment really matters in terms of how you know, how uplifted we feel and the cleanliness of our room to the weather outside. So this project was basically and this is keep in mind, back in 2006, even before the iPhone. So it's kind of a futuristic idea, which was let's say you have two phones, two feature phones, and in one of the zip codes, the Weather Service just knows that you live in a zip code where you've had bad weather for five or six or seven days in a row.

[00:41:29]

What it does is it text your other friend and says, hey, your friend DAX has had bad weather. Do you want to take a photo of the sunshine? Because we know you're in a weather is a code with good weather. Send some sunshine to him. And so you get this view of this kind of puppeteer who's trying to orchestrate something really positive. You know, you're trying to orchestrate compassion or love or thinking of you kind of vibes.

[00:41:50]

Right. And so this is an example of persuasive technology that could be used to help bring people think a little bit closer together, you might even say, to make sure we're charitable and even handed with Facebook that the birthdays feature is reasonable. Right. I mean, to be able to know there's this one day you get to celebrate your friends and it's helpful to have computers remind us and tell us what those days are. That's persuasive technology. And you start to realize that persuasion is kind of everywhere it's in our world is a choice architecture.

[00:42:16]

You know, the height of shelves in a supermarket controls what we look at and what how often we buy some things over others. You know, the fact that when you're at the end of a grocery store line and there's that last bit of chocolate and sweets and gum and things like that, you know, end caps and caps. Yeah, yeah. And so our world is we are living already in physical choice. Architecture is physically persuasive environments. But if you now put the phone in our hands, this phone totally reshapes the menu of choices that we are picking from.

[00:42:45]

And it provides in any moment of anxiety or boredom in instant excuse. You always have something sweeter on life's menu you can choose, which is to look at your phone rather than be with yourself. When we had TEAC not Hon, who's the famous spokesperson for World Peace and for mindfulness. He's a Zen Buddhist monk who passed away, I believe recently from Vietnam, and we brought him to Google. And the way he described his concerns about technology is that it's never been easier to run away from ourselves.

[00:43:15]

Yeah, because these things provide an infinite, you know, excuse for the quick hit of pleasure or novelty, which is way more satisfying than the discomfort of whatever is plaguing us. You know, when when the lights go down.

[00:43:28]

The stupid example I give is my wife and I went to this hotel for two days and I just said on the way up, I'm not going to look at my phone for two days. First day I was like bored, agitated, grumpy. Second day I found paper and some pencils and I drew all day and I loved it. And I was like, oh, my God, when's the last time I just sat in Drew? And it was like a right.

[00:43:52]

And I was shocked at my own level of being satiated by the never ending blinking machine.

[00:43:58]

Yeah, I had a friend who had created something called Campground ID, which was actually an experience for three days, which revolved around unplugging from your phone. But it was much deeper than that when you arrived at this camp. There is three people in hazmat suits who would come up to you and take your phone and put it in a plastic sealed bag along with your driver's license. And they would do this whole ritual to kind of take it away. And then they give you a new name for the weekend.

[00:44:22]

So you if your name is Tristan, my name would be presents for the three days that I'm there. And then the entire time you're there, everyone has a different name. You can't talk about work. There's no technology and there's no time. Everyone also takes your watch away. This time is also an interesting thing, that it's the technology that really restructures our relationship to pressure and attention and things. So you don't have a watch. You can't talk about time, you can't talk about work, you don't talk about age, and you just have a new name.

[00:44:50]

And so you kind of get to see who am I when everything is just taken away. And it's very uncomfortable and people have these brief moments of kind of feeling anxious and everything.

[00:45:01]

But then it becomes this kind of freeing human experience where Orji, by day two, I have to imagine you started fucking like the first original game.

[00:45:10]

Boy, that's an interesting you know, it didn't go in that direction. Bombi, a different redwood forest, would have steered people differently, but it's actually amazing because when there's no competition for attention, everyone cares a lot more about each other and gets to be a lot more curious about each other because there's nothing else.

[00:45:29]

The other people are the source of entertainment at that point. It turns out people are really interesting and actually have the patience and not just one way, but both people have the patience to get curious about who each other are.

[00:45:42]

To your point earlier. Yeah, you don't have this endless bit of homework, which is your email, all the responses you owe, and you don't have this pressure of what's supposed to happen at this time and that time. Yeah, there's a lot that goes away right there completely.

[00:45:56]

Yeah. And this experience is created by Leevi Feliks, who was an amazing human who unfortunately passed away a few years ago from brain cancer. We were actually the same age, but he created these weekends at Camp Grounded, where hundreds of people had totally life changing experiences. And it wasn't just from the disconnection of technology. It's about reconnecting with what it means to be human, including being in a redwood forest, people.

[00:46:20]

Your next story, you tell if the person dies at the end, that'll be the third strike. OK, so I was back to back. No more. The Buddhist monk. The Buddhist monks.

[00:46:31]

Yes, yes, yes, yes, yes. I forgot about that. Yes. Or at least tell a couple were there. People are still thriving. OK, so I have some thoughts.

[00:46:41]

I watched it this morning while I was working out and just won one sentence. Again, I want to repeat that so profound that people should really ask themselves, which is if you're not paying for the product, you are the product. I feel like that's the easiest way to assess what's happening. It's kind of the old poker table saying if you can't recognize the fish, you're the fish. That's a powerful way to think about this. Right. And I've heard of people who concentrate in your field, talk about this kind of fork in the road we had, which was we made a decision to either pay for this service or to receive it for free from other people who would be footing the bill, which would be advertisers and this and that.

[00:47:26]

So we didn't choose the paid version. Of course, I'm in a position where I could pay for the paid version and I would like the paid version. But it is inherently a little undemocratic, isn't it? If that were the it would just exacerbate the income inequality. How would we get around that? Is that part of what a potential solution could be is, is one that we picked that had a set of ethics and then we just we had to pony up and pay for it.

[00:47:51]

Yeah, well, maybe first for the audience. It might be good to explain why when we are the product, this is such a unaffordable outcome. One thing we like to say is free is the most expensive business model we've ever come up with, because when the product is free, as you say, we are the product being sold. And specifically, Jaron Lanier is very quick to point out in the film that this subtle, imperceptible change in our beliefs and behavior and identity, that is the product.

[00:48:21]

So the ability to shape just by one percent or zero point one percent what we're thinking, feeling and doing, that ability to shift someone is the very thing that's being sold.

[00:48:32]

We don't go further for one second. That's a big concept that I want to set in. And yes, I think a lot of us have a kind of baseline understanding that we're the product in that we're being advertised to and we're going to go spend money and buy these products. But that's incomplete, is what you're saying, and that there is something more dangerous on the table, which is not only will you buy this product, but I can change your mind about this product, whatever that product may be, I could be a political product that could be anything and that the real value for anyone that would invest in this is to lead the world somewhere they would want them to be led.

[00:49:15]

That's right. Yeah. The ability to shift minds.

[00:49:17]

Now, when I say this, a lot of people might think, well, I'm not influenced by advertising. Right. Not just some people. Almost everyone believes that they're not influenced. There's something called the Dunning Kruger effect. Everyone believes that, you know, they're better. I think 90 percent believe that they're better than average drivers. But of course, that distribution doesn't work out.

[00:49:35]

Also, the Dunning Kruger, also the stupidest person, talks the most. Right, that in general they've done it. Yeah, there's so many aspects of the ways that we overestimate our capacities. But this is critical that it's not just about the advertising. Facebook's incentive is you are more profitable when you are addicted, anxious, polarized, attention seeking, outraged and disinformed than if you were actually a thriving citizen of our democracy. Because in each of those cases, you're spending time on the platform, right?

[00:50:07]

If you are not addicted, you're not worth as much as someone who's addicted. If you're not anxious, that causes you to check in more often with an unconscious habit. You're not as profitable as someone who is anxious, who's checking in if you're not attention seeking, like you don't care about the number of people who looked at the video that you posted and how many comments it got, you're worth less if you're not attention seeking than if you are at attention seeking.

[00:50:28]

So each of these cases, we are worth more if we're domesticated into a certain new kind of human. You can think of it as just like we domesticate cows. I'm sorry this feels disgusting to people, but I think it's important to really make sure the metaphor lands that just like we don't have regular cows anymore, we're wild cows. We have the kinds of cows that give us the most milk and give us the most meat, because that's the kind of cow that fits our economy.

[00:50:53]

Well, we are becoming the kind of human that is, you know, shorter attention spans, more addicted, more distracted, more outrage, more polarized, more attention seeking and more disinformed, because each of those things are part of what makes the companies more money. Again, not because anyone at Facebook twist their mustache and says this is how we're going to make all this money. But the machines that they supercomputers that they put in our brains to calculate, what thing can they show us and how quickly can they show it to us?

[00:51:21]

And should it auto players, should it wait five seconds or should we make people hit rich, richer for instantly, or should we have them wait and meditate on a on a mountain and then share only when they know if it's true? All the worst aspects of society are more profitable than the best aspects of society, which is actually what puts each of us in the same boat as human beings. Like the good news is no one wants the system because it's an economy based on cannibalizing our own life support systems that make us real human.

[00:51:49]

Whether you're a Republican or a Democrat or you're a child or an 80 year old, each of us are in the same boat together because it's really our lower level human instincts for each of us.

[00:52:00]

Well, and ones who are suffering, we start seeking comfort. And we generally, when suffering and feeling fearful, seek comfort in the quickest, easiest, most disposable ways. That, of course, then impact even more suffering. And, you know, it's like the sugar spiral circle people get in or the lack of exercising leads to less exercise. You know, all these things, we seem to always be spiraling up or spiraling down. That's exactly right.

[00:52:28]

And I think a humane technology world is one in which you get virtuous cycles, where more virtuous behavior creates more virtuous profits, creates more virtuous behavior, creates more virtuous habits, creates more virtuous ways of living, creates a more virtuous society. And so what we're looking for, as you said, is not these sort of downward spiral of human downgrading, which is the more the machines get smarter at predicting our behavior and manipulating us and making us more predictable, they make more money.

[00:52:53]

So they get to bigger machines that can better predict our behavior. And then sort of a vicious loop of they get increasing power and we have become more predictable. Let's reverse that so that humans become more free, more wise, more thoughtful, more virtuous, and the machines are more in service of us as opposed to using us as the resources to be mined.

[00:53:11]

Here's where we get into some big philosophical issues with all of this, which is we live in a capitalist market economy. How do you incentivize that? How could that ever be incentivized?

[00:53:24]

Well, I think this now gets to the question you were asking about. You know, hey, we could have paid subscription models for these products, like, you know, how much the people paid. Your listeners paid for a Facebook account recently. Nothing but how they were seven hundred and fifty billion dollars, because obviously our attention is the thing that's being strip mined for profit. So if the business model instead was subscription, we pay ten dollars a month, let's say.

[00:53:48]

Yeah. You know, it's important to say that we I think, you know, for basic phone service, it's not free either. We have to pay for a network for access to a network. Yeah. So this is really not a radical idea, you know, for Netflix and Peak Entertainment with Netflix and the rise of, say, Game of Thrones, we get Game of Thrones and we pay for subscription television instead of the kind of race to the bottom for click bait.

[00:54:10]

So we're really paying for is a world that we want to live in. Now, obviously, any time you have a paid access economy, it introduces inequalities. We want to make sure that we're equalizing. However, to say that we want to equalize what we have now is like saying it's like Coca-Cola saying, well, how else are we going to give the entire world diabetes if we don't, you know, use sugar?

[00:54:31]

Right. They're delivering. Come after you come on their hitless. We're delivering the wrong product.

[00:54:37]

We have to make sure that we're not we're not delivering garbage that downgrades human civilization. So we can't actually survive as a species. We want to deliver ubiquitously a democratically the kind of self reinforcing virtue as humane technology that is actually has our interests at heart.

[00:54:54]

I thought of something today while pounding the bench press and listening to your movie as anyone thought about this technique. Now there seems to be an agreed upon analysis of happiness in that the UN always rates Sweden as number one and God knows what's the last place one there. There seems to be some pretty dependable.

[00:55:15]

Metrics we can measure happiness, right, and we can measure suffering, we can measure on some level flourishing. Why isn't there an independent government agency that studies what a year on a platform results in? So let's say that every one of these platforms, Facebook, Twitter, Instagram, from the day you joined, one year later, you would be asked a series of questions and then they would give all of these different products a rating. So people are going to get 30 percent more miserable.

[00:55:49]

If I if I see that as I'm signing into Facebook versus I go to Instagram and let's say I get two percent happier, I feel like that would be the equivalent of putting ingredients on labels and putting caloric intake in fat and all that. No one has a sense of whether this thing's going to make them more miserable or not. And I trust that people wouldn't pick a product that's going to make them more miserable. How could that work or work in tandem with something?

[00:56:15]

Yeah, well, this is definitely a line of thinking that people have gone down. It's it's a really important one. I think we have to make sure we're setting the right goal for what our objective is. So if the goal post is happiness, the brave new world that Aldous Huxley conceived of, where everyone has so much and pleasure, you know, a world where we're all just getting happy is not a world where we necessarily solve climate change, deal with racial injustice or, you know, make a more equal and fairer society.

[00:56:41]

Obviously, a world where Facebook is delivering happiness in that in that way might just be kind of amusing ourselves to death or sort of distracting us. So I think what we need to make sure is that our technology environment, if we set the right goalposts, conditions humans to have the kind of capacities that can solve our most pressing problems, including having the well-being, mental health, relationships and connection that enable us to be full and thriving would require people to trust science, which is more and more.

[00:57:09]

There are many, many people who are a part of those metrics whose purpose is community is, you know, not just, yeah, I'm on two hits of ecstasy and you gave me a lollipop measure of happiness. But again, if there was one we could agree upon that was truly flourishing and using all the cognitive behavioural things we now know and sorry, please continue.

[00:57:30]

Yeah, that's exactly right. I mean, you quickly kind of reverse engineer Maslow's pyramid and you end up with, OK, how do we make sure technology is enabling the kind of full, holistic, thriving, flourishing and embodiment of our wisest values? Then you quickly enter in this kind of abstract philosophical conversation about what does it mean for us to be so wise and how do we know that we're flourishing? And can Facebook or should we wait for technology to deliver on all those benefits?

[00:57:55]

Shouldn't humans also have the capacities intrinsic in themselves to achieve their own ends and so on?

[00:58:02]

Can I interrupt you for one second, because it is a point you made earlier in the documentary, which is it occurred to you at one point, wow, not only are two billion people being affected by this, but they're ultimately being affected by a handful of 25 to 30 year old white males. And so there is a inam throw, we say naive realism. There's a tendency to believe that your hierarchy, your Maslow pyramid in your culture is one that would transcend when in fact, if you're an antrel person, that's not true.

[00:58:34]

So, yeah, you're designing something or my suggestion would be implicit in it, assuming other cultures would want our Maslow pyramid.

[00:58:44]

And that's exactly what we've seen is a kind of digital colonialism, especially since these companies, especially Facebook, with its Free Basics program, has gone into countries like Myanmar or Ethiopia and made a deal with the telecom providers. So if you were getting your very first phone, you've never been on the Internet. Your phone comes with Facebook. That's the deal that they do with the telco provider. And you get Facebook access free, but everything else costs money.

[00:59:09]

And this creates an asymmetry where now Facebook has crowded out competitors and alternatives. And people's first experience with the Internet is Facebook. There is no Colen there is no there is no learning about security or how to type in usernames and passwords. All of that goes by the wayside. And Facebook has colonised with their view of reality, with their, you know, white males, mostly in California, choosing the ways that everybody else lives by. And that has had enormous negative consequences.

[00:59:38]

And you say in the dock, they think that's the Internet. They think the actual Internet is just Facebook.

[00:59:44]

That's exactly right. And it's important because in some of these countries that they do these deals with free basics, like in I think in Ethiopia there's six at least six major languages. And this is pointing out Ethiopia, because many people are looking at it as kind of Myanmar number two, meaning sort of the second at risk country because of fake news that's spreading and driving up civil conflict. And so you end up in a place where with all these different languages, do you think that Facebook has content moderators for the six different major languages of Ethiopia?

[01:00:15]

For the hundreds of dialects on different continents, do you think they have content moderators for that? So if you think YouTube recommending some conspiracy theory, you know, Holocaust denial thing in English is bad, but then later you take, you know, YouTube takes the whack a mole stick and they'll whack some of the bad apples. That's because they have content moderators that are paid to do that in that language. Now, you open it up to Facebook managing 80 elections around the world in a given year and you have hundreds of languages and dialects, especially developing countries where they don't have nearly the same level of attention as we're now having here in the U.S. with the US election, where there's obviously war room set up within Facebook and Google to try to deal with adversarial threats.

[01:00:54]

We're entering into a period where this is an unprotected infrastructure in the kind of bullies tend to win the worst of us. The hate, the outrage, the accusations, the black and white thinking, the incivility, the dehumanization of speech, dehumanizing kind of speech. That's the stuff that gets the most clicks, which then makes up the default information environments of these places. I'm sorry to be so dark, by the way.

[01:01:15]

It's important that people get, I think, the real world outcomes of this. That's right.

[01:01:20]

And it's much bigger than addiction and time spent. Right. I mean, we're talking about the fabric and survivability of every society.

[01:01:28]

Stay tuned for more armchair expert, if you dare.

[01:01:33]

We are supported by square, just as we're all trying to adapt to everything going on this year. Small business owners everywhere are having to figure out new ways of doing business. If you run a business and are thinking about shifting online, Square can help with tools like a free online store. Set it up in minutes so your customers can shop from anywhere with pickup delivery or shipping. Once you're up and running, Square can help you promote your new online store with tools for social ads and email marketing.

[01:02:02]

Everything works together all from one place. You just need a square account to get started. See all the ways Square can help at square dotcom logos dacs that's square dot com slash go slash tax. We are supported by better help. If you think you may be depressed or you're feeling overwhelmed or anxious, better help. Online counseling offers licensed professional therapists who are trained to listen and to help with issues including anxiety, grief, depression, trauma, anger and more.

[01:02:32]

Now, as you guys know, I'm a raging addict. I also have some childhood issues and nothing has been more helpful than talking with some outside objective folks who can help me navigate this. That includes, for me, therapy. It includes 12 step Monaca. You two are under the guidance. Absolutely couldn't live without it. Now all you have to do to get started, it better help is fill out a questionnaire to help assess your specific needs and then get matched with your counselor.

[01:02:59]

In under forty eight hours, you can easily schedule secure video or phone sessions, plus exchange unlimited messages to communicate with your therapist at your convenience. Everything you share is confidential. Better help is an affordable option and arm Cherrie's. Get ten percent off your first month with the discount codecs. Get started today at better help h LP dotcom attacks. Talk to a therapist online and get help.

[01:03:32]

Yes, and so you just kind of touched on one of my fears when I have a pessimistic view of all this, which is, wouldn't it be impossible to have human ethical oversight of any of this?

[01:03:45]

Because the volume of information and the users is just so great that no company or government agency could ever monitor it. Only the A.I. can be handling the volume and processing it. And so I hate to be nihilistic about it, but I just don't know how the volumes just so great. It's almost like I compared to the Department of Homeland Security, it's like with the Patriot Act, they start gathering all this data of all these phone calls and Bova and people are worried about it.

[01:04:13]

But in practice, who gives a shit? No one's around to synthesize all this. No one can go through all this data. It's just collecting it for the sake of collecting it. There's no no human apparatus that could possibly monitor it.

[01:04:27]

Yeah. So you're getting to this critical possibly underneath kind of the paradox that's underneath all this, which is that the premise of doing this, you know, managing our attention, our information with technology is that we automated, meaning we have machines choosing what we see, because it would it would cost a lot of money to hire editors, hundreds paying them a hundred thousand dollars a year to pick what's true, credible or real, and the whole premise of the business model.

[01:04:58]

One thing I might share that I don't think I've talked about in any other interview is I actually worked with Jimmy Wales, the founder of Wikipedia, for a period earlier in my life and my career when I was just finishing computer science at Stanford. And I worked with him on his for profit Wikipedia spinoff called Wikia. And Wikia was the idea that they would do Wikipedia. But more like if you went to a library, you would have the encyclopedia section.

[01:05:24]

That's Wikipedia, but then you have the rest of the library in the magazine rack, meaning people writing their own content, just like Wikipedia, where you have regular people filling in, not just articles that are encyclopedic, but maybe the lost wiki or the Wikipedia Star Wars wiki. Yeah, and the premise of why I'm bringing this up is because this was an attractive business model, because we, the regular humans are the ones doing all the writing for free.

[01:05:48]

Yeah. Yeah, right. Notice that no one who writes Wikipedia gets paid for writing Wikipedia. Yeah.

[01:05:53]

Similar to YouTube as well. Yeah. They're just creating Condah. Exactly.

[01:05:57]

And so the brilliance of these business models, what makes these companies so profitable is that there are no editors, there are no content creators or journalists that you have to pay to go to journalism school and have a media ethics background and double check and do the right journalistic protocol. You can just get 15 year olds to basically put on makeup in front of YouTube channels and make ten thousand dollars a month. And that becomes the new normal for teenage girls. So the premise here, as you've said, is let's say you're Facebook and you've got trillions of items of content rushing through your system.

[01:06:28]

They would have to hire thousands and thousands and thousands, if not millions of content moderators.

[01:06:33]

Can I add one stat in there? Yeah, in the wormhole one. I forget the exact number, but it's very close to this.

[01:06:39]

In a single day of global YouTube viewing, it would amount to one hundred and fourteen thousand years of content.

[01:06:48]

Well, so the principal and computer science here is there's just way more information that is causing consequences than there is monitoring of what is inside of those of that information or what consequences are being created. So that's like a car that is driving faster than your steering wheel, than your eyeball can keep track of how many people you're running over, how fast it's going and how many people, how many stop signs you've bashed into your eye works more slowly than your car is knocking the stuff down.

[01:07:16]

Yeah, and that's the problem is that Facebook is this kind of Frankenstein where it is creating, again, some positive consequences. Let's remember the organ donors and the blood donors and the families reconnecting in high school relationships they haven't seen after 20 years. That's fine. But there's also just this unmitigated harm that they really don't have control over. And that's why we're really in this, why we, I think, called the film and the directors and filmmakers called the film The Social Dilemma, because it is a deep dilemma that we are now living infused with these systems, these social systems that are running our digital habitats.

[01:07:50]

These are the places that we spend hours of our lives now. They're not just products we use. They're the habitats that we use to make meaning of, you know, how bad is Portland? Is it a war zone of people shooting each other or is it just two blocks in a city? And it's a beautiful, peaceful day? The way we know the answer to that is by looking through the tiny binoculars of social media glasses, right? Yeah.

[01:08:10]

And as these things have infused themselves with our way of making meaning and saying what is going on in the world if it's causing more consequences than they can ever get a handle on, we have to take much more radical actions about what we can and can't have in our information society. And first of all, as society become conscious that this is even happening, which is why I think the film was so important to make that. I think that's happening and I think this is becoming slowly comparable to the awareness that smoking cigarettes gives you cancer, you know, that took decades and it was slow, but people seem to really understand that.

[01:08:44]

So I do feel like the awareness variable in the equation is happening. My fear is that the horse cannot be put back in the barn.

[01:08:54]

You know, this metaphor of putting the genie back in the bottle or the horse back in the barn obviously comes up. There's actually one time in history where we did put not the genie, but the pill back in the bottle. And this was in the history of Johnson and Johnson, where there was.

[01:09:09]

Oh, Tylenol. Yeah, well, so in 1980s there was poison that was being put into Tylenol, whatever. It wasn't the capsule, the jar. I think there's poison tampering. And so some people were dying because of Tylenol carrying these these tampered with poison poison tablets or something. And Johnson and Johnson had a choice. They could have said there's no problem, deny, deflect, delay. Tylenol is fine. Keep buying it. We're sure it's safe.

[01:09:35]

They could have said it's not happening. They could have said, look how much we're doing to try to remove, you know, sounds familiar with Facebook and they didn't do that. Instead, what they did is they did actually say we are going to take this off the shelf until we can prove that it's safe. And they took it off the shelf. I think it was something like five or six weeks. You can look it up on Wikipedia.

[01:09:55]

And and, of course, what happened was their stock price tanked in the short term, but then because they had done a high trust action where they were honest with the American public about what was actually going on after they they invented the tamper proof top, which is why they invented that that tamper proof top for the jar. That's when people trusted them afterwards. So that's, I think, the model that we could apply for here. So in Twitter's case, we're about to go into a U.S. election.

[01:10:22]

There's groups like Accountable Tech and others that are pushing for us to ask Twitter to unfriend October. So basically just say the trending topics is a completely Gamesville machine. It only takes a few thousand bots or, you know, one hundred ten thousand people to what's called brigade, where you all simultaneously post about a topic that you want to get the media to cover.

[01:10:43]

This is how the fire festival got popularities. They all posted this yellow image or something at the same time. That's exactly right.

[01:10:51]

And so this is very amable. And so if you want to assert and keep in mind, if you were a KGB agent in the 1980s and 1990s, one fourth of your time, one fourth of your time was dedicated to inventing fabricated stories that would be very plausible, that would take down your adversaries. So imagine the kind of practice adversaries have when one fourth of their time is coming up with plausible stories that are already confirming your existing bias.

[01:11:15]

Have you seen this HBO two part doc about the troll factory and how how they came to be so effective?

[01:11:21]

And I haven't seen it yet, but I know he did. The one on Scientology called Going Clear, which think has tremendous parallels to what we're talking about.

[01:11:28]

Yeah, you know, it was originally basically designed to just alter how people felt about Russia. And then, of course, they found a great application for in the Ukraine and in Crimea. And then now they realize they have this incredibly powerful tool and now they just, yeah, deploy it everywhere. And you recognize just how sinister, how effective, how few people are required. That's one of the scariest aspects. There's only a floor full of people doing this.

[01:11:56]

So, yeah, it's highly Gamesville, as you say.

[01:11:59]

So it's incredibly gamble and it doesn't take that many resources. We just did an interview on our podcast called Your Undivided Attention with someone from the Institute for Strategic Dialogue, where they actually said how much would it cost if we wanted to run an influence campaign that would reach every single user online in Kenya using like Facebook or Twitter. And the answer is less than the cost of a used car. So it's about ten thousand dollars. In other words, if you want to run an influence campaign and you're from one of the Western wealthy countries for chump change, you can basically reach the entire world.

[01:12:31]

And that's what we're seeing in many different countries. And set in the Southeast Asia and the African continent is actually Russia in the same area. Troll Factory has actually been running campaigns and something like seven major African countries to try to influence public affairs there and elections, which is incredibly cheap. And I think one way to think about this is while we in the US are obsessed with protecting our physical borders and building walls and spending something like a trillion dollars to revitalize our nuclear fleet and our physical security, we left the digital borders wide open.

[01:13:01]

Yes, because how much of our economy and our society run in the physical world like our physical antennas or electricity grids or roads, you know, those are all protected by passport controls. And the Department of Defense will shoot down a plane from Russia or China to try to come in. But if they try to fly an information plane into the United States, they're met by a white glove with the Facebook algorithm and says, yeah, exactly which zip code would you want to target?

[01:13:26]

And they can now get us to, quote, pick what they want. Which is far crazier than being invaded, you know, exactly, it's not as if they're trying to actually bomb us, they actually go get to talk. There's actually evidence recently from about a year ago of Russia going into former US veterans groups from the Vietnam War. And they had pages with more than two hundred thousand followers where they're able to sow discontent among US veterans. Now, it's the equivalent of this.

[01:13:51]

When you really think of the mismatch of our defence industry, like the military, industrial and national security and intelligence communities have been really struggling with this because as we have virtualize our country, as we have made our country work in the digital world primarily and less in the physical world, we essentially lose all the security protection. So it would be like you're a bank and you spend a trillion dollars hiring physical bodyguards, these super buff guys that surround your entire bank.

[01:14:18]

And meanwhile, you set your computer system with the default password. That's lowercase password. You don't change the default password and anybody can hack in what matters more surrounding your bank with the physical body parts or letting the default computer system be wide open for hacking. And that's exactly what our adversaries see Facebook and YouTube and Twitter as they leave the back doors to our country's wide open for attacking. They're just doing it all in the culture war space. And I think, again, as you said, the key here is to recognize, first of all, that this has happened to us almost like we've been bombed by a business model.

[01:14:52]

But this is not theory harbor. It's not theoretical. This has really happened to us and that we really have an urgent not just, you know, mental health need here or a teenager's issue or addiction issue, but a national security issue that we have to fix this problem incredibly fast. And to your point, though, these are the richest companies in the history of humanity. And so if they could, you know, spend billions and billions and billions of dollars, whatever it takes to try to address these problems, that's what we need them to do.

[01:15:21]

And right now, hiring thirty thousand content moderators between Arizona and the Philippines is not a solution to this problem.

[01:15:27]

Well, it's going to have to be another algorithm that we put some faith in, and then it, too, will have some unintended outcome that we'll have to then adjust yet again. I imagine just the capacity can only be met by the technology. Ironically, I do want to ask two more philosophical questions. One is there is a Darwinian argument to be made here, which is that if we require all these American companies to do this, all that's going to happen is the many other foreign companies that make these products are going to have this huge advantage over us.

[01:16:01]

And they're going to get way more of our attention because we can't compete with the algorithms. So, you know, it is a daunting proposition to go first. Right. And I know this is a shitty argument and it's used in the environmental arena, which is like, well, why would we not have a ton of CO2 emissions if everyone else is going to we live on the same planet. So until they go. So some of this is like who's going to go first?

[01:16:22]

And certainly some people will have a Darwinian advantage over us.

[01:16:26]

Yeah, that's exactly right. And ever since the beginning in 2013 at Google, this is the concern that we always brought up is it is a game theoretic challenge. If I don't do it, the other guy will. You know, the US says we're not going to build semi-autonomous A.I. drone war weapons, but and we try to sign a treaty with China, say you're not going to build the drone weapons either. Been secretly. I'm going to build them and you're going to build them because I don't actually trust you when you race into a game theoretic escalation.

[01:16:52]

But at least in that case, you have mutually assured destruction. In this case, let's say you have companies like you said, we have a humane social network that doesn't depend on unchecked morality and the fame lottery and the promise of sort of attention seeking narcissism that powers our social media system today that felt personal about continue, if not intended.

[01:17:15]

Then the idea is, can we have a social media infrastructure that is for the people, by the people that is really in the democratic interest and that will be heavily competed by an unregulated social media that appeals to you'll reach millions more people with it, you'll get way more positive feedback. You'll get Snapchat beautification filters like, you know, enhancements. You'll do all of the kind of Las Vegas enhancements that kind of reach deeper down the brainstem.

[01:17:45]

And we can use the real time example. Ticktock. So Tech is now competing with all these other platforms. And it was on Rabbit Hole. It's a cutie pie a decade to get to one hundred million followers. And now with Tic-Tac, someone had sixty million and, I don't know, three months, whatever it is. So again, this isn't theoretical. It's happening currently.

[01:18:02]

That's exactly right. And the curve shortens every year. Right. Where the number of years that takes or months that it takes for someone to go from zero to one hundred million followers. The companies are competing to provide that addiction to the fame lottery. Right. Because if I can give you that faster, if I'm Tick-Tock, I'll outcompete Instagram for the same video you would post to Instagram and you get a thousand likes. If I can say well and tick tock, you'll get 10000 likes.

[01:18:27]

They're competing on how much they can give us this inflated sense of social app. And again, I use these metaphors often to try to create that intentional disgust, so people just say we don't want any of this. This isn't the system that we want to live by so that we can actually look at a country like Taiwan where you have an amazing human being who I think doesn't get any attention. Audrey Tang, who's the digital minister of Taiwan and created an online democratic society, meaning a digital democratic society using online tools that sorts our discourse for unlikely consensus.

[01:19:00]

So instead of sorting by what gets the most clicks or what gets the most shares or what gets the most comments, what they sort their system by is when two people who would tend to disagree on topics, when they tend to converge in a line, that's what gets up regulated. They're sort of democratic feed. Oh, unleash that. And they have a transparent government process that has led to I think the country has handled covid better than almost any of the other countries.

[01:19:25]

They have some of the fewest cases. And I really think people should go check out her work, either on our podcast, Your Undivided Attention or on the TED podcast. She has an amazing interview.

[01:19:35]

Does she speak English? Well, we would love to have her on. I don't speak Taiwanese.

[01:19:39]

So, yeah, she's actually spent most of her life in the United States. She actually worked on Apple's Siri team. Oh, OK. And what she's doing is really a model for how do you battle disinformation, keeping in mind that Taiwan is also heavily under the threat of Chinese disinformation? Yeah. So they've really gone through the gauntlet of intense threats and made it out, saying here is a model for a large country with millions of people, tens of millions of people with a not a homogeneous culture.

[01:20:05]

There's different different tribes, different languages, actually. And they've actually successfully created an online democratic model that works for the people, by the people and of the people. And that's what we need to bring to the United States. And it's funny, because this is one of those rare cases where, you know, coming from Stanford and Silicon Valley, you would visit places abroad and people would say, oh, yeah, we see Silicon Valley, but we could never do that here in Nigeria or we could never do that here and, you know, Taiwan or whatever.

[01:20:30]

This is one of those weird things where when I tell you about this example of Taiwan, you might be thinking, oh, we could never do that here in the United States, but we have to prove that it's possible. And I actually think that, you know, maybe I don't think the Trump administration would really do this. And I say this nonpartisan, but I think, you know, this would be a great thing if the Bush administration were to take hold to really say how do we modernize our democracy to have a safe, secure, you know, nationally protected digital democratic environment where we don't downgrade our collective citizenry, we don't downgrade our attention spans, our mental health, our attention seeking, our narcissism, our polarization.

[01:21:04]

We actually realize what happened and we reverse course. I think that's the most optimistic hope that I have that we could we could do from here on out.

[01:21:11]

Last two questions, I promise. Thanks for all your time. One is I'm also, again, as I guess I'd pat myself on the back and say, a realist about the market. We're also talking about taking away one of the most explosive and wealth creating and successful sectors of American ingenuity and export. So it is a little scary to think about pulling back what these companies make and what wealth they create, which, of course, disseminates all throughout our country.

[01:21:43]

That seems a little scary just from a financial point of view.

[01:21:47]

Yeah, well, I think the inconvenient truth here, which is actually why that was a great title for that film, is when we build an economic system that profits from the very problems that we want to eliminate from our society, it's untenable.

[01:22:01]

Yeah, this was also true in history where if there's a great book called Bury the Chains by Autumn Hochschild about the ending of slavery in the British Empire, because back then the entire world economy was powered by slavery. Yeah, right. Yeah. So if you wake up one day and have the very obvious insight that this is inhumane and wrong, what do you do when seventy five percent of the world economy is coupled with something you cannot decouple with? Well, an inspiring story is this book, Bury the Chains, speaks to how over something like 60 years, the British Empire dropped its GDP two percent per year over 60 years to decouple its economy from slavery.

[01:22:45]

And it did that without a civil war. And it did that by networking with the Quakers in the US and different groups and testimony and pioneering many of the kind of activism techniques that are now being used in the climate movement and many other movements. And I think that book offers a kind of blueprint for, hey, we've done this before. We've actually had a situation where our economy was directly tied, not making moral equivalence terms here about the content of what harms we're talking about.

[01:23:14]

Now, I think you're saying that Mark Zuckerberg is a slave owner.

[01:23:17]

We heard you loud and clear, and that'll be the headline, hopefully, that accompanies this interview in the click bait model of the attention economy that they have created. That is would probably create off of this more nuanced.

[01:23:30]

Interview. But I think that offers a blueprint for, hey, know, humanity, we've done this before.

[01:23:37]

Yeah, I'm a pessimist. And yet I also can recognize that every time we think we can't get off of horse and buggy production, the car comes along and then we can't get off of this. We we always find a way to stay busy and we always find a way to feed ourselves. And it's only gotten better and better despite all these hurdles now. This is my last question, because I just had a real juicy argument with my friend Eric, because I'm in favor of many of the videos they pulled off YouTube.

[01:24:02]

And he coming for more of a right perspective is like, well, yes. So basically, all of this forward movement, what you would consider would be good would generally just be liberal, kind of progressive thought and that we would be really just mostly policing against Q and on and all these things that, of course, I find repugnant. And so he has a point, there's an unavoidable point that, yes, I probably think some videos should stay that he would disagree with and vice versa.

[01:24:31]

There probably would be a liberal bent to all this, wouldn't it, if I'm on the right, I might think, well, yeah, I'm going to leave it to Tristan, who I'm certain is going to vote for Biden. So really all you guys are asking for is for us to let you on the left decide what we can all view. Is there a non-partisan argument for this or reassure that you can give?

[01:24:54]

Yeah, well, I think the challenge we're facing here on content moderation is a crisis of trust. So who do you trust to make decisions about what can or can't be broadcasted to millions of people? Do you trust Mark Zuckerberg, the individual to do that? Do you trust low paid content moderators and Arizona paid minimum wage? You know, to do that, do you look at the psychological sweatshops in the Philippines that do content moderation to trust, determine what is actionable, content to try to take down?

[01:25:27]

We don't have a clean answer to this problem. That's kind of the Frankenstein aspect, is that when you have systems that are blasting off dangerous content rockets for exponential scale. And as I said, if you have the power of gods, you have to have the wisdom, love and prudence of gods. You can't be zus and accidentally bump your elbow and scorched half of earth if you don't have a moral compass and good guidance. So I think the problem is we've created the means by which dangerous, viral and intentionally malicious and kind of conspiracy influenced campaigns can actually outcompete truth and we don't have a way of adjudicating it.

[01:26:01]

So the current standard that platforms use when it comes to foreign manipulation is something called coordinated, inauthentic behavior, because oftentimes what Russia or China or someone will do or Iran, they won't create a new influence campaign. They'll actually just find, let's say, a Texas secessionist. They'll say, hey, there's this group of Texas secessionists. Those are US free speech protected individuals. They've already got a group. But we're going to dial up the free speech folks.

[01:26:30]

I mean, up the Texas secessionist folks. And we're going to dial down the let's everybody get together and make this work, folks. And by changing the dials, it's not really about who speaks. It's around who gets hurt, which is the point of the attention economy. Yeah, that freedom of speech is not the same as freedom to reach millions of people.

[01:26:48]

Well, that's what I was arguing with, Erick. I'm like you. You don't have a right to be on a company's feet. You just don't know more than you have a right to put a billboard of your political point of view. On my home, there are barriers all over the place. The only thing you're protected from is that the government will not limit your speech and that you can walk out in the street and say whatever you want. But there is no constitutional right to be involved in other people's platforms.

[01:27:15]

It's just not free speech. That has nothing to do with that. Yeah.

[01:27:18]

And, you know, we're in this difficult situation where after a foreign actor will meddle in a local group and stir up a culture war, once that culture war has started, they don't have to continue. The culture war now exists. Right. So now your country is fighting itself on its own accord. And I can slowly walk into the background and leave the situation. And you're still fighting. And I think that's actually what's dangerous. And why I think awareness is so critical is we have to ask how much of the polarization and conflict in our society is because we would naturally walk outside and hate each other and fight each other.

[01:27:54]

And how much is that? We have been ten years in now through this mass warping of our social psyche where we see each other with less compassion and less common humanity, because social media has polarized us so deeply into a narrower and narrower, more certain view of our own view.

[01:28:10]

I sound like a broken record when I do this, but I feel it's so important to always bring this up, which is it's very easy because the headline getters are Pizza Gate and Kuhnen and Wayfair. But make no mistake, if you're on the left in your liberal and progressive, the Russians were creating fake protests that were attended by people on the left to. Their only desire is conflict. They don't necessarily have a position, they want discord. That's what their incentive is.

[01:28:40]

So you on the left are every bit as susceptible and being steered deeper into the left as is the right.

[01:28:47]

Can I give you even one more example from what you're talking about? The director of the film, The Social Dilemma, his previous two films, you know, he's an environmentalist, previous to films were about the coral die offs in the Great Barrier Reef and Chasing Ice, which is about the melting glaciers up north. And he found out in the research for the film that one of the groups that Russia targeted was actually U.S. pro environmentalist groups that were anti fracking and that Russia was specifically trying to dial up the reach of anti fracking groups.

[01:29:17]

Why? Because what happens if U.S. anti fracking groups are successful and we don't frack for oil locally? We have to buy from them? That's right. And Russia is basically an oil company posing as a country, and that is totally in their interests. So this speaks to the fact that many of the things that he has shared and retweeted, he said, that were anti fracking could have very easily come from Russia and against. So people really understand this is not a partisan conversation.

[01:29:41]

There's many more countries that are in the game now because this is the new means of geopolitical warfare. It's information warfare. It's incredibly cheap. And we're seeing now Iran, Saudi Arabia, UAE, Israel and China and many other countries that are now in the game.

[01:29:55]

And for me, I guess the one check valve I'd encourage people to have is the moment you've consumed something and your opponent is now not human in your eyes or they are evil or they are a demon, you've probably passed your point of view and been firmly anchored in us, them. And there's no going forward once you're there, once it's them in your US. I think that's the thing people need to monitor is after they've consumed something, if their ire towards the opposition is irrational and the people are no longer human, I think that's a good warning flag for you.

[01:30:34]

I think a good measure there, as I think back in your life to anything that you felt certain about. There is earlier beliefs that you had a very sharp view and that you don't believe anymore. I think that's a good anchor feeling to really lock onto, because for any of us to be certain and not be curious and open minded and trusting, because one of the hidden factors here is not just having access to good information, it's actually being able to trust new information when it comes in.

[01:31:00]

Because right now you could tell people, you know, on the left about something bad that Biden did. But if they're provided, they don't even trust the information, if it's negative towards Biden. And the same thing is true on the right. Right. And that's that's where we've lost it. If we're not even willing to update.

[01:31:14]

Are you in a relationship? I am. You OK? Right. So I hope you've had this experience. We have kids. So it's like let's just say Tylenol is not bad for kids. My wife. Yes, it is. So I'll search. Is Tylenol not harmful for kids? And she searches. Is Tylenol all right. And by George, we both are vindicated. Almost every debate we have, Monica and I, do we get these debates in the manner in which we search.

[01:31:39]

We get exactly what we were hoping for.

[01:31:42]

That's such a good example. You know, if you type in climate change is not real, you'll get a bunch of results. If you type in climate change is real, you'll get a bunch of results. And the point is that there's infinite people arguing on both sides. And so what we really need is discernment and almost like a more lasting score of who are the most trustworthy dialectic thinkers, who are doing synthesis, who are proving they can steal me on the other side.

[01:32:05]

Yes, they're actually know the other arguments and they can speak in terms of a dialectic of giving power to both sides and then trying to offer synthesis. And imagine if our newsfeeds were rings for synthesis level of speakers instead of who got the most outrage instead of winning with black and white thinking, which is how it works today.

[01:32:21]

It's so desirable from our point of view. And it is something that we desperately hope to be doing at all times on here, which is like making a real, honest, sincere argument for your opponent. And then in doing that, I guarantee you'll find more empathy, sympathy and understanding because you're forced to make it. And then trust. Yeah. And then they will trust you more. And if they can give you that same courtesy, you will trust them more.

[01:32:47]

Monica, you were just I was just going to say in this synthesis model is actually the antidote to the left right situation you were bringing up. They're not only going to be able to take down right stuff because it's going to be the middle ground stuff that rises to the top.

[01:33:02]

Yeah, that's exactly right. Yeah. And transcendent thinking. So, you know, is Biden, if he favors a climate change sort of deal going against the fracking that's done in Pennsylvania. Yes. We have to make sure that we have a real answer. You can't have just a pro environment, anti fracking position. You need a thing that protects, you know. But we also don't just want the pro fracking people who are going to ruin the environment.

[01:33:23]

We need to make sure that we find a synthesis of how do we care for jobs, the environment, children, air quality, all at the same time in. All the values on the chalkboard and say, what's the solution that gets us all of these things, as opposed to winning by polarizing and gerrymandering people into predefine belief groups that allows you to win elections?

[01:33:41]

Yes. In another thing, I would want people to immediately admit to themselves that any one of these very, very hard topics that are being addressed on a governmental level or any kind of level, at best the best option is going to be 60 percent versus a 40 percent. This I'm right and you're wrong, almost doesn't exist in any of these complex conversations. There isn't fracking right or wrong. There's a sliding percentage of. Versus what how much shit are we buying from Russia?

[01:34:12]

How many oligarchs are we empowering?

[01:34:14]

How many how many foreign wars do we have to start to make sure we secure our oil supply? Yes.

[01:34:19]

So every issue is so incredibly complicated in a very stimulating way. I mean, that's what I love about it. But anyone who thinks that they have a position that is 100 percent accurate, they're being naive.

[01:34:30]

I think I think I love you also. You're a fucking hero, man.

[01:34:35]

The first few times I heard you, I didn't understand the full scope of what you were talking about.

[01:34:41]

And you're really kind of, if not the among the vanguards who are really helping us understand the outcome of this ten year long experiment. So I thank you from the bottom of my heart. When I think of my children, I'm grateful that there's someone like you.

[01:34:55]

I thank you both so much. It's really, really a great conversation. I really hope people hold on to some hope as well that we can change the system. You know, I think of it like a body that doesn't know that it's eating itself. You know, if only one neuron wakes up, the body doesn't stop eating itself. But if all the neurons wake up, then we can stop eating ourselves. And I think the thing that gives me hope is with this film, The Social Dilemma, and something like 40 to 50 million people having seen it, that's, you know, how big was the civil rights movement?

[01:35:23]

How big was the gay rights movement? You know, we actually have a constituency that if enough people see what we've talked about for the last hour and a half, you know, we can actually solve this problem. And that's what gives me some hope, is I wish I could turn my inbox inside out so people could see just how common the feeling that we need to change this is. Yeah, and the question is, how do we harness that?

[01:35:42]

And that's what we're trying to do, you know, with our work at the Center for you mentally.

[01:35:45]

I was just going to say, so if people want to support the great work you're doing, I imagine Center for Humane Technology is a nonprofit.

[01:35:52]

Yeah, we're building a movement Center for Humane Technology. It's a nonprofit. And we're trying to create a story. Bank of ways that people can share their own stories of the movement can see itself so that everybody can see everyone else's stories. Please go to humaine tecum and you can get involved in various ways there, along with we have another podcast called Your Undivided Attention that dives into some of the topics that are in the film, too.

[01:36:13]

So listen to your undivided attention and go to humane tech dotcom and either watch some story, share a story, donate some money, watch social dilemmas, watch social media.

[01:36:28]

If you're listening, those odds are you probably have. But yes, please go to all these places and I commit right now to go to those places and start helping. So thank you so much, Tristan.

[01:36:37]

Thank you both so much. Really my pleasure. Yeah, thanks. Hope we talk again with better news. With better news. I'm sorry if this was I'm really sensitive to that.

[01:36:47]

It's just that I want people to see it. The bigger the pain, the bigger the motivation. So it's like just show people.

[01:36:53]

It is not hyperbolic to say minimally the seeds have been planted for a civil war. No, I do not believe they'll be a civil war. But the seeds are in the ground.

[01:37:02]

There's no question. I think in my way of seeing this, that the film is meant to be a big pause button as we were trending in that direction. Not that the film is going to stop people from that. That's bound to happen, but it's it's a kind of metacognitive, perche that lets everybody climb up and say, look what's happened to us, don't go that way. That's going to happen if you do that. That's not what we want.

[01:37:20]

And my biggest hope is just that, you know, it's hopefully contributing a little bit to cooling off some of those tensions. We had some people say actually that instead of watching the presidential debates for 90 minutes, they found for their family members a better conversation by watching with their family they couldn't agree with. Watch the social dilemma for 90 minutes. Yeah, that was a a better use of kind of healing some of the divides where they couldn't talk to each other.

[01:37:41]

I agree.

[01:37:42]

I think we can all see ourself in that documentary. I know I can. All right. Well, again, thank you so much. Thank you both. Really great to meet you. Yeah. Yeah, I.

[01:37:54]

And now my favorite part of the show, the fact check with my soul mate, Monica Batmen. Oh, Torrico record, you're getting so advanced at your job that you can now activate our recorder with your toe.

[01:38:09]

I know I'm very dexterous in my toe regions.

[01:38:14]

Oh, Falangist dexterity don't look too closely because my toenail looks weird right now. But it does work that way because it's grown out a little bit.

[01:38:22]

No, because it started to break. So I have to break one half off. So now there's a point where a stupid point at the end.

[01:38:30]

You know, it's interesting by looking at your foot like you look like a puppy upstairs, generally like a young puppy, and then your foot looks like a woman's foot like that. Yeah, that looks like a woman's foot. Oh, my God.

[01:38:42]

Like like a mom's foot is. No, no, no, no. Just a little ladies. Yeah. Like, if I were younger and I was at the pool at a hotel and I saw a woman, maybe she's got a martini glass and a cigarette with a cigarette holder and like a pinup style bathing suit from the 50s high waisted.

[01:39:00]

That might be the foot attack.

[01:39:03]

If you had a foot fetish, would you like it? Of course. Well, actually, I cannot even really comprehend the foot fetish thing, so I wouldn't even it's hard for me to guess interests like I have no opinion on what feet I like.

[01:39:16]

Like my you could just keep bringing them past me and I go, yeah, it's nice. That's nice or that's not nice. But I couldn't give you criteria for what's nice and not nice.

[01:39:25]

We have a friend who I won't out who has a foot fetish. Oh oh. Oh yeah. You didn't ever catch him like perving out at your feet.

[01:39:34]

He told me I had beautiful feet. Oh. Did you feel good or scared?

[01:39:38]

I felt good since he has a foot fetish and he's in officiant. Yes.

[01:39:43]

Since he knows a lot about fee and what's attractive and what's not. He liked mine.

[01:39:46]

Oh wow. Whose doesn't he like? Mine. Obviously he didn't tell me. He didn't he didn't say I don't like this person's foot. Right. OK, but he didn't, he wasn't mean about it. He just said I had beautiful feet and he knows all of our shoe sizes.

[01:39:59]

Oh my God. Now do you ever catch him staring at your feet when we're all swimming?

[01:40:05]

I've never caught him, but we were swimming when he told me I had beautiful feet. And you probably catch people looking at your boobs sometimes right now.

[01:40:13]

What's weird? I mean, I know people are, but I'm never catching someone, OK, locked in Target.

[01:40:20]

Like, I think people do a good job of not just like staring. Oh, they just take a quick glance probably. Have you got any of the girls staring at them? So I would imagine the girls are just as prone to probably lock in on them for sure, maybe even more so. Like if I could see a big bulge in Charlie or Ryan's bathing suit, I'm probably on it longer than any of the female. Yeah, definitely.

[01:40:44]

I again, I haven't I'm not very observant, so maybe everyone's staying.

[01:40:48]

Didn't have the greatest eyesight as we've discussed, too. So maybe you just when you look you just see a blur and you can't really pinpoint where the eyes are. I don't really see eyes. Can you tell what I'm staring at right now.

[01:41:00]

My eyes. Your kneecap. Oh yeah. I wouldn't. I guess that.

[01:41:05]

Really? Yeah. Do this. Oh, my God. Oh my God. This is real. OK, I'm gonna close my eyes. You you look somewhere else. Can I open. Yeah. Action. My phone now, well, my here. Yeah, yeah, that switch kneecaps oh, wow, I'm trying to keep this as clean as possible because, like, if I'm staring at your elbow, everyone knows your elbow is roughly on the same latitude as your boobs.

[01:41:28]

So people will be like real. You're going to stare at her elbow cuoco. And so I'm really trying to be somewhere very safe. So your knee in this example and I might go, oh, I got one shoulder. Yep. Boom.

[01:41:42]

Wow. I got good at this game quick. Yeah. Now I've had a big, big development, a virtually defected. It's like I have a new nationality now. Yeah. I tell people, oh god I hate to say this because I was so proud of my resilience. I've switched to an iPhone.

[01:41:59]

You have. I've switched to the collar. To the cold. It's difficult to call it a club, OK? The reason I don't call it a cult is because it's the majority. It's not like a faction. It's like everyone has an iPhone.

[01:42:13]

I agree. I agree. But there are militant members of the group that are almost in a cult like Ryan. Ryan will tell you within sixty one days when the next products coming out. Oh wow.

[01:42:25]

And he's already got plans to get it, but he already has the 12 and so does Amy. Yeah I know. Yeah. Like he was a he's the first person to have it so he's kind of on the cult end of the spectrum, I'd say, of the group. OK, so primarily I got it because my children lose the Apple TV remotes. They just I think I've complained about this before, but we have six of them. And at any given time, the whereabouts of one is known.

[01:42:52]

Yeah, and it'll show up somewhere crazy like the laundry room. So now you got to go around to all six and see which one it works with.

[01:42:59]

It's such an annoying.

[01:43:01]

Yes, and annoying to my only defense against that has been I can pull my IPEX out and control the Apple TV with my IPEX. I want to fucking lug around my IPEX all the time when I want to watch telly. So I was thinking I just got to have a remote in my pocket. That's really what drove this entire thing. Wow. And receiving videos from you guys who are in the cult because so many, so many times you guys make you got great videos and you, you text them to Christian.

[01:43:28]

Right. And we can't see shit. It's the lowest grade video you can imagine. We can't tell what's going on. We can hear. And then some châteaux kind of like when you're looking to see people are looking at your honkers.

[01:43:39]

Wow. It's just a shadow. So what you're seeing is what I see in normal life. Yeah, that's a bummer.

[01:43:45]

And I'm sick of that. Yeah. But I got to say, Samsun, great product, really. I'm going to miss my estin.

[01:43:51]

So you are because you're still kind of hanging on your car around two phones.

[01:43:55]

Yeah, currently I do. I have my S10 on the armrest of the Lazy Boy and my new iPhone, which I need to come up with a name for, is in my groin. Well, my belly belly.

[01:44:06]

Yeah.

[01:44:06]

You know, I have a fear, OK, I have a fear on behalf of your IPEX, OK, I have a feeling you're not going to be spending any more time on that IPEX and you abandon your son and you replaced your son with a new baby and that baby Futter slimmer, bigger and smaller and more efficient and funnier away from you.

[01:44:30]

So I think it's going to be sad.

[01:44:35]

Well, that's a very realistic concern to have for the packs, my son, but I can tell you right now, like I'm going to the dunes in a few days, right?

[01:44:43]

Yeah. So what I'll do is I'll bring the packs and I'll lay in bed at night and I'll watch Netflix in the motor home on the pacci. So I'll still watch a lot of content or when I'm here in the attic. And you're not here to watch TV when I'm not here.

[01:44:58]

Yeah, but you watch TV with me in here before we watch the election. Yes. That was a crucial time. We kept that on like a ticker tape where you just had that on in the background.

[01:45:08]

That's how my family has their life. That's literally their life. They just have CNN or MSNBC on in the background of their life always.

[01:45:16]

Wow. And that makes them feel safe, even though it should make them feel very scared because it's just update's of calamity across the globe.

[01:45:23]

When I'm home for Christmas and I come back, I am there for like two weeks and I have never it's the most informed I am the whole year. Oh, I know everything.

[01:45:33]

And you try to carry the. But don't you find and don't they find like at times when I try to watch that it's just the same fucking thing for 14 hours. That's what's maddening. It's like you have all these different hosts and programs, but they talk about the exact same thing that it is.

[01:45:49]

I don't know how that interests anyone.

[01:45:51]

I know that just in case. Oh, it's the slot machine in. It's in our head. Ding, ding, ding.

[01:45:59]

Trystan has got a good ding, ding, ding, ding, ding. Right.

[01:46:03]

Segway that was he is I. It's too hard because we have so many brilliant guests. But his, the way his brain works is for astonishing. Yeah.

[01:46:13]

We should give him instead of like Kizzia we're always tempted to say someone's the smartest person we talk to and that's just becoming inefficient. Do that. Yeah. I'm going to say he's a freak. I know that's not good. No.

[01:46:25]

Like he's freakishly smart. People don't like to be called freaks. Oh, they don't know. What about Freaks and Geeks? Yeah, that was a negative. Wouldn't you love to have been on Freaks and Geeks?

[01:46:36]

Of course. Yeah. I'd be known forever as one of the freaks, Auggie. I would have, yes.

[01:46:42]

I bet Tristan would as well. I'm a freak on the dance floor. That's a big call. That's good. Yeah. Or a freak in the sheets. Yeah. Everyone wants to be called between the sheets. That's true. He's a freak between the ears.

[01:46:55]

It's so great that he is devoted to this notion. Yeah, because. Yeah, because he certainly walked away. Well, actually, I don't know his finances, but my assumption is he walked away from a pile of money being as smart as he is in that industry. I agree. Maybe still like a genius investor, though. I hope so.

[01:47:12]

Yeah. I hope he's rich. I hope he's a freak investor. I hope he's a freak in the streets. I hope he's a freak in the Wall Street and a freak in the sheets.

[01:47:21]

No, I hope he's a freak in the Wall Street's in a geek in the sheets. Why? I don't know. Because you've got to change the word.

[01:47:26]

What do you think a geek in the sheets is like?

[01:47:29]

I think he's meticulous about your anatomy. Oh, your honkers and your ding ding.

[01:47:36]

OK, I'm going to start listing some facts now.

[01:47:40]

So first fact, did they have a Charles Schulz ice skating rink in Santa Rosa?

[01:47:47]

I don't know. I know for sure there was OK and was it at the Museum Research Center? I don't know where it was located. I'm going to guess who was there. OK, there's a museum dedicated to the works of Charles Schulz, creator of the Peanuts comic strip, opened in 2002 and is in Santa Rosa. The museum is home to many of the original Peanuts strips, as well as other artwork by Schulz. Two works by Japanese artists.

[01:48:13]

Whatever I mean, no offense to Japanese artists, just giving it relevant to Charles Schulz. I don't believe he was Japanese in any way.

[01:48:22]

OK, there's a three. Wow, there's a three and a half ton wood sculpture depicting the evolution of Snoopy.

[01:48:32]

Oh, and a 22 foot high ceramic mural made of three thousand five hundred eighty eight peanuts strips which combine to form the image of Lucy.

[01:48:42]

Oh, wow, wow. That's from the Japanese artists. I guess they are relevant now.

[01:48:48]

Dang it. OK, yes. Those are by Japanese artist. OK, Japanese are really good at art. Sure. I guess. How many Japanese people are good at art. Oh that's not fair to say.

[01:48:58]

I'm not. Yeah, yeah. Yeah. It's, it's, it's the same as saying they're all bad at something. All right. So I can tell you what Japanese are bad at what.

[01:49:06]

Having blonde hair. OK, that's true. They're not good at having blonde.

[01:49:10]

Well they're not good at producing blonde hair. That's right. They got their hair blonde. It still looks good.

[01:49:17]

Good point. That's a really good point. Counterpoint. Oh I have another one. Oh. Japanese are really, really bad at being born in China.

[01:49:28]

Hmm. Historically, I guess a majority of them are probably not good at being born in China, but some are good at being born in China.

[01:49:40]

I know there's been plenty of Japanese people that China fuck. Exactly. I know one, OK, the Japanese are terrible at flying. And I don't mean in an airplane. I mean individually like a bird. Yes, OK. Although someone's going to say there's a Japanese person in a wingsuit. Mm. Oh, this is hard. It's really hard. I'm glad it's hard. OK, ok.

[01:50:05]

OK. The quiz from the New York Times that can pinpoint your exact zip code after asking twenty five questions about your like word choice and stuff is called how y'all use and you guys talk. Oh should we try it. Yeah.

[01:50:20]

Although what's going to be tricky is that you know, I've intentionally changed some of mine, like I say yallock. So I think it's yours the best because often when I say you guys and there's women present or at one time it happened and a person had transitioned. Oh, and two female. And I said, hey guys, sorry I was late. And then I just panicked. I was like, oh boy, this did it come up?

[01:50:45]

No, I don't think the person cared at all, but I just panicked, OK? And then I just thought, I'm going to get rid of guys because it's gender specific. I'm just going to go with y'all and it rolls off the tongue for me.

[01:50:57]

Well, it's obvious. Rolls off the time for me. Yeah. I wish you'd said it more. You don't say it very often.

[01:51:01]

I used to say it exclusively and you probably tried to break yourself of the habit when you came to Los Angeles.

[01:51:06]

You did or it just happened organically. It's great. It's it's really useful. It's better than you guys. You're right. I don't like that. We're calling a bunch of girls, you guys. It's kind of patriarchal. Yeah. OK, let's let's play. Let's try. OK. OK, it's twenty five questions. OK, great. We'll do you first. Do you first. No. OK. OK.

[01:51:28]

How would you address a group of two or more people. First question what you just talked about.

[01:51:32]

Well I'm going to be honest, be honest, not what you've changed to. OK, you guys do want to hear the list though. Oh yeah. There's a lot of maybe there's you all use. No, that's New Jersey. You lot. No you guys. Yeah, Ewan's. Oh, that's real deep south mannequin's. You other y'all OK.

[01:51:54]

You guys, you guys.

[01:51:55]

What do you call the small gray bug that curls up into a ball and it's touch Roly-Poly.

[01:52:01]

Is that an option? It is, yeah. Hilberg Doodlebug Potato Bug Roly-Poly Sjoberg. Oh, Basketball Bug tWiddled Bug Roll-Up Bug Woodlouse Millepied Centipede. I know what this creature is but have no word for it. Oh I have no idea what this creature is that could be telling.

[01:52:24]

Maybe it doesn't exist in the Pacific Northwest. Exactly what do you call the thing from which you might drink water in a school? Drinking fountain options are bubbler water bubbler, drinking fountain, water fountain, other oh oh.

[01:52:40]

So good is it is ok. What do you call the large wildcat native to the Americas. Cougar OK. Mountain Lion. Cougar Kumah. Mountain Cat. Panther Catamount. Mountain Screamer. Painter Mountain Screamer.

[01:52:56]

That's got to be Appalachian Mountains grosgrain. We all hear that mountain scream last night kept me up well past midnight. I didn't get a damn bit of shut eye last night that mountain screamers halleran out back my door. I shut all the windows.

[01:53:13]

That's pretty good. Thank you. Rare that you like my thing. I just passed that.

[01:53:17]

I was proud that you also changed windows to windows. All busy doing some work and I found out here this mountain. What is it. A screamer. Mountain screamers start if anything happened out in the back, shed all the windows and shut down the worst. What's a worse? Wash. Oh, abortion. Your clothes.

[01:53:35]

OK, how do you pronounce this word? CCRA a Y'en crann. OK. Say it again, Crann. Give me that blue crayon crayon like CRM. So are you saying it one syllable? Yeah, OK, got a new box of crayons, we say new box of crayons, that crayons, crayons.

[01:53:59]

But that's still kind of OK here. The options with one syllable rhymes with man crayon. Yeah, that's what you're saying, huh. With two syllables. Sounds like crayon. No, with two syllables were the second syllable rhymes with dawn brown. No crayon.

[01:54:16]

Fuck that. Oh the two. The first one is Crayon Ram. That's closer to what you're saying Karen. Sounds like Crown now. OK, so you were one syllable rhymes with man. All right. What do you call it when rain falls.

[01:54:30]

When the sun is shining, devils beat his wife. But I know that's from Kentucky because that's from my grandma is.

[01:54:37]

Oh, dear. Oh, my God. Oh, my God. I did not expect that. Like, I'm glad it's on there, too.

[01:54:44]

I counted as the only thing I know to call. OK, what are the options that I would call it sun shower, I think. Let's see. Oh OK. Yep. Sun shower. OK, the wolf is giving birth.

[01:54:55]

Oh wow. The devil is his wife. Yeah. Monkey's wedding.

[01:55:01]

Oh I wish we said monkeys. Well like that fox is wedding. Pineapple rain, liquid sun. I have no term expression for this or other.

[01:55:11]

OK, if it were for my grandma I don't think I'd have a term for it. You wouldn't. I don't think so. But let's go with it anyway.

[01:55:17]

OK, what it says is going to be an anomaly. Oh what do you call it. Drive through liquor store. We don't have them. I've never heard of such a thing. OK, the other ones are brew through party, barn, bootlegger, beer, barn, beverage barn. We have these in my area, but I've no special term for them and I've never heard of such a thing. You didn't have them in Georgia? I have never heard of that either.

[01:55:38]

What do you call a traffic situation in which several roads meet in a circle roundabout like Rotary Roundabout Circle, traffic circle, traffic circle? I have no word for this other.

[01:55:50]

Oh, this is weird though. They just added roundabouts to Michigan like ten years ago. Yeah. Now when I go home, I'm more often in a roundabout than on any other. I don't like roundabouts person. People have a big issue.

[01:56:02]

I see the people, Michigan, they're still adjusting to them. They're like they cannot figure out what's supposed to happen.

[01:56:08]

OK, how do you pronounce the second syllable of this word? I'm going to spell it again because I don't want to. Yeah, don't lead the witness. Exactly p a j a a s pajamas with the vowel in jam with the vowel and palm jam. OK, what do you call something that is across both streets from you at an intersection or diagonally across from you in general.

[01:56:30]

Kitty corner. Not catty corner, I hate Caddy Corner is mine. Oh, I hate that I hate kiddie porn. OK, kitty corner. Kitty corner cattycorner catty corner. That's mine. Kitty cross, kitty wampus. Who's getting wampus?

[01:56:49]

Did you know I said scatty wampus during Spazz and Laurissa. That wasn't a word she said it's catawampus.

[01:56:55]

I got it from a girl sketch in the Groundlings. Oh I got it from you. Right. So it's a very catchy word.

[01:57:03]

It's getting Wampus. I love it. I don't really give a fuck of whether it existed prior. Yeah. Like the word we said the most in childhood. My brother and I was was Bungay and Bunga Bunga. No, it's actually the buildup of poo and one's butt cheeks. OK, yeah. And so that's not a word. But we used it the most and I love it. Bungay like if you saw someone scratching their butt you'd go, oh boy, that guy's got Bungay.

[01:57:26]

And when we were kids we were always on Bungay patrol because I've heard Bungay but.

[01:57:32]

Oh, I don't. We never said Bungay, but we just said Bungay Bunji, I wonder if it's on this list. OK, what do you call a buildup of excrement in the butt cheeks?

[01:57:42]

What do you call the insect that flies around in the summer and glows in the dark? Lightning bug. Lightning bug firefly. I use Lightning Bug and Firefly interchangeably. Pini Wolly. No, I have no word for this other. Do I use a lightning bug quick. But do you want me?

[01:57:59]

We also say fireflies. You want me to do interchangeable.

[01:58:02]

I think so. OK. I think that be most honest, how do you pronounce this word BGN I've been mad, that being. Yeah. Mm hmm. With a vowel and sit with the vowel in C been. No, with a vowel and set. Ben, have you been. Have you been. Yeah I've been. Yes. Yeah. Yeah I.

[01:58:25]

What do you call an easy high school or college class.

[01:58:28]

An easy high school or college class basket weaving. No, no, no. OK, like the options are gut creped course, krip cause bird blowoff meat or other meat or other I guess blowoff would be the only one I've heard in that list.

[01:58:50]

Yeah, it's a blow off class. Yeah. Yeah, it's basket weaving.

[01:58:55]

That's what my family said. What do you mean. Well it's like an easy basket weaving in back in the 60s. Maybe they offered it when my parents went to high school. You could take basket weaving if you can't get a fucking ambassador. Oh that's so funny. Yeah. Chimpanzee can weave.

[01:59:12]

I bet I wouldn't be very good at it. You got to know you money and everything.

[01:59:15]

Well, how do you pronounce the words. There's three of them. Oh boy. Capital M r why Mary kay m e r r y Mary and m a r r y Mary. OK, you pronounce them all the same. Yes. All three are pronounced the same. All three are pronounced differently. M r y and m e r y are pronounced the same but m a are y is a different ok.

[01:59:41]

Is all those things ok.

[01:59:43]

What is the distinction between dinner and supper.

[01:59:46]

We say dinner, but I purposely say supper now.

[01:59:50]

So should I say I don't use the word supper because you do, but you did it right. Exactly.

[01:59:54]

OK, supper is an evening meal. Dinner is eaten earlier. Supper is an evening meal. Dinner is the main meal. Dinner takes place in a more formal setting than supper. There's no distinction. They both have the same meaning. I don't use the word supper. I don't use the word dinner.

[02:00:06]

OK, what do you call the area of grass in the middle of some streets? Median boulevard, midway traffic island. Island, neutral ground. I have no word for this median or other. Oh, good.

[02:00:18]

I'm always relieved when the word I say is on the list. Yeah. And if you so far yours have all been. Yeah.

[02:00:25]

What do you call the long sandwich that contains cold cuts, lettuces and so on. I know a lot of words for it.

[02:00:30]

I'm going to try to think of what would be most your control. Yeah there's hogy foot long Dagwood. I love Dagwood.

[02:00:39]

Right foot long, the adoption. Or hogy would be next, forlong is not because sometimes they're only six inches, yeah, so hogy I guess hogy ok sub that's mine o sub it's sub grinder.

[02:00:58]

Oh I've heard Grinder really. That one hogy hero. Oh yeah. Hero. That's the thing. Yeah. That's the thing. Yeah. Oh my God. Yeah. Poor boy. Oh bomber. School bomber. Italian sandwich baguette. Sarni. I have no word for this other.

[02:01:17]

OK, let's call them bombers. OK. Yeah that sounds fun. Oh I go for a bomb then.

[02:01:21]

Monkey fucking. What was it. Monkey wedding. Wedding monkeys. Gay marriage.

[02:01:26]

OK, how do you pronounce the first syllable of the word. L a w why are. L.A., WIC, I hired a lawyer, lawyer Lioy, so rhymes with boy, yeah, there's also rhymes with floor and I use both pronunciations and there's a long test.

[02:01:44]

I know we're almost kind of done.

[02:01:49]

What do you call the night before Halloween? Devil's Night. There's only one name for the night before Halloween.

[02:01:56]

Date night, trick night, mischief night, cabbage night goosy night night devils. I have no word for this other. I would say I have no work for this.

[02:02:04]

OK, what are you all doing for Cabbage Night. That is very simple. Cabbage. Yeah. Yeah, that's what you just said. Cabbage night. Oh my God. Because you want a cabbage night. Oh well duh. We're making soup.

[02:02:16]

Do you call the sweet spread that is put on a cake frosting or icing frosting frosting icing frosting and icing refer to different things. Both neither other. What do you call the rubber soled shoes worn in gym class or sneakers or athletic events?

[02:02:30]

Sneakers you to sneakers, shoes, gym shoes and gym shoes, sand shoes, jumpers, tennis shoes.

[02:02:38]

That's what my. Yeah, tennis shoes. That's what we would say. You say more than whatever I said, you can grab your tennis shoes. Yep.

[02:02:46]

Running shoes, running shoes, runners, trainers. How do you pronounce C.A.R. a smell.

[02:02:53]

Oh Kamal, Kamal, Kamal. OK, so two syllables. Kharma not caramel. That's how I, that's how I say it really.

[02:03:01]

It sounds so fancy to me. It sounds like oh tons boogie boogie.

[02:03:07]

Do you pronounce the word Scottie and see a t the same.

[02:03:14]

I got caught now or got caught sleeping some differently. Yeah. Wow. You pronounce them to say yeah. Oh wow. He caught me pull out the court ball at the court.

[02:03:28]

What do you call the area of grass between the sidewalk in the road area.

[02:03:32]

Grass between the sidewalk and the road. What are some options. Burm parking tree lawn. Terrace, curb, strip beltway. Virge, I have no word for that. I don't think I have a word for that.

[02:03:45]

Last question.

[02:03:46]

What would you call a sale of unwanted items on your porch, in your yard, et cetera?

[02:03:51]

Well, now I'm so fucked with the last twenty five years in L.A., but I think a garage sale I would have a garage sale to.

[02:03:58]

Yeah, OK. Tag sale. Yard sale. Garage sale. Rummage sale. Thrift sale. Stoop sale. Carport sale. Sidewalk sale, jumble jumble sale. Car boot. Car boot sale.

[02:04:10]

A jumble that's going to be like New Orleans or something. I don't know. I wish I could do good work. I'm going to submit you.

[02:04:15]

All right. I'm going to try to practice and accent. Holy shit. Detroit. No.

[02:04:22]

Oh. Oh yeah.

[02:04:27]

OK, so it has a map of most similar. Least similar.

[02:04:31]

OK, and then it has these maps show your most distinctive answer for each of these cities, Detroit, Grand Rapids. So it got it. Wow.

[02:04:44]

That is crazy. It's like it didn't give me a zip code but it's crazy. Didn't give you a zip code but still it's still crazy. I'm still crazed. Yeah. Wow.

[02:04:54]

Wow. Try that quiz. Yeah. How interesting.

[02:04:58]

Or do you think they got so bored during hours. They're like, I don't want anything to do with that quiz.

[02:05:01]

I hope that they did it in their own head. They don't get their answer. That's why they need to go do it all. So yeah.

[02:05:07]

So I hope you wrote down your answers so you can do it quickly. Exactly. Anything else can you tell me? About myself. Oh, no, no, no, no. It's a fear of financial insecurity.

[02:05:19]

You talked about happiness metrics and how Sweden is ranked number one country and happiness first right now is Finland. OK, then Denmark, Norway, Iceland and the Netherlands right now.

[02:05:31]

Mm hmm. How many languages are there in Ethiopia? Tristan thought maybe five. I saw a figure that said 86 individual languages indigenous to Ethiopia.

[02:05:41]

Oh, boy. OK, yeah, it's a lot.

[02:05:44]

How long to Johnson and Johnson take Tylenol off the shelves back in the day when it was being poisoned and tampered with? I can't find an amount of time, but they did take thirty one million bottles. One of the first major recalls in American history.

[02:06:01]

It was in the Chicago metropolitan area.

[02:06:04]

Seven individuals died in Tylenol that had been deliberately laced with cyanide.

[02:06:10]

Oh, my God, I cannot of the many layers of murderers I can't really relate to. It's in descending order, right, like a guy who catches a guy sleeping with his wife and loses his mind and kills. That seems like OK. Yeah, that that could happen. Yeah. Onward and downward. Even serial killer is now before this person like to just randomly poison some and you'll never see the results. Yeah, it seems very.

[02:06:39]

It is weird. Yeah. It's kind of like, you know, it's a bad analogy. I'm going to ditch it. I was going to say like if you hit a button and you could blow up a star that you never see, why even do it and say you do it. I blow up a star. I guess they tell me and tell me when I hit the spot in, the star will blow up. But I don't know.

[02:07:00]

Well, I'm sure this person follows the news.

[02:07:02]

The news is like happy in there, but there's a good chance that there was a good chance that that's not even going to be detected.

[02:07:10]

I know that was the big risk. That's part of the that was part of the joy they get when it does.

[02:07:16]

I don't understand it. I'm glad you don't understand it. Yeah, I we don't have the mind of a killer.

[02:07:21]

Yeah. I mean, I guess see as some of the aspects of killing, I understand, like I wouldn't do it, but I understand, like I understand feeling inferior and wanting to be smarter than everyone and prove that you could do something and not get caught. I understand that motivation. I wouldn't kill to do that. But I understand that, like, oh, I want to prove I'm smarter than everyone. Do you I mean, I conceptually you understand especially I can understand it.

[02:07:51]

I think that there's so much I mean, that's so entitled. Yeah.

[02:07:56]

Well, it's terrible, terrible murder.

[02:07:59]

But I can't say that I will I can stand firm on that. I was like, oh my God. Was that everything. That was everything for Tristan. Oh man. I really enjoyed him.

[02:08:10]

And boy oh boy. I think the tipping point has happened because of that movie and a few other things. I feel like people are really starting to broadly understand the risks of all this technology, which is encouraging. And I think we're going to see some governmental regulations that are going to help curb it. And I regret now when I said to Bill Gates, do you find it ridiculous that these folks who have created these amazing things have to sit in front of a Senate subcommittee with some assholes saying his email gets bounced back when he sends it to his son?

[02:08:41]

Right.

[02:08:43]

But, yeah, there's it's important.

[02:08:45]

Yeah, yeah, yeah, yeah, yeah.

[02:08:46]

I regret it. All right. That's a big thing to admit.

[02:08:49]

Well, I'm admitting it. Helvey excuse me. I'm going to play on my iPhone TBD, its name.

[02:08:54]

How are you going to you could call it a Epiphone, but that's sounds kind of racist.

[02:09:00]

How about money. Uh, a pony. I pony the right man map. Oh.

[02:09:10]

So you're going to call it I pony. Yeah, I think I, uh, uh, pony.

[02:09:16]

By the way, if anyone has not heard the Leon Bridges version of Pony, it is second time we talked about Leon Bridges.

[02:09:24]

We love him. We love him. Should he. Come on, I would love it. Do you think the way he talked to us on one of those steel microphones and it sound like Sixty's distorted the way his his music dies, let me, quote, be kind of like we were time travel interview. Like we have an interview from the sixties today. Queen's Gambit. Oh, my gosh, I love you.

[02:09:43]

Bye.