Transcribe your podcast
[00:00:01]

Joe Rogan podcast, check it out. The Joe Rogan experience. Train by day, Joe Rogan podcast by night, all day. Hello, Jonathan. Good to see you, sir. Good to see you again, Joe.

[00:00:15]

The same problems that you talked about when you were here last that I've referenced many times since on the podcast have only exacerbated, unfortunately. And that's why you wrote this, The Anxious Generation. And it could not be more true how the great rewiring of childhood is causing an epidemic of mental illness. I don't think anybody can dispute that.

[00:00:41]

Yeah. When I was on last time, there was a dispute. There were some psychologists who said, Oh, this is just a moral panic. They said this about video games and comic books. No, this is not a real thing, they said. Now they don't.

[00:00:55]

Yeah, I think it was pretty obvious. I think it was only their preconceived notions that were keeping them from admitting it before or at least looking at it before. Or maybe they don't have children. It could be that. I think a lot of older people, particularly boomers, they're a little bit disconnected from them because unless they're addicted to Twitter, they're not engaging in this stuff.

[00:01:16]

And they're often thinking, When I was a kid, we watched too much TV and we turned out okay. But part of the message of the book is that social media and the things kids are doing on screens are not really like TV. They're much, much worse for development.

[00:01:29]

Yeah. And even the watching too much TV, I don't agree that they turned out okay. I think it had a pervasive effect. It did, but nothing like this.

[00:01:39]

Well, that's right. Because when we were watching TV, I'm a little older than you. I was born in 1963. So I grew up watching a lot of TV, maybe an hour or two a day, weekdays, and then two or three hours on the weekends. But it was a bigger screen. You're watching with your sisters or with your friends. You're arguing about things, you're eating. So it's actually pretty social. But now kids Kids are spending... The latest survey, Gallup, finds that it's about five hours a day just on social media, just social media, including TikTok and Instagram. And when you add in all the other screen-based stuff, it's like nine hours a day. And And that's not social. It's private on your little screen. You're not communicating with others. In all these ways, the new way that kids are digital is really not like what we had when we were on watching TV.

[00:02:28]

It's also an extraordinary extraordinary amount of wasted resources. I'm always embarrassed when I look at my phone, when I see my screen time, four hours. That's four hours I could have done so many different things with.

[00:02:40]

That's right. And so that's the concept of opportunity cost is this great term that economists have, which is the cost of... If you invest an hour of your time and $100 to do something, how much does it cost? Well, $100, but you could use that $100 in that hour for something else. So what are the things you gave up? And when screen time goes up to, now it's about nine hours a day in the United States, nine hours a day, not counting school-Average? Average.

[00:03:09]

Is that for a certain age group?

[00:03:11]

We're talking teenagers, not little kids. But in 13 to 15, 17, that range, that's when it's heaviest. It's around nine hours a day. And so the opportunity cost is everything else. Imagine if somebody said to you, Joe, you've got a full life. Here, you have to do this additional thing nine hours. That's insane. That would push out everything else, including sleep.

[00:03:33]

When you are now talking to people that agree that this is an issue, what changed?

[00:03:46]

You mean what changed? Why is there now more agreement? Yes. In 2019, when I was last here with you, my book, The Coddling of the American Mind, had just come out. Back then, people were beginning in a sense that this internet, the phones, the social media that we were all so amazed by. There was a very positive feeling about all this stuff in the early part, like in the 2000s. Sentiment was beginning to turn, but there was a big academic debate because when you look at studies that look at how do kids who spend a lot of time on screens, do they come out more depressed? The answer is yes, but the correlation is not very big. So there was a big argument among researchers, and that's when I got into this around 2019, really getting into that debate. And I I think that Jean Twenge and I really had good data showing there is an issue here. And then COVID came, and that confused everything because basically, when I was on with you last time, 2019, I was saying, what kids most need is less time on their devices and more time outside playing unsupervised.

[00:04:48]

Let them be out unsupervised. That's what we needed, 2019. Covid comes in, boom, exactly the opposite. What did kids get? No more time unsupervised. You can't even go out. I mean, in New York City, they locked up the playgrounds. They locked up the tennis courts. It was insane. No time outside with your friends. Oh, spend your whole day on screens. So that made everything worse. But people thought, oh, yeah, the kids are really messed up now from COVID, but they were wrong. Covid was terrible for a lot of kids. When you look at the mental health trends over the last 20 years, COVID was a blip. Actually, you know what? I've got some chart. If you don't mind, I'd like to actually show these because- Did you send the data to Jamie so he could pull it I haven't sent it yet, but I'll... Oh, right. Because you want... Yeah. Do you want to stop and do that?

[00:05:34]

Yeah, let's pause real quick. Okay. I'm sorry. Jamie will give you the email address. Okay, we're back.

[00:05:40]

All right.

[00:05:41]

What are those things?

[00:05:43]

Oh, so these are stickers for your Kids. So as part of the book, I'm trying to launch a movement called Free the Anxious Generation. Here you go. You have two younger kids. And so I've teamed up with the artist who did the book cover, Dave Cisarelli, who's created these incredible artworks. There's going to be billboards. He's putting together a twelve-foot-tall milk carton, which is going to be traveling around different cities with this.

[00:06:12]

Missing childhood. Do they do that anymore, the milk carton thing? No, I don't think so.

[00:06:18]

I don't know what your kids think about social media and whether they think it's a good thing or bad thing, but we are hopeful that members of Gen Z are going to start, and they are starting to advocate that, you know This is messing us up.

[00:06:32]

Okay, so here's the graph.

[00:06:34]

Okay. So this is the graph that I showed last time I was on. And what it shows, because I know most of your listeners have probably just listened to the audio. It shows that from 2005 to 2010, the rates of depression in girls was about 12% of American girls had a major depressive episode in the last year. And for boys, it was about 4% to 5%. And it's flat. There's no change. Then all of a sudden, around 2012, 2013, the numbers start rising, especially for girls. And it goes all the way up to 20% for girls. So that was a huge rise, and that's what I showed you last time.

[00:07:07]

What is the difference between boys and girls?

[00:07:10]

Girls suffer from more internalizing disorders. That is, when girls have difficulties. They turn it inwards, they make themselves miserable. Girls suffer from higher rates of anxiety and depression. That's always been the case, especially once they hit puberty. Boys, when they have psychological problems, they tend to turn it outwards. They engage more violent behavior, deviant behavior, substance use. So boys, it's called externalizing disorders. But you can see both boys and girls are getting more depressed. It's just that the effect is bigger for girls.

[00:07:42]

So boys have gone up to about seven %, and girls were way up to 20.

[00:07:46]

That's right. And that was 2019.

[00:07:49]

So one out of five girls.

[00:07:50]

That's what it was. That's right. Was. Was. That's right. And then COVID comes in. So we can have the next slide. So then COVID comes in. And now this is the exact same data set, just this federal data. I just got a few extra years of data. And what you can see is that it goes way the hell up. And if you look at the 2021 data point, you can see that little peak at the very top there. That's because of COVID. That is, COVID did increase things. It did make kids more depressed. But as you can see, it's a blip. Covid was just a tiny effect compared to this gigantic increase. On the last slide, it was 20 % of girls. Now it's almost 30, almost 30 % of girls who had a major depressive episode in the last year. And for boys, it's up to 12 %, which is still quite a lot. It's more than a doubling, although much less than for the girls.

[00:08:44]

It's still, even if you look at boys, or excuse me, if you look at girls from 2018, pre-COVID, that ramp is very steep, the upward ramp.

[00:08:53]

That's right. And that might be TikTok. So what happens is a A lot of things change around 2011, 2012. 2010 is when you get the front facing iPhone, it's when Instagram is founded, it's around when kids are getting high speed data plans. So my argument in the book is that we had a complete rewiring of childhood between 2010 and 2015. In 2010, most of the kids had flip phones. They didn't have Instagram, they didn't have high speed data. So they would use their flip phones to get together with each other. They communicate with each other. By 2015, about 80%, 70, 80% have a smartphone. Most of them have high speed data, unlimited plan, Instagram accounts. And this really messes up the girls. So that's what I think happened between 2010 and 2015. Tiktok becomes popular Really more '18, '19, '20. And it's so new. We don't have good data on just TikTok. But I suspect that that extra acceleration might be due to TikTok.

[00:09:55]

What specifically about TikTok?

[00:09:59]

So this is something I'm just really beginning to learn. I don't even have much on it in the book. Kids love stories, and stories are great all around the world. People tell children stories. There are myths. We see plays, we see television shows. I asked my undergrads at NYU, I said, How many of you use Netflix? Almost everybody says yes. How many of you wish Netflix was never invented? Nobody. Nobody. Watching stories This is not a bad thing. Tiktok is not stories. It's little tiny bits of something. And they're short, they don't add up to anything. They're incoherent. They're often disturbing and disgusting. People are being hit by cars, people are being punched in the face. It's much more addictive and with no nutritive value. There are not really stories. It seems to be much more addictive. Kids really get hooked on it, much more so than Netflix or anything else. I It depends on what you're watching, but I suspect that so many of them are consuming stuff about mental illness. It has a variety of effects that we don't even understand yet.

[00:11:10]

Now, I know that there's some push right now currently to ban TikTok, and there's a lot of people that are very torn on this because they don't want to give the government the ability to ban social media. What is the argument about banning TikTok? What specifically are they talking about? The main thing they want to do is separate them from the company, the bite dance that owns them, and just make them an American company. So it can still operate, I suppose. So it's a data issue?

[00:11:38]

Well, it's a national security issue. So thank you. Let's separate the national security issue from the mental health issue. I have a lot of libertarian friends. I have a lot of libertarian sympathies. I would be uncomfortable about the government banning a company or a product because it's harmful to children. I personally think we should just have age verification. We should not have kids on certain things. But if it was just a question of, this is really bad for children, let's ban it. No, I don't think I would support that. But TikTok is different because it is a Chinese-owned company. And as many of your listeners will know, China, it says in whatever, it doesn't have a constitution, I don't think. But by law, every Chinese company must do what the Chinese Communist Party tells it to do. And that's what's so scary. So this is Instagram reels and YouTube shorts. They might have similar effects to TikTok. But the Chinese government can literally tell ByteDance to change what kids are seeing, and they do that in China. They tell them in China, you have to have this content and not that content. There was an incredible episode of...

[00:12:50]

You had Tristan Harris on. Tristan Harris has this amazing podcast episode where they go into the national security risks, and they show that the day that Russia invaded Ukraine, TikTok and Russia changed radically. The government was on... Tiktok was on it. We're going to do what Putin wants us to do. The idea that the most influential, the The most influential platform on American children, the idea that that must do what the Communist Party tells it to do at a time when we have mounting tension with China and the possibility of a war. I mean, as Tristan says, imagine if in the 1960s, the Soviet Union owned and controlled, CBS, ABC, NBC, and all the kids programs. We would never have allowed that. I hope, listeners, I really strongly support this bill. I think Reverend Mike Gallagher, I think, was one of the ones proposing it. Or at least certainly advocating for this issue. I hope people will not see it as a TikTok ban, but they'll see it as an urgent national security move to force ByteDance to sell to a non-Chinese owner.

[00:14:02]

And specifically, what are they pointing to when they say national security risk? What specifically have they seen?

[00:14:09]

So a lot of it seems to have to do with the data question. Facebook pioneered this model in which the person using the product is not really the customer. They don't pay the money. They're the product. The user is the product, not the customer. And they give them data. And the data can be used for all sorts of purposes, especially marketing and advertising. Tiktok has enormous amounts of data, and they can get all psychological on it because they know exactly how long you've hesitated, how much you like certain kinds of videos. Many people have written articles on how TikTok seems to have known they were gay before they did. That thing. Tiktok has extraordinary amounts of data on most Americans, certainly most young Americans, and they say, Oh, but we don't share. It's in a server over here in Singapore. I don't know where, but it's not China. Oh, come on, come on. There's no way it could possibly be the case that the data is really separated and not available to the Chinese Communist Party.

[00:15:10]

What are they pointing to in terms of the danger of this data that makes them want to have it sold to an American company?

[00:15:18]

I don't know whether the motivation behind the bill, I don't know whether it's that the Chinese would have some access to data on American citizens or whether what most alarmed me when I heard the the Tristan Harris podcast, was the ease of influencing American kids to be pro this or pro that on any political issue.

[00:15:41]

You're seeing that with Palestine and Gaza?

[00:15:44]

Yeah, I think so.

[00:15:45]

You're definitely seeing that now. It's very obvious. Well, it's very obvious with many things with TikTok, trans stuff, and there's a lot of different things that they're encouraging. And people that are opposed to that are being banned, which is also very odd, and specifically female athletes. We had Riley Gaines, who was the female athlete that competed against Leah Thomas. And she has said that biologically male athletes should not be able to compete with biologically female athletes because they have a significant advantage. And she was banned from TikTok just for saying that.

[00:16:24]

Yeah, that's right. So this relates to the larger issue that we talked about last time and that I hope we'll continue to talk about today, which is that social media has brought us into an environment in which anyone has the ability to really harm anyone else. There's an extraordinary amount of intimidation available via social media. This has led the leaders of all kinds of organizations to run scared. Greg Lukianoff and I saw this in universities. Why don't the university president stand up to the protesters who are shouting down visiting speakers? Isn't there a grown up in the room? Then we saw it journalism, newspapers, and editors who wouldn't stand up for journalistic principles. I think what has happened here is that social media allows whoever is angriest and can mobilize most force to threaten, to harass, to surround, to mob anyone. And when people are afraid to say something, that's when you get the crazy distortions that we saw on campus or that or that Rodley Gaines was seeing, too, just that people are afraid to speak up. And in a democracy in a large, secular, diverse democracy, we have to be able to talk about things.

[00:17:36]

And so that's part of why we're in such a mess now is I've argued that it's when social media became super viral after 2009, 2010, you get the like button, the retweet button. Social media wasn't really bad or harmful before. It wasn't entirely harmful before then. But by 2012, 2013, it had really become as though everyone had a dart gun. Everybody could shoot everyone. And that's when we began teaching on eggshells in universities because our students could really do a lot of damage if we said one word they didn't like.

[00:18:03]

And it's not just the students, which is really disturbing. We've talked about this before. There was an FBI security specialist who estimated that somewhere in the neighborhood of 80 % of the Twitter accounts were bots, which is very strange because that means that they're mobilizing specifically to try to push different narratives. Yeah, that's right.

[00:18:26]

So if you think of people say, Well, now Twitter is the public square or things like that. It's not a public square. It's more like the Roman Colosseum. It's more like a place where people say things and the fans are in the stands are hoping to see blood. To move our discussions onto platforms like that that can be manipulated, that anyone, it doesn't have to be a foreign intelligence service, it could be anybody who wants to influence anything in this country or anywhere in the world. For very little money, they can hire someone to create thousands, millions of bots. And so we're living in this fun house world where everything is weird mirrors, and it's very hard to figure out what the hell is going on.

[00:19:12]

Have you ever sat down and tried to figure out a solution to this other than trying to encourage people not to use? Jimmy, does something happen to the volume just dropped lower? Okay, so what was I just saying? We're talking about solutions other than asking kids to not use it, which is very hard to do.

[00:19:31]

Yeah, that's right. So when we're talking about the democracy problems and the manipulation of politics or anything else, those are really, really hard. I have a few ideas of what would help, and we're not going to do them because all of them are like the left likes and the right doesn't or vice versa.

[00:19:46]

What are those ideas, though?

[00:19:47]

Things like identity authentication. If large platforms had something like know your customer laws, that is, if you want to open an account on Facebook or on X, You have to at least prove that you're a person. I think you have to prove that you're a person in a particular country. I think you should, over a certain age. You prove those to the platform, not directly. You go through a third party. So even if it's hacked, they wouldn't know anything about you. You established that you're a real person, and then you're cleared, go ahead, you open your account, you can post without... You don't have to use your real name. If we did that, that would eliminate the bots. That would make it much harder to influence. That would make us have much better platforms for democracy.

[00:20:29]

Is that possible to do internationally?

[00:20:32]

Well, the platforms can certainly require whatever they want for membership. Right now, they are legally required to ask you if you're over 13. If you're 13 or over, they ask it, and then they accept whatever you say, and that's it, you're in. Those rules could be changed, and they could be required to do more. And they're based in the United States, but their users are all around the world. So, yeah, that could be done.

[00:20:58]

So one One of the things that people are nervous about when it comes to authentication, authentication, is that if you could do that, then you could target individuals that wouldn't be allowed to be anonymous. So you would eliminate the possibility of whistleblows.

[00:21:14]

No, no, no. The point is that you just have to establish that you are a person. It doesn't mean that you have to post under your real name. And even if you want ultra high security, you could just have dissidents in repressive countries. They could just communicate by secure channels with a journalist who posts for them. So I understand the concern, and there are values to having anonymity. But I think what we're seeing now is that the craziness, the way it's affecting, it's making it harder for democracies to be good, vibrant democracies, and it's making it easier for authoritarian countries like China to be powerful and effective authoritarian countries. So I think we have to start weighing the pluses and minuses, the costs and benefits here.

[00:21:58]

But how would you ramp that How would you implement that internationally? Say, if you're talking about people in Poland, just pick a country.

[00:22:08]

Well, the platforms can do whatever they want, but then, yes, if a company starts in Poland, Then the US Congress would have no influence on that.

[00:22:17]

China could pretend, and they could falsify the data that shows that these are individuals. They wanted to empower a troll farm.

[00:22:27]

Oh, I see. You're saying even if American companies did this, the Chinese could still get around it. Yeah, that's true. You're never going to have a perfect system. But right now, it's just so easy and cheap and free to have massive influence on anything you want. But the larger question here was, you asked me, what can we do? What I'm saying is there are some things like identity authentication that I think would help. But yes, there are implementation problems. There's all kinds of political questions. So my basic point is, man, those problems, I don't know that we can solve, but we can do better. And I should point out, a lot of these have to do with the basic architecture of the web. When we move from Web 1, which was put up information. It's amazing. You can see things from everywhere to Web 2, which was directly interactive. Now you can buy things, you can post stuff. And it's the Web 2 that gave us these business models that have led to the exploitation of children and everyone else. I'm part of a group, Project Liberty. If you go to projectliberty. Io, that's trying to have a better Web 3, where people will own their own data more clearly.

[00:23:32]

As the architecture changes, it opens us up to new possibilities and risks. There are some hopes for a better internet coming down the pike. Actually, I just wanted to put all this stuff out there about democracy to say this is really hard. But when we talk about kids and mental health, this is actually amazingly doable. We could do this in a year or two. And the trick, the key to solving this whole problem with kids is to understand what's called a collective action problem. So there are certain things where if you have a bunch of fishermen and they realized, oh, we're overfishing the lake, let's reduce our catch. And if one person does If that no one else does, well, then he just loses money. But if everyone does it, well, then actually you can solve the problem and everyone can do fine. With social media, what we see over and over again is kids are on it because everyone else is. Parents are giving their kids a in sixth grade because the kid says everyone else has one and I'm left out. Over and over again, you see this. When you ask kids, How would you feel if I took your Instagram or TikTok away?

[00:24:43]

Oh, I'd hate that. I hate that. But then you say, Well, what if it was taken away from everyone? What if no one had it? They almost always say, that would be great. There's an academic article that showed this with college students. I did it as a test with my students at NYU and a review of the book of Anxious Generation in the Times of London, the UK Times, the woman ended by asking her 16-year-old, would you have liked there to be a social media ban until you were 16? I think the daughter was 18 at the time. This was last month. And the daughter says, Would everyone else be off it, too? And she says, Yes. And then the daughter says, Yeah, I would have rather like that. And so you have this consumer product that the people using it, they don't see value in it. They're using it because everyone else is. And There's evidence suggesting it's messing up their mental health. So anyway, this is a solvable problem if we act together. And that's what the book is about.

[00:25:38]

How would you do that, though? Would you get all the parents to do it? Would you get the social media companies to do it? How would you do that?

[00:25:45]

I'm not counting on the social media companies or Congress. I'm assuming we'll never get help from either one. Now, I hope I'm wrong about Congress. But as a social psychologist, I'm trying to point out we can actually solve this ourselves. The simplest one is this. I propose four norms. If we can enact these four norms ourselves as parents and working with schools, we can largely solve the problem. We can certainly reduce rates of mental illness a lot. The first norm is the simplest. No smartphone before high school. Now people say, Oh my God, but my kid needs a phone. Sure, give him a flip phone. The millennials had flip phones, and they were fine. Flip phones did not harm millennials's mental health. They're good for communication. You text, you call, that's it. So the first rule is no smartphones before high And as long as a third of the parents do this, well, then the rest of the parents are free to say when their kid says, Mom, I need a smartphone. Some other kids have one. Then you can say, Well, no, here's a flip phone. You'll be with the kids who don't have one.

[00:26:44]

Oh, and by the way, you're also going to get a lot more freedom to hang out with the other kids. So we don't need everybody, but we need to break the feeling that everyone has to have one because everyone else has one.

[00:26:55]

Yeah, that sounds great on paper. I can't imagine that most parents would agree to it because there's just so many parents that don't pay attention.

[00:27:06]

That's true.

[00:27:07]

Especially two families where two people are working. Yeah.

[00:27:11]

No, you're right. Just when we look right now, Kids with married parents are trying harder to keep the kids off. These things are good babysitting devices in the sense that the kids are off doing their thing. You don't have to think about them. So it is true that this would not be adopted universally at first. But I think we could still develop a norm that it's just not appropriate for children to have a smartphone. They should have flip phones. I think that any community that wants to do this, because what I find over and over again is that most parents are really concerned about this. This is across social classes. Most parents are seeing the problems. I don't have to convince parents to change their minds about something. What I'm trying to do with the book is show them, here are four norms that are pretty easy to do if others are doing them. And these are going to make your kids happier, less mentally ill.

[00:28:09]

Yeah. Like I said, it sounds like a good suggestion. I just don't imagine that with the momentum that social media has today and the ubiquitous use that kids are going to give it up. They're not going to want to give it up. I think there's a lot of kids that have had problems that if you talk to them alone and you say, Wouldn't it be better if social media didn't exist, if they've been bullied or what have you, they'd say yes. But the idea of getting a massive group of people to adopt this is highly unlikely.

[00:28:40]

Well, you may be right, but I'm encouraged because whenever I speak to Gen Z audiences, and I've spoken to middle schools, high schools, college audiences, I always ask, Do you think I got this wrong, or do you think this is a correct description of what's happening? They agree. They're not in denial. They see the phones are messing them up. They see I see that social media is messing up the girls, especially. So even in middle school, certainly high school, the kids actually agree that this is a problem. And so if it was offered to them, let's do the other three norms. Let's get them all for on the table. Okay, please. All right, so the first is no smartphone before high school. Second is no social media until 16. That one's going to be a little harder to do. But the big platforms like Instagram, the place where you're posting and the whole world is seeing and strangers are contacting you. I think the age is currently 13 and it's not enforced, I think that needs to go up to 16. Here, it would be nice if Congress would raise the age to 16 and make the companies enforce it.

[00:29:39]

But even if they don't, as long as many other parents are doing it, me, I as a parent, My kids are 14 and 17. As long as many other parents are saying 16 is the age, then it's very easy for me to say that also. That's the second norm.

[00:29:53]

Yeah. Again, if you could get them to say it. And I think the kids would push back so hard because so many other kids are on it, and that's how they interact with each other.

[00:30:03]

Joe, you're just reiterating the collective action problem. You're just saying they react because all the other kids are on it. Yes. So it does require a big push. But I think we're ready. I don't think we're ready in 2019. It wasn't as clear. But now that we're through COVID, now that the numbers are through the roof, I think we're ready. And if it starts in some places, not others, that's okay with me. That's the way it's going to be. And then we'll see whether it spreads.

[00:30:24]

Then we'll see the data.

[00:30:26]

Yeah, because look at smoking. Smoking is highly addictive. It was very common. Up through the 1990s, and now it's very rare in high school. Very few high school kids smoke. So it's possible to change norms.

[00:30:36]

And what was the third?

[00:30:37]

The third is phone-free schools, and this one is happening. This is already happening. So I've published articles in The Atlantic, and and on my Substack, the afterbabel. Com, bringing together the research, when kids have a phone in their pocket in school, they're going to be texting. Because if anyone is texting during the day, during the school day, they all They'll have to check because they don't want to be out of the loop. They don't want to be the one who doesn't know. So when kids started bringing smartphones into school instead of flip phones, academic achievement actually went down. Kids are stupider today than they were 15 years ago. I mean, stupider meaning measuring their academic progress. After 50 years of improvement, it turns around after 2012. And this is true in the US and internationally. So there's just no reason why kids should have the phone on them. They should come in the morning, put it in a phone locker or Yonder pouch, go about their day. And guess what? The schools that have tried it, after a week or two, everyone loves it. The kids are like, Oh, wow, we actually talk in between classes.

[00:31:36]

We have five minutes in the hallway, we actually talk. And you hear laughter. Whereas right now in a lot of schools, it's just zombies looking at their phones in between as they're walking from class to class. Yeah.

[00:31:46]

So the assumption is that from 2012, kids are just much more distracted?

[00:31:52]

Oh, my God. I mean, look, Joe, I think I heard you say in one of... Yeah, it was a conversation you had a few weeks ago with a comedian A friend of yours, and I think this was a direct quote from you. My fucking phone runs my goddamn life. Does that sound like you? Yeah, it sounds like me. Okay. So as adults, we have a fully formed prefrontal cortex. You and I had a normal childhood Our brains developed. We have the ability to stay on task. And man, it is hard with notifications coming in. There's always so many interesting things you could do instead of what you need to do. So it's hard enough for us as adults. Imagine if you didn't have a normal childhood where you developed executive function, where you develop that ability as a teenager, because puberty is when the prefrontal cortex, the front part of the brain, that's when it rewires into the adult configuration. So the fact that we're scrambling kids attention at the time when they're supposed to be learning how to pay attention, I think is terrible.

[00:32:50]

Where do you think this is going? This is my concern, is that this is just the beginning of this integration that we have with devices and that the social media model, and it's been immensely profitable. Yes.

[00:33:07]

Oh, my God.

[00:33:08]

Incredibly addictive. And there's a massive amount of capital that's invested in keeping us locked into these things. Where do you think this goes from here? Have you paid attention to the technology?

[00:33:22]

Like AI? Yeah. Yes. So let me just draw a very, very sharp, bright line between adults and children. I'm very reluctant to tell adults what to do. If adults want to spend their time on an addictive substance or device or gambling, I'm reluctant to tell them that they can't. When we're talking about adults, I think where this is going is, well, where it's gone so far is everything that you might want becomes available instantly and for free with no effort. In some ways, that's a life of convenience, but in other ways, it's messing us up and it's making us weaker. So you want sexual satisfaction. Okay, here you go. Free porn. And it gets better and better and more and more intense. You want a girlfriend or boyfriend who you can customize? You have that already. Advances in robotics are such that it's just a matter of time before AI girlfriends are put into these incredible female bodies that you can customize. So I think the adult world, for young adults especially, is going to get really, really messed up. And again, I'm I'm not saying we need to ban it now, but what I'm saying is, for God's sakes, don't let this be 11-year-old children's lives.

[00:34:37]

Let's at least keep children separate from all this craziness until their brains develop, and then they can jump into the whirlpool in the tornado. But the fact that our 11-year-old girls are now shopping at Sephora for anti-wrinkle cream or all sorts of expensive skin treatments, this is complete insanity. So let's at least protect the kids until they're through puberty.

[00:35:01]

Well, that would be nice.

[00:35:04]

That would be nice. It's essential, I think.

[00:35:06]

It's just the way I see adults being so hooked on these things. There's so many adults that I know that are engrossed in this world of other people's opinions of everything they think and say. And it just doesn't give you enough time to develop your own thoughts and opinions on things. That's right. So many people are dependent upon other people's approval. And there's just so many people that are addicted to interacting with people online and not interacting with exceptional people in the real world. Yeah, that's right.

[00:35:40]

One way to think about this is let's look at junk food, which became very popular after the Second World War. The manufacturing of food became very good. There were science labs. At Frito Le, they studied the exact degree of tensile strength that a chip should have before it snaps. How do you make this? What's the perfect crunch? They designed the foods to be as addictive as possible. And in the '70s and '80s, Americans switched over to a lot of junk food and we became obese. A huge increase in obesity. And that kept going on for a few decades. As I understand it, obesity has finally turned around a little bit. And many people are still eating huge amounts of junk food, but at least some people are beginning to say, You know what? I'm going to resist that. Deep evolutionary programming for fat and sugar. The companies played to that. They hijacked those desires, and they got us hooked on junk food. But after 50 years, we're making some progress in pushing back and having healthier snacks and eating less.

[00:36:42]

What's the root of that progress?

[00:36:44]

I don't actually know the numbers. I just know a few years ago, I saw something that for the first time, obesity actually went down in the United States. I don't know that that's still true today, but this was like three or four years ago. Before COVID, I saw something.

[00:36:54]

Do we know what caused it to go down?

[00:36:56]

I don't. I'm just assuming that this is an issue that we dealt with as a society, and we didn't know what we were doing at first, and we got hooked, and the efforts to educate people and to develop healthier alternatives. So again, I I should have looked at the data before I came here, but I'm just using this as an analogy.

[00:37:20]

I'm sure Jamie can find something.

[00:37:23]

Okay, yeah. Look it up online. I'm surprised. Is obesity still rising in the United States, or is it actually a little lower than it was 10 years ago? That's the question. I quickly found this study here, but I haven't even got a chance to look at it yet.

[00:37:33]

This is the second time I've done this. Something about this is giving me anxiety. I'm spilling this. Update on the obesity academic. After the Sudden Rise, is the upward trajectory beginning to flatten?

[00:37:45]

Okay, so it's a question, in what year was this?

[00:37:46]

Do you think it's people recognizing that they're developing health issues and they're taking steps to discipline themselves and mitigate some of these issues? Or is there some information push that's leading in that direction.

[00:38:02]

That I don't know because it's not my field. But I would say that that is a probably necessary precondition, understanding the problem and developing a people a desire to change it. And then it's hard to change. I love chips, I love chocolate, I love ice cream. It's hard to change. But over time, a society adapts. And now the question is, will we adapt to social media? Because the desire for sugar and fat and salt is is very deep. The desire for others to think well of us, to hold us in esteem, I would say, is just as deep and much more pervasive. It's much stronger, I would say. Because as adults, we're Very concerned. When I put out a tweet, I know all this stuff. I know how terrible this is for me to check. I'm busy. I've got things to do. But I'll go back and I'll check how the tweet is doing 30 seconds later. And then I'll check again five minutes later. So it's hard for me to resist that. What are people saying about the thing that I just said? But the question is, will we adapt to it in some way so that we begin, as with junk food, we're still going to be consuming junk food, but maybe we'll keep a lid on it.

[00:39:14]

I don't know. But what I can say with not confidence, but I think is the case, is as long as our kids are going through puberty on social media and video games, and they're not developing executive control, I do not think they will be able to keep a handle on this. As adults.

[00:39:30]

I do not either. Again, as you're saying, we are adults. We grew up without the Internet, and we grew up without all these problems, and it is hard. I try to tell all my friends to use my strategy, which I call post and ghost. I don't read anything. I just post things. I post things, I don't read comments. That's really smart. It's made me immensely more happy. It's a massive difference. I very rarely use Twitter. The only reason why, or X, whatever. The only reason why I use it is to see information. To see things. I don't read anything about myself, and I certainly don't. I very rarely post at all. And if I do post, I certainly don't read what... Because first of all, I'm aware of this number, this FBI security specialist, the 80 % of it, and I see it all the time. There's so many times where I'll see any social issue, any political issue, anything that's in the zeitgeist. When you see someone post about it, you'll see these people posting, and I'll look at it. It's like a couple of letters and a bunch of numbers, and I will go, Okay, is that a real person?

[00:40:32]

And then I go to their page, Nope, not a real person. How many of them are there? Oh, I haven't done that. There's a lot of them. There's a lot of them, especially when it comes to things like Ukraine, Israel, Gaza.

[00:40:42]

Right, because those are areas where various actors are, or various parties and actors in countries are trying to manipulate us. Yes.

[00:40:49]

And they're doing it.

[00:40:51]

They're doing a great job of it.

[00:40:52]

They're very focused. It's really incredible. It's incredible to see the impact that it has when you see 50 posts on us, 50 comments, and 35 of those seem to be not real people.

[00:41:05]

That's right. I think your strategy is very wise, and for this reason, when social media began, you would put something up and then people could comment on it. Okay, that goes until about 2013, 2014. I think it's 2013 when Facebook introduces threaded comments. So now you put something up, someone says some horrific, nasty, racist, whatever thing in comment thread, and now everyone can reply to that comment, and people can reply to the... So you get basically everyone fighting with each other in the comment thread. And what social media is good for is putting out information quickly. I'm a professor, I'm a researcher. I am engaged in various academic disputes and debates, and Twitter is amazing for finding current articles, for finding what people are talking about. So the function of putting information out is great, but the function of putting something out and And then watching everyone fight in the comments, that's why I use the analogy of the Roman Colossian, like with the gladiators. That's just sick. There's nothing good comes from that.

[00:42:09]

My concern is that we are paddling upriver and that there's a raging waterfall that's powering this whole thing that you cannot fight against, and that we are moving in a direction as a society with the implementation of new, more sophisticated technology that's going to make it even more difficult unless you completely opt out. And some people are going to opt out, but it's going to be like my 600 pound life. People that are realizing, oh, my God, what have I done? I've ruined my body. I've ruined my life. How do I slowly get back to a normal state? And it's going to take a tremendous amount of effort. I think about the amount of effort, the amount of focus that people have on comments and things. If you're addicted, if you're currently deep into it right now, where you're tweeting constantly. There's people that I follow that I know they're tweeting 12 hours a day. Yeah, that's right. It's sad. It's so sad. Yeah, they're addicts. My fear is that this is only going to get greater.

[00:43:18]

Yeah, I share that fear. And if current trends continue, it's really not clear how we get out of this. Something might break in a big way. Human humanity has faced many crises before. That doesn't mean, as they say, past performance is no guarantee of future success. So we face many crises, and we've always come out stronger.

[00:43:41]

I've never faced anything like this.

[00:43:43]

That's right. This is a rewiring. This is a rewiring. That's right. Exactly. That's right. So we face many external threats. We face diseases, we faced wars, those have come and gone. But this is a rewiring of the basic communication network of society in ways that link up with so many of our deepest motivations. This is a challenge unlike any we've ever faced. I think what we really need, I'm speaking as a university professor, is we really need great social sciences. We need great sociologists. We need people really studying this. But it's all happening so fast. And then the problems in universities of political concerns sweeping in. I fear that we're heading towards this. Well, you said going upstream to a waterfall. I think it was going downstream. We're about at the top of a waterfall, going to go over the edge.

[00:44:37]

That, too. Yeah. Well, we're trying to paddle, but that's the direction we're moving. Yeah, that's right.

[00:44:42]

That might be the case. So Yeah, we live at a very interesting time in history when in the '90s, the future looked so bright, and now it doesn't.

[00:44:52]

My fear is that we are no longer going to become human, that we will no longer be human. We'll be a different And I think the implementation of technology is what's going to facilitate that. I think we're how many years away from Neuralink and something similar to it that's going to change how we interact completely. And then it's not going to be a question of whether or not you opt out, whether you pick up your device. Your device is going to be a part of you. And there'll be incentives, whether it's performance incentives, whether you're going to have more bandwidth, whatever it is.

[00:45:32]

They have a competitive advantage.

[00:45:33]

Yeah. That's the real fear of something like Neuralink or whatever. If they can figure out a Neuralink that doesn't require surgery, if they could figure out something that does that without surgery, the advantage of having that in a competitive sense in terms of business and technology and industry, it's going to be massive. And it's going to be so difficult to get people to not do that, that it's going to be like phones. I remember when I When I moved to Los Angeles in 1994, I bought a Motorola Star tack, and I was like, look at me. I had a phone in 1989.

[00:46:09]

Oh, wow. One of the big ones that went to a satellite?

[00:46:11]

It was actually connected to my car in 1989, and it was Very advantageous. My friend Bill Blue and Wright, he owned the Comedy Connection. He owns the Wilbur Theater now in Boston. And I got a lot of gigs from him because he could call me when someone canceled. He had an advantage. Someone got sick and they said, hey, can you get to Dartmouth at 10:00 PM? I'm like, fuck, yeah. And so I got gigs from that. We joke about it to this day that I was the first guy that he knew that had a cell phone. It was a huge advantage. And I remember when I had one in 94, I was like, this is great. I can call my friends. I don't even have to be home. There was so many positives to it. And it gave you an advantage. It gave you an advantage. You didn't have to be home. If there was a business thing that I had to deal with, there was something going on with my career, I I could deal with it on the phone at Starbucks or wherever I was. My fear is that this is going to be that times a million.

[00:47:07]

It's going to be you have to have it in order to compete, just like you have to have an email today. You have to have a cell phone to that.

[00:47:15]

That's right. Yes, we are certainly headed in that way. And I think the word human is a very good word to put on the table here. Some things seem human or inhuman. And when you simply connect people, Zuckerberg sometimes says, How could it be wrong to give more people more voice? If you're simply connecting people, making it easier for them to contact each other, I think that's mostly going to have good effects. And that happened with the telephone. We all got telephones, and we could do all sorts of things. We could coordinate with our friends. Telephones are great. But when it became not technology making it easier for this guy to reach you or me to communicate with you, but rather it's a way to put things out to try to gain prestige from me in front of thousands or maybe millions of people. Now it changes all of our incentives. It changes the game we're playing. What games are we playing as we go about our day? And the more people are playing the game of I'm struggling to get influence in an influence economy where everyone else is on these seven platforms. So I have to be two or they have an advantage over me.

[00:48:21]

That is the way that things have been rewired already. Already we're there. Now you're raising the possibility that the next step is more hardware-based, that it's going into our bodies, and I think that is likely to happen. And so I hope what we'll do now, and I hope my book, The Anxious Generation, will promote a pause. Let's think where we are. Let's think what we've done. Let's look at what has happened. When our kids got on phones and social media, we thought, Oh, this could be amazing. They can connect. They can form communities. It's going to be great. And now it's clear, no, it's been horrible. It's been really, really terrible. Well, as soon as they got on, their mental health suffered. They might feel like they have a community, but it's much worse than what it replaces. So I think what we're seeing is the techno-optimists, the futurists who say, Oh, it's going to be amazing. We'll have Neuralink. We'll have all this technology. We'll be able to do everything. Here's where we have to heat, I think, the warnings of the ancients, of religious authorities, of those who warn us that we are leaving our humanity and we're stepping into an unknown zone where so far the initial the initial verdict is horrible.

[00:49:33]

So if we keep going without putting on some breaks, yeah, I think we're going to a horrible place.

[00:49:40]

Yeah. My fear is that it won't be horrible.

[00:49:44]

Oh, It will feel good.

[00:49:45]

Yeah, that it would be amazing. So my fear, my genuine fear, is the rewiring of the mind in a way that can enhance dopamine, enhance serotonin, and do things that can genuinely make you feel better. In the short run. Yes. And that we will decide that this is a better thing. Regardless of how you feel about SSRIs, most people think that they're being dispensed too readily and that too many people that get on antidepressants could have solved that issue with exercise and diet, because this is a big part of the reason why people are feeling shitty in the first place is their body's failing.

[00:50:32]

Yeah, and having less sex. I read recently that the SSRIs are suppressing sex drive in many people. Yes.

[00:50:38]

So there's that. There's a lot of issues that come along with those, and yet there's an immense profit in making sure that people take those and stay on those. My fear is that if you can do something that allows people to have their mind function, have their brain, their endocrine system, have all these things function at a higher level, then everyone is going to do it. You would not want to just be natural and depressed if you could just put on this little headset and feel fantastic. And maybe it could be a solution to so many of our society issues. Maybe bullying would cease to exist if everyone had an increase in dopamine. It sounds silly, but if dopamine increased by... Look, if you have an entire society that's essentially on a low dose of MDMA, You're not going to have nearly as much anger and frustration. And you also are not going to have as much blues. You're not going to have as many sad songs that people love. You're not going to have the literature that people write when they feel like shit. It's unfortunate, but also as a whole, as a society, it probably would be an overall net positive if people did not want to engage in bullying, did not want to engage in violence, did not want to engage in theft, were more charitable, more benevol.

[00:52:05]

If you look at things in that direction, that's my concern, is that the rewiring of the mind, what we're essentially seeing right now is a rewiring of a mind that we didn't anticipate it. It has negative consequences. We thought about it in a positive way. Oh, this is going to be great. We're all going to be connected. How would it be bad that people could have more voices like Zuckerberg says? But my fear is that it's going to just change what it means to be a human being. And my genuine feels that this is inevitable and that as technology scales upward, this is unavoidable.

[00:52:42]

Right now, it certainly feels that way. And while I'm not optimistic about the next 10 years, I share your vision of what's coming, but I'm not resigned to it. People always say to me, I go around saying, We need to do these four norms, we can do them. And people say, Oh, that ship has sailed. Like, Oh, the train's left the station. But if a ship has sailed and we know that it's going to sink, we can actually call it back. I've been on airplanes where it leaves the jetway, and then they call it back because they discover a safety issue. So we are headed that way, I agree. But I think we can... I mean, we humans are an amazingly adaptable species. I think we I can figure this out, and there are definitely pathways to a future that's much better. These technologies could, in theory, give us the best democracy ever, where people really do have the right voice. It's not just the extremes who are super empowered as it is now. We're at a point in space and time, let's say right now, and I can imagine a future that's really fantastic.

[00:53:53]

But how do we get there? And are we able to get there? Is there a path? Or is it like there's no path from A to B. So I don't know, but I think we sure as hell have to try. And the first thing we have to do is not be resigned and just say, oh, well, the world's going to hell. What are you going to do about it? It's too big. So let's start. I have a proposal. Let's Start with the one area that we can all agree on, which is our kids. It's the most amazing thing. In Congress, any issue, if the right likes it, the left won't, and vice versa, except for this one. This is the only thing in Washington that's really bipartisan. The senators and congressmen have kids. They see it. So let's test the proposition that all is lost and we're helpless. Let's test that proposition, and let's test it in the place where we're most likely to succeed, which is rolling back the phone a more play-based childhood and replacing it with a more play-based childhood. Actually, I said there are four norms. We talked about three. If you don't mind, I'll put in the fourth norm now.

[00:54:54]

The first three are about phones. No smartphone before high school, no social media until 16, phone preschools. But if you take away the phones and you don't give kids back each other and play time and independence, what are they going to do? You're going to keep them at home all day long without screens? The fourth norm is more independence, free play, and responsibility in the real world. This is a thing that you and I talked about last time. I think we actually had a small disagreement, where I'm a big fan of Lenore Skanezi, the woman who wrote Free-Range Kids. She and I co-founded organization called Let Grow. Parents, please go to letgrow. Org. All kinds of ideas for how to help your kid have more independence, which makes them more mature, it makes them less fragile. So this fourth norm, this is the harder one. This is the one that we have to really overcome our fears of letting our kids out. Actually, let me ask you. I think our last time was, I talked about this and I said letting kids go for sleepovers and spend more time with other kids and unsupervised advised.

[00:56:00]

And then you said, I think you said, No, I'm not letting my kid go to sleepovers because I don't trust the other families. Does that sound familiar to you?

[00:56:06]

I don't believe that's what I said. I think our concern was with people wandering around with kids being free to walk home in cities.

[00:56:14]

Yes, you had that also. We did talk about sleepovers. My kids have sleepovers.

[00:56:18]

They've always had sleepovers. If you know the parents and you trust the parents, it's a great way to give the kids independence and have them interact with other people. Good.

[00:56:25]

So tell me, what was your policy with your kids, with all three, on when you let them out, like they could go out the door, get on a bicycle, walk seven blocks to a friend's house without any adult with them? Do you remember what age or grade?

[00:56:38]

No, I don't. I mean, it's fine if you live in a good neighborhood. The problem is if Childhood predators are real.

[00:56:47]

Not really. Not anymore. What I mean is- What do you mean? Well, when you and I were growing up, there were childhood predators out there in the physical world, approaching children. I think you said you told a story about one who approached you when you were doing magic tricks. So there were child predators out there. That's true. They're all on Instagram now. The kids aren't out. And Instagram, and especially Instagram, makes it super easy for them to get in touch with children. So this is my point. I can summarize the whole book with a single sentence. We have overprotected our kids in the real world and underprotected them online.

[00:57:26]

I would agree to that.

[00:57:27]

So yes, child predators are terrible, but guess what? We actually locked up most of them. When you and I were growing up, they weren't all locked up. They were just excentrics who were exposing themselves. Remember flashing flashers? That doesn't happen anymore because if you do that now, you're going to jail for a long, long time. So we actually locked up most of the predators, and they know, don't approach kids on a playground, approach them on social media.

[00:57:49]

I don't know if we are doing that.

[00:57:50]

And there's this new push. Oh, yeah. No, once you're identified as a sex offender, you are gone for a long time, and then they're a sex offender. No, we've really done a lot since the '90s. To make the real world safer.

[00:58:04]

But there is push against that. You're aware of this term, minor attracted persons that's being pushed?

[00:58:10]

Disgusting.

[00:58:10]

Disgusting and freaky. It's disgusting.

[00:58:11]

It's completely disgusting.

[00:58:12]

It's such a bizarre term that I got to imagine is only being done by people who don't have children. And they're pushing this thing that it's an identity and that it's not the fault of the person who has this issue. What's the root of that? Have Have you investigated that?

[00:58:30]

Yes. Not that specific issue, but I can... Look, I study moral psychology. That's my academic discipline. And I study the roots of it, evolutionarily, historically, and child development. What is our moral sense? And there are different moralities, and in some ways that's good, and left and right push against each other. I'm very open to different moralities. But when a group makes something sacred and they say this is the most important and nothing else matters other than this, then they can go insane, and they lose touch with the reality. I think, again, I don't know the history of this particular movement, that horrible term. But there is a certain morality, which is all about oppression and victimhood. And once someone, I guess somewhere said, Oh, men who are attracted to boys or little girls are You are victims, I don't know what. In some little eddy of weird morality, someone put that forward as a new victim class because we've been trying to address victimhood all over the place. Once someone puts that up as a new victim class, and you have to You have to change the terms. This is very Orwellian. You change the terms.

[00:59:48]

Then some others who share this morality, which is focused on not making anyone feel marginalized, not allowing any labels that will slander someone or make them look bad. I think people who approach children for sexual goals, I am very happy to have them slandered and labeled and separated. But I suspect that some people, once they When they knock this in as a group that's being marginalized, they say, Well, we have to defend them, and we don't think about what the hell we're actually saying.

[01:00:23]

It seems purely an academic thing. It seems that this is something that with people that only exist in an academic space where it's almost like an intellectual exercise in understanding oppression. You can't apply it in the real world. It's just it's too fucked up. The consequences of it are horrific. Yeah, that's right. Normalizing, victimizing children.

[01:00:52]

That's right. Now, the one thing... Before we go any further with this particular topic, I would want to point out one of the problems that our social media world has given us, which is somewhere in all of the academy, in all the universities, some philosopher, let's say, proposed that term or raised an idea. This has been going on for thousands of years. Someone in a conversation proposes a provocative idea. What if we think about this as a minor-attracted person? They put that idea out, and then other people say, No, that's really stupid, and it doesn't catch on, because this is not an idea that's going to catch on, even in the academy. So But I think where we are now is, I'm guessing someone proposed this, somebody else got wind of it, posted it online, and now you're going to have a whole media ecosystem going crazy about this terrible idea. So maybe can you look up a minor-attracted person? Is this just a thing that was from one academic talk, or is this an actual movement?

[01:01:52]

Well, I've seen politicians discussing.

[01:01:55]

No way. Wait, wait, wait. As decriminalizing or destigmatizing stigmatizing.

[01:02:00]

Oh, God. There was a recent politician that went viral for this discussion.

[01:02:06]

All right. They may be wrong.

[01:02:07]

It was more than one. There was two specific women that were doing that, and I didn't investigate whether these women had families or what it was, but this push to try to alleviate bullying or alleviate shame or alleviate the stigma that's attached to what they're calling an identity.

[01:02:31]

Yeah, that's right. Actually, so that brings us to the issue of identitarianism, which I think is a useful term for us these days. I think a lot of what's happened on campus is the move to focus on identity as the primary analytical lens in a number of disciplines, not in most disciplines, but in a lot of the humanities, the studies departments. So putting identity first and then ranking identities and saying some identities are good, some are bad. This really activates our ancient tribalism. I think that the liberal tradition, going back hundreds of years, is really an attempt to push back against that and to create an environment in which we can all get along. As I see it from inside the academy, we've always been interested in identity. It's an important topic. There's a lot of research on it going back many decades. But something happened in 2015 on campus that really elevated elevateded identitarianism into the dominant paradigm, not dominant in that most people believed it, but dominant in the sense that if you go against it, you're going to be destroyed socially. That's what cancel culture is. That's what Greg Lukianoff and Ricky Schlott, their new book, The Canceling of the American Mind, is about.

[01:03:45]

So, yes, it's the people who are putting identity first, and that's their religion and their morality. I mean, they're welcome to live in the United States, but when they get influenced in universities or in school boards, Yeah, bad stuff will happen.

[01:04:01]

It's just bizarre the effect that it does have when people push back against identity politics. It's a small, very vocal minority that pushes this agenda, and it's not the majority of people. The majority of people mostly disagree with these ideas.

[01:04:20]

Yeah, absolutely. This is, again, a really important point about how our society has changed. Those of us from the 20th century still think in terms of public opinion, Do most people believe this or do most people not believe it? And most people are sane. Most people are not at all crazy. Most people are pretty reasonable. And I think what's happened since social media became much more viral in 2009, 2010, is that the extremes are now much more powerful, and they're able to intimidate the moderates on their side. So on the right, the center right, what I call true Conservatives or like Berkian, Edmund Berk Conservatives, they get shot and they get excluded, and there's not many of them in Congress anymore. And on the left, you have the far left, the identitarian left, shooting darts into people like me, into anybody who questions. So they shoot their moderates. And what you have is even though most people are still moderate and reasonable, our public discourse is dominated by the far right, the far left, and all these crazy fringe. I mean, it can be neo-nazis on one side, and then these identitarians defending minor-attracted people on the other side.

[01:05:26]

So don't lose faith in humanity. Lose Well, don't lose faith in humanity. Recognize that we've moved into this weird, weird world because of social media in which it's hard to see reality and in which people are afraid to speak up. And so we get warped ideas rising to dominance, even though very few people believe them.

[01:05:48]

And I think this is where bots come into play. Yeah, they can really amplify it. Because I think I really do believe that this is being amplified, whether it's by foreign governments or by special interest groups or by whoever it is is trying to push these specific narratives.

[01:06:04]

Absolutely. And this can bring us right back to TikTok and the national security threat. Vladimir Putin was a KGB agent in the 20th century. And the KGB, going back, I think it was in the '50s, they had some a meeting or something where they decided that they were going to take, I think it's called active measures. They were going to try to mess up American democracy. They'd spray paint racial slurs. They'd put swastikas on synagogs. They saw that we're a multi-ethnic democracy. We're making a lot of progress towards tolerance. And the Russians, the Soviets, were trying to put a stop to that and make us hate each other. So they were doing that back since the 1950s. And it was expensive. They had to fly people over or they had to try to win people over. You He didn't scale the operation. But that's the tradition that Vladimir Putin comes from. Now, the Soviet Union falls in 1991. I think he's in Berlin. I can't remember where he was, but he was very influenced by this and the humiliation of the Soviet Union. He rises to power again in the 21st century. Do you think he suddenly no longer wants to mess with American democracy?

[01:07:10]

Did he suddenly drop that desire? We basically handed him the tools. We said, Okay, you can open as many Facebook accounts as you want, Twitter accounts. Open as many as you want. There's no identity authentication. There's no age verification. Create bots all you want and have them mess with us. And Renee Duresta has a book coming out soon. She really did amazing work to get to the bottom of this. They started running tests in 2013. They created accounts on all these platforms long before, but they started running tests. Could they get Americans to believe that an explosion had occurred at a refinery plant in Louisiana? Yes, they made it all up and people believed it. Could they get Americans to believe some extreme BLM post that was completely outrageous? Yes. Same thing to enrage to enrage people on the left. We know that the Russians are messing with us. We know that the Russians know our weak point. And by Russians, again, I don't mean the Russian people. I mean Vladimir Putin. The government. The government. We're handing them the tools in the instruction book for how to divide us, how to weaken us, how to make us lose our resolve and our will.

[01:08:21]

Have you seen Yuri Besminow give a speech?

[01:08:23]

Oh, is that the- Yeah. Yes.

[01:08:25]

I've seen that. That conversation about the ideological subversion. That's incredible. And he did this in the 1980s. I think it was 84. And he was talking about how the work is already done. That's right. And that is just a matter of these generations now going into the workforce with Marxist ideas and with all this ideological subversion that the Soviet Union has injected to the universities.

[01:08:46]

That's right. That could be right. I mean, it is chilling to watch, and it is prophetic. But they were playing a long game. I mean, the communists planning the communist revolution. They were patient, and they were playing the long game.

[01:08:58]

Yeah, as is China. Yeah, that's right. They're very smart. There's so much more... Because they're dictatorship, they have complete control over what they choose to do. They don't have to meet with subcommittees. They don't have to have congressional hearings. They can just do it.

[01:09:18]

Okay, that's a good point because that brings us to the big difference between democracies and autocracies. Back in the 1930s, when the West was economic collapse. It was the Soviet Union, and then the Italian fascists, and then Hitler, the German fascists. They were making rapid economic progress. The criticism of democracy has always been, It's chaotic. There's no good leadership. They can't plan ahead. And that's all true. But why did we triumph in the 20th century over all these other models? Because democracy gives us a degree of dynamism, where we can do things in a distributed the way. We have people just figuring stuff out. We have an incredibly creative economy and business sector. And so democracies have this incredible ability to be generative, creative, regenerative, unless you mess with their basic operating system and say, let's take this environment, which people talk to each other, share ideas, take each other's ideas, compete, try to get a better company. Let's take that. And let's change the way people talk so that it's not It's not about sharing information. It's about making them spend all day long, nine hours a day competing for prestige on social media platforms.

[01:10:37]

And in a way that empowers everyone to complain all the time. This, I think, really saps the dynamism. I think this social media, what I'm suggesting, I haven't thought this through, but I'm suggesting is that whatever the magic ingredient that made democracy so triumphant in the 20th century, Western liberal democracy, American style democracy, whatever made it so triumphant is being sapped and reduced by the rapid rewiring of our society onto social media.

[01:11:05]

Yeah, I would agree with that. And I think it's also being influenced, again, by these foreign governments. Absolutely. That have a vested interest in us being at each other's throats.

[01:11:12]

Why wouldn't they? It's so cheap.

[01:11:15]

It's so cheap, it's so effective, and it seems to be the predominant way that people interact with each other. That's right. Wouldn't you say that you've been attacked? What have you specifically been attacked about?

[01:11:27]

Oh, it's just in the academic world, if you say anything about any DEI-related policy, you'll be called racist or sexist or homophobic or something. I was always on the left. I was always a Democrat. Now I'm nothing. I'm an extremely alarmed patriotic American citizen who sees my country going to hell.

[01:11:53]

I'm in that camp.

[01:11:54]

A lot of us are. A lot of us are politically homeless now. Yeah. But I started my career in political psychology. My original work was on how morality varies across cultures. I did my dissertation research in Brazil, and then I did some work in India. It was only in the '90s that we Our culture were heated up, and I began to see that left and right were like different countries. We had different economics textbook, different American history, different US Constitution. It was like different worlds. I began actually trying to help the left stop losing elections, like in 2000, 2004. As a Democrat, I thought I could use my research in moral psychology to help the Democrats understand American morality, which they were not understanding. Al Gore and John Kariathe did a very bad job. So I've all along been critical of the left, originally from within the left. And that's a pretty good way to get a bunch of darts shot at you. Nothing terrible ever really happened. Lots of people have been truly canceled, shamed, lost their jobs, considered suicide. So nothing like that has ever happened to me. But when there's some minor thing on...

[01:13:09]

People take a line out of one of your talks, they put it up online with a commentary about what an awful person you are. Thousands of people comment on it or like it or retweet it. It hurts. It's frightening in a way like nothing else I've ever known.

[01:13:22]

And how many of those people were even real people? Yeah, that's right.

[01:13:25]

This is the real question.

[01:13:26]

That's right. Because it really is in dispute. It was one of the major disputes when Elon bought Twitter.

[01:13:31]

That's right.

[01:13:32]

I mean, one of the things that's come out of Elon buying Twitter, and thank God he did, as much as people want to talk about the negative aspects which are real, which I've seen racism and hate go up on Twitter. I've seen it being openly discussed, which is very disturbing. But what we did find out is that the government was involved in this, that the federal government was interfering with people's ability to use these platforms for their speech.

[01:13:59]

Because of COVID, yes, that's right.

[01:14:01]

But I feel like that's just a test run. Being able to implement that for that, then you can implement it for so many different things, dissent about foreign policy issues, dissent about social issues. There's so many different ways they can do it, if they can somehow or another frame it in a way that this is good, better for the overall good of America.

[01:14:22]

Yeah, that's right. So that's why I never talk about content moderation. I'm not interested in it. There has to be some, but most people focus on the content, and they think if we can clean up the content or change the content or in those Senate hearings we saw a couple of months ago, if we can reduce the amount of suicide promoting or self-harming promoting content that our kids are seeing, then all will be well. No, it's not primarily about the content. I agree with you that the government was influencing these platforms to suppress views they thought were wrong, and some of which turned out to be right. I'm a big fan of my friend Greg Lukyanov, who runs the Foundation for Individual Rights and Expression. So I think we shouldn't be thinking about social media like, well, how do we keep the wrong stuff off and only have it have the right stuff. I think almost only about architecture. How is this platform designed? And can we improve it in ways that are content neutral? Can we improve it in ways that aren't going to advantage the left or the right, but are going to make more truth-seeking?

[01:15:31]

And so Frances Haugen, the Facebook whistleblower, when she came out, she had all kinds of ideas about settings, things that Facebook could have done to reduce the incredible power of the extremes. The farthest right, 3%, the others left 3%, and then a bunch of just random weirdos who just post a lot. They have extraordinary influence. And that's not about a left-right thing. That's about, do we want an information ecosystem that super-duper empowers the extremes and silences the middle 80%? Hell no. So that's the regulation that I favor, focusing on making these platforms less explosive and more useful.

[01:16:09]

And there's also this discussion that comes up a lot about algorithms. Algorithms have essentially changed the entire game because it's not just what's online, it's what do you interact with more frequently, and that's accentuated. And the problem with that is most people interact with things that rile them up. And so you're developing these platforms that are immensely profitable that ramp up dissent and ramp up anger and ramp up arguments. And in the case of yourself, instead of just debating you on these issues and doing it in a good faith manner, Jonathan hate believes this. This is why I disagree. I think of this and that. Instead, they'll label you as whatever. That's right. Racial, sexist, homophobic, Islamophobic, xenophobic, whatever they can say, whatever pejoratives they can throw at you that essentially this reductionist view of your perspective that makes it incredibly negative. That's right. And then you'll get bots that interact with that, that push that.

[01:17:18]

That's right. So Twitter only went to algorithms, I think in 2017. So before then, people who would tweet a lot. People talk a lot about algorithms. That's the cause of the whole problem. And they're not the cause of the problem, but man, are they amplifiers? And I think that's what you're saying. They're just super duper amplifiers on whatever craziness would be there even without them. And so that certainly is shaping what we receive, what our children receive. And so this is some of the stuff that I think, again, we have to really protect our children from. To have a company able to microtarget their exact desires, even when they don't know what their desires are. It's a degree of control and influence over children in particular that I think they should just be protected from.

[01:18:14]

Do you think that if you looked at algorithms, do you think that it's an overall net negative? Could the argument be made that algorithms should be banned?

[01:18:25]

Yeah, no, I don't think. I mean, algorithms are there for a reason. We all know on Amazon, if you look up a book, it's going to suggest some other books you might be interested in. And it's pretty darn good. Like, yeah, you're right. I would be interested in that. So no, I would never say, oh, we can't have algorithms. I mean, that would just be a Luddite move to make. I think, again, as a social psychologist who studies morality, I just see everything going up in flames. Here's a metaphor that I sometimes use. Suppose you're the California Department of Parks and you have a hundred years of experience fighting forest fires. You know everything about the wind, the humidity, what season. You've got it down to a science, and you're doing the best you can to keep forest fires under control. And then one day, God decides to just mess with the world and changes the atmosphere from 20% oxygen to 80% oxygen. And if we suddenly were in a world where 80 % of the atmosphere was oxygen, everything would go up in flames. Every electronic device would be burning right now. So that's what happened after 2009, 2010.

[01:19:28]

That's what happened once we switched over to be about... I would say the retweet button, that move to virality, that I think is even more guilty of causing the problems even then algorithm. I don't know that it's necessarily one versus the other, but that's the way I see it. That we're in a world that the technology is so quick to ramp up whatever will most engage us, and that's mostly emotions such as anger. So, yeah, that's why it feels like everything's burning.

[01:19:58]

And this It doesn't seem like it's slowing down. It seems like it's ramping up, and it seems like they've gotten more efficient at the use of algorithms and all these different methods like retweeting and reposting and different things that accentuate what people are upset about and what people get riled up about.

[01:20:18]

Yes, I think it is accelerating, and for two reasons. One is that it's just the nature of exponential growth. It's the nature of progress. I think in the 19th century, a guy named Adams gave us the Adams Curve. He was noticing like, wow, the amount of work we're able to do now that we're harnessing steam in coal keeps growing and growing and growing. At some point, it's going to be going up so fast that it'll go up an infinite amount every day or something. You reach an asymptote, you reach a point at which It's insane. Many people think that we're now at the singularity. We're at the point at which things are changing so fast that we just can't even understand them. We haven't yet mentioned the word AI. Now you bring in AI, and of course, AI could unlock extraordinary material progress. Mark Andreessen has been arguing that. But as a social scientist, I fear it's going to give us material progress and sociological chaos. It's going to be used in ways that make our already unstable social structures and systems even less stable.

[01:21:22]

Well, what's very bizarre that we're seeing with the initial implementation of it, specifically with Google's version of it, is that it's ideologically captured.

[01:21:30]

That was so horrible. And that was so irresponsible of Google to do. So no, I'm glad we have a chance to talk about this because I'm really horrified by what Google did in introducing Gemini. And just to give a little background here. So I'm sure many of your listeners know Google Gemini was programmed to answer in ways that basically the most extreme DEI officer would demand that people speak. And so if you ask for a picture of the founding fathers, they're multiracial or all-black. Or Nazi Even Nazis had to be multiracial or black. There's two things to say about this. The first is that Google must be an unbelievably stupid company. Did nobody test this before they released it to the public? Obviously, Google is not a stupid company, which leads me to my next conclusion, which is if Google did such a stupid, stupid thing, so disgraced its product that it's banking so much on, it depends a lot on the success of Gemini, and now they've alienated half the country right away on the first day, practically, they alienated them. They couldn't be that stupid. I think what's happening to them is what happened to us in universities, which is what I've called structural stupidity.

[01:22:42]

You have very smart people. But If anyone questions a DEI-related policy on campus, they would get attacked. That's what most of the early blowups were. I think you probably had Brett Weinstein on here. That's what Erica Krstackis at Yale and Nicholas Krstackis at Yale. If people wrote these thoughtful, caring memos about opposing a policy, there would be a confligration, they'd be attacked, and they would sometimes lose their jobs. So that's what happened to us in universities in 2015, to usher in our now nine years of insanity, which I think might be ending. I think last fall was so humiliating for higher Ed that I think we might be at a turning point. But my point is for Google, I suspect that Google was suffering from an extreme case structural stupidity because surely a lot of those engineers could see that this is terrible. This is a massive violation of the truth. And part of Google's brand is truth and trust. So I suspect they were just afraid to say anything. And that's why Google made this colossal blunder of introducing woke AI at a time when we desperately need to trust our institutions that are related to knowledge.

[01:23:56]

And Google was trusted, and now they've lost a lot of it.

[01:23:59]

And it's not Just Google. It's ChatGPT.

[01:24:01]

But ChatGPT is not as explicit.

[01:24:04]

It's not as explicit, but it does do certain things. If you ask it to say something positive about Donald Trump, it refuses. You ask it to say something positive about Joe Biden, it'll gaslight you.

[01:24:14]

No, that's right. I'm And there was recently, was it David Rosado or who was it who put out some listing of how far left each of the different AI products are. So you can certainly say that ChatGPT is not politically neutral, but you wouldn't say from that that the people at ChatGPT or OpenAI are stupid. You would not look at this product and say, How could they be so dumb as to have it be left leaning? But with Google, you have to say, How could they be so dumb as to produce Black Nazis for us?

[01:24:46]

Right. I just don't think they played it all out. I think this ideological subversion, this thing that they've done with DEI and with universities and education system, it just seemed like you had to apply that to artificial intelligence because you're essentially, you're giving artificial intelligence these protocols. You're giving it these parameters in which it can address things. And if you're doing it through that lens, this is the inevitable result of that. You're going to get Black Nazis.

[01:25:22]

Oh, no, I don't know about the Black Nazi. I don't think it goes that extreme.

[01:25:26]

So to the extent that- But if you say DEI, if you apply that to everything across the board and don't make exceptions in terms of historical accuracy, the founding fathers of America being all Black.

[01:25:40]

Yeah. Again, I'm not an expert in AI, but large language models are basically just consuming everything written and then spitting stuff back out. It might be that most stuff is written. The people on the left are dominant universities. They probably publish more books, whatever.

[01:25:57]

But there's nothing written about Black That's right.

[01:26:00]

What I think is going on here is that I could see AI seeming to lean left, even if it wasn't programmed to lean left. That might just be the data input that it takes. But to get Black Nazis, you need somebody had a program in those commands, Somebody had to consciously say anything about representation, everything's going to look like a Beneton. No, it's not even like a Beneton. Beneton ads had much more diversity in the 1980s and '90s. I would agree that the Gemini case, clearly someone deliberately be programmed in all kinds of rules that they seem to come from a DEI manual just without much thinking.

[01:26:35]

How do they come back from that?

[01:26:38]

I don't know. That's a good question. I don't know how deep the rot runs. I don't know how bad things are. Google used to have an amazing corporate culture. Oh, boy, look at this.

[01:26:46]

Apple is in talks to let Google Gemini power iPhone AI features. Oh, my God. Go back. Oh, sorry.

[01:26:53]

I was adding that, too.

[01:26:54]

Yeah, go back. Companies considering AI deal that would build on search packed. Apple also recently held discussions with Open AI about deal.

[01:27:05]

On this news then a big investment happened, too.

[01:27:10]

Magnificent Seven adds 350 billion on Gemini's reported iPhone deal. So Because Google has implemented AI into their phone, specifically Samsung. Samsung's new Galaxy S24 Ultra has a bunch of pretty fantastic AI features, one of them being real time translation, your ability to summarize web pages instantaneously, summarizing notes, bullet points, very helpful features. So because of that, another one is your ability to circle any image and it automatically will search that image for you. Like, what is that? Circle it. Boom. The Samsung phone will immediately give you a result and tell you what it is. So very helpful. But now there becomes this is something that Apple has to compete with. So Apple has decided to try to implement AI, but it has to outsource.

[01:28:05]

Yeah. No, it is alarming. I guess the point that I'd like to add on, which I hope will be useful for people, is Part of what we're seeing across our institutions is a loss of professional responsibility, a loss of people doing their jobs. I don't mean base-level employees. I mean leadership. Institutions have important roles to play. Companies have missions. Universities must be completely committed to the truth, research, discovery. Journalists must be committed also to the truth and methods to find the truth. And what we've seen in the 2010s, especially, is many of these institutions being led away from their mission, their purpose, towards the political agenda of one side or another. So I think this is what we're seeing, and we If we're going to make it through this difficult period, we need some way to find the truth. The more we've gone into the internet age, the harder it is to find the truth. We just look like... Something's incredible. We just say, Hey, look this up, and we got it. But on anything contested, it's just very hard to find the truth. And so that's why I'm especially disappointed in Google. I always loved Google.

[01:29:21]

I thought it was an incredible company. And for them to so explicitly say, Our mission is political. It's not to help you find the truth. That, I thought, was so disappointing.

[01:29:32]

Yeah, it is disturbing when a large company decides their mission is political. To which side? To who? Is it the truth? Is that your main politics? Or is it you decide that one side is good overall, net positive, the other side is net negative, and whatever you can do to subvert that other side is valuable?

[01:29:55]

That's right. And so that's a mindset in which the ends justify the means. Yeah. And so So part of the genius of American liberal democracy was to calm down those tribal sentiments to the point where we could live together, we could celebrate diversity in its real forms, we could get the benefits of diversity. And that was all possible when we didn't feel that the other side was an existential risk to the country, that if the other side gets in, it's going to be the end. And that's a very powerful image. And that's an image that helped Donald Trump win. There was an essay, what's it by Michael Anton, I think, called the Flight 93 Election. If you're on Flight 93 being hijacked to crash into Congress, and if you do nothing, you're going to crash into Congress, you'll do anything. And so he framed it as a 'Hale Mary' past that patriotic Americans were supposed to vote for Donald Trump. That mindset of the ends justify the means, the situation is so dire that even violence, even violence is justified, that is really frightening. And that's my concern, is that we could be headed that way.

[01:31:01]

We have not had much political violence. There's been an uptick, but very little compared to, say, 1968 to '73. That period was much more violent. So I'm hopeful will avoid that. But once you say the ends justify the means, and we can cheat, we can lie, we can subvert the company's purpose because the end we're fighting for is so noble. Well, the other side is going to do the same thing. And before you know it, your culture war becomes a real war.

[01:31:26]

Yeah. And you're seeing that in the news, how it's implemented in the news. I'm sure you're aware of this recent Donald Trump speech where he talked about a blood bath.

[01:31:35]

Oh, God.

[01:31:36]

What the actual phrase was. See if you can find that, Jamie, because it's actually important to highlight how not just inaccurate, but it's just deceptive the media was in their depiction of what he said, and that they are taking this quote out of context and trying to say that there's going to be a civil war if he doesn't get elected, which is not what he was talking about at all. See, pull it up because it's so disturbing that they would... First of all, they would think that they could get away with it in this day and age with all the scrutiny and with social media and all the independent journalists that exist now, which is one of the more interesting things about the demise of corporate media, the demise in trust. Trust in corporate media is at an all time low. And so this has led to a rise in true independent journalists, the real ones out there, the Matt Taibies, the Glenn Greenwals, the people that are actually just trying to say, what is really going on and what are the influences behind these things and why are these things happening? But this one was bizarre.

[01:32:46]

When I saw it, then I saw the actual speech. Let's play the actual speech. Yeah, I have the actual speech. The headlines are different, but I'll just play this. Let's play the actual speech. To China, if you're listening, President Xi, and you and I are friends, but he understands the way I deal, those big monster car manufacturing plants that you're building in Mexico right now, and you think you're going to get that, you're going to not hire Americans, and you're going to sell the cars to us. Now, we're going to put a 100% tariff on every single car that comes across the line, and you're not going to be able to sell those cars. If I get elected... Now, if I don't get elected, it's going to be a bloodbath for the whole... That's going to be the least of it. It's going to be a bloodbath for the country. That'll be the least of it. If this election, if this election isn't won, I'm not sure that you'll ever have another election in this country. Does that make sense? I don't think you're going to have another election in this country. If we don't win this election, I don't think you're going to have another election or certainly not an election that's meaningful.

[01:33:47]

And we better get out or we better... I actually say that the date, remember this, November fifth, I believe it's going to be the most important date in the history of our country. I believe that.

[01:34:02]

So that's what he said. Well, that sounds pretty bad. That sounds like the Flight 93 election argument. If I don't win, the country is over.

[01:34:09]

Yeah, but what he's talking about is this subversion of our economy and the subversion of our democracy, that we will never have an election again. I don't think he's saying that it'll be a bloodbath in terms of a civil war. He's saying the economy is going to be destroyed.

[01:34:25]

I was listening for that. I was thinking maybe he meant it as a metaphor. It's a long speech. I didn't hear any. The bloodbath is- It's an unfortunate term, but I don't think he's saying it's a civil war. It sounded to me like he was. It sounded to me like, if he doesn't win, there will be violence. You have to really give him a hell of a lot of benefit of the economy.

[01:34:46]

He's talking about the economy.

[01:34:47]

He was talking about- No.

[01:34:49]

He was talking about China- He was talking- Building plants. He's talking about all these things and saying that if he doesn't get elected, it's going to be a bloodbath. It's going to be a mess. I think he would elaborate elaborate on that if he was saying there'll be violence. I don't think that's what he's saying. I think he's saying destruction of our economy, the destruction of our...

[01:35:08]

He makes a lot of asides. So he was talking about the economy. That's true. And then he said, if I'm not elected, and then he makes an aside about what would happen to the country. Look, we might disagree on this. We surely disagree on our prior. It's surely the wrong way to say it.

[01:35:24]

Surely. We both agree on that. It's an unfortunate term to use.

[01:35:28]

For him to, yes, that's right. But It doesn't sound to me as though the media took that one out of context.

[01:35:32]

I just rewatched the longer video on closed captioning.

[01:35:36]

The video we watched cuts it off right after he says bloodbath. It does continue to say.

[01:35:41]

It's going to be a bloodbath for the whole... That's going to be the least of it. It's going to be a bloodbath for the country. That'll be the least of it. But they're not going to sell those cars. They're building massive factories. A friend of mine, all he does is build car manufacturing.

[01:35:56]

Okay, so he's back on the economy. Yes. He was talking about his talk there. But the aside was not about the economy. The aside was him making one of these typical asides about how important he is. All right, Joe, I think we're not going to settle this. Look, I do agree that the media, as a progressive, left-leaning institution like universities, has violated its duty many times to the truth and thereby lost the trust of much of the country. Most of the people who work in these industries, I think, are wonderful and are trying to do a good job. But the net effect, and this is my point about structural stupidity, during our culture war, institutions that have had very little viewpoint diversity have been subject to hijacking by those with a political agenda. So I agree with you about that, although I disagree with you about what that comment from Donald Trump meant. It sounded to me like it was not taken out of context.

[01:36:54]

Well, he was talking about the economy, though, specifically.

[01:36:58]

I know, but in the aside, he wasn't.

[01:37:00]

In the aside, he elaborate in the aside about the economy.

[01:37:04]

No, he just makes this aside about the bloodbath. But that's the least of our problems. Now, back to what I was saying about the economy. All right, look, we're not going to sell this one.

[01:37:12]

It's a terrible term. It's a very unfortunate term. If he said it would be a disaster instead of a blood battle, that would have been the better term.

[01:37:21]

Yes, that would have been a reasonable thing to say.

[01:37:23]

But he's filled with hyperbole. He's talking about... He's trying to excite people about the idea. You're right.

[01:37:27]

Words matter when you're a presidential candidate. They do.

[01:37:29]

You're right. But no argument there in no way saying that that was the correct thing to say. But the way they phrased it, the way they just tried to make it seem like that was the only thing that he was talking about.

[01:37:45]

Okay. I'm just not going to say anything else on this.

[01:37:47]

I get it. But what you're saying is that these people are good people, but that they are ideologically captured. Is that what you're saying?

[01:37:57]

What I'm saying is that most people are wherever you go. But in the social media age, it's no longer about what most people are like. It's about how much power do the extremists have because anyone now has the power to hijack, threaten, intimidate. That's my concern. That means it's actually more easily fixable because if it would be one thing, if 90% of journalists were rabid left wingers who didn't give a damn about journalistic integrity, and that's just not true. Most of the journalists I've met are really good journalists. They really care about sourcing accuracy. It's the same with professors. Many people, especially those who listen to conservative sources, might think that professors are mostly tenured radicals who care more about Marxism than about educating their kids. That's just not true. What is true is that the minority that have extreme views now have a much bigger platform. They have more power. But most people are reasonable wherever you go.

[01:38:53]

Is the issue that the reasonable people are afraid of pushing back against the radical people?

[01:38:58]

Yes, exactly. That's it.

[01:39:00]

That's the issue. Because there really are consequences in terms.

[01:39:03]

That's right. People say, Well, you've got tenure. What are you worried about? The answer is, Yeah, we've got incredible security, but everybody is afraid of being publicly shamed, humiliated, attacked, and mobbed. The people who go through it. I mean, it's incredibly painful. They have to take sleeping pills at night. They sometimes contemplate suicide, and in one case, committed suicide that I know of. So, yes, that's exactly the problem. That's, I think the effect of not the original social media platforms like MySpace or early Facebook, but of the hyper viral ones that we got in the 2010s. Yes.

[01:39:38]

And the result of that in terms of people terrified about people attacking them is what you got when you got those people from Penn, from Harvard. We're talking about this rampant anti-Semitism on campus, where people were actively calling for the death of Jews, saying that this does not constitute harassment unless it's actionable, which is just- Yes, that was stunning.

[01:40:05]

Insane. It's not wrong unless they act on it.

[01:40:07]

What is that like as a person when you are an academic and you are a professor, when you see that from these, especially from somewhere like Harvard.

[01:40:18]

Yes, I'm a professor at NYU. I was at UVA for 16 years. I love being a professor. I love universities. I'm also Jewish, and I can understand the argument that those presidents were making. The argument was a very narrow technical argument about whether students should be allowed to say from the river to the sea, Palestine will be free. I understand why it would have been reasonable for them to say, Well, we're not going to punish students for saying that. That is political speech that's protected under the First Amendment. I understand the point that they were making, but they were such screaming hypocrites in making that point, and this is what the Cotline American Man was all about. How did it happen that if a professor administrator writes a single word that a student objects to and calls racist, suddenly this person out of a job, really? You're going to fire someone or let someone be tormented and fired because they said something that someone interpreted in a certain way. That led us to be super hyper crazy sensitive about every word we say because you never know when it'll explode and cause a scandal. For the presidents to say, Oh, yeah, anything anyone ever said between 2015 and yesterday would be punished if anyone was bothered by it.

[01:41:39]

But from the river to the sea, oh, yeah, sure. That's constitutionally protected.

[01:41:41]

It wasn't just from the river to the sea. It was the literal expression, death to Jews.

[01:41:48]

Yes, that's right.

[01:41:49]

That's what they were specifically defending saying, unless it's actionable, which is insane. Unless you commit actual genocide.

[01:41:56]

Is that what you're saying? No, I'm sorry, Joe, you're right. The question Right. The deeper question is about political speech, but you're right that as Stefanic, I believe, was asking them, it was about calls for genocide. Yes. Calls for genocide, it seems to me, again, I'm not a First Amendment lawyer. Maybe on the first amendment, legally, you can't be arrested for it. But for God's sakes, on a university campus where you're trying to make everyone feel included, you can't even comment not just about the calls for genocide, but about the actual events on October seventh. That, I think, is what really brought higher Ed to really a nadir, a low point in public esteem, like literally a low point in public esteem.

[01:42:37]

I think it was a wake up call for a lot of people that are on the fence about how big the issue is. Because These are the same people that call for you being kicked out of the university if you deadname someone. Yeah, that's right. These are the same people that if you use the wrong pronouns.

[01:42:56]

Yeah, that's right. I'm actually... So Last semester was the worst one ever for higher education. Data from Gallup and Pew show that the public... Higher Ed used to have an incredible brand, global brand. We were the best. Everyone wanted to come here. Scientific innovation, all the top academics were here in the United States. In 2015, people on the left had a very high opinion of higher Ed, and actually people on the right had a moderately high opinion of it. Then since 2015, it's dropped not just among people on the right, but among centrists and moderates as well. Higher Ed really lost the trust of most of the country. I was running an organization called Heterodox Academies. I started it with some other social scientists that advocates for viewpoint diversity. That's why I was a target sometimes, because here I am saying we need Viewpoint diversity. We need some Conservatives, some libertarians. We need to not all be on the same side politically.

[01:43:50]

Which is an amazing thing to fight against.

[01:43:53]

That's right. We're the experts in why diversity is beneficial, and the important diversity, it turns out to be viewpoint diversity.

[01:44:03]

Well, it's also the most important aspect of an open and free society is the ability to debate things.

[01:44:08]

Yeah, that's right. Democracy is based on it.

[01:44:10]

And find out who's right or whose ideas resonate the most, who makes the most sense, who has thought about this further, and who has the more enlightened and educated perspective, who has more information, who has more balance. That's right.

[01:44:27]

I think we hit a low point in fall in such a way that I'm actually optimistic that things are going to change. Because I've been concerned about these issues in universities, the culture issues, since 2014, 2015, when Greg Luciana and I wrote our first Atlantic article titled The Coddling the American Mind. Every year, it's gotten worse and worse and worse. There's never been a turnaround until last year. As with the Emperor's New Clothes, people can see that something is stupid and crazy and wrong, but they won't say anything. But then when somebody does, then everybody can speak. I'm feeling, finally, for For the first time since 2015, I'm feeling that people understand, you know what? Wait, that was crazy. What happened to us? That was crazy. People were saying crazy stuff. Let's put our head above the parapet. Let's start sometimes saying, Maybe that is not right. So I think that things are actually going to turn around, maybe not at the Ivies. Although there are movements of faculty, they're saying, no, let's return to academic values, the pursuit of truth. So I think what I'm hoping, what I think is likely to happen is we're going to see a split in the academic world.

[01:45:30]

That is, there are already schools like Arizona State University. There are schools that already have basically said no to all the crazy stuff, and they're focusing on educating their students. I think we're going to see more students going that way. The University of Chicago is another model. I think there are a few schools that departed while almost all the other schools went in the same direction. But I think now that's going to change, and it can change actually pretty quickly, because most of the university presidents don't like this stuff. I've spoken to many of them. All the crazy politics the activist students, it made their job very difficult. So I'm actually hopeful that we're going to start, and we are starting to see some university presidents standing up and saying, it's not okay to shout down every conservative speaker. No, we're not going to allow that. So we'll see a year from now, if I come back on a year or two, we'll see. But I think things are actually beginning to get better for the first time since 2015.

[01:46:22]

Well, I hope you're correct. And I do agree that the pushback was so extreme that some action is likely to take place. I think the first step of that has got to be to allow people with differing perspectives to debate and not shout them down. And also to show that that's shouting people down and setting off fire alarms It's shameful.

[01:46:46]

It's disgraceful. That's right. That's where we have to get to.

[01:46:48]

In higher education institution.

[01:46:49]

That's right. If there was any punishment, the students would change very quickly. The students are very concerned about getting a job, about their futures. And what the early presidents who didn't do Everything. What they conveyed was, You can yell and scream all you want. Nothing will happen to you. You can bang on the glass and frighten speakers. Nothing will happen to you. You can throw rocks through windows. Nothing will happen to you. And of course, that just brought us more obnoxious behavior on campus and shame to higher Ed in the eyes of the country. So we had a brand that was based on extreme excellence and truth. I think we damaged our brand very severely. I think finally now there's a reckoning and a realization of what we've done. I think we're going to see a recovery, an uneven recovery. But I do think that a year or two from now, the mood, who knows what's going to happen with the election and whether there'll be a bloodbath. No, don't take that out of context. I just was referring to the early part of our conversation that you're not quoting when you quote this. Yeah, let's say disaster.

[01:47:51]

Yeah, disaster. It could be disaster. But I am actually about certain things, I'm pretty pessimistic like you, but at least on the future of universities, I do think for the first time, I'm actually optimistic. I wasn't optimistic a year or two ago.

[01:48:02]

Well, that's great because you're on the ground, so you would really understand more than most. Do you sense that with students, there's also a recognition that this is a gigantic issue? What was the reaction to students? I mean, not specifically Jewish students, but Jewish students must have been the most horrified by this.

[01:48:23]

Oh, my God. Yes. Stabbed in the back is the way many of us feel. What I found all along, as I say, most people are reasonable. When all this stuff was breaking out in 2015, 2016, most students just wanted to get an education. They don't want to take part in this. Right. And now I find out... Of course, I teach in a business school. I teach at NYU Stern. Our students are pretty pragmatic. They want to get a job. Most of them are from immigrant backgrounds. They're not here to protest the latest political-They're here to succeed. They're here to succeed. That's right. So that is an aspect of Gen Z that gives me hope, is that they see the problems. They see the with social media. They see the problems with the extreme activists. What we have to change is not the average student. We have to change the dynamics so the average student feels freer to speak up.

[01:49:10]

How can that be done?

[01:49:11]

I founded two organizations to do that. One is Heterodox Academy. We need more viewpoint diversity among the professors, or at least we need more toleration of people who are centrist or libertarian. That's one on the faculty side, what we need to do, and also the culture on campus. But I We also co-founded another organization called the Constructive Dialog Institute with a woman named Caroline Mel. What we did is we took some of the insights of moral psychology and some of the content from my book, The Righteous Mind, and it evolved. It's now six 30-minute modules that teach you about moral psychology. Why are we divided? What do liberals believe? What do Conservatives believe? Why do conversations go wrong? How can you start more skillfully? How do you need to listen first? There's a lot of Dale Carnegie wisdom in there, and it's really effective. If people go to constructivedialog. Org, the program is called Perspectives. It's being used in, I think, more than 50 universities now. So there are things that we can do, but it's going to take leadership and good psychology.

[01:50:12]

That's so important what you just said. And I think that if those programs gain momentum and that people recognize that it's really beneficial to all to have these ideas debate. If you truly believe that opposing ideas to your are evil, you should be able to debate those. The only way to do that is to have the ability to express themselves and for you to counter those points that they make.

[01:50:41]

Exactly. This is what many commentators on the left have been pointing out since 2015. Van Jones has an amazing talk. He's a progressive, democratic, well-connected, smart person. He's been pointing out there's a great talk he gave at the University of Chicago. I have a quote on this in the Coddling the American Mind, where he talks about the move to protect students from bad feelings, the move to protect them for emotional safety, is really bad for the students. But then it His talk goes on, and he says, This is actually really bad for the Democrats. It's really bad for young activists to drown out opposition, to not listen to the arguments, to not get stronger. A lot of what's happened on campus, I think, is what you might call a Pierrick victory. Pierrick victory is one where you won the battle, but that made you lose the war. I think when your side is able to wipe out opposition, it might feel like a victory at first, but it's ultimately going to weaken you. The same thing is going on in the far right. There's a lot more fear and really bad consequences for people who dissent on the right, too.

[01:52:00]

But if we're talking about universities, that's more an issue of what's been happening on the left.

[01:52:04]

Are there any universities that don't have a left-leaning perspective? Sure. What universities?

[01:52:12]

Not in the top 20 or 50, I would say. There's not a That's not a problem. Well, that is a problem. That's right. Yeah, it is. Well, actually, no. But let's put it this way. First of all, there are lots of religious universities, Christian universities that don't have this problem. Large state schools tend to have Much less of it because, again, most people are reasonable. The great majority of faculty want to do their research, teach their classes. They don't want to get involved in this stuff. The problem is especially severe. For some reason, the Ivy League schools, that's what's really surprising. I thought it was just like, well, the elite schools. No, it's actually the Ivies are the place where the worst anti-Semitic actual threats and intimidation and even some violence are happening or threats of violence are happening. Something about the Ivies makes them more extreme.

[01:52:54]

What do you think that is?

[01:52:56]

Well, I think it's in part the region. Most of the shout downs, Greg Lukyanov and fire, they've really been tracking this for a long time. Most of the shout downs happen in the northeast and along the west Coast and then around Chicago. That's where most of the really nasty stuff happens. This has not happening at the great majority of American universities. It's not happening at top schools in the south. It's not happening at top schools in the southwest. So it is in part where it is. And then I think also the Ivy League is full of really rich kids. The statistic a number of years ago that the top schools have more people from the top 1% of the income distribution than from the bottom 60%. So there's a real concentration, especially in the Ivies, in the Ivies, of rich kids who don't need to worry as much about getting a job and have the bandwidth to devote themselves to politics while they're students. God.

[01:53:58]

It's just I just fear for the children that come out of that, too. These young people that come out of that that have these distorted perspectives that have to rewire their view of the world once they get out. Almost like taking someone from a cult and trying to just delete the indoctrination.

[01:54:24]

That's right. It's almost impossible to do that, especially if most of what's coming in is coming in from TikTok, not from your parents or your friends or your teachers. Back to the problem. Back to the problem. That's right. Again, back to the question of the TikTok ban. The issue here is not, should we ban TikTok? The issue is, should American law require a divestiture of TikTok from a Chinese Corporation that is beholden to the CCP.

[01:54:49]

Which seems logical. There's an issue that's happening in Texas currently, where one of the porn sites has pulled out of Texas because they require verification. And so there's all this pushback about whether or not they should be able to require age verification. You have to be 18 to use porn websites, which I think is very reasonable.

[01:55:13]

Yes, it's insane that we're even debating it.

[01:55:17]

We're running a mass psychology experiment on children by having smartphones with large screens and having instantaneous access to porn. That's right.

[01:55:27]

I forget the exact number, but A very large number of boys are on pornhub or porn sites daily, every day. And again, as we were talking about before, in puberty, the prefrontal cortex, the brain is really rewiring itself. This is when you're supposed to be developing the ability for a boy to talk to a girl for straight kids. It's hard because boys and girls, they think a little differently. It's awkward. They're always mistakes. They need to be practicing. But instead, they're exposed to this diet of just horrible, horrible stuff. And the girls see it, too. The girls are not on as much, but they're all exposed to it. We now see that many more members of Gen Z, they don't want to get married, they don't want to have children, they're not having as much sex. I understand it. If that's what you think this sex stuff is when you're an 11-year-old and you see this stuff, you're not going to be like, Oh, I want that to happen to me.

[01:56:17]

It's also so distorted, the relationships in these porn videos. It's bizarre fantasy. And it's bizarre fantasy.

[01:56:26]

And about step siblings. Why is so much about step sisters?

[01:56:29]

It's a lot of step Mom's, too.

[01:56:31]

Right. So the whole thing is sick. And once again, I'm not going to tell adults what they should do with their spare time. But for God's sakes, I am going to try to tell companies that they can't just have access to my kids from the age of nine or 10 and do what they want with them. So I don't know the details of the Texas law, but I think we've got to do something to age-gate pornography. I just can't see. I mean, yes, there's a libertarian argument on the other side, that, oh, we should never require identification from anyone for anything. Well, If that's the way you're going to go, no restrictions, then either we have to keep kids off the Internet, which is insane. We can't keep them off of the entire Internet, or we have to say, You know what? Maybe some companies should be held liable. Maybe Congress was wrong to grant them blanket immunity from lawsuits for what they're doing to our kids. I think we should change that.

[01:57:20]

Do you think at a certain point in time, all this is going to become more obvious? And do you think the trend is that it's becoming more obvious to people, whether it's to politicians or to parents?

[01:57:34]

Yes.

[01:57:35]

Over time, the negative effects of it are just so obvious.

[01:57:40]

Yes, and I think that is happening right now. We're right at the beginning of the tipping point. I'm confident about this because the tipping point began in Britain last month. Parents everywhere are fed up. They all see it. They don't know what to do, but they're all frustrated. In Britain, some parents put up a website, delay smartphones. People rush to They had a WhatsApp group for parents to come together. Thousands and thousands joined right away. In Britain, the government actually has mandated phone-free schools, which is one of my four norms. Whenever you have a situation where most people hate it, but they're either afraid or confused, that can change really, really quickly. And that's like the fall of the Berlin Wall, fall of the Iron curtain. We thought it was going to be there forever. But since most people hate it, I traveled behind the Iron curtain in 1987. Everybody hated it. And so once the Berlin Wall fell, it fell everywhere very quickly. I think the same is going to be true for social media and the digital environment for children. I think that 2024 is going to be for the digital environment what 1989 was for Soviet communism.

[01:58:47]

Parents are fed up. The data is in. There's no doubt that there's an epidemic now. The evidence that it's caused by social media is a lot stronger than it was a few years ago. People are ready to act. Congress is to act. So I'm actually, again, I think universities are going to... They are now actually getting better now that they've been through that. I think that the situation around kids and digital media is going to change radically this year. That's my goal in That's my goal in writing the book, in writing The Anxious Generation. I have this amazing collaborator, the artist Dave Ciserelli. So these stickers here that I gave you, I don't know if we can hold them up. I'll just hold them for my camera.

[01:59:28]

It's a milk carton with a child on it, and it says Missing Childhood.

[01:59:34]

So my friend Dave Ciserelli is a great artist in New York City. He designed the cover for the book, and he and I had a plan for some guerrilla art campaign with posters, linking Instagram to cigarettes, that thing a couple of years ago. So Dave had the idea to really go big. And so Dave has built a 12-foot-tall milk carton of the thing you just showed, a 12-foot-tall milk carton. It's going to be on the National Mall in Washington this Friday. If you're in DC, check it out. It's coming to New York City, the northeast corner of Union Square. I'll be there on March 25th. We're starting a national movement. There are lots of organizations that are joining us here. But we're starting a national movement to get parents, to encourage parents to work together. Because as I said, we can escape this if we work together. It doesn't have to be all of us. But if a lot of us say, We're not going to give our kids smartphones till 14, We're not going to let them open an Instagram or TikTok account until they're 16. We're going to ask our schools to go phone free, and we're going to give our kids a lot more independence of the sort that we had in a much more dangerous world.

[02:00:42]

If we do those four norms, we really can turn that around. I'm confident we are at the tipping point right now, even a few months, or even by July and August, or let's say by September, when school starts again in the fall, I think there's going to be a different vibe about phones and the roles of technology in kids' lives.

[02:01:00]

Well, I hope you're right, Jonathan. I really appreciate you. I really appreciate you writing this and spending so much time on this and thinking about it so thoroughly. The Anxious Generation, how the great rewiring of childhood is causing an epidemic of mental illness. It's available right now. Go get it, folks. Listen to it, read it, absorb it, take it in. Thank you very much.

[02:01:21]

Really appreciate you. Thank you, Joe. It's always fun to talk with you.

[02:01:23]

Fun to talk to you, too. Thank you. All right. Bye, everybody.