Transcribe your podcast
[00:00:00]

By the rates of depression, anxiety skyrocket right around 2012, especially for girls. That's Jonathan Haidt, a social psychologist who's been studying the effects of social media on teen mental health.

[00:00:12]

Millennials are not really more depressed than previous generations, but suddenly kids born in 1996 and later are very different from the millennials. And this is a real puzzle and this is a very interesting psychological and demographic puzzle.

[00:00:27]

Jonathan is a careful researcher. He doesn't leap from data to definitive conclusion. More often, he leaves room for debate like this.

[00:00:34]

Well, a piece of the puzzle is social media. But there's one trend that Jonathan argues is remarkably clear. Generation Z, the kids who grew up on social media are being swept up in a current of mental health issues. Unlike anything researchers have seen since World War Two, you never get really sharp lines between generations.

[00:00:53]

The only one I know of is 1946. So if you were born when you know, the soldiers came home, there's a baby boom. You're born 1946. You're different from kids born in 1944. OK, so that's like a really sharp line. Big changes in American history.

[00:01:08]

If you've seen the film The Social Dilemma, you probably remember Jonathan searing presentation on teenage depression and suicide rates, both of which skyrocket along with social media usage. But the correlation is so distressing, you might reasonably ask, is that correlation or causation? It seems unimaginable that the stuff we see in our screens could drive such devastating trends. What mechanics of manipulation could lead to a generation wide shift in depression, self-harm and suicide? Well, when we talk about how technology's manipulating us, people think we're talking about the rhetoric of an advertisement or whether Russia or China are creating comments that are persuasive or not persuasive.

[00:01:44]

But all of this misses the core mechanism.

[00:01:47]

Is social feedback a very powerful lever for influencing what we do? I'll give you an example. When Tic-Tac was trying to figure out how do we get users away from Instagram, they inflated the amount of social feedback that we get. When you post a video or a photo, maybe I get ten likes and one or two comments on Instagram. What if for the same video I get a thousand likes and twenty comments on TAC, which of those two products is going to be more persuasive at keeping me coming back?

[00:02:16]

One of the ways they do this is they don't actually label what a heart or a like or a view is. They just put a big heart and then have a big number next to it. And so these companies are in a race to the bottom to manufacture the kind of social approval that developmentally kids are seeking. And then if you tell kids that it doesn't really matter because they like how it feels and because their social status among their friends is based on the fact that they get more likes and views than their friends do.

[00:02:44]

And so it's kind of like, you know, the way that we inject a cow with growth hormone so it produces more milk.

[00:02:49]

Tic-Tac is injecting into our videos a kind of social feedback growth hormone that is inflating the amount of feedback that we get from others, which is more and more addicting. And that's the problem in the race to the bottom of the brainstem is that each company is forced to go deeper into this social approval mechanic.

[00:03:08]

The other aspect of how this attention economy evolves is to find cheaper and cheaper ways for us to create the content.

[00:03:15]

We are the unpaid laborers who will generate content for free because we will post about our cats and our dogs and our beach photos to get social feedback rewards, and we will generate the attention that will make money for the advertisers. One of the diabolical things about tech talk is that they actually invite each of us to create content for advertisers. When you open up, ticktock and you go to the Discover tab, you're going to see a list of hashtags for things like Patte Challenge or Doritos Dance.

[00:03:44]

And each hashtag shows in the right hand side the number of views. Twenty one billion, six hundred million, one point six billion. It doesn't say whether those are views or likes or real people. They obviously can't be real people because the numbers are too big. But they give you the sense that there's a large audience awaiting you. If only you were to post a video. And so when you do hashtag Doritos dance and you show someone that they're going to reach one point six billion dancing while you eat Doritos.

[00:04:12]

Now we are the useful idiots who are generating advertisements for Doritos, and we have hundreds of millions of teenagers who will happily do the creative work for Doritos.

[00:04:22]

Do we want a world where your children are the unpaid laborers to generate advertising for other kids? Do we want a world where this is the future of children's development? The easiest standard of moral and ethical behavior is not what I would just endorse for myself, but would I endorse it for my own children? And many tech executives don't allow their own kids to use social media. That should tell you everything. If you can't even meet that standard, just stop.

[00:04:49]

And we want to make sure we're not doing naive moral panics here. I mean, we have worried about every new medium from radio to television and how they've affected children. But I.

[00:04:59]

Also want you to keep in mind that as we crunched the numbers and argued the data that this is the environment that our children are growing up in, at some point, if you work at Coca-Cola and the best you can do is just have it be, you know, sugar inducing and diabetes creating, but then just the minimal amount, we're still in the wrong conversation.

[00:05:18]

The question is, what's good for people? That's the question of human technology, not what's less bad for people. And too often the tech companies ask us to take the bad with the good. You've heard these arguments before.

[00:05:29]

Obviously care about this from my own two girls. That's Mark Zuckerberg in an interview with Fox News. And the research is pretty clear.

[00:05:35]

What it says is that all Internet users, not the same. We're all screen users, not the same. If you're using it to interact with people, then that is associated with all of the positive aspects of well-being that you'd expect. You feel more connected, less alone, happier and over time, healthier, too.

[00:05:52]

And we've heard where some of these arguments can take us.

[00:05:54]

Do you believe nicotine is not addictive? I believe nicotine is not addictive. Yes, Mr. Johnston. Congressman, cigarettes and nicotine clearly do not meet the classic definitions of addiction. There is no tax.

[00:06:09]

We'll take that as a no. And again, time is short.

[00:06:12]

If you could just when it comes to kids and potential harm, our standards need to be higher. I don't believe that nicotine or our products are addictive. I believe nicotine is not addictive and our burden of proof lower. I believe that nicotine is not an addict. I believe nicotine is not addictive.

[00:06:29]

Today on the show, we asked Jonathan Haidt, a professor of business ethics at NYU, to lead us through a more nuanced and academic debate on teens in tech without losing sight of the more critical question.

[00:06:41]

Are the kids all right? I'm Tristan Harris and I'm Azour Askin, and this is your undivided attention. John, welcome to your undivided attention. Thank you, Tristan. There's a lot of moral panic, seemingly about how technology is affecting young people and teenagers and mental health. And a lot of headlines are smartphones ruining a generation. A lot of debate back and forth. And we've actually never covered this topic explicitly on this podcast. So do you want to take us back into how you got into this a little bit on your background?

[00:07:30]

Yeah, sure. I'm very happy to go through it because it's a real a branch off of my main research, but it's been a really fascinating branch. So I study morality. That's what I've always done. I picked that topic in graduate school when I was at the University of Pennsylvania. I studied morality, how it varies across cultures. Beginning in the early 1990s and then in the early 2000s, as the American culture war was heating up, it began to be clear that left and right are like different cultures and warring cultures.

[00:07:58]

And so then I began to study political polarization. That's the main line of my work. And then along the way, my friend Greg Lukyanov came to me in 2014 and said, John, weird stuff is happening on college campuses. Greg is the president of the Foundation for Individual Rights in Education, defending free speech rights on campus. And Greg had noticed that suddenly, for the first time in his career, college students were demanding protections from speakers and books and words, and they were using the same arguments that Greg had learned to stop using when he learned cognitive behavioral therapy for his own depression.

[00:08:33]

And I'd begun to notice this weirdness on college campuses, this new moral culture of safe spaces, trigger warnings, micro aggressions. And so because I study moral psychology, moral culture, there was a natural match there and I began to see this weird new pattern. So that got me into studying what was going on on college campuses. And when Greg and I wrote an article in The Atlantic in 2015, which the editors titled The Coddling of the American Mind, we didn't like the title, but it sure stuck.

[00:09:00]

So that got us into studying what is happening to college students. They have rising rates of depression. Why is that? And so that's what that article was about. We thought that there are ways of thinking that are very harmful, that are self-destructive, that encourage people to think of themselves as victims. And we speculate.

[00:09:17]

We had one line in the article about how college students who arrived in campus around 2014 were also the first generation to really get on Facebook and other social media. Around the time it came out around 2007, 2008, they were in middle school. So we speculate, well, maybe, you know, maybe that had something to do with it. But there was no evidence back then. Well, in the couple of years after that, what Greg and I learned is that one of the biggest things that happened on college campuses is that Gen Z arrived around 2014.

[00:09:47]

So the millennials are not really more depressed than previous generations. But suddenly kids born in 1996 and later are very different from the millennials. Gene Twangy, who's been studying generations for a while now. She comes out with a big article in Atlanta called Our Smartphones Ruining a Generation. And she reviews the evidence that, well, actually, yes, the smartphone generation growing up on smartphones does seem to impact mental health. That was 2017. And she has a book called Egemen What Greg and I read that that was a big missing piece of the puzzle.

[00:10:20]

So for me, this has been really a gigantic puzzle with enormous social ramifications. 20 years research at least suggested that, well, a piece of the puzzle is social media. And another piece is the overprotection, which is what Greg and I had been focusing on. So that's what got me started.

[00:10:37]

I think it's important for people to know in your book you are not coming from a background of we really have to care about kids. They're all so vulnerable. We have to make sure we're coddling them. At the point of your book, the coddling of the American mind is that we've been overprotective. So just a name for people as we start to veer into the territory of how do we deal with and protect or care about the mental health of especially teenage girls.

[00:10:57]

This isn't starting from a perspective of we need to be so delicate, they're so delicate. We have to be so careful with them. Do you want to talk just a little bit more about that side? Because I think it qualifies that your concern would be so opposite when it comes to social media and teen girls.

[00:11:10]

Yeah, well, that's right, because the core psychological idea, the most important psychological idea in the book is a.. Fragility. It's such a useful idea and everybody knows it. We all understand that the immune system is an open system that requires exposure to pathogens in order to develop immunity. That's how a vaccine works. And most people understand that if you raise your kid in a bubble because you're afraid of bacteria and so you never let the kid be exposed to bacteria, that doesn't help.

[00:11:37]

We need to be exposed to bacteria. And psychologically speaking, if you protect your kid and you say, I'll make sure you never get lost, I'll make sure that you're never teased or threatened by other kids, well, you're not helping the kid. Obviously, bullying that goes on for days is terrible. But kids have to have normal conflicts to get lost, to get scared sometimes. And then you find your way back. We need this. Kids must have a lot of negative experiences to develop normal strength and toughness.

[00:12:03]

So I start from that position that we. You need to let kids out, we need to let them have all kinds of negative experiences and not protect them, and then they learn to protect themselves. So there's going to interesting twist when we get to the question of, well, shouldn't they be out on social media being publicly shamed?

[00:12:17]

Wouldn't that be good for them?

[00:12:19]

But we're getting ahead of the story. So, OK, let's put it on the table here. What do we mean by social media and why is it sometimes bad? And let's be clear, obviously, social media does enormous good. Facebook in particular is very good at getting groups to organize and do things. I would never want to do a blanket thing like, oh, social media is terrible or, you know, the Internet is terrible. So let's be clear about what are the mechanisms here that make a little part of what we do online, harmful both to democracy and to teen mental health.

[00:12:47]

And writing this article in The Atlantic last fall with Tobias Rose Stockwell, who knows a lot more about social media than I do what I learned, what I really began to see.

[00:12:56]

The evolution here is that when social media began Friendster and MySpace and the Facebook, they were just like glorified address books like Look, here's me, look at all the friends I have, look at all the bands I like. So that's not toxic. That's just public display. And you sure you boast about your popularity, but that's not bad for democracy and that doesn't drive people to suicide. The big change, the period where everything got transformed in 2009 to 2012 or 13 and in 2009, Facebook ads the like button and then Twitter copies it, Twitter ads the tweet button and then Facebook copies it.

[00:13:34]

And now the platforms have enormous amounts of information about what people will click on, what engages them. So now they algorithmic size their news feeds. And so suddenly now everything is custom tailored to you to maximize the degree which you will stay on. You will click, you will forward something. And the net effect is that by first of all, for the mental health in 2009, most teens were not on these platforms every day and by 2011 they were.

[00:14:02]

So that's the two year period where teens social life goes from mostly face to face. Now, of course, they're texting a lot. You know, it's not that they're, you know, like the old days, but these platforms where you create content and other people rate your content and other people like it or ignore it. And then you look and you're watching and you're watching the media go up or not and you're feeling shame because your post didn't get many likes.

[00:14:27]

This is when everything changes, 2009 to 2011. That's the transformative period for teen mental health and also for democracy, because by 2011, 2012, we've now created what Tobias calls the outrage machine. We have the ability now for anything to happen. And anybody, an individual or an organization, can distort it, repackage it in a way that triggers outrage retweeted and then it can go viral very quickly. And now we're in a state of perpetual outrage. This is not about forming a group of dog walkers in a neighborhood.

[00:15:01]

This is about a way of engaging that maximizes public performance, which means we all become brand managers, trying to manipulate other people in a way linked together so that things can move very, very quickly and we can all be immersed in outrage forever and ever. The world change between 2009 and 2011 12 and then mainstream media now has no choice but to look into this. So this is the key period that people need to focus on.

[00:15:32]

I think it's important to go back and just have everyone remember or if you happen to use Facebook back in 2005, 2006, there's a famous talk that Mark Zuckerberg gave at Stanford, where I was at the time, where he was asked what Facebook is. And he said it's a social utility, it's an address book. It's just a page that you go check on a Facebook page and you see what's on your friend's wall. To speak personally to my own experience and many others at that time, especially I think with the launch of photo tagging, that's really when when things revved up.

[00:16:01]

But just to sort of say that's very, very, very different than this infinite scrolling feed of more like that, click more like that, click more like that. Click and getting that instant approval and validation and having that tight feedback loop. Because if we take the argument, the detective case back to is there a problem with social media and impacts on, say, mental health, oftentimes the way this started was with this flat term screen time. Screen time is the problem and it's this hour's debate.

[00:16:30]

And it's like the glowing rectangle in front of your kids. That glowing rectangle is going to give your kids cancer. And this feels kind of like other moral panics that we've had in history. TV, that glowing rectangles going to melt your brain. You know, Elvis is shaking his hips. You know, whatever the thing was we were we're worried about just the eyeball and the glowing rectangle.

[00:16:47]

And you're saying something explicitly different. I want to maybe track some of that that debate, because even until recently, screen time has been used as the vehicle for this debate about what is good or bad. Yeah, that's right.

[00:16:58]

That's where it gets really interesting. You know, as a scientific detective story, as a sociological detective story, it's really important to note the. At any time there's a new technology that the young people use, the older people freak out about it. This was true for novels in the 18th century. That was true for radio, television, comic books, video games. And in general, you know, there's a very common dynamic. And so especially once the iPhone comes out in 2007, the touch screen technology is so much more addictive, I would say.

[00:17:30]

And here I'm speaking as a psychologist, almost as a behaviorist of the day. I got my first iPhone and my two year old son was able to master the the input output. The fact that I didn't put all my money in Apple on that day is one of the biggest mistake of my life, because it is an amazing interface and it's much more pleasing than going through a keyboard to a computer screen. So, yeah, this was shaping up to be a classic moral panic where all the kids were on their phones and the adults were saying, well, this is going to melt their brain now if we just focus in on depression and anxiety.

[00:18:02]

OK, so we have to be clear. What are the input variables?

[00:18:05]

Is it screen time? Is it social media?

[00:18:08]

And what are the outcome variables? Is it depression? Is it laziness and failure to launch? You know, what are we talking about? And so most of the research has focused on depression and anxiety because that's the big mystery. That's the giant thing that has to be explained. Why did girls rates of depression and anxiety skyrocket around 2012? And there is no other explanation that anyone's been able to offer for why then and why mostly girls.

[00:18:37]

So understandably, people point their finger at social media. Well, kids are doing a lot of things on their iPhones, not just social media. So the original panic was about smartphones and the editors at The Atlantic make up the titles. I don't think Jean Tanguy made up the title. Are smartphones ruining a generation? That was something that the theater is made up, but they were having to play into the click bait economy to get it.

[00:18:58]

So they developed an article which then creates a full system closed loop.

[00:19:01]

Exactly. That's right. Because had they had a more low key title, it wouldn't have been such a panic.

[00:19:07]

Well, same thing with your article, the coddling of the American mindset. It probably use an extreme word like coddling. And again, ironically, Facebook and the social media fees and Twitter are responsible for the naming of your books, which then people.

[00:19:19]

Oh, my God. Because we should bring people back to that.

[00:19:21]

Jean Trangie, who wrote that article in Twenty Seventeen, Have Smartphones Ruin a generation.

[00:19:25]

She's received so much anger and blowback for being so extreme. And just the title alone, I think also instigates this kind of outrage trolling machine, which again, ironically, she probably experienced on social media as the kind of same trolling and shaming behavior that that unfortunately other people face.

[00:19:41]

That's right. So, you know, we evolved in a world where Newton's laws applied. I forget which one is for every action. There's an equal and opposite reaction. But after 2012, that's no longer true. For every action, there's an opposite reaction that's multiplied by a factor of two or three. And we're all immersed in outrage all the time.

[00:19:58]

So thank you for pointing that out that both China and my articles and of course, they had big impact because they were in the Atlantic and because the Atlantic did this.

[00:20:05]

But this is the very problem.

[00:20:07]

It also shows the way that social media or the race to the bottom of the brain, some attention economy sort of leaks out. Even if you're not on social media, the world still is. And so you live in that world. There's no escaping it.

[00:20:18]

That's right. And people will tell you about it. So there are those who think that social media is harmful. Then there are the skeptics. So the skeptics make a good case that if you look at history, there's all these moral panics. Why should this be different? So I agree with them that the burden of proof is on people like me and Jean. We can't just say, look, they came in around 2011 and in 2012, depression rates go up.

[00:20:39]

See, we're done. We proved it like, no, that's not enough. That's a correlation. It does not show causality.

[00:20:44]

So what Jean did in Japan and or other work is she looked at almost all the data here is correlational, but some of it is time lag. And some of it you can dig into the data and show it's not just that like historical event A happened in an historical event. Be happy. You can show that it only happened for people who are heavy users. So at least the straightforward correlations are there.

[00:21:03]

So we're not talking about screentime. We're talking about it's not the light users, the medium users. There's a disproportionate for the heavy users. You really get some kind of lift, is that right?

[00:21:11]

Exactly. So to the extent that there's evidence of harm, this clearest evidence is the graphs that show based on the number of hours per day along the X axis, what is the rate of depression shown on the Y axis? And the lines are not straight. They are curves. So typically someone who uses social media two hours a day is not doing any worse than someone who doesn't use it at all, but somebody who at four or five hours a day is.

[00:21:36]

And so you generally get these curves. So it's heavy use, not light use. And the curves are generally bigger for girls. And in a few studies I've seen, the effects are biggest for young girls in middle school. So we have sort of round one is we have Gene's article and oh my God, it's smartphones are destroying a generation.

[00:21:52]

The glowing rectangles, the glow, it's glowing rectangles. That's right. And then we have the skeptics. And so it's Amy Orbin and Andrew Bilski are two of them. They published a big article in Nature Human Behavior in January 2019. And they do a big. Analysis of these some of the same big data sets the G20 looked at and they say, look, we did this giant analysis with 60000 combinations of variables. And yeah, we do find a relationship between the amount of time a kid spends using devices and their mental health.

[00:22:21]

But it's tiny, it's microscopic. It's the same size as we find in the data set for eating potatoes. It's not zero. But for those who understand correlation coefficients, we're talking correlation coefficients around like zero point zero two point zero three in that ballpark. They're statistically significant in a giant sample, but they're so small that you can basically ignore them. And this article came out right after Greg and I had published our book and I thought, wow, were we wrong about this?

[00:22:45]

Because in our book we say, you know, we've got this mystery, what happened to Gen Z and we think it's overprotection and social media. And so I created a Google doc where I put all these articles that were coming out on both sides. And I invited Gene to join me as a curator because she knows a lot more about the substance of them. This is not my area of expertise. And as soon as we posted it, we got some pushback, people saying, oh, come on, this isn't even a real thing.

[00:23:09]

There isn't even really a mental health crisis. That's another moral panic.

[00:23:13]

And we had to say, wait, wait, what what do you mean? And they say, oh, it's just self report.

[00:23:16]

Like, sure, kids are saying they're depressed, but, you know, that's just because they're really comfortable talking about it now more than older generations. So, OK, that's a valid objection. So I had to go back and make a second Google doc where I gathered all the evidence as to whether there's actually evidence of a mental health crisis. And once you put it out there and you have evidence on depression, anxiety, self-harm and suicide. Now, if it's just depression, anxiety, self report, you could say maybe it's just a change in diagnostic criteria.

[00:23:46]

But when you have hospital admissions, this is not subjective interpretation. This is kids who are brought to the hospital because they're bleeding, because they cut themselves deliberately. And when you have suicide data, this is as objective as can be. I mean, there are sometimes there's some play into whether something it's called suicide. But the fact that they all line up same magnitude, same timing means there's no doubt about this. There is a mental health crisis. It is very serious for suicide.

[00:24:13]

Boys and girls are both up a lot for self-harm. It's only the girls. Boys don't generally self-harm. They either kill themselves or they don't. Girls will self-harm. It's more of a social thing and an anxiety reduction thing.

[00:24:22]

So I think that's a big step forward is just to establish this is real and this is really big. And it's not just in America. Same thing. And all the other Anglo countries, we haven't looked everywhere in the world, but in Britain, Canada, Australia, New Zealand, it's smaller in the South Pacific places, but in UK, Canada is very similar to the U.S. So we created that Google doc and there's been zero pushback on that once we put it up there.

[00:24:44]

So now it's very widely accepted. There really is a crisis here. So let's talk about the Orbit and Schabowski study, because that is the most widely cited study, specifically the one saying there actually isn't something to worry about here.

[00:24:54]

I mean, I remember the New York Times article saying, no, it looks like smartphones are not ruining a generation or something like that. Yeah.

[00:25:00]

So the one of the times was, I think, don't freak out about screentime during the pandemic.

[00:25:04]

And it seems to give the all clear signal to parents to say, don't worry about it, let your kids do what they want. But what Jean and I found because we were puzzled like, wait a second, how can there be no effect in their study when they're big affecting others? But if you dig into it, what you find is this.

[00:25:18]

The hypothesis here is heavy use of social media by girls is associated with depression. Let's call that a pitchfork. Let's take that pitchfork stick in the ground and now let's do sixty thousand analysis.

[00:25:32]

We'll just pile analysis and analysis on top of that, each one being like a straw. And before you know it, you've got this giant haystack of analysis, 60000 of them. Almost all of them have nothing to do with that pitchfork. So almost all the analyses are about television use or video games or all sorts of other digital device activities. So only a few of the analysis are actually about social media. It's mostly about screen time. And on the outcome side, they don't just use the questions about depression.

[00:26:00]

They have this one measure of mental. It's not even mental health. It's got thirty two questions on all sorts of things. Only four or five are actually about depression. So this one scale accounts for thousands and thousands of analyses when only a couple are relevant. Anyway, there's a lot of other stuff like this. I'm not accusing them of anything. This is not like they're trying to be devious. The analysis they did is so impressive, but yet only a little bit of it is actually relevant to the hypothesis.

[00:26:24]

So to come back to to our detective story, we have a crime or a you know, a body, as it were. We have a giant increase in rates of depression, anxiety, self-harm and suicide started around 2012. Who done it? And it's as though some of the crime scene data was sent to one lab which analyzed it and came back saying, nah, doesn't really look like there's any evidence here. And then Jean and I say, well, actually, we think they did the wrong lab test.

[00:26:50]

If you do the right lab test, you actually get evidence that the culprit or the accused is in the right place at the right time. And this is with a dose response model. That is that the kids use it more are the ones who suffer more, it reminds me.

[00:27:05]

In your book, calling on the American Mind, I mean, part of it in terms of argumentation is we have this desire for simplicity, this desire for quick, easy answers. And the answer that we really want people to do is move towards complexity and nuance, like what is the complex and nuanced perspective? But in this case, it almost feels like there's been a weaponized use of complexity because we took 60000 possible variations and as you said, sort of hiding a pitchfork of sort of some very obvious, very clear harm that's in there.

[00:27:31]

That's right. But, you know, let me respond to your point about nuance. I should be careful about calling it, you know, weaponized and using battle metaphors. But science is actually advancing as it should, and that you have people making a claim. You have critics who say, no, that's wrong. And then you have the first batch saying, well, actually, no, your rebuttal had some errors. And the new ones that were advancing, too, I think is actually pretty good.

[00:27:49]

Here it is. That screentime is not a good measure. And here Amy Orbin has been, I think, really good on this. She's had a lot of articles saying stop talking about screentime. And she actually has convinced me about that and my debate with her. Now, screentime still matters overall in the sense that parents need to decide and kids need to decide. Do you want to spend all day on your screen? But if we're talking about screen time, cause depression or anxiety, know, it looks like it doesn't.

[00:28:13]

So if we just focus on depression, anxiety, I think we are honing in on the idea that screen time is not the problem. But social media is we're not accusing all screentime activities. We're actually now focusing on, you know, we think this is the guy that did it. So it's not resolved. But I think we got the guy.

[00:28:30]

So we've gone through the detective story with these statistical models. But the content that's beneath the word social media is different for each application. And on a given day and in a given year, are we talk about Facebook? Are we talking about Instagram or are we talking about ticktock? Are we talking about Facebook in 2009, third quarter where they change the algorithm and all the weights are different? I think what's really hard about this is how do we kind of move the debate and our conversation to kind of a common sense orientation of, OK, if I'm a 12 year old kid, I'm forming my identity from a teenage girl and I'm especially attuned to my physical appearance.

[00:29:05]

And I post a photo and I don't use a filter on it. And I see that the photo that doesn't have as much of my skin showing doesn't get as many likes as when I used to have a lot more skin showing. I actually will delete that. This is a known behavior. The teenage girl will delete the photo that doesn't get very many likes because she's worried about how she'll be perceived given all of her other ones have this high social rating.

[00:29:27]

And so the kind of basic mechanics, it's almost like saying, well, with climate change, we could do a million statistical models or we can just look at the mechanism that says this tends to amplify that. And I'm curious when you think about that, because there's so many nuances of what we can say here. I mean, obviously, people will say things like, but look at all the creative things that people are doing on tick tock. Look at all these amazing videos.

[00:29:47]

But we can look at key mechanics, at content beneath the word social and media that I think we can clearly say are harmful. What do you think about that?

[00:29:55]

Yeah. So near Iyall, he wrote the book Hooked He and actually became friends during a debate over whether whether social media is harmful. We have daughters the same age hubby who became friends, but neither has this thing he calls the regret test. And if you ask consumers, do they regret their involvement with the product? And they say, yes, well, that's pretty damning. You know, the whole moral basis of capitalism is that it creates wealth and allocates resources in ways that satisfy people's wants.

[00:30:20]

And if it's doing things that people don't want or, you know, catching them up in behaviors that they wish they didn't have, well, that's that's pretty damning. There was a study done on users of moment, and it was when was the percentage of users who are happy with the amount of time they spend on each app and at the top, the most happy in order is face time, male phone messages and messenger.

[00:30:44]

In other words, to the degree that technology helps us talk to our friends, that's great. There's nothing wrong with that. Nobody wishes they spent less time on face time with their friends. But at the other end, the bottom was Instagram. At 37 percent, only 37 percent of Instagram users are happy with the amount of time they spend. Tinder is forty percent, Facebook is forty one percent. Reddit is 43 percent. So I think this is very, very important.

[00:31:07]

I think this really shows there's something wrong here and now. Let's dig deeper. OK, so what is it about those programs that not just people regret using, but what is it that actually is the mechanism of harm? And here, you know, if people over 18 choose to do something, if they choose to gamble or try heroin, that's their choice. I don't want to get involved in that. But the Internet, this is pointed out to me by Beban Kilgallen, a member of parliament who studies this in the UK.

[00:31:31]

The Internet was not built with children in mind yet a third of the people on the Internet are children under 18. If we really take this seriously and say, well, what kind of internet would we have built if we knew that a third of the people on it would be children, would it look like this for adults? You know, I don't want to tell adults they can't do something because I think it's harmful. But, you know, for children, it's different.

[00:31:51]

And then the other thing that's crucial here is that social media is not an individual choice. I mean, in one level it is, of course, by the children of the parents. But when my son started sixth grade and everybody else was on. To Graham and his middle school in New York City, and I said, no, you can't go on. Well, then he was excluded and presumably none of the other parents wanted their kids on.

[00:32:14]

But we all let our kids on, most people, because the other kids are on it. So the social media companies either wittingly or unwittingly have created a trap. Everybody lies about their age. They can get on whenever they want to. Actually, that answer your question. You did say, aren't there all these good things?

[00:32:30]

Yeah, of course there are. And if it wasn't for the mental health, suicide and self-harm, I would say, hmm. Let's try to add up the pluses and minuses. We're talking between 50, 150 percent increases in suicide for teenagers in the United States. So given that, I think we can say you can be as creative as you want on Instagram and ticktock, but maybe wait until at least the legal age of 13 and maybe even longer.

[00:32:55]

I know people who are on, say, the well-being team of Instagram or Facebook.

[00:33:00]

There are actually have teams of people who are worried about well-being. They'll hire the statisticians. They'll hire the subjective wellbeing experts who worked under a dinner and Martin Seligman and positive psychology people.

[00:33:08]

And they hire as many PhDs as you want.

[00:33:11]

But if you were in that room back in 2004 and 2005 when Sean Parker would literally just tell his friends, if we haven't got you yet, we will, because once we get all your other friends on and you will not have a choice and we can provide those cocain rewards faster than you will, I think someone who set up a service that tapped into those same reward pathways, everything else is almost a distraction because the wellbeing team is just there to justify and to try to do the best they can with the product whose entire basis is addiction.

[00:33:42]

And it was never designed with the best interests of society or well-being or the developing child in mind. Never we didn't get here because people were asking what's best for society. We're now trying to reverse into that position. Many people would say you don't blame a baker for making an addictive croissants or you can't blame someone for inventing the shipwreck when they invented the ship. You can't invent a ship without inventing the shipwreck. So, I mean, I'm curious how you respond to this notion of what is the responsibility of technology companies when making these products?

[00:34:12]

Yeah, well, so those are two different arguments. They're both interesting arguments. You can't blame a baker for making an addictive question. Sure. And if it's adults buying it, it's fine. But suppose there was a company that provided school lunches and they realized that providing Kool-Aid and providing sugar powder to kids love it. There are different responsibilities when you're dealing with kids. And so we do take a more paternalistic approach to kids. And as for the the other argument, you can't invent the ship without inventing the shipwreck yet.

[00:34:43]

That's fine. And if maybe what will happen here is that this is just like, you know, when the automobile was first invented and they I presume they had brakes on the initial automobiles, but they didn't have turn signals. They didn't have windshield wipers, they didn't have seatbelts. And over time they got those because market pressures were such that people preferred safer cars. And normal market mechanisms meant that you'd improve the product by making it safer. And if that was working, if we saw evidence that social media is getting better and better, you know, every year it gets better for mental health.

[00:35:12]

Every year it gets better for promoting civil discourse and supporting democracy. Well, then I'd say that argument applies here. But, you know, if it's this, you know, Metcalf law, if it's the sort of thing where once they get big, they can basically stop these sorts of changes, well, then I would say the shipwreck argument doesn't apply. Ships got better and better because nobody wanted ships to wreck. If there was some commercial interests that benefited from more and bigger shipwrecks, well, we'd be in a different situation.

[00:35:40]

We know that there is a mental health crisis affecting our kids. We got to do something about that. So what do we do?

[00:35:46]

I would suggest we start with some simple experiments. We can find out there are ways of finding out answers to this. The simplest experiment we can do that is urgently needed is if anybody having to listen to this podcast, who knows anybody who works in a school district or a middle school, suggest this simple experiment takes some school district. They're all coping with the rise of depression, anxiety, self-harm, they're all worried about. This takes some school district and ask them to do a simple experiment, ask half the schools in the district to strongly discourage kids from opening social media accounts because it has to come central.

[00:36:22]

You can't expect individual parents to ban Instagram. You have to have a school wide effort to say, just don't let your kids have an account to the high school and have a policy in school of keeping the devices locked away during the day. As long as kids have in their pocket, they're thinking about it. They're not paying attention to the teacher as much. They're thinking about the drama and they go to the bathroom, they add to the drama.

[00:36:44]

So if there are school districts out there that are concerned about this and they all are do experiments, middle school is where I think we really can get a handle on the problem.

[00:36:53]

And if a school district has, you know, especially the city has, you know, ten different middle schools, if five of them do this and five go with this standard policy where they all are on all the time, then we'll see in a year a.

[00:37:05]

Who will see, because social media changes the fundamental fabric of connection. You can't do it one person at a time. It has to be done group at a time, community at a time. So I think we have an emergency. We have a likely suspect and we have simple scientific methods for trying to figure out is this is social media really the culprit? And middle school is the best place to look.

[00:37:27]

What I love about that suggestion is if we actually ran those experiments and have a set of middle schools, you see the kids making better sense of the world, have better relationships with themselves. That creates a race to the top. Do you want to have your kid in the schools where they're going to be more likely to commit self-harm or not? Exactly.

[00:37:48]

That's right. We'd know within two years. We know within two years because the curves are going up and up and up. We are not flattening the curves on mental health. And if some schools are able to flatten the curve and actually bring down the rates of suicide and self-harm, yeah, I think you're going to see a lot more parents wanting to move to that town.

[00:38:03]

I think that's something that we can all hopefully get behind. And maybe that's a centralising point for how we at least get behind the issues of mental health and kids. Don, thanks so much for coming on the podcast.

[00:38:13]

Oh, what a pleasure, Tristan. It is. It's been really fun sharing metaphor's with you.

[00:38:22]

Listeners, before you go, one of the most common questions that we've been getting since people watch the film, the social dilemma is obviously what can I do? And many people who see the film or hear this interview will think to themselves, well, I'm just going to have my kid by themselves to lead their Instagram account or delete Tick-Tock. But of course, what's diabolical about these systems is that they prey on manipulating social exclusion, because now that that one kid is not using Tick-Tock, the rest of their friends still are.

[00:38:52]

And the way that they do their homework or find out about sexual opportunities or gossip or who's more famous or has higher status in school than the other person is still happening on Tick-Tock. So one thing we're recommending to people is not just to delete your own Instagram or Tic-Tac account, but to actually start a group migration, just like the birds migrate every year.

[00:39:14]

Can we migrate as a group, as a school, as a set of families, as a set of friends off of one of these manipulative platforms? You can delete Tick-Tock and when you make a dance or funny video, you can send it to people you love directly. Instead of using Snapchat, you can delete Snapchat and use text or WhatsApp instead, instead of asking yourself in your kids, Do I like this app?

[00:39:37]

You can ask How does this app make me feel both during and after using it? And these are really powerful conversations to have in your family. For more resources, you can go to our website at Humaine Tech Dotcom, where we have some material for youth, parents and educators.

[00:39:59]

Your undivided attention is produced by the Center for Humane Technology. Our executive producer is Dan Kaddoumi and our associate producer is Natalie Jones.

[00:40:06]

Nor al-Samarrai help with the fact checking original music and sound design by Rhiannon Hayes holiday and a special thanks to the whole Center for Humane Technology team for making this podcast possible. Our very special thanks to the generous supporters of our work at the Center for Humane Technology, including the Media Network, the Gerald Schwartz and Heather Riesman Foundation, the Patrick J.

[00:40:26]

McGovern Foundation, Evol Foundation, Craig Newmark Philanthropies and Knight Foundation, among many others. Huge thanks from all of us.