Transcribe your podcast
[00:00:01]

From The New York Times, I'm Michael Bobarrow. This is The Daily. Today, a look inside a historic set of new lawsuits filed by dozens of states, accusing the country's largest social media company of luring children onto its platforms and hooking them onto its products. My colleague, Natasha Singer, has been reviewing the state's evidence and trying to understand the long term strategy behind the lawsuits.

[00:00:47]

It's.

[00:00:48]

Wednesday, November 15th.

[00:00:54]

Natasha, the last time you and I spoke, the state of Utah had just tried to restrict the use of social media by children, passed a first of its kind law. Today you come to us with something related, but on a vastly different scale.

[00:01:11]

Right. When we were talking about Utah, it was a single state trying to restrict social media on behalf of young people. And now what we have….

[00:01:20]

Good morning, everyone.

[00:01:22]

Good morning. Good morning, and thank you for joining me today. There is a youth mental health crisis in America.

[00:01:30]

And we.

[00:01:31]

Need to act. Is dozens of states banding together. So today, myself and 42 other attorney generals across this nation are announcing collective action in state. A astonishing coalition of red states like Texas and Tennessee, and blue liberal states like Massachusetts. Today, my office has filed a lawsuit against Meta Platforms. Today, my office has filed a lawsuit against Meta Platforms, Inc, the company.

[00:01:57]

Formerly known as Facebook, for knowingly.

[00:01:59]

The mental health of young social media users. Coming together to SuMeta, which is the social media giant, as you know, that owns Instagram and Facebook and WhatsApp. Right. I do very much see this as a public health crisis in the same way that tobacco was. The scale of this investigation, the way the states are banding together to investigate and the parallel lawsuits they filed is right out of the big tobacco playbook.

[00:02:32]

Meta has.

[00:02:33]

Been allowed to addict our children to a product that interferes with their mental and physical health. From the point of view of the attorney's general, they have said they view the case against Meta as a case about severe health harms and hazards to young people in the same way that states viewed the health hazards and harms to young people with cigarettes or jewel or opioids.

[00:02:59]

Of course, Natasha, the industry, as you just mentioned stood accused of quite literally poisoning people, including children, with their products, cigarettes and vaping pens. Meta is a technology company, so does the parallel to those industries end there?

[00:03:22]

I think the answer to your question is both yes and no. Technology is different and the question of whether social media is addictive at the heart of this case will have to be proven. It's not clear. The methodology the states are using, all coming together to sue one company is similar. The content is different. Like big tobacco, it's hard to say anything good about. Social media, lots of people use to connect to their friends, their family, their colleagues to figure out what celebrities are doing, to look up recipes. There's a lot of good things that happen with social media, and so it is not the same in that way as big tobacco or jewel. But when the case was first announced last month, we don't really know what evidence the states had against Metta because their legal filings were redacted and it was all blacked out and we couldn't see anything. But since then, the Massachusetts attorney general has gotten that state's complaint unsealed. And that gave us a much better sense, not only of the Massachusetts case, but of similar cases that attorneys general in other states were filing.

[00:04:34]

Right. I want to better understand this lawsuit that all these states are involved in. Walk us through the case, Natasha.

[00:04:42]

The case has a really interesting backstory. It starts in the fall of 2020, which is when Netflix released a really ominous docus drama called The Social Dilemma. A lot.

[00:04:54]

Of people think Google is just a search box, and Facebook is just a place to see what my friends are doing. What they don't realize is there's an entire team of engineers whose job is to use your psychology against you.

[00:05:08]

It featured former Facebook, Google, and Twitter executives warning about how social media platforms hacked users' psychies.

[00:05:17]

We get rewarded by parts, likes, thumbs up, and we conflate that with value and we conflate it with truth.

[00:05:23]

And state regulators around the country began to see the documentary.

[00:05:28]

Those kids are the first generation in history that got on social media in middle school.

[00:05:33]

And they began talking about how worrisome social media was for young people.

[00:05:39]

How do they spend their time?

[00:05:41]

They come home from.

[00:05:42]

School and they're on their devices.

[00:05:47]

And so some of them are parents who have seen their own kids use social media, and it's personal for them. I spoke to the attorney general of Massachusetts, Andrea Joy Campbell, and she said, Look, I'm not just the attorney general. I'm the mother of two young boys. And these attorneys general had been watching spiking rates of teen depression, anxiety, and suicide in their states. And they believed that social media was one of the causes. And while the attorneys general were discussing the film and the alarms it raised for them. Facebook is developing a children's version of the popular app, Instagram, for youngsters 13 and under. Metta, Facebook's parent company, announced in early 2021 plans to develop a new service, Instagram for Kids. That caused even more alarm bells because attorneys general were worried that Metta was trying to create training wheels for Instagrams are not proper. So more than 40 states got together, and their attorneys general wrote a letter to Mark Zuckerberg asking him to halt the plans to develop Instagram for kids. Soon after they sent that letter, a whistleblower inside Metta appeared. Many of Facebook's internal research reports indicate that Facebook has a serious negative harm on a significant portion of teenagers and children.

[00:07:11]

Francis Howgan, a former Facebook product manager, had taken tens of thousands of internal documents. She spoke to the Wall Street Journal and she testified in Congress that the company knew that they were making young women feel worse. The company's leadership knows how to make Facebook and Instagram saver, but won't make the necessary changes because they have put their astronomical profits before people. And it causes a firestorm. Metta announced that it was going to pause the development of Instagram for Kids. And more than 40 states announced that they were going to investigate whether Meta, and particularly Instagram, had deliberately created a platform to addiction kids and knew that it could cause them harm.

[00:07:59]

That's how we get to these lawsuits. It sounds like a slow burning realization by these states, attorneys general, that there is a problem inside Meta with Instagram and that there's something they are uniquely capable of doing about it.

[00:08:20]

Exactly. Over the last two years, the attorneys general have amassed thousands of internal documents from Meta showing how Instagram works and using that and also testimony from other whistleblowers. They contend they now have a really strong case in which they are accusing Meta, particularly Instagram, of deliberately designing addictive features that harm children, of lying about the harms, and also of allowing underage users under 13 on his platform.

[00:09:01]

Okay, well, let's walk through those three claims in these lawsuits one by one. Let's start with the claim that Meta, Instagram design intentionally addictive features. What's the evidence that the lawsuits outline?

[00:09:19]

I think in simple terms, the attorneys general are accusing Meta, but particularly Instagram, of being an all-powerful social media slot machine that has knowingly ensneared, addicted, and harmed millions of young people. And so if you think about how a slot machine works, and some of the attorney generals have used this phrase, it can be endless. That is one of the things they say is an addictive feature, that there's no natural end for young people, that it's really hard to get off. Second of all, they say that Instagram bombards young people with all kinds of notifications. And if you've used Instagram, you have probably seen them. Chad just posted a new photo. Chloe has a new reel.

[00:10:06]

Katie Kirk's going live. Right.

[00:10:09]

And so first of all, that is a dopamine hit because you're going to see something new and you instantly have to go check. Second of all, there's social pressure. If you don't like your friends' post fast enough, what is that going to mean? And then there's also fear of missing out because some of those things like Instagram Live or stories are temporal. If you don't do it now or in the next 24 hours, you're not going to see it. And so the attorneys general alleged that all this stuff is by design that these compelling features have a particularly pernicious effect on young people because it overrides their brains.

[00:10:50]

Right. Here I have to confess that I have lost 2-3 hours at a stretch on Instagram because of the very features you're describing, and I'm 44 years old.

[00:11:03]

But I think we all are, right? You're scrolling down and there's this nanosecond where you're salivating in anticipation, like Pavlov's dogs, and then you see something new and you get a dopamine hit. And look, there's George and a Mal Clooney. Look, there's Chloe Kentucky's Du Fong. There's Lionel Messi waving his pink jersey. Things you absolutely don't need to know.

[00:11:24]

Well, what are the specific claims of harm to children from all these features we're talking about.

[00:11:32]

There are two sets of harms. One is psychological. The lawsuits are not simply saying that spending all this time on Instagram is causing kids to lose sleep or take time away from school or distract from homework. What they're saying is that the features they contend are designed to addiction kids, cause compulsive use of Instagram. And that compulsive use can lead to increased depression among young people, increased anxiety, increased isolation and loneliness, and particularly among young women, increased dislike of their own bodies. For example, the lawsuit focuses on these cosmetic surgery and beauty filters that you can use if you're on Instagram because you can use the filters to make your arms look thinner or your boobs look bigger or to give you a stronger chin. And there's a conversation inside Instagram that's cited in one of these lawsuits in which even Instagram executives are saying, we're actively encouraging young girls to hate their bodies with these filters.

[00:12:43]

Right. Because a filter like that basically encourages the people who use it to think that something is wrong and needs to be improved.

[00:12:53]

Right. And apart from the mental health harms, the state lawsuits also allege that there are concrete harms that come from young people using the platforms. For example, a whistleblower recently came forward with internal company documents and said that a survey of Instagram users found that 22% of 13 to 15 year old said they were bullied on the platform just within the last week. He also said that about 22% of young users had said they received unwanted sexual advances, again, just within the last week. In the.

[00:13:38]

World of these lawsuits, Instagram especially, is not a place where you go to innocently post about your life and catch up with your friends. Instagram, according to this lawsuit, is an addictive product whose most popular features make it dangerous for kids. Right.

[00:14:00]

That's what the lawsuits are saying. Not only that, the states are accusing Meta of knowingly concealing those harms.

[00:14:18]

We'll be right back. Natasha, tell us more about this allegation in these lawsuits. The second of the allegations that Meta knew Instagram was harmful to kids but tried to hide that from the public.

[00:14:40]

That's one of the really surprising things about these statements are complaints. They describe how the company regularly did research on teen mental health, regularly surveyed its users age 13 to 17, and knew that they were having negative experiences and that they were experiencing harms. Yet the complaints say company executives from Mark Zuckerberg on down testified in Congress or gave interviews on TV saying that they cared about the wellbeing of their youngest users, they were doing all this work to protect them, and that the platform was safe.

[00:15:20]

In other words, that they knew better than what they were saying in public about what Instagram did or didn't do to kids. They had internal evidence showing it was bad, and they'd go out and say it's not bad or even that it's good.

[00:15:35]

That's what the states contend.

[00:15:37]

I'm curious, what, according to the lawsuits, did Meta do with this data it had about what its users were feeling, this data that often showed these harms.

[00:15:48]

The lawsuits describe how a number of Meta employees were worried by their own internal data on their impact on teen users. And these meta employees proposed different ways to mitigate the problems and the harms that young people were having. And yet their proposals were often shut down by their bosses.

[00:16:08]

Give us a couple of examples of that.

[00:16:11]

One of the internal projects was about likes. And so according to one complaint, Meta's research found that teen users often compare their accounts to their friends. And when they see other people getting more validation, it's a negative comparison. And it led to negative outcomes like increased loneliness or worse body image or negative mood. And so to try to change that, Meta did a test program called Project Daisy, where in some cases they basically hid all the like counts you saw on Instagram, except for your own. And then another experiment where like counts from really popular accounts were visible, but not like normal people.

[00:16:56]

And.

[00:16:57]

They found that both of those experiments reduced the user's experience of negative feelings and negative social comparisons. That's fascinating. And so the solution was let's hide the likes by default, and it might make teens happier. And around 2020, Meta executives did this whole publicity tour saying that they were going to put this into effect. But ultimately, Meta did not take away the likes.

[00:17:28]

And why not?

[00:17:29]

That's a really important question that the complaints don't directly answer. The implication in the lawsuits is that there was a profit motive for not taking the likes away. But because Meta had publicly said they were going to take the likes away, there was some pressure to do something. And so at the end of the day, Meta offered an opt-in option that you could choose to hide the likes. And the lawsuit says that Meta knew that that was really not going to make a difference because they studied it and they found that if you offered people the chance to hide their like counts, less than 1 % of people would do it. But if it was opt out by default, 35 % would leave the like counts hidden. And so basically the allegation is that here was something easy that Meta could have done to reduce social anxiety for teens, and they chose not to do it.

[00:18:20]

Do we get a sense from these lawsuits of who at the company is blocking these efforts to make Instagram better, safer, and better for kids?

[00:18:30]

Yes, we do. There's one striking example in the Massachusetts lawsuit against Meta, which talked about those cosmetic surgery filters we talked about before. There's this whole internal discussion in this lawsuit between different executives saying these cosmetic filters are overwhelmingly used by teen girls. We know it's not good for them. Outside experts say that these filters are not good for young women. Let us disallow them.

[00:19:03]

Again, a little bit like the likes, do something easy, low hanging fruit, just make it go away. Yeah.

[00:19:09]

There is an internal discussion among a handful of top executives at Instagram and Mark Zuckerberg, according to the lawsuit, is part of this chain. There's supposed to be a meeting with Mark Zuckerberg to discuss getting rid of these filters. One day before the meeting, it's canceled. Then according to the lawsuit, Mark Zuckerberg vetoed the proposal to formally ban these plastic surgery camera filters himself.

[00:19:38]

In.

[00:19:39]

The email, it says he specifically directed staff to relax. And he said there was a clear demand for these filters. According to the lawsuit, he said in the email that he had not seen data suggesting that these filters were harmful.

[00:19:57]

Here you have the CEO of the company saying, I don't see a problem here, and I reject the idea of getting rid of these filters.

[00:20:04]

You have the CEO saying, According to the lawsuit, these filters are popular, but these are selected quotes from emails. We do not have the full correspondence, nor do we know what data he looked at or didn't look at.

[00:20:17]

Natasha, the final allegation you mentioned earlier in these lawsuits is that Meta knowingly allowed very young children onto the platform. What does the lawsuit have to say about that?

[00:20:29]

Meta's terms of use say that you cannot set up an account on Instagram if you are under 13.

[00:20:37]

And.

[00:20:38]

The reason for that is there's a federal law that says companies that know they have kids on their platform must get permission from parents to let them create an account that would involve collecting their personal data.

[00:20:52]

Got it.

[00:20:52]

But what the lawsuit said is that Meta made it easy for users under 13 to sign up for accounts. And that initially there was a drop-down menu that automatically generated a date and year of birth for new users that would make them over 13. The default was basically to suggest what birth date to pick to be 13.

[00:21:14]

Which, of course, if you're under 13, you'd pick.

[00:21:17]

Then they changed it because somebody internally, according to this lawsuit, said it should be neutral. But of course, it's very easy to lie and pick years that make you older than 13. The result was the lawsuit says that millions of kids under 13 were on Instagram in violation of the Federal Children's Privacy Law. If Meta were found guilty of violating that law, the fines can be more than $50,000 per violation.

[00:21:50]

Just to summarize this entire case, Natasha, is that Meta designed Instagram to be addictive, knew that from its own research, and pretty much lied to the public about that fact, and internally rejected employees' request to do something about this and make Instagram better for kids. And all the while, it is not doing all that much to stop kids under 13 who are most vulnerable to all the things we're talking about from using the platform.

[00:22:23]

That is what the majority of state attorneys general in the United States contend.

[00:22:29]

Let's turn to Metta's defense against this lawsuit. What has the company said about the allegations, and I think more importantly, about the evidence that's contained in these lawsuits?

[00:22:40]

I reached out to Metta to ask them what the response was to all these lawsuits. They got back to me and they said, first of all, the company has made a major and ongoing investment in protecting young folks on their platforms, and that there were now more than 30 tools and resources to protect teens and help keep them safe and away from potentially harmful content and unwanted contact on their platforms. They also said that the state's complaint is basically cherry-picked. Meta said that the lawsuit was filled with selective quotes from hand-picked documents and that they didn't provide real context of how the company operates and how it makes decisions.

[00:23:28]

On.

[00:23:28]

The other issues, they pointed out that Instagram's terms of service prohibit users under the age of 13, and that when the company fines users under 13, they remove those accounts. And as for the beauty filters we talked about, Meta said that it banned filters that directly promoted cosmetic surgery or extreme weight loss. I also asked them about the comparisons to big tobacco that some of the attorneys general were making. And Meta said that it was an absurd comparison. Unlike tobacco, they said Meta's apps add value to people's lives. So they basically completely rejected the comparison to tobacco.

[00:24:14]

So at the end of the day, Natasha, based on your reporting, how strong is this case that's being made against Meta by the states? And how likely are the states to win?

[00:24:25]

You know, Michael, it's not a Slam dunk because the states are accusing META of several distinct and really big things, and those can be hard to prove. It's going to be hard to prove, for instance, that notifications and likes cause addiction, and that addiction leads to depression or isolation. There are studies linking social media use to increase symptoms of depression or feelings of isolation or negative self-esteem. But unlike cigarettes, social media is relatively new. We do not have decades of research, and particularly, we don't have rigorous research showing that the typical use of social media by typical kids directly causes harm. But there are larger legal issues at play here, including Section 230, which is part of the Communications Decency Act. And in simple terms, that provision generally allows digital services like Instagram to curate speech and content any way they like and not be held liable. You could see in these lawsuits from the states that they're carefully trying to avoid talking about content. They're talking about tools like algorithms. But social media companies have argued that they're entitled under the Section 230 to curate content as they see fit. Interesting. And that the algorithms do that curation, and therefore the companies are not liable for the content.

[00:25:59]

Okay, I want to, just for a moment, Natasha, ask us to put ourselves in Meta's headspace because we've been spending so much time talking about the state's case, the state's evidence, the state's worries about what Meta is doing. If you're Meta and you're watching this case unfold, I wonder if it's likely that they're asking the question, Why are all these state attorneys general focused on us? We're not the only social media platform out there where all this stuff is happening. There's TikTok, there's Snapchat, there's many others. So are they feeling a little bit picked on?

[00:26:40]

Right. Why me? What you're saying is absolutely true. Tiktok also has these features that the states are concerned about, like endless feeds and likes and beauty filters and Snapchat has notifications and stories that disappear after 24 hours. In fact, Meta essentially copied some of the features the states are complaining about from other social media platforms. And so it would be completely legitimate for them to say, Why are you picking on us?, except that the states are not only picking on Instagram. In 2022, the states announced that they were investigating TikTok for many of the same practices or similar practices that they were already investigating Meta for. And Tennessee and Colorado are leading that investigation into TikTok. It's been going on about a year and a half. And remember in the Meta case, right? They announced it two years ago, and now they're actually filing the lawsuit.

[00:27:41]

Right. So this case isn't the only case. It's just the furthest along.

[00:27:45]

Right.

[00:27:46]

And the implication of that is this is a moment, the AG's hope, of possible reckoning for these platforms, all of them.

[00:27:55]

I think that that is the design of the lawsuits, not only to try to litigate their way into causing Meta to change some of the things we discussed already, but it's going to attract a lot of publicity. And it's going to reinforce some of the concerns that lawmakers, the Surgeon General, and many other people have already been voicing. And so I think that it's the beginning of a snowball. And so I think that they're hoping to use these lawsuits to cause Meta to change, and then therefore other social media platforms, whether they win the lawsuit or not.

[00:28:37]

Natasha, we had started this conversation using the analogy of the state cases against tobacco companies all those years ago. And when I think about those cases, one of the clear outcomes was that everybody started to think of cigarettes as dangerous. I wonder if the states in the case against Meta, even if they don't win in court, would be happy in a world where lots more parents walk around the world thinking of social media platforms like Instagram as a danger to their kids. Would that be a successful outcome for the states?

[00:29:10]

Yeah, I don't think it would, Michael. I think that we already have a lot of parents walking around thinking that social media is problematic, including some attorneys general. And parents are struggling to keep their kids off their devices and not on social media for hours and hours at a time. So, newsflash, social media is problematic. I don't think that's news, right? I think what they want is they want the companies to stop using or dial down some of the features we talked about, endless feeds, endless bombarding of young people with notifications because they want that stuff to stop. I think about Jonathan Scarmetti, who is the attorney general of Tennessee, who co-led the investigation into Meta. What he said to me was, certain, social media companies know what they did to make their platforms as habit-forming as possible for kids. The companies ought to know where the switches are to turn those habit-forming features off. Really, that's the end game for these attorneys general. They want the companies to either turn these features off or dial them back.

[00:30:23]

Well, Natasha.

[00:30:25]

Thank you very much.

[00:30:26]

Thank you. We'll be.

[00:30:31]

Right back.

[00:30:44]

Here's what else you need to know today. Look at what Hamas is holding inside the hospital. These are explosives. These are vests, vests with explosives.

[00:30:56]

We have hand Grenades, Kalachnikovs, and then we.

[00:31:00]

Have the RPGs.

[00:31:02]

On Tuesday, Israel released a pair of videos that it said were recorded from inside Gaza's main children's hospital that showed weapons and explosives purportedly stockpiled there by Hamas. This is Hamas.

[00:31:17]

Firing RPGs for hospitals. The world.

[00:31:21]

Has to understand who is Israel fighting against.

[00:31:24]

Israel shared the videos to press its case that Hamas.

[00:31:28]

Is using hospitals.

[00:31:29]

As cover for its military operations and to justify Israel's operations aimed at evacuating the hospitals, which have sparked outrage. Gaza's Health Ministry, which is run by Hamas, denied nearly every Israeli claim in the video. But the health ministry acknowledged that the footage was taken from inside Casa's main children's hospital. On Tuesday afternoon, the Biden administration said that US intelligence sources had information supporting Israel's claims. A temporary spending bill that would avert a government shutdown at the end of the week was adopted by the Republican-controlled House after more than 200 Democrats crossed party lines to back it. The bill was seen as a major test of the new House Speaker, Mike Johnson, who chose keeping the government open over pleasing his parties far-right. The bill, which would fund some government departments until mid-January, and the rest of the government through early February, did not include the spending cuts that Conservatives had demanded, prompting more than 90 of them to vote against it. Today's episode was produced by Alex Stern, Will Reed, and Carlos Prieto with help from Stella Tan. It was edited by John Ketchum, with help from Michael Benoît, contains original music by Marion Lasano and Dan Powell, and was engineered by Alyssa Moxley.

[00:33:12]

Our theme music is by Jim Runberg and Ben Lansbrook of Wonderland. That's it for The Daily. I'm Michael Iarroir. See you tomorrow.