Transcribe your podcast
[00:00:04]

Listener supported W NYC Studios'. OK, so in the layer cake of mayhem that we find ourselves in, that metaphor makes no sense. I want to update a story that we played a few years ago that is fascinating. But definitely shifting and changing as we speak. And worth tracking. And yet hard to track with all the things that are happening right now. This is a story that aired, I believe, two years ago, but still very timely, as you hear.

[00:00:34]

Bit later in the episode, we'll come back around and update it. Bring it up to the present. Wait. You think.

[00:00:46]

You're listening to Radiolab Radio from WNYC. Hey, I'm Jan Abumrad. I'm Robert Krulwich, Radiolab.

[00:01:01]

And today we have a story about what we can say and what we can't.

[00:01:07]

And by the way, there's going to be a smattering of curse words here that we're not going to bleep, which I think makes sense given the content of the story. And also, there's some graphic scenes that you got kids with. You may want to sit this one out. Yeah.

[00:01:19]

Anyway, the story comes to us from producer Simon Adler.

[00:01:22]

So let's start can we start in 2008? Sure. How about with a song? Yes, please.

[00:01:39]

So December.

[00:01:40]

Twenty seventh. A sunny Saturday morning. This group of young, middle aged women gathered in downtown Palo Alto Library.

[00:01:53]

They're wearing these colorful hats and are singing and swaying directly in front of the glass door headquarters of Facebook.

[00:02:07]

Yes, it was a humble gathering of a few dozen women and babies.

[00:02:13]

That right there. Are you the organizer? This is one of the organizers of the gathering. I'm Stephanie Miller.

[00:02:19]

And why are you calling me? I'm a Facebook nurse. Nursing as in like breastfeeding. The intent was really just to be visible and be peaceful and make a quiet point. What what point were they trying to make?

[00:02:34]

Well, so Stephanie and this group of mothers, you know, they were on Facebook, as many people were, and they'd have photos taken of themselves, occasionally breastfeeding their babies. They wanted to share with their friends what was going on so they would upload those photos to Facebook and these pictures would get taken down and they would receive a warning from Facebook for uploading pornographic content.

[00:02:57]

And people were really getting their backs up over this.

[00:03:01]

They wanted Facebook to stop taking their photos down to say that, well, nudity is not allowed. Breastfeeding is exempt, period.

[00:03:15]

Now, what Stephanie couldn't have known at the time was that this small, peaceful protest would turn out to be this morning a face off on Facebook.

[00:03:26]

One of the opening shots, Facebook triggered a hornet's nest in what would become a loud OCU Facebook block.

[00:03:33]

You Facebook raucus, you Facebook you and Global Battle, embattled Facebook CEO book today playing defense.

[00:03:42]

And now I'm not talking about all the things you've recently heard about Russia's interference in election meddling or data breaches, but rather something that I think is is deeper than both of those free speech.

[00:03:55]

Facebook has been accused of facilitating violence against Rohingya Muslim. What we can say and what we can't say. And again, Facebook bombed this iconic photograph we can see and what we can't see.

[00:04:06]

Let Mueller write kids in front of a. Funded by power. You're fucking crazy. Thank you, Mr. Chairman. Mr. Zuckerberg, I've got to ask you, do you subjectively prioritize or censor speech?

[00:04:31]

Congresswoman, we don't think about what we're doing is censoring speech, but there are two types.

[00:04:38]

What really grabbed me was we all was discovering that underneath all of this is an actual rule book, a text document that dictates what I can say on Facebook. What you can say on Facebook and what all.

[00:04:55]

Two point two billion of us can say on Facebook for everyone in the entire globe, for everyone except for one. One set of rules that all two point two billion of us are expected to follow. What is an actual document? It's a digital document.

[00:05:09]

But yes, it's about 50 pages. If you print it off and die in bullet points and if then statements, it spells out sort of a First Amendment for the globe, which made me wonder, like, what are these rules?

[00:05:23]

How were they written? And can you even have one rule book? Right. Exactly. And so I, I dove into this rule book and dug up some stories that really put it to the test.

[00:05:35]

OK. I'm Grynch now with the stories we're gonna hear. We are going to ish. Three ish. OK. OK. All right. Project leader on the issue. But let's go ahead with the first.

[00:05:43]

Well.

[00:05:43]

So let's start back on that morning in 2008, the morning that you could argue started it all because in the building, right behind those protesting mothers, there was a group of Facebook employees sitting in a conference room trying to figure out what to do.

[00:06:01]

Cool. So if I so I'm just going to switch. You just read.

[00:06:04]

So I was able to get in touch with a couple of former Facebook employees, one of whom was actually in that room at that moment.

[00:06:11]

And now neither of these two were comfortable being identified. But they did give us permission to quote them extensively. How's that?

[00:06:18]

Well, they'll take work for you and a great deal just so we have it.

[00:06:21]

So what you're going to hear here is an actor we brought in to read quotes taken directly from interviews that we did with these two different former Facebook employees.

[00:06:31]

All right. Ready. So at the time when I joined them, there was a small group, 12 of us, mostly recent college grads, who were sort of called the Site Integrity Team. Again, keep in mind, this was in the early 2000s, seismic changes this week in the Internet hierarchy.

[00:06:47]

This was like the deep, dark past. MySpace dot com is now the most visited Web site in the U.S. Facebook had somewhere in the neighborhood of 10 million users.

[00:06:56]

We were smaller than MySpace.

[00:06:58]

The vast majority of them college kids. And so in those early days, those 12 people, they would they would sit around in a sort of conference like room with a big, long table, each of them in front of their own computer.

[00:07:09]

And things would come up onto their screen flagged to Facebook and Lagu, meaning like a user saw something that I thought was wrong.

[00:07:17]

Exactly. Like a reporting a piece of content that you think violates the community standards. This is Cape Clonic. She's a professor of law at St. Johns and she spent a lot of time studying this very thing. And she says in those early days, what would happen is a user would flag a piece of content.

[00:07:33]

And then that content, along with an alert, would would get sent to one of those people sitting in that room. It would just pop up on their screen.

[00:07:41]

Most of what you were seeing was either naked people blown off heads or things, that there was no clear reason why someone had reported because it was like a photo of a golden retriever. And people are just annoying. And every time something popped up onto the screen, the person sitting at that computer would have to make a decision whether to leave that thing up or take it down.

[00:08:02]

And at the time, if you didn't know what to do, you would turn to your pod leader who was, you know, somebody who had been around nine months longer than you and ask, what do I do with this? And they would either have seen it before and explain it to you or you both would know and you'd Google some things.

[00:08:18]

It really was just kind of an ad hoc approach.

[00:08:21]

Was there any sort of written standard or any common standard? Well, kind of.

[00:08:25]

They had a set of community standards that there at the end of the day, they were just kind of that was one page long and it was not very specific.

[00:08:32]

Sorry that the guidelines were really one page long. There are one page long. And basically all this page said was nudity is bad. So is Hitler. And if it makes you feel bad, take it down.

[00:08:44]

And so when one of the people sitting in that room would have a breastfeeding picture pop up on the screen in front of them, they'd be like, I can see a female breast. So I guess that's nudity.

[00:08:55]

And they would take it down until the rise up fight for the rights to have breastfeeding anyway. Now, a dozen or so people in front of their offices on a Saturday.

[00:09:07]

It probably wasn't causing Facebook too much heartache, but I thought, you know, hey, we have an opportunity here with, you know, over 10000 members in our group, according to Stephanie Mirror.

[00:09:17]

Those protesters were just a tiny fraction of a much larger online group who had organized, ironically enough, through Facebook.

[00:09:25]

So to coincide with the live protest, I just typed up a little. Blurb encouraging our members that were in the group to do a virtual Merson, a virtual person. Right. What we did.

[00:09:39]

They posted a message asking their members to for one day changed their profile avatar to an image of breastfeeding and then changed their status to the title of our group. Hey, Facebook, breastfeeding is not obscene. And it caught on. A social networking Web site is under fire for its policy on photos of women breastfeeding their children big time. Twelve thousand members participated and the media requests started pouring in to Facebook and Facebook breastfeeding. I did hundreds of interviews for print.

[00:10:12]

Chicago Tribune. Miami Herald, Time Magazine. New York Times. Washington Post.

[00:10:18]

The Internet isn't interesting, Dr. Phil.

[00:10:22]

It was a media storm and eventually, perhaps as a result of our group and our efforts. Facebook was forced to get much more specific about their rules.

[00:10:36]

So, for example, by then, nudity was already not allowed on the site, but they had no definition for nudity. They just said no nudity. And so the site integrity team, those 12 people at the time, they realized they had to start spelling out exactly what they meant. Precisely.

[00:10:51]

All of these people at Facebook were in charge of trying to define nudity. So, I mean, yeah, the first cut out, it was visible male and female genitalia and then visible female breasts. And then the question is, well, OK, how much of a breast needs to be shown before it's nude? And the thing that we landed on was if you could see essentially the nipple and Ariela, then that's nudity and it would have to be taken down.

[00:11:15]

Which theoretically at least, would appease these protesters because, you know, now when a picture would pop up of a mother breastfeeding, as long as the child was blocking the view of the nipple in the area left, they could say, cool, no problem.

[00:11:29]

Then you start getting pictures that are women with just their babies on their chest with their breasts bare. Like, for example, maybe baby was sleeping on the chest of a bare breasted woman and not actively breastfeeding.

[00:11:41]

Okay. Now what, like, is this actually breastfeeding? No, it's actually not breastfeeding. The woman is just holding the baby and she has her top off.

[00:11:48]

She was clearly just breastfeeding the baby. Well, like it was before. Well, I would say it's sort of like kicking a soccer ball, like a photo of someone who has just kicked a soccer ball. You can tell the ball is in the air, but there is no contact between the foot and the ball in that moment, potentially.

[00:12:04]

So although it is a photo of someone kicking a soccer ball, they are not, in fact, kicking the soccer ball in that photo.

[00:12:11]

And this this became the procedure or the protocol or the approach for all these things was we have to base it purely on what we can see in the image.

[00:12:21]

And so it didn't allow that to stay up under the rules because it could be too easily exploited for other types of content, like nudity or pornography.

[00:12:29]

We got to the only way you could objectively say that the baby and the mother were engaged in breastfeeding is if the baby's lips were touching the woman's nipple. So they included what you could call like an attachment clause.

[00:12:40]

But as soon as they got that rule in place, like you would see, you know, a 25 year old woman and a teenage looking boy. Right. And like, what the hell is going on there?

[00:12:50]

Oh, yeah. It gets really weird if you, like, start entering until, like, child age isn't even gonna bring that up because it's kind of gross.

[00:12:57]

It's like breastfeeding porn. Is that a thing? Are there sites like that? Apparently.

[00:13:01]

And so this team, they realized they needed to have a nudity rule that allowed for breastfeeding, but also had some kind of an age cap.

[00:13:09]

So. So. So then we were saying, OK, once you've progressed past infancy, then we believe that it's inappropriate.

[00:13:15]

But then pictures would start popping up on their screen and they'd be like, wait, is that an infant? Like, where's the line between infant and toddler?

[00:13:23]

And so the thing that we landed on was if it looked like the child could walk on his or her own, then two old.

[00:13:29]

Big enough to walk too big to breastfeed. Well, I could be like, yeah, that's like a year old in some cases.

[00:13:34]

Yeah.

[00:13:34]

And like the World Health Organization recommends breastfeeding until, you know, like 18 months or two years, which meant there were a lot of photos still being taken down within, you know, days.

[00:13:47]

We're continuing to hear reports from people that their photographs were still being targeted.

[00:13:53]

But Facebook did offer a statement saying, you know, that's where we're going to draw the line. Facebook using budging on its policy. And keep in mind through this whole episode, is this perhaps the next big thing?

[00:14:05]

And the Facebook dot com, the company was growing really, really fast.

[00:14:09]

Seems like almost everyone is on it. And and they're just got to be a lot more content. When we first launched, we were hoping for. You know, maybe 400, 500 people an hour at one hundred thousand. So who knows where we're going.

[00:14:22]

Thousands more people are joining Facebook every day. Sixty million users so far with a projection of 200 million by the. End of the year and now more people on Facebook than the entire U.S. population, not just within the United States, but also it was growing rapidly, more international.

[00:14:37]

You know, you were getting Spook's in India stuff from India and Turkey. Facebook. Facebook is going to read this book. It's getting big throughout the EU.

[00:14:47]

Corea's joined the Facebook so they have more and more content coming in from all these different places in all these different languages. How are we going to keep everybody on the same page?

[00:15:00]

And so once they saw that this was the operational method for dealing with this, creating this like nesting set of exceptions and rules and these clear things that had to be there had to not be there in order to keep content up or take it down. That, I think, became their procedure.

[00:15:16]

And so this small team at Facebook got a little bigger and bigger, jumped up to 60 people and then 100.

[00:15:23]

And they set out to create rules and definitions for everything.

[00:15:27]

Can we go through some of sort of the ridiculous examples? Let's over here. Okay.

[00:15:32]

So Gore. Gore, you mean violence? Kind of Gore. Yes. So the Gore standard was headline.

[00:15:37]

We don't allow graphic violence and gore. And then the shorthand definition they used was no insides on the outside.

[00:15:44]

No guts, no blood pouring out of something.

[00:15:47]

Blood was a separate issue. There was an excessive blood rule. They had to come up with rules about bodily fluid.

[00:15:53]

Zeeman, for example, would be allowed in a clinical setting. But like, what does a clinical setting mean? And, you know, does that mean if someone is in a lab coat?

[00:16:02]

Mm hmm. One of my favorite examples is like, how do you define art?

[00:16:06]

Because as these people are moderating, they would see images of naked people that were paintings or sculptures come up.

[00:16:14]

And so what they decided to do was say art with nakedness can stay up like it stays up if it is made out of wood, made out of metal, made out of stone. Really? Yeah, because how else do you define art? You have to just be like, is this what you can see with your eyeballs?

[00:16:32]

And so from then on, as they run into problems, those rules just constantly get updated with your constant amendment. Yeah. Constant amendments.

[00:16:39]

New problem. New rule. Another new problem. Updated rule.

[00:16:44]

In fact, at this point, they are amending these rules up to 20 times a month. Wow. Really?

[00:16:51]

Yeah. Take, for example, those rules about breastfeeding. In 2013, they removed the attachment clause. So the baby no longer needed to have its mouth physically touching the nipple of the woman.

[00:17:03]

Oh, and in fact, one nipple and or Ariela could be visible in the photo, but not to only one.

[00:17:12]

Then 2014, they make it so that both nipples or both Ariela may be present in the photo.

[00:17:19]

So this is what happens in American law all the time. It's very thick. Yes.

[00:17:23]

Yeah. You know, it sounds a lot like common law. So common law is this system dating back to early England where individual judges would make a ruling, which would sort of be a law, but then that law would be amended or evolved by other judges. So the body of law was sort of constantly fleshed out in face of new facts. Literally, every time this team at Facebook would come up with a rule that they thought was airtight. Could plop something good, something good to show up that they that they weren't prepared for that.

[00:17:54]

That the rule hadn't accounted.

[00:17:56]

As soon as you think. Yeah, this is good. Like the next day something shows up to show you. Yeah.

[00:18:01]

You didn't think about this, for example, sometime around 2011.

[00:18:04]

This content moderator is going through a queue of things except reject except escalate. Except and she comes upon this image. Oh my God.

[00:18:17]

The photo itself was a teenage girl, African by dress and skin, breastfeeding a goat, a baby goat. And the moderator throws her hands up and says, What the fuck is this? And we Googled breastfeeding goats and found that this was a thing. It turns out it's a survival practice according to what they found. This is a tradition in Kenya that goes back centuries.

[00:18:40]

That in a drought, a known way to help your herd get through the drought is to if you had if you have a woman who's lactating to have her nurse, the kid, the baby goat, along with her human kid.

[00:18:55]

And so there's nothing sexual about it.

[00:18:57]

Just good farm business. Good. And theoretically, if we go point by point through this list, it's an infant. It sort of could wax and maybe there's an issue there, but there's physical contact between the mouth and the nipple.

[00:19:11]

But but obviously, breastfeeding, as we intended anyway, meant human infants. And so in that moment, what they decide to do is remove the photo. And there was an amendment, an asterisk under the rule stating animals are not babies. We added that. So in any future cases, people would know what to do.

[00:19:30]

But they removed it. Did they discover it was culturally appropriate? Nothing thing that people do and they decide to remove the photo.

[00:19:36]

Yeah. That outraged individuals our editors saw and we like. Why? Why didn't we make an exception?

[00:19:41]

Because. Because when a problem grows large enough, you have to change the rules. If not, we don't. This was not one of those cases. The juice wasn't worth the squeeze.

[00:19:51]

And like if they were to allow this picture, then they'd have to make some rule about when it was okay to breastfeed an animal and when it wasn't OK. This is a utilitarian document.

[00:20:01]

It's not about being right 100 percent of the time. It's about being able to execute effectively.

[00:20:10]

In other words, we're not trying to be perfect here and we're not even necessarily trying to be 100 percent just or fair.

[00:20:17]

We're just trying to make something that works. One, two, three, four, five, six, seven, eight. And when you step back and look at what Facebook has become like from 2008 to now in just 10 years.

[00:20:33]

I'm just surprised at that Accenture Tower here in Manila. I don't know if it is one.

[00:20:40]

The idea of a single set of rules that works, that can be applied fairly.

[00:20:45]

It's just a crazy, crazy concept.

[00:20:48]

Fifteen. Because they've gone from something like 70 million users to two point two billion. Hard to take count. I would say it's about 30 floors.

[00:20:57]

And they've gone from 12 folks sitting in a room deciding what to take down or leave up to somewhere around 16000 people.

[00:21:04]

So there's a floor in this building where Facebook supposedly outsources content moderators.

[00:21:10]

And so around 2010, they decided to start outsourcing some of this work to places like Manila, where you just heard reporter Aurora Almendral as well as I mean, I would guess that there are thousands of people in this building, Dublin, where we sent reporter Gareth Stack seeing where they get their delicious Facebook treats cooked.

[00:21:28]

Everybody's beavering away. And we sent them there to try to talk to some of these people who, for a living, sit at a computer and collectively click through around a million flag bits of content that pop up under their screen every day. Wow. What? I'm just curious. Like, what's what's that like?

[00:21:44]

Well, can I ask you some questions? We found out pretty quickly in New York, none of these folks are willing to talk to us about what they do.

[00:21:54]

So there's a lot of running away from me happening a lot. Sorry to bother you. You guys work on Facebook and you happen to work on Facebook. No, I don't. Sorry to bother you. Do you work inside? Sorry to work in Facebook. I mean, like you just came out of there. I know you're lying. In fact, most people wouldn't even admit they work for the company.

[00:22:15]

Like what, sir? Is there something wrong about being in love, like an NDA that they signed?

[00:22:19]

Well, yes. So. So when I finally did find someone willing willing to talk to me. Do you want to be named or do you now or do you not want to be named? I'd rather not. That's totally fine, you know. I'm still in the industry. I don't want to lose my job, this shit. You know, to explain that he and all the other moderators like him were forced to sign these nondisclosure agreements, stating they weren't allowed to admit that they worked for Facebook.

[00:22:45]

They're not allowed to talk about the work they do.

[00:22:47]

My contract did prohibit any. So talk to you about what kind of moderation?

[00:22:53]

One, several reasons. One is that up until recently, Facebook wanted to keep secret what these rules were so that they couldn't be gamed at the same time. It creates a sort of separation between these workers and the company, which if you're Facebook, you might want to.

[00:23:08]

Yeah, I knew I signed up to monitor graphic images just given the nature of the job. But, you know, I didn't really, really know the impact that that's going to have on you. I'm telling you.

[00:23:20]

So this guy I talked to, he. He got his first contract doing this work several years back and for the duration of it, about a year.

[00:23:27]

He'd show it to his desk every morning, put on his headphones, click, click, click, click, click, click, ignore. It's not to. I actually just I just. I just I guess. Forty five thousand shifts every day. And it's just image and decision. Image decision. Five five 5000 a day. You just said. Yeah, it was like it was a lot of chaos. Yeah.

[00:23:44]

He said basically he'd have to go through an image or some other piece of content every three or four seconds. Wow. All day long. All day. Eight hours a day. Well, if I can ask, what kind of things did you see? I don't know if this is even. I don't know if this is video worthy. To. I think this, too, actually, that clicking through.

[00:24:12]

He came across unspeakable things. Subtle heads exploding to people being squashed by a tank.

[00:24:19]

Two people in cages being drowned, too, like a 13 year old girl having sex with an eight year old boy. And it's not just once. It's over and out. Right out. Right over.

[00:24:35]

When did you did this, like, keep you up at night? Did you do this? Absolutely. Absolutely, 100 percent. It kept me up at night. He'd catch himself thinking about these videos and photos when he was trying to relax. He had to start avoiding things.

[00:24:51]

There were there were specific like movies that I couldn't watch. It was wild. I think it was Quentin Tarantino on the Wi-Fi in there. And I see it was like, OK. And then I was like, Haeju to exploding. I was like, no, no, I have to walk away. And I just I had to I actually was too real. I saw that. It's classic majestic. A different moderator I spoke to described it as seeing the worst side of humanity.

[00:25:22]

You see all of this stuff that you and I don't have to see because they are going around playing cleanup.

[00:25:30]

What a job. Wow. Yeah.

[00:25:32]

And it's worth noting that more and more this work is being done in an automated fashion, particularly with content like Gore or terrorist propaganda. They're getting better automate that. They yeah. They through computer vision, they're able to detect hallmarks of of of a terrorist video or of gory image. And with terrorist propaganda, they now take down 99 percent of it before anyone flags it on Facebook.

[00:26:02]

But moving on to our second story here, there is a type of content that they that they are having an incredibly hard time, not just automating, but even getting their rules straight on. And that surrounding hate speech. Oh, good.

[00:26:20]

More laughs coming up. Well, there will be laughter. Oh, really? There will be comedians. There will be jokes. Comedians. Hey. All right. OK. Oh, no. So we take a break and then come right back. No, I think we're gonna keep going. OK.

[00:26:31]

Testing. One, two, two, four, five. Testing one, two, three, four, five. I'm Simon Adler.

[00:26:34]

So a couple months back, I think we sent our pair of interns on 60 feet.

[00:26:42]

Carter Hodge, RICO at Just Standing and Lizzie Yeager tickets for tonight. OK.

[00:26:52]

To this cramped, narrow little comedy club, the kind of place with like fifteen dollars smashed Rose-Marie cocktails, none of it.

[00:27:06]

We didn't need to get high topped tables. The AC is dripping off, but still is kind of a dive. It was good. Yeah. All right.

[00:27:18]

And we sent them there to check out someone else who'd found a fault line in Facebook.

[00:27:23]

Rule book signing weren't even going to right along. The next Tina, come the stage. Please give it up for Marsha. Yes. You get so mad. I feel like my first time to the city, I was such a carefree brat. You know, I was young and I had these older friends, which I was like, very cool. And then you just realize that there are high. She's got dark, curly hair, was raised in Oklahoma.

[00:27:54]

I think I was raised Jewish. So when you're raised Jewish, you read about Anne Frank a lot. You know, a lot of my friends like this will get funny. She.

[00:28:08]

How did you decide to become a comedian?

[00:28:10]

You know, it was kind of the only thing that ever clicked with me, and especially political comedy. You know, I used to watch The Daily Show every day.

[00:28:17]

And back in 2016, she started this political running bit that I think can be called sort of absurdist feminist comedy.

[00:28:26]

Now, a lot of people think I'm like an angry feminist, which is weird. This guy called me a militant feminist. And just because I am training of. Into the woods. At first, I just had this running bet online on Facebook and Twitter. She was tweeting or posting jokes, you know, like we have all the buffalo wild wings surrounded, you know, things like that.

[00:28:54]

Eventually took the spit on stage, even wrote some songs.

[00:29:02]

My dad. Anyhow, so about a year into this running bid, Marcia was bored at work one day and logged on to Facebook. But instead of seeing her normal news feed, there was this message that pops up.

[00:29:23]

It says you posted something that discriminated along the lines of race, gender or ethnicity group. And so we removed that post. And so I'm like, what could I possibly a Post-it. I really I thought was like a glitch. But then she clicked continue.

[00:29:40]

And there highlighted was the violating post. It was a photo of hers.

[00:29:45]

Well, what is the picture? Can you describe it? The photo is me as look can only be described as a cherub, cute little seven year old with big curly hair, and she's wearing this blue floral dress.

[00:29:55]

Her teeth are all messed up in into the photo Marcia had edited in a speech bubble that just says kill all men.

[00:30:03]

And so it's funny, you know, because a hit. A hit. It's funny. You know, I just read whatever so I thought was ridiculous.

[00:30:10]

So she searched through her library of photos and found that kill image. And I posted it again immediately after, like. Yeah. And it got removed again. And this time there were consequences. I got Banford three days after that.

[00:30:25]

Then after several other bands shoot forward, this is months later, a friend of hers had posted an article.

[00:30:31]

And underneath it in the comment section, there were guys posting just really nasty stuff.

[00:30:36]

So I commented underneath those comments. Men are scum. Which was very quickly removed.

[00:30:44]

When how.

[00:30:45]

How long did you get banned for this time? Thirty days. Wow. Yeah. I was dumbfounded. So there's a rule somewhere that if I type. Men are scum. You take it down. Yes. I'm like, what could it be?

[00:31:02]

And so Marsha called on her, quote, militia of women. Exactly. To find out, like, is this just me? Female comedians who are sort of like mad on my behalf started experimenting, posting. Men are scum. To see how quickly I would get remove and if it would be removed every time.

[00:31:20]

And it was so they started trying other words. Love. Yeah. To find out where the line was, my friend put men are Duska that got removed. Men are the words removed and banned. This one girl put men are septic fluid. Band, but we're only at the middle of the saga. It doesn't end there because now she'll really like what the hell is going on? Is it sexism?

[00:31:44]

So I just start doing the most bare minimum amount of investigating.

[00:31:50]

She's Googling around trying to figure out what these policies are. And pretty quick, she comes across this leaked Facebook document. So this is when I lose my mind.

[00:32:01]

This is when Mark Zuckerberg becomes my sworn nemesis for the rest of my life, because what she'd found was a document Facebook used to train their moderators in inside of it in a section detailing who Facebook protected from hate speech. There was a multiple choice question that said, who do we protect, white men or black children?

[00:32:23]

And the correct answer was white men, not black children, not even kidding white.

[00:32:29]

And we're not protected. Black children are not. That's not a good look.

[00:32:33]

It's racist. Something's going on here. There is absolutely some sort of unaddressed bias or systematic issue at Facebook. Hello, how are you?

[00:32:47]

I'm doing well. Thank you so much for being on the show. Not long after sitting down with Marsha. Facebook invited me to come out to their offices in California and sit down with them.

[00:32:58]

I'm going to eat one cookie. Oh, they're little. I think I get to typing.

[00:33:04]

Could I just picture your name and your title? And I'm Monica Bechard and I lead the policies for Facebook. Monica Bechard is in charge of all of Facebook rules, including their policies on hate speech. And so I asked her, like, why would there be a rule that protects white men but not black children?

[00:33:23]

We have we have made our hate speech policies. Let me let me rephrase that. Our hate speech policies have become more detailed over time. But our main policy is you can't attack a person or group of people based on a protected characteristic, a characteristic like race, religion or gender.

[00:33:42]

So this takes a couple beats to explain. But the gist of it is that the Facebook borrowed this idea of protected classes straight from U.S. anti-discrimination law. These are the laws that make it so that you can't not hire someone, say, based on like their religion, their ethnicity, their race and so on.

[00:34:01]

On Facebook, you can't attack someone based on one of these characteristics.

[00:34:06]

Meaning you can't say men are trash, nor could you say women are trash because essentially you're attacking all men for being men.

[00:34:15]

Oh, is is it the all. Can I say Bob is trashy yet?

[00:34:18]

You can say Bob is trash because as my sources explained to me, the distinction is that in the first instance you're attacking a category.

[00:34:26]

In the second instance you were attacking a person. But it's not clear that you're attacking that person because they are a member of a protected category.

[00:34:33]

Oh, so Bob might be trashed for reasons that have nothing to do with him being a man. Yeah, he just might be annoying. Right. Okay, so that explains why you take down men are scum. But why would you leave up. Black children are scum. Well, why would that not get taken down?

[00:34:48]

So traditionally we allowed speech once there was some other word in it that made it about something other than a protected characteristic.

[00:34:57]

In Facebook jargon, these are referred to as a a non protected modifier just means literally nothing to give us an example of this.

[00:35:08]

So traditionally, if you said I don't like this religion, cab drivers, cab driver would be the non protected modifier because employment is not a protected category. Huh?

[00:35:21]

And so what the rule stated was that when you add this non protected modifier to a protected category, in this case, the cab driver is religion, we would allow it because we can't assume that you're hating this person because of his religion.

[00:35:37]

You actually just may not like cab drivers.

[00:35:40]

So in the case of black children, children is modifying the protected category of black.

[00:35:46]

And so children trumps black age is a non protected category. OK. So the children becomes a non protected modifier and their child ness trumps their blackness. You can say whatever you want about black children. Whereas in the case of white men, you've got you've got gender and race both protected. You can't attack them.

[00:36:15]

That's just a bizarre rule. I would think you'd go the other direction that you the protected class would outweigh the modifier.

[00:36:22]

Well, they they made this decision, as they explained to me, because their default was to allow speech. They were really trying to incorporate or nod to the American free speech tradition.

[00:36:34]

And so there's a whole lot of stuff. That none of us would defend is a valuable speech. But didn't didn't rise to the level of stuff that we'd say. This is so bad, we're going to take it down.

[00:36:43]

And in this case, their concern was we're all members of like, you know, at least half a dozen protected categories. Like we all have gender. We all have sexual orientation.

[00:36:52]

And if you if the rule is that any time a protected class is mentioned, take it down.

[00:36:59]

It could be hate speech. What you are doing at that point is opening up just about every comment that's ever made about anyone on Facebook to potentially be hate speech.

[00:37:10]

Then you're not left with anything, right, no matter where we draw this line. There are going to be s some outcomes that we don't like. There are always going to be casualties. That's why we continue to change the policies.

[00:37:20]

And in fact, since Marcia's debacle, they've actually updated this rule. So now black children are protected from what they considered the worst forms of hate speech.

[00:37:30]

Now our reviewers take how severe the attack is into consideration.

[00:37:36]

But despite this, there are still plenty of people that is flawed because you are a social network.

[00:37:42]

Including Marsha who who think this still just isn't good enough. There are not systematic efforts to eliminate white men in the way that there are other groups. That's why you have protected groups.

[00:37:54]

She thinks white men and heterosexuals should not be protected.

[00:37:58]

Protect the groups who are actually victims of hate speech. Makes sense.

[00:38:03]

Well, yes.

[00:38:03]

Because in sort of hate speech or thinking about hate speech, there's this idea of of privileged or of historically disadvantaged groups and that those historically disadvantaged groups should have more protection because of being historically disadvantaged.

[00:38:21]

And the challenge with that that was presented to me was OK by the thousands of Japanese reinforcements in the 1940s to cut off the Chinese Jeonju.

[00:38:32]

You had Japanese soldiers shot and beheaded tens of thousands of Chinese civilians, killing millions of Chinese during World War Two. At that same time, you had Japanese American citizens, one hundred thousand persons of Japanese ancestry.

[00:38:48]

All of them would have to being put into internment camps. And so we had to ask ourselves a question like, are the Japanese and historically advantaged or disadvantaged group, huh?

[00:38:59]

Japanese Americans, pretty easy to make a case that they were disadvantaged. But in China, it's a totally different story. And this happened at the exact same moment. So you've got two different places, two different cultural stories.

[00:39:12]

And when you have a Web site like Facebook, there's trans national community. They realized or they decided that ideas of privilege are so geographically bound. That there is no way to effectively weigh and consider who is privileged above who. And decided, therefore, that we are not going to allow historical advantage or historical privilege into the equation at all.

[00:39:43]

And I think it's very important to keep in mind here, as Americans, these moderators only have like four or five seconds. Republicans must come to make a decision.

[00:39:55]

They did this in those four seconds. Is there enough time to figure out where in the world someone is, particularly given IP addresses can easily be masked. Go back where you came from. Is it enough time to figure out a person's ethnicity? White children are better than black children. On top of that, we often don't know an individual's race. Straight people suck. Other categories are even less clear, like sexual orientation. And they they just realized it would be next to impossible to get anybody to be able to run these run these calculations effectively when we were building that framework.

[00:40:31]

We did a lot of tests and we saw sometimes that it was just too hard for our reviewers to implement a more detailed policy consistently. They just couldn't do it accurately.

[00:40:43]

So we want the policies to be sufficiently detailed to take into account all different types of scenarios, but simple enough that we can apply them consistently and accurately around the world.

[00:40:56]

And the reality is, anytime that the policy's become more complicated, we see dips in our consistency.

[00:41:03]

What Facebook is trying to do is take the First Amendment, this high minded, lofty legal concept and convert it into an engineering manual that can be executed every four seconds for any piece of content from anywhere on the globe. And when you've got to move that fast, sometimes justice loses.

[00:41:26]

That's the that's the tension here. And I just want to make sure I emphasize that these policies, they're not going to please everybody, they often don't don't please everybody that's working on the policy team at Facebook. But if we want to have one line that we enforce consistently, then it means we have to have some pretty objective black and white rules.

[00:41:59]

Well, I believe that there is. Something like Coca-Cola being very funny. But. When we come back, those rules. They get toppled. This is Danny from Denver, Colorado. Radiolab is supported in part by the Alfred P. Sloan Foundation, enhancing public understanding of science and technology in the modern world. More information about Sloan at W w w dot Sloan dot org.

[00:42:50]

Jad. Robert Radiolab. Back to Simon Adler. Facebook. Free speech. So as we just heard before the break. Facebook is trying to do two competing things at once. They're trying to make rules that are just. But at the same time can be reliably executed by thousands of people spread across the globe in ways that are fair and consistent. And I would argue that this balancing act was put to the test April 15th, 2013. That's right.

[00:43:16]

Pangkalan Ronda, some good luck. Thank you. We have some break. We have some breaking news. Otherwise, I wouldn't cut you off so abruptly.

[00:43:24]

Carlos, Monday, the 15th, 2013, just before 3:00 in the afternoon. Two pressure cooker bombs. Ripped through the crowd near the finish line of the Boston Marathon. And as sort of the dust begins to settle. People are springing into action. Just one man in a cowboy hat sees a spectator who who's been injured, picks him up, throws him in a wheelchair, and as they're pushing him through the sort of ashy cloud, there's a photographer there and he snaps this photo.

[00:44:17]

And the photo shows the runner in the cowboy hat and these two other people pushing this man who is faces ashen from all of the debris. His hair is sort of standing on end. And you can tell that actually the force of the blast and then the particles that got in there actually holding it in this sort of wedge shape. And one of his legs is completely blown off. And the second one is blown off below the knee. Other than the femur bone sticking out and then sort of skin and muscle and tendons, it's it's horrific.

[00:44:51]

Meanwhile, in the CBS B area studio on the other side of the country. Yes. Five. I remember snippets of the day.

[00:44:58]

My Facebook employees were clustering around several desks, staring at the computer screens, watching the news break.

[00:45:05]

And this has occurred just in the last half hour or so. I have memories of of watching some of the coverage, chilling new images just released of the Boston bombings.

[00:45:15]

I remember seeing the photo published online and it wasn't long after that. Someone had posted on Facebook from the folks I spoke to, the order of events here are a little fuzzy, but pretty quickly, this photo is going viral.

[00:45:32]

And we realized we're going to have to deal with it. This image is spreading like wildfire across their platform. It appears to be way outside the rules they'd written, but it's in this totally new context. So they got their team together, sat down in a conference room.

[00:45:48]

I don't know, there was probably eight or 10 people thinking about, like, should we allow it or should they take it down according to their rules?

[00:45:56]

Yeah. So if you recall the no insides on the outside definition that we had in place, meaning you can't see like people's organs or that sort of thing.

[00:46:05]

And if you can, then we wouldn't allow it. And in this in this photo, you could see you could definitely see bone.

[00:46:12]

And so by the rules, the photos should obviously come down. Yep. However, half the room says no.

[00:46:18]

The other people are saying this is newsworthy. Essentially, this photo is being posted everywhere else. It's important. We need to suspend the rules.

[00:46:29]

We need to make an exception which immediately receives pushback.

[00:46:33]

Well, I was saying that what we've prided ourselves on was not making those calls. And there are no exceptions. There is either. Are there mistakes or improvements? We made the guidelines for moments like this to the other side, shoots back. Oh, my God. Are you kidding me? Like, the Boston Globe is publishing this all over the place and we're taking it down. Are you fucking kidding me? Damn the guidelines. Let's have commonsense here.

[00:46:55]

Let's be humans. We know that this is important.

[00:46:57]

And, yeah, they're kind of. They're right. But the reality is, like, if you say, well, we allowed it because it's newsworthy. How how do you answer any of the questions about any of the rest of the stuff? In other words, this is a Pandora's box. And in fact, for reasons that aren't totally clear, team consistency. Team follow the rules eventually wins the day they decide to take the photo down. But before they can pull the lever, word starts making its way up the chain and internally within Facebook.

[00:47:32]

According to my sources, an executive under Zuckerberg sent down an order.

[00:47:35]

We were essentially told, make the exception. Huh? I don't care what your guidelines say. I don't care what your reason is. The photo stands. You're not taking this down. Yes. Yes, that's what happened. This decision means that Facebook has just become a publisher, but they don't think maybe they have, but they've made a news judgment and just willy nilly. They've become CBS, ABC, New York Times, Herald Tribune, Atlantic Monthly and all these things all at once.

[00:48:11]

They just become a news organization. Yeah.

[00:48:13]

And this brings up a legal question that that's at the center of this conversation about free speech. Like, is Facebook a sort of collective scrapbook for us all?

[00:48:23]

Or is it a public square where you should be able to say whatever you want? Or yeah, it is it. Now, a news organization adds transparency.

[00:48:32]

I mean, I'm sorry to interrupt, but let me get to one final question that kind of relates to what you're talking about in terms of what exactly Facebook is.

[00:48:40]

And this question has been popping up a lot recently. In fact, it even came up this past April when Zuckerberg was testifying in front of Congress.

[00:48:48]

I think about 140 million Americans get their news from Facebook. So which are you are you a tech company? Are you the world's largest publisher? Senator, this is a I view us as a tech company because the primary thing that we do is build technology and products that you're responsible for your content, which make exactly nine of a publisher. Right. Well, I agree that we're responsible for the content. But I don't think that that's incompatible with fundamentally yet at our core, being a technology company where the main thing that we do is have engineers and build products.

[00:49:23]

Basically, Zuckerberg and others at the company are arguing, no, they're not a news organization.

[00:49:27]

Why? What would be the downside of that? Well, Facebook currently sits on this little idea legal island where they can't be held liable for much of anything.

[00:49:38]

They're subjected to few regulations.

[00:49:40]

However, were they to be seen in the eyes of the court as a media organization, that could change.

[00:49:46]

But setting that aside, what what really strikes me about all of this is here you have a company that really up until this point has has been crafting a set of rules that are both as objective as possible and can be executed as consistently as possible. And they've been willing to sacrifice rather large ideas in the name of this. They, for example, privilege, which we talked about, they decided was too geographically bound to allow for one consistent rule.

[00:50:18]

But if you ask me, there's nothing more subjective or geographically bound than what people find interesting or important, what what people find newsworthy. I'll give and I'll give you I'll give you a great example of this that happened just six months after the Boston Marathon bombing.

[00:50:37]

When this video starts being circulated out of northern Mexico and it's a video of a woman being grabbed and forced onto her knees in front of a camera and then a man with his face covered, grabs her head, pulls her head back and slices her head off right in front of the camera. And this video starts being spread.

[00:50:59]

I can't count how many times, like just reading my Twitter feed.

[00:51:02]

I've been like, you know, like one person who came across this video or at least dozens of others, like it was Shannon Young.

[00:51:08]

My name is Shannon Young. I am a freelance radio reporter. I've been living here in Mexico for for many years now.

[00:51:15]

And her beat is covering the drug war. In doing so years back, she noticed this strange phenomenon.

[00:51:20]

It first caught my attention in early 2010. She'd be checking social media.

[00:51:25]

You know, you're scrolling through your feed and, you know, you'd see all this news. People say, no, there was this three hour gun battle and intense fighting.

[00:51:33]

All weekend long, folks were posting about clashes between drug cartels and government forces. But then when Shannon would watch the news that night, I got a phone call from my friend.

[00:51:43]

Yes, yes, yes, yes. Wendy Yeah.

[00:51:46]

She'd see reports on the economy and soccer results, but the media wasn't covering it. There'd be no mention of these attacks, nothing to do with the violence.

[00:51:55]

And so she and other journalists tried to get to the bottom of this. Reporters in Mexico City would contact the state authorities and no public information officer, and they'd be like shootings, bombings.

[00:52:05]

What are you talking about? Nothing's going on.

[00:52:06]

We have no reports of anything. These are just Internet rumors.

[00:52:10]

The government even coined a term for these sorts of posts. The famous phrase at the time was collective psychosis.

[00:52:16]

These people are crazy because, you know, they didn't want the situation to seem out of control. But then a video was posted.

[00:52:30]

It opens looking out the windshield of a car on a sunny day. The landscape is dry, dusty and the video itself is shaky, clearly shot on a phone. And then Gallman taping starts talking.

[00:52:46]

And this woman, she just narrates as they drive along this highway.

[00:52:55]

She pans the phone from the passenger window to the to the windshield, focusing in on these two silver destroyed pickup trucks.

[00:53:06]

And she's saying, look at these cars over here.

[00:53:08]

They're, you know, shot up.

[00:53:09]

And, oh, look here, look here. You know, this 18 wheeler is, you know, totally abandoned.

[00:53:15]

It got shot up. At one point, she sticks the phone out the window to show all the bullet casings littering the ground.

[00:53:24]

And she just, you know, turn the the official denial on its head. The government was saying there's no violence here were cars riddled with bullets. It was impossible to dismiss. And from then on, you had more and more citizens, citizen journalists uploading anonymously video of the violence.

[00:53:52]

He's lo fi, shaky shots of shootouts. This member admits beheadings.

[00:54:01]

I mean, bodies hanging, dangling off of overpasses.

[00:54:11]

To prove to the world that this was really happening. They said, we're not crazy. That's a cry for help. Yeah. Which brings us back to that beheading video. We mentioned a bit earlier. Yeah, that video of the beheading. A lot of people are uploading it, condemning the violence of the drug cartels.

[00:54:37]

And when it started showing up on Facebook, much like with the Boston Marathon bombing photo, this team of people, they sat down in a room, looked at the policy weight, the arguments.

[00:54:47]

And my argument was it was OK by the rules during the Boston bombing. Why isn't it OK now, particularly given that it could help? Leaving this up means we weren't hundreds of thousands of people of the brutality of these cartels. And so we kept it up. However, it's looking real. It's what I think is utterly irresponsible. And in fact, quite despicable of them.

[00:55:10]

When people found out I'm talking, I have little neighbor kids that don't need to see shit like that backlash.

[00:55:15]

Is there really any justification for allowing these videos, people as powerful as David Cameron weigh in on this decision?

[00:55:22]

Today, the prime minister strongly criticized the move, saying we have to protect children from this stuff.

[00:55:27]

David Cameron tweeted, It's irresponsible of Facebook to post beheading videos, especially.

[00:55:33]

People were really upset because of what it was showing. And so, according to my sources, some of the folks involved in making this decision to leave it up were once again taken into an executive's office.

[00:55:43]

And so we went up and and there was a lot of internal pressure to remove it.

[00:55:48]

And I'd go to my boss and say, hey, look, this is the decision we made. I recognize this is controversial.

[00:55:54]

I want to let you know why we made these decisions and they made their case. There are valid and important human rights reasons why you would want this to be out there to show the kind of savagery. And she vehemently disagreed with that.

[00:56:06]

They took another approach, arguing that if we take this down, you're deciding to punish people who are trying to raise awareness.

[00:56:12]

Again, she wasn't budging and just didn't get didn't get past that. And ultimately, I was overruled and we removed it. Just because there was pressure to do so. The same people that six months prior told them to leave it up because it was newsworthy.

[00:56:31]

Said take the video down Facebook this week, reverse the decision and banned a video posted to the site of a woman being beheaded.

[00:56:38]

In a statement, Facebook said, quote, When we review, if you want the one from Boston, then you probably should have the one from Mexico. Right. There was a mistake. Yeah, I think it was a mistake because I I felt like like why do we have these rules in place in the first place?

[00:56:58]

And and and it's not the it's not the only reason, but decisions like that or the thing that precipitated me leaving.

[00:57:09]

Leaving. Yeah. Not too long after that incident. A few members of the team decided to quit. OK, we're going to break in here and fast forward to the present in our original broadcast of this story.

[00:57:29]

Simon finished with one final story about a content moderator in the Philippines who for personal and religious reasons, would ignore the rules entirely and just take down whenever she saw fit, which added just one more layer of difficulty to the whole problem.

[00:57:47]

But I think I can hit record on my end. Okay. I think it's working now. OK.

[00:57:53]

In just the past few weeks, all of these questions about newsworthiness and what Facebook should or shouldn't allow on their platform. All of it has gotten even stranger and harder in ways that are definitely going to be having an impact on the 2020 presidential election.

[00:58:12]

OK, well, so work we're doing. We got to start with President Trump.

[00:58:17]

President Trump has threatened to rein in social media companies claiming they're interfering with free speech.

[00:58:23]

This is sort of been all over the news.

[00:58:25]

The president tweeted, quote, Twitter is completely stifling free speech, but it's also been in the shadow of larger news events going on.

[00:58:33]

Yeah, it's like I know I've seen Facebook headlines, but I honestly have not been able to absorb what's happening so long.

[00:58:39]

And short of it is May 26. President Trump penned two tweets attacking Twitter, not Facebook here, initially falsely claiming that mail in ballots will lead to voter fraud and a, quote, rigged election.

[00:58:51]

So, I mean, just to be clear, like there's no like the evidence for voter fraud is almost nonexistent. Yes. Everyone who's looked at this says this is not an issue. And so in response to these tweets, Twitter decided to do something.

[00:59:04]

They labeled these two tweets with an exclamation point and a bit of text that read Get the facts about mail in ballots. The most mild fact check one can imagine.

[00:59:14]

Yes, but it's also a gargantuan step forward based on anything that they've done with Donald Trump, at least up until that point. I mean, it's the first time that Twitter or Facebook have fact checked. The president of the United States, which, unsurprisingly, President Trump hit back at Twitter today.

[00:59:32]

He was not happy about it. President was infuriated. They tried to silence views and they disagree with by selectively implying a fact check, fact check, FEC key fact check.

[00:59:43]

And he actually went so far as to draft and sign an executive order threatening to regulate or shut down social media companies that engage in this sort of fact checking. So you've got the left saying Twitter is not doing enough. The right is upset with Twitter for censoring conservative voices fritters. Then in this position of like, what the heck do we do next? And into that uncertainty, Trump tweets the now infamous line.

[01:00:10]

When the looting starts, the shooting starts.

[01:00:12]

When the looting starts, the shooting starts. This pops up at 11, 53, three p.m. And according to the reporting, all of the Twitter exacts got on to a virtual hangout. And just like many of the cases we just talked about at Facebook, they're like, what do we do about this? First thing they had to consider was that this this tweet did break their rules. Just like Facebook, Twitter's got its own set of rules defining what does and doesn't constitute glorifying violence.

[01:00:43]

And according to that, this did glorify violence. But then at the same time, this is the president of the United States. And so after going back and forth, what they decide is that we're not going to take it down because we don't want to censor the president.

[01:00:58]

But what we're going to do is we're going to shield that information and neuter this tweet's ability to spread. Mm hmm. How do you how do you do both of those things? So the first thing they do is if you're scrolling down your Twitter feed and you come across this tweet instead of seeing the text of that tweet, all you see is this tweet has violated the Twitter rules by glorifying violence and you DNF to actually click on the tweet to view it.

[01:01:29]

So there's one click that they've put between you and the information itself. OK, OK.

[01:01:34]

Then let's say you want to read tweet that you want to help. This information spreads. You want to put it in front of all of the people that follow you.

[01:01:41]

You click re tweet and instead of just being able to retreat, another box pops up that says comment on this tweet. At that point, you then have to write your own tweet about Donald Trump's tweeting question.

[01:01:56]

And when you finally are able to tweet this, the way it now shows up for others is your comment.

[01:02:02]

And then again, that text, this tweet glorifies violence. Interesting. So now for someone else to get to it through your feed, they have to click and go through that same rigmarole. Mm hmm. So that's what Twitter decided to do.

[01:02:16]

But here here's where you see a real fork in the road, because simultaneously we have a different policy, I think, than Twitter on this.

[01:02:25]

Trump posted the exact same thing on Facebook.

[01:02:28]

And as Mark Zuckerberg explained during an interview on Fox News, you know, I just believe strongly that that Facebook shouldn't be the arbiter of truth, of everything that people say online.

[01:02:41]

Our policy has been halted for several years that we are not going to infringe upon political speech.

[01:02:47]

I think in general, private companies probably shouldn't be, or especially these platform companies shouldn't be in the position of doing that. And we've been pretty clear on our policy.

[01:02:58]

And so we're not going to do anything. We're going to let this post stand. So we've got same post, two different companies coming to two very different decisions. And so to make sense of this difference, I give another call.

[01:03:13]

I think that I understood. I mean, I get both decisions, but to keep it clonic, I just would kind of be happy for some consistency.

[01:03:21]

And the way she tells it, Facebook decision to leave this post up really goes all the way back to that newsworthiness argument about the Boston Marathon bombing.

[01:03:31]

Yeah, it's a little bit more complicated than that. But because of newsworthiness, when a public figure or someone of influence in some way posts on Facebook like, say, Donald Trump or Michael Jordan, basically because they are a public figure, it's inherently newsworthy.

[01:03:47]

It's kind of a circular definition.

[01:03:50]

Essentially, Facebook is saying, yes, we have rules about hate speech and violence, but anything a famous person says is newsworthy enough that so that they don't have to abide by those rules, which is mean that they can just say whatever the hell they want.

[01:04:06]

Pretty much this was one of the reasons why Donald Trump statement, Donald J.

[01:04:12]

Trump is calling for a total and complete shutdown of Muslims entering the United States about a Muslim ban, which would have come down as hate speech if anyone else had said it were kept up by the platform because they justified it as well. He is like a candidate running for office, a presidential candidate, and this is newsworthy.

[01:04:34]

And this is where you feel the rub, cause on the one hand, there's plenty of evidence that people in power using charged language can lead to violence.

[01:04:44]

But on the other, I think there's an argument to be made that isn't it important for us to know what our leaders are saying and thinking, particularly when it's threatening or dangerous. And that argument has basically exploded within Facebook own ranks.

[01:05:00]

Facebook employees attacking Mark Zuckerberg first, Zuckerberg defending his decision not. There's been an employee walkout. Others have actually resigned.

[01:05:09]

A number of senior Facebook executives publicly sharing their outrage.

[01:05:12]

And then sort of a who's who of former Facebook employees, folks who were there at the beginning penned an open letter to Zucker Berg urging him to reverse his decision.

[01:05:23]

Yeah, I agree that the decision is dangerous. It's that simple. And so we actually reached back out to one of our sources from the original piece, a former Facebook employee, to get his take on this. Just like before, we're using an actor to conceal his identity. But he says when you look at the reasoning, the rationale behind this public figure, newsworthiness, exception.

[01:05:48]

In addition to it being dangerous and wrong, it was just reticular sleep, badly done. I mean, if you're going to come up with a pretextual reason to do it more effectively, the idea that you want to protect political speech, I have sympathy with. But you would not do what they're doing.

[01:06:03]

Basically, he's saying they're being incredibly selective in what political speech? They're choosing to privilege, you just you can't stand there and say you're all in favor of free expression while banning political cartoons for hate speech and the idea that political freedom requires that the people with the most power in society have to use rules. Is it obvious nonsense, right? The spirit of the First Amendment is meant to protect US citizens from the powerful, not the other way around.

[01:06:31]

So I don't know. It's repugnant to me. It's like an inversion of everything. It's OK to be.

[01:06:40]

Take the best parts of the Internet as they amplify voices that do not traditionally have amplification on their own.

[01:06:49]

But what we're seeing in these these types of rules is it doubles down on amplify and bringing the power structures that already exist in society to the Internet, allowing people who are already powerful, that already are newsworthy themselves, even more power to speak.

[01:07:11]

And that's what makes this moment so confusing, because on the one hand, you have videos coming out of Black Lives Matter protests that are shining a light on issues of policing and systemic racism. And we need to see those videos unfiltered. But then at the same time, right alongside that, you have posts from the leader of the free world encouraging violence. And there they are right next to each other, side by side. Now, lastly, I think it's important to keep in mind that none of this is static, that just like all their other rules and policies, this public figure carve out has been and is again being tweaked.

[01:08:06]

I mean, well, I was tracking these very words I'm saying to you now. I received a notification that Facebook just removed posts from Trump's re-election campaign because those posts featured a symbol used by Nazi Germany. And not long ago, Facebook actually went even further in light of coded. They actually halted this exception. They said, listen, if you're saying something that is false about the pandemic or about Cauvin 19, we are going to remove what you said.

[01:08:40]

No matter what. Wow. People were really happy with that decision. And I think that that was right where we kind of it was like pre Boston Marathon days and maybe where we should have been always. This episode was reported by Simon Adler with help from Tracy Hunt and produced by Simon with help from Bethel Hobday. Big thanks to Sarah Roberts, whose research into commercial content moderation got us going big time. And we thank you very, very much for that.

[01:09:55]

Thanks also to Jeffrey Rosen, who helped us in our thinking about what Facebook is to Michael Chernus, whose voice we used to mask other people's voices. Caroline Glanville, Ruchika Bood Rajha, Brian Dukan, Ellen Silver, James Mitchell and Guy Rosen, of course, to all the content moderators who took the time to talk to us. And do you want to sign off? Yeah, I guess we should. Should you want to go first? Yes.

[01:10:18]

I'm Jad Abumrad. I'm Robert Krulwich. Thanks for listening.

[01:10:29]

This is less from Perth, Australia. Radiolab is created by Jad Abumrad with Robert Krulwich and produced by Soarin Wheeler. Dylan Kafe as our director of Sound Design. Lichtenberg is our executive producer. Our staff includes Simon Adler, Baker Brendler, Rachel Cusick, David Gable, vessel Hoppity. Tracy Hunn, Matt Kielty, Annie McEwan, Latif Nazr, Sara Cari, Ari on Wack, Pat Waltz's and Molly Webster with help from Cima, Ola W, Harry Fortuna, Sarah Sandbox, Melissa Donald, Todd Davis and Russell Gragg, a fact checker as Michelle Harat.