Happy Scribe Logo

Transcript

Proofread by 0 readers
[00:00:00]

Hey, this is Jad Radiolab is supported by IBM, we want the best of both worlds. We want a hybrid, a smarter hybrid cloud approach with IBM helps retailers manage supply chains with Watson A.I. while predicting demands with ease. The world is going hybrid with IBM. Go hybrid at IBM. Dotcom slash hybrid cloud.

[00:00:23]

Hey, this is Jad Radiolab is supported by Geico. Do you own or rent your home? Sure you do. Fortunately, Geico makes it easy to bundle your home and car insurance. It's a good thing too, because having a home is hard work. Go to Geico Dotcom, get a quote and see how much you could save Geico Dotcom. Easy.

[00:00:45]

Listener supported WNYC Studios. Wait, you're OK? You're listening to Radiolab Radio from WNYC.

[00:01:09]

Three, two, one. Hey, I'm Jad Abumrad. This is Radiolab. Today we have got a special collaboration with The New Yorker magazine in The New Yorker Radio Hour.

[00:01:18]

Very excited about that.

[00:01:20]

So for the last several years, we here at Radiolab and by we I mean mainly Simon Adler, he we have been watching and reporting on Facebook, specifically how Facebook decides and then enforces what people can and cannot post on their site. As many of you know, the way that they do it is they've got this rule book, one single set of rules for the many countries in the globe that define what is possible and what isn't. And then they have a giant army of fifteen thousand souls who have to moderate all the crap that we put on Facebook.

[00:01:59]

Anyhow, in doing so, Facebook has managed to piss off well, just about everybody.

[00:02:07]

I mean, despite all of the time, effort and money that they have thrown at this problem by taking posts down censorship of things, they have been accused of censoring voices across the political spectrum and infringing on user's right to free expression.

[00:02:22]

Let them post pictures, nursing enemies of the person, and then by leaving material up.

[00:02:30]

To incite violence against following investigators, they've been accused of helping to incite a genocide in Myanmar. Big stories including Felisha and arguably swing the 2016 U.S. presidential election were one of the most impactful decisions.

[00:02:47]

And I start here with this wrap up, because since we last reported on all of that, Facebook has actually made a pretty big shift in how they are going to approach policing, refereeing the world speech. It's a shift that it's going to have a massive impact on their decisions about what is and is not allowed on the site, including the question which we'll talk about in a second, of whether former President Trump should be banned indefinitely from Facebook.

[00:03:12]

But more deeply, this is a shift that has Facebook really looking less like a company and oddly, a little bit more like a government, an unelected government for the entire planet. So with all of that, let me now hand off to.

[00:03:30]

Hi. Hello. How are you, Simon? Are you rolling on your end? There we go.

[00:03:37]

Now I am really great. I will record myself on my phone.

[00:03:40]

Yeah. So a couple of months back, I called up academic Kate Clonic to talk about this shift and this research project. She's been working on documenting it.

[00:03:50]

I want to be done with this project. So, yeah, this has been your life. Yeah, it has like a little bit too much. So I'm ready to, like, you know, I'm ready to kind of do something different.

[00:04:03]

Kate is a professor of law at St. John's University. She's studied Facebook off and on for years. And she was at it again because back in twenty eighteen, Mark Zuckerberg, the company's CEO, was considering this strange proposal.

[00:04:17]

Yes. Like this crazy project to solve this crisis about content management.

[00:04:23]

I think, you know that I've been kind of an inside for the last couple years, like a little over a year.

[00:04:30]

And Kate actually sat down with Mark to talk about all this. She did it over the computer. So you'll hear some clacking of keys.

[00:04:39]

But anyway, as he told her, I said a bunch of times, I just think that it's not sustainable over time for one person or even one company's operations to be making so many decisions, balancing free expression of safety at the scale.

[00:04:59]

Like, I recognize that this is a huge responsibility and I'm not going to be here forever.

[00:05:04]

And I like I plan to be running the company for a while, but one day I could be running the company. And I think at that point it would be it would be hard to have built up a separate set of independent structures that ensure that the values around free expression and balance in these equities can exist.

[00:05:21]

Oh, interesting. Like, I trust me, but I don't necessarily trust the next guy.

[00:05:27]

Right. And so, like a benevolent dictator, he wants to devolve power away from Facebook and himself.

[00:05:34]

And what he'd landed on as a model for how to do this was the Supreme Court for Facebook and sorry.

[00:05:42]

What exactly. Like what? Yeah.

[00:05:45]

So the proposal was pretty simple. It was creating a group of people from all over the world that would basically be there's oversight on Facebook and its speech policies.

[00:05:56]

Essentially think of it as like the Supreme Court of the United States. But instead of overruling lower courts decisions, this Supreme Court of Facebook would be able to overrule Facebook's own decisions.

[00:06:10]

It's a hard pitch to make, isn't it? Oh, my God. One hundred percent. You can imagine how that went over there. Like, why would you want us to do what? That's how I imagine that going. Yeah, but Mark wanted this to happen.

[00:06:22]

And so it happened. It's part of a larger sense, I think, that he sees Facebook becoming more and more like the government isn't even the best term, but like a system of government.

[00:06:38]

I try to use the fact that I have control to basically help implement different forms of governance, like a long term legacy that he knows will not make terrible decisions.

[00:06:56]

This this seems to be them catching up and being like, yeah, like if you've got three billion users, you're bigger than any company at that point, any country, your rules can be as impactful as any government's laws. And so you really need to start thinking of yourself in a new way. Yeah, I think that's right.

[00:07:16]

Has any company ever done anything like this before?

[00:07:19]

I mean, honestly, there's nothing that even kind of comes close. And I don't want to be grandiose about this, but there is a sense in which feels like you're what? I felt like I was watching an experiment that would if it even if it completely and utterly failed, would be remembered and be a lesson for however the world ends up sorting out this problem of online speech.

[00:07:44]

And so once Facebook decided to build this court, they suddenly needed to figure out, like what cases would go to the court, who would be on it, how would they make these decisions?

[00:07:57]

And it became clear that it's not appropriate to have a single person answer these questions on behalf of society or this institution.

[00:08:07]

This is Brent Harris who led Facebook's effort to build this board, this court.

[00:08:12]

And as one of his first decisions, he said we need to go out and actually listen to a wide array of people about what the problems are and the challenges are that they are finding and ask them what do they want this to be? What can what can we create?

[00:08:26]

And so they held dozens of listening sessions all over the world talking to lay people.

[00:08:31]

But the cornerstone of this process was really six global workshops where they invited experts to come and weigh in.

[00:08:41]

Kate was one of 40 or so people that attended the US workshop. It was held in the basement of the Nomad Hotel in downtown Manhattan.

[00:08:50]

And when she walked in, it was like walking into a technologist, sweating. You come in every like table is decorated with succulents and bottles of water and an iPad. The iPad is not for you to keep.

[00:09:04]

And in fact, someone joked to one of the Facebook people, just me, like we used a couple generations old iPad to make sure no one walked away with the spectacular.

[00:09:16]

And so you have an iPad.

[00:09:18]

And ultimately this moderator came out and tried to get the room's attention.

[00:09:23]

And, of course, like everyone's half listening. Most people on their phones and like whatever else in part because, like a lot of people in that room were just very skeptical of what Facebook was doing here. I mean, Kate herself remained somewhat skeptical of this court.

[00:09:38]

This is just something Facebook can scapegoat. It's really crappy decisions, too.

[00:09:43]

That was my main skeptical point in all of this, that Facebook is essentially erecting what will be just a body to absorb blame. But anyhow, the moderator explained what they were up to, that they brought these experts here to, in essence, design this institution.

[00:10:02]

So what do you think this should be like? What does it look like? And some of it was like an answer to questions. Some of those things people brought up case selection questions or selection, who picks the board? And I would say a solid third of it was people standing up and holding forth on topics that had nothing to do with why we are there that day.

[00:10:21]

Less of a question and more of a comment like, holy cow. So many of those.

[00:10:29]

Eventually, though, they got to the heart of the matter, like, how should a global board think about these cases that are that are right on the edge?

[00:10:37]

What we wanted to do was really put people in the shoes that Facebook is in right now and taking these decisions.

[00:10:44]

So they told them like, hey, you are going to play mock court as a group.

[00:10:49]

You're going to have to decide whether a piece of content should stay up on Facebook or come down.

[00:10:55]

And so everyone was asked to open their iPad. So you were asked to like, we're going to go over the first simulation and you'll love this time in the first simulation that they did was the Kill All Men simulation.

[00:11:08]

Really? Yes. Wow. Oh, that's great.

[00:11:11]

Oh, this is the thing you the one that you focused on in the last story. I remember there was like a song in there.

[00:11:17]

Yeah, it's right. You're totally right.

[00:11:20]

We we spent 10, 15 minutes dissecting this piece of content.

[00:11:23]

You know what? You should play this and just be like, here's what they focused on. OK, yeah.

[00:11:28]

I think we only need to do about three minutes of it, but here it is going to keep it moving right along the next. We come to the stage. Please give it up for Marsia Belski.

[00:11:39]

We did this back in twenty eighteen. It's about comedian Marsha Bolshy and a photo she posted.

[00:11:45]

Yeah, I guess I get so mad. I feel like my first time to the city. I was such a carefree brat, you know, I was young and I had these older friends which I thought was like very cool.

[00:11:57]

And then you just realize that they're alcoholics, you know, this is her up on stage. She's got dark, curly hair was raised in Oklahoma. How did you decide to become a comedian?

[00:12:09]

You know, it was kind of the only thing that ever clicked with me, and especially political comedy.

[00:12:15]

I used to watch The Daily Show every day and inspired by this political comedy, she started this running bit that I think can be called sort of absurdist feminist comedy.

[00:12:26]

Now, a lot of people think that I'm like an angry feminist, which is weird. This guy called me a militant feminist the other day and I'm like, OK. And just because I am training of militia women. In the woods at first, I just had this running bet online on Facebook and Twitter, she was tweeting or posting jokes, you know, like we have all the Buffalo Wild wings surrounded.

[00:12:52]

You know, things like that eventually took this bit on stage, even wrote some songs. Older white men should die, but not my dad. Anyhow, so about a year into this running bit, Marcia was bored at work one day and logged on to Facebook.

[00:13:17]

But instead of seeing her normal news feed, there was this message that pops up.

[00:13:22]

It says you posted something that discriminated along the lines of race, gender or ethnicity group.

[00:13:30]

And so we removed that post. And so I'm like, what could I possibly have posted? I really I thought was like a glitch. But then she clicked continue.

[00:13:40]

And there highlighted was the violating post. It was a photo of hers. What is the picture?

[00:13:45]

Can you describe it? The photo is me as look can only be described as a cherub, cute little seven year old with big curly hair, and she's wearing this blue floral dress.

[00:13:55]

Her teeth are all messed up and into the photo Marcia had edited in a speech bubble that just says Kill all men.

[00:14:02]

And that's fine because I hit a hit. It's funny, you know. Trust me, whatever.

[00:14:07]

Facebook had taken it down because it violated their hate speech policy. I was dumbfounded.

[00:14:16]

And so back to present day, this is the scenario they put in front of these tech elites in the basement of the Nomad Hotel to see really how they would react.

[00:14:30]

Is that hate speech, what does that mean? And should that be up on Facebook or not? Leave it up or take it down? And so people started to discuss.

[00:14:39]

People were like, well, this wasn't funny. And someone else was like, does it matter whether it's funny or not? Back and forth and back and forth.

[00:14:46]

And even so, like, should men be protected?

[00:14:48]

Like men are more protected than other groups.

[00:14:52]

Eventually, though, the room pretty much came to an agreement on that is clearly humor or social commentary that should be up on Facebook. And it's inappropriate for Facebook to take that down.

[00:15:05]

Yeah, I get that. I mean, I remember when we first did this feeling like like this is a harmless joke, right?

[00:15:11]

And Facebook should be a place where harmless jokes can get made, because in this case, the joke only works because men are the power structure. If they weren't, it wouldn't be funny.

[00:15:22]

Yeah, it's punching up. There you go. It's punching up. Right.

[00:15:26]

But here's where things get interesting, because as we said, they did six of these expert global workshops, Berlin, Singapore, New Delhi, Mexico City, Nairobi. And in each of them, they they ran through this kill all men scenario.

[00:15:45]

We we ran that case across the world. And something that's very, very striking is we got really different viewpoints about should that be up on Facebook or not like not just at the New York workshop, but in Berlin, another Western liberal democracy.

[00:16:04]

And even Singapore folks supported leaving it up.

[00:16:09]

And, you know, you'd think that that folks who'd experienced more authoritarian governments and restrictions on their speech would also be for leaving it up.

[00:16:19]

But it didn't go that way recently, but go for it. But I understand that. And I understand that. I can't. Of course I can. Olmen, that's the most feminist radical joke that you can make.

[00:16:35]

This is Burgen Tyee. She works for an NGO called Access.

[00:16:39]

Now we defend and extend the rights of users at risk around the world. And when she was shown this photo at the Global Workshop in Nairobi, which had attendees from all across the African continent, her thought was is very funny.

[00:16:53]

And, you know, many of us that our feminists might upset that once and personally. All right. You just like you. Yeah. And I like I understand that to be a joke. So I'm like, yeah, of course there should be space for humor. And then I know why satire is so important.

[00:17:08]

But but I'm sensing a but what is it? So, you know, it's how do I put it. So for me right now, you know, it's funny, but, you know, humor is a luxury.

[00:17:23]

And we're not we're not immune. None of us are laughing right now. So, yes, we've seen content like that, unfortunately, quite prevalent, and we've lived through it. So it's not something that we joke about. Right. What is she what events in the world is she thinking of when she says that? Well, some very recent history. And so we're going to take a little bit of a detour here to to understand why Berthon would want that KMB joke take it down.

[00:17:55]

And along the way, we're going to see close up, really the life and death decisions this global court will have to make. We'll get to that right after a quick break. This is Lauren Furi from Western Springs, Illinois. Radiolab is supported in part by the Alfred P. Sloan Foundation, enhancing public understanding of science and technology in the modern world. More information about Sloan at that.

[00:18:36]

Sloan Dog Science reporting on Radiolab is supported in part by Science Sandbox, a science foundation initiative dedicated to engaging everyone with the process of science.

[00:18:53]

The Sajad Radiolab is supported by IBM, we all want sophisticated but simple cutting edge made user friendly. In other words, we want a hybrid and so do retailers.

[00:19:03]

That's why they are going hybrid with IBM. A hybrid cloud approach with Watson A.I. helps them manage supply chains while predicting demands with ease from retail to health care.

[00:19:15]

Businesses are going with a smarter hybrid cloud using the tools, platform and expertise of IBM. The world is going hybrid with IBM, go hybrid at IBM, dotcom hybrid cloud.

[00:19:28]

Whether you're debuting a new gonads podcast series or launching a new startup, making big announcements can be daunting. When the stakes are high and attention spans are low, your messaging has to be as powerful and eye-catching as possible. Radiolab is supported by Squarespace and they're here to help. Squarespace is the one stop shop for your domain website and marketing tool needs. They offer beautiful templates created by world class designers and everything is optimized for mobile. Right out of the box, go to Squarespace Dotcom slash radio lab for a free trial.

[00:19:59]

And when you're ready to launch, use the offer code radio lab to save 10 percent off your first purchase of a website or domain.

[00:20:07]

True democracy was a myth at America's founding, and today we still have a lot of work to do. I'm Julia Longoria and in a new weekly show from The Atlantic and WNYC Studios, we explore the powerful ideas that shaped the United States and what happens when those ideas collide with people's real lives. Listen to the experiment on Apple podcasts. Jad Radiolab here with Simon Adler. Yes, yes, yes, yes, yes.

[00:20:44]

OK, before we went to break, we met digital rights activist Bronté, who was opposed to leaving a joke like Kill All Men on Facebook. That is correct.

[00:20:54]

So why is that? What was she thinking? Yeah, well, I mean, it comes down to what's been going on in her home country.

[00:21:03]

Absurdity in Ethiopia right now. In Ethiopia, there's a lot of animosity between different groups. A lot of tension.

[00:21:11]

And looking at just the past four or five years, there you see how these questions of who's punching up and who's punching down can get flipped on their head with the click of a mouse. So to set things up, Ethiopia sits right on the Horn of Africa. It's the second most populous country on the continent. And for a long time, it was considered one of the world's leading jailers of journalists.

[00:21:38]

Politically, the country used to be very authoritarian, very repressive.

[00:21:43]

This is online activist turned academic and cholla assistant professor at Harvard University. And yes, I can see that me and some of my colleagues were like the first people blogging to the Ethiopian public.

[00:21:56]

He was actually forced into exile because of this activism and the way he tells it that 2015, the worst unrest in a decade.

[00:22:08]

The demonstrations started as a small scale student protest. Student protest broke out and they start spreading across the country. Thousands took to Ethiopia's streets over the weekend.

[00:22:18]

And watching this unfold from the United States, Dr. Cholla noticed that at the center of these protests was this guy Jiwa Mohammed.

[00:22:27]

Yes, Joa himself is a very tech savvy guy. He's articulate in English.

[00:22:37]

Dissenting voices are allowed. That is going to be sufficient pressure on the government to break its will.

[00:22:42]

And he had about one point four, three million followers on Facebook, making him as powerful as just about any news organization in Ethiopia.

[00:22:53]

Now, a couple of quick things about juror number one. He is from the Oromo ethnicity, the largest ethnic group in the country. And we'll get more into that in a moment. But first, the other notable thing about U'wa is that at the time that these protests were getting underway, he was actually living in Minnesota. He was in exile there thousands of miles away from the action, at least 75 people killed.

[00:23:21]

But as these protests intensified, including clashes with the government, they died.

[00:23:28]

Two people were killed in clashes that he was able to galvanize folks and direct things because of Facebook with you live in America, coming to Kenya and some of his young men.

[00:23:48]

So that sort of amazingly, when these protests succeeded. Hailemariam Desalegn has resigned amid deadly anti-government protests that he was lionized as well, a hero, not one who'd helped usher in a new prime minister.

[00:24:09]

Ethiopia has a new leader. Abu Ahmed Abu Ahmed, one six and a new era in Ethiopia. Since coming to power, Prime Minister Abiomed was engaged in listening to what the people of the country have to say for the first time in our entire maybe 3000 years of history.

[00:24:29]

Again, Bronté. We actually thought we could be a cohesive, united country.

[00:24:36]

The government freed thousands of political prisoners and journalists, the latest of sweeping measures invited those in exile for the Cuban dissidents exiled abroad to come back home, even ended a decades long conflict with neighboring Eritrea. A promise delivered.

[00:24:51]

I mean, these changes were so profound that Ethiopia's new prime minister, Abiomed, thanks in no small part to Joie Mohammed, went on to win.

[00:25:02]

The Nobel committee has decided to award the Nobel Peace Prize to Ethiopian Prime Minister Ahmed Ahmed Ali. That's right, the Nobel Peace Prize. So what you've got here is really the promise of Facebook realized, right, like man from thousands of miles away leverages Facebook's power to bring down an authoritarian government and elevate a peace loving leader. I mean, this is David and Goliath level.

[00:25:45]

And as part of all of these reforms, I will be coming back to a country we have now established our office in Aqaba, Jordan.

[00:25:53]

Mohammed returned to Ethiopia and was welcomed with open arms. However, while Ahmed's reform ambitions have increased his popularity, analysts fear that ethnic rivalries in Ethiopia will undermine his reforms.

[00:26:09]

The very forces that brought this change about began pulling in the opposite direction.

[00:26:14]

And I'm sure you agree you're going to get a lot of reaction for this because everything is contested in Ethiopia. Every historical fact, everything, you know, you see people are confused. There is information disorder in the United States. This is just like child's play when you compare it with Ethiopia. But, yes, the first the first violence that happened was in 2018. The first it was gruesome pictures circulating on Facebook along with, you know, different ethnic minority sentiment.

[00:26:49]

But what were the ethnic tensions and what was being said? Yeah, so how complicated to get around in the weeds to get get get complicated? Well, OK, so as I mentioned, Zhihua is part of the Oromo ethnicity, the largest ethnicity in the country.

[00:27:08]

And, well, the Roma are the largest. They've also long felt politically and culturally marginalized. And there's this feeling of marginalization, this resentment.

[00:27:22]

This was really at the heart of the revolutionary protests that Jiwa had helped lead.

[00:27:29]

So I'm just curious, are you a Roma first? Ethiopian first?

[00:27:33]

I am on Roma first.

[00:27:34]

I mean, many of his posts pointed directly at it.

[00:27:38]

He would say a little more precise in how it almost marginalised. And that is absolutely OK with me because there is some historical truth to it. But he's a guy. Like who? I heat up the temperature, ramp up some emotions, I as I said, we are forced to fight back, to coalesce together, to come together and fight back.

[00:28:06]

But now, even with the old government out of power and a new Oromo prime minister in power, Jiwa Muhammad, did not let up. He kept stoking this resentment.

[00:28:22]

To be honest with you, I think that is a risk of not civil war, but catastrophic communal violence across the country.

[00:28:31]

I think people of American Samoa and with this inversion of power statements he was making during the protests sounded very different in twenty eighteen, like even just the line, this is our land, this is our homeland, went from being about Ethiopians getting a corrupt government out of power to Oromos, getting minorities out of their territory.

[00:28:57]

And quickly, the language began to escalate. He will ramp up with, like, protect your land minorities. They are aliens. They're going to do to you know, they are evil until eventually. October 2090, the riots began on the 21st of October 2019 and lasted for several days. A mob took to the streets, burned cars and killed several people they thought were their opponents. Eighty six people died across the country. What caused this horrific outbreak of violence?

[00:29:32]

The Facebook post by Opposition Leader Jabar Mohammed.

[00:29:35]

One evening from his home in Addis Ababa, Jiwa Mohammed posted an unsupported claim, insinuating that he going to be killed by the minorities in his post.

[00:29:46]

He called on his supporters for help.

[00:29:48]

In response, some of his followers called for war and well denies that he was intentionally inciting violence.

[00:29:55]

Hape flooded onto Facebook and calling for the killing of all minority groups.

[00:30:01]

Again, Barentu content actually telling people, look, if your neighbor is from a different ethnic group going killed, literally, that was what we were seeing.

[00:30:10]

And then everyone started to take things on their own hand.

[00:30:17]

And, you know, kids, minorities. Everything that could go wrong went wrong. No, not really. No minorities were brutally murdered like brutal, brutal, brutal, gruesome violence. Minority communities being brutally targeted by the Oromo, the country's largest ethnic group in the picture.

[00:30:50]

About when they tried to cut my granddaughter's breast. I took out nine and I begged them to cut mine instead. Then they stopped get. But they took her father instead. And since then, the government just has not been able to get back to any sort of peace than 800000, at least five people were shot dead by police on Monday, at least 50 fatal shooting of a child. And so every couple of weeks, dozens have been killed. And there's just another outbreak of this sort of violence, presumably, that this change is political change.

[00:31:30]

And that is bullshit for me. I'm sorry, but that is what happened.

[00:31:46]

And so back in Nairobi, in an air conditioned conference room where this Supreme Court of Facebook training session was underway as Burgen was sitting there staring down at this iPad with a photo on it that says Kill all men.

[00:32:06]

She's like. Yeah, this has to come down, you know, I'm not supposed to, you know, even give space to having a conversation about content, governance and moderation when it's about humor. And Birhan was not alone in this.

[00:32:21]

Many people felt that as an incitement to violence that could result in actual harm. Again, Facebook's Brent Harris.

[00:32:28]

And that is something that should not be on Facebook.

[00:32:31]

And so I think around 4:00 p.m., to be honest with you, I left.

[00:32:36]

She walked out of the session because I was just like, no, this isn't this does not address the issues that we're talking about today.

[00:32:43]

Damn, what do we do? Because it really is a we what do we do if the very thing that people in New York, in an ironic way, say must stay up is the very thing that makes her walk out because it's just utterly privileged and completely ignorant of the real life consequences of hate speech. That's wow. And keep in mind, these are just mock trials, training sessions, really like they ran into this as they were trying to answer how to answer these sorts of questions.

[00:33:23]

And now we will get to some of their actual rulings and the Supreme Court itself.

[00:33:29]

Yes, but first, like I think that the tension we're seeing here goes deeper than this one example.

[00:33:37]

I mean, at the core of Facebook is this very American understanding of freedom of expression. And you hear this even in the way Facebook executives just talk about the company, more people being able to share their experiences.

[00:33:53]

That's how we make progress together.

[00:33:55]

You know, how many times has Mark Zuckerberg said some version of this most progress in our lives actually comes from individuals having more of a voice.

[00:34:05]

But when you talk to people from from different parts of the world, like there's not universal agreement on this, I will definitely tell you that I found myself.

[00:34:16]

Oh, my goodness, I was not as liberal as I thought.

[00:34:19]

Again, Professor Milk, chocolate milk, that Facebook came and overwhelmed us with information. We didn't have a well established fact checking system. We didn't have journalism institutions.

[00:34:32]

We Ethiopia have only imported Facebook. We haven't imported the rest of the institutions and democratic foundations, the economic security around which such untrammeled freedom of expression is beneficial and so we can use local.

[00:34:51]

So I saw that freedom of expression and technology will help us liberate us and get us out of authoritarian system. Now, I have seen people get angry and they will take matters into their own hands. That's what happened. So it's about like a choice between coexistence or being whatever you want to say, it comes down to that for me. And as I've seen, the violence that those speak has made, I think I would prefer coexistence. And to put that opinion in perspective here, 80 percent of Facebook users are not American.

[00:35:32]

Eight zero. Yeah, really? Yeah.

[00:35:34]

And I think moderation is a very difficult task, one that's being done by people that have no fucking idea about our way of life. And unfortunately, it's us that are being affected over and over again with these things.

[00:35:48]

And then you guys, I mean, is there anyone openly advocating for just abolishing Facebook? Yes, but I don't think anybody is taking that particularly seriously.

[00:35:59]

But I mean, come on, like at a certain point for private company becomes so.

[00:36:05]

Potentially toxic to the very basic. Functioning of a decent democracy? I don't know, man, I don't know, unless you can somehow break Facebook into a balkanized set of Internets where each one has its own separate rules, but I doubt that's even possible.

[00:36:23]

Well, engineering wise, it is possible. Facebook, in a few rare instances, already does employ some version of this.

[00:36:34]

I spoke to Monica Bechard, who is Facebook's head of global policy, and she explained that there are certain slurs that are outlawed in specific regions but allowed everywhere else.

[00:36:47]

And similarly, they do have to abide by local laws. But she did go on to say that, quote, If you want a borderless community, you have to have global policies and that she doesn't expect that to change.

[00:37:03]

No, no, that's crazy. You're going to have to be so astute and so aware of regional context and regional history. I just don't think that's possible. So actually, now that I'm saying it out loud, I think they should be outlawed. I don't know.

[00:37:19]

I suddenly talked myself into a very extreme position, but it suddenly seems like what other solution is there?

[00:37:26]

Well, the solution Facebook has landed on is this Supreme Court.

[00:37:32]

After those global workshops, they took all that feedback and created this this independent structure. It's going to have 40 members. It currently has 19. The members represent every continent other than Antarctica, and they're from just a wide array of backgrounds. Some are lawyers. Others are free speech scholars, activists, journalists, even a former prime minister of Denmark.

[00:38:01]

And among the first decisions they're going to have to make is whether or not former President Trump will be banned from the platform indefinitely.

[00:38:12]

Facebook has currently banned him, but it will be up to the board to to rule on whether that ban should remain or be lifted. And I mean, this decision won't just impact Trump.

[00:38:26]

It could very well have implications for how Facebook will deal with political figures, not just in the United States, but in places like Ethiopia.

[00:38:36]

Hello. Hello, Minoff, very nice to meet you. Virtually here. You doing? I'm good. How are you, sir? All right.

[00:38:42]

And well, making the right decisions for for the entire planet seems in many ways impossible.

[00:38:51]

When I sat down and talked to several members of of this court, of this board, I have to say they did make me a little bit hopeful.

[00:39:00]

Thanks so much for being willing to do this. I hope we can have a little bit of fun here today. I hope so.

[00:39:05]

Yes. I think we should make as much controversy as possible. Oh, wow.

[00:39:09]

OK, well, this is Meineke. He's a member of the board, former special rapporteur to the United Nations.

[00:39:16]

And he's basically spent his entire life fighting for human rights.

[00:39:20]

And what struck me about him right off the bat is just how on Facebook he is.

[00:39:27]

I haven't I haven't used Facebook or Twitter myself, really. I'm old school. I try to keep my private life private.

[00:39:35]

Why the hell were you chosen to be on the oversight board of a product that you don't even use? Why?

[00:39:41]

Yeah, because of all kinds of people being chosen for it.

[00:39:45]

I mean, that's the beauty of it, doesn't it, that we have all kinds of people on the board, all kinds, and that he sees the solution here in the incremental progress we've made in the past?

[00:39:54]

You know, look, I see I see this work as as human rights work. And I've I have gone through in my life two different things around hate speech, using radio in first of all, Rwanda, then in Kenya as well. The media can be abused. And then how do you rein them in, how to mitigate them and how do you mitigate them in a way that doesn't abuse human rights so that the tools and the problems is basically the same.

[00:40:17]

The difference is that media, mainstream media before social media has been regulated over time, decades and years and then informed and guided how the information is put out, he said.

[00:40:29]

Just look at the five second delay that live television runs on now.

[00:40:34]

I'm sure when it started with live television, live radio, it was it was on on the go. So I think that's the questions we have to now deal with Facebook. But I think I think I've I have confidence that there is enough experience in the world that's dealt with these phenomenons.

[00:40:50]

And this feeling resonates with most of the people I spoke to at Facebook.

[00:40:54]

I mean, I spent about fifteen years working on climate before I came to Facebook. And I think the issues here are deeply analogous. Again, Brent Harris, they are human generated. There are. Major regulatory actions that are needed, there's a serious responsibility by industry to step up and think about the responsibility that they hold and the solutions that will come forward as we start to figure out how to address these types of challenges will inherently be incremental. And at times I worry we will kill off incremental good progress that start to address these issues because they don't solve everything.

[00:41:34]

Is the Paris agreement enough? No. Is it a lot better than what we had before? Yes. Is the Montreal Protocol enough? No. Is it a substantial step forward against this challenge? Yes. And building this board is only one step in a wide array of many other steps that need to be taken on.

[00:41:54]

It sounds to me the way you're saying is this is the first piece in this global governance body Facebook is imagining. Well, if it really works and people end up believing in it and thinking it's a step forward, then further steps can be taken. Nothing is ever perfect, there are always going to be issues, people criticize the specific people who are on it or criticize process.

[00:42:21]

And I mean, when Kate Clonic, who turned us on to this story to begin with, when she interviewed Mark Zuckerberg, he said, as much as the oversight board is, the end is one institution that needs to get built as part of the eventual community governance and governance that I think we will end up having 10 years from now or however long it takes to to build all of this out. It is kind of a concrete step that we could go take.

[00:42:51]

And what they're thinking of in terms of next steps, one would be something like regional circuits or, you know, a level of adjudication that are more regional or more localized that sits below this board as a means of taking these decisions. You mean like seven continental courts or I don't 50 to sub regional courts that feed up to the one Supreme Court? Yeah, that's right. And so what we're watching spring up here is not just a solution to what is truly one of the problems of our of our moment, but also this wholly new way to organize ourselves and sort of adjudicate our behavior.

[00:43:42]

Look, look what what we're trying to do as an experiment. I cannot tell you it will work, but but I can tell you will try to make it work as much as possible. And when we make mistakes, I'm absolutely I have got no doubt in my mind that being the humans, we are not yet evolved to into saints and angels. We will make mistakes. That's part of the process. The oversight board started officially hearing cases in October.

[00:44:19]

They've already ruled on matters ranging from whether nude photos advocating breast cancer awareness should stand to whether a post about churches in Azerbaijan constitutes hate speech. The board will render their decision on President Trump in the next few months. This story was produced and reported by Simon Adler with original music throughout by Simon is the original music by Simon that we're hearing right now. Simon, It is indeed. All right.

[00:44:55]

As we said at the top, this episode was made in collaboration with the New York Radio Hour and New Yorker magazine to hear more about the intricacies of how this court came into being, the rulings they've already made and what's coming up on their docket. Check out David Remnick and reporter Kate Clinic's conversation in The New Yorker Radio Hours podcast feed or head over to New York radio our dog. And on that note, a huge thank you to Kate Clonic, whose tireless coverage of Facebook and their oversight board made this story possible.

[00:45:26]

We'd also like to give special thanks to Julie Aronow, Tim Wu, Noah Feldman, Andrew Morant's, Monica Beckert, John Taylor, Jeff Gelman and all the volunteers who spoke with us from the Network Against Hate Speech. Beautiful job. That's great. All right.

[00:45:58]

Hi, this is Claire Sebree calling from Lafayette, California. Radiolab was created by Jad Abumrad and edited by showing reeler Lulu Miller and Marcus Nasr. Our co-host Dylan Keith is our director of Sound Design. Truthy Sternbergh is our executive producer. Our staff includes Simon Adler, Jeremy Bloom, Becca Bressler, Rachel Kucik, David Gabal, Matt Guilty and McEuen Sakari and Black Cat Walters and Molly Webster with help from Shenley, Sarah Sandack and Journeyman's. Our fact checkers are De-Anne Kelly and Emily Craker.

[00:46:35]

This Radiolab episode has been supported by Squarespace working on a project that you can't wait to share, let Squarespace help with that. They have powerful tools to help you tell your story, share your updates and post photos and videos from anywhere. Plus, you can categorize and schedule your posts to make your content work for you. Head to Squarespace Dotcom, slash Radiolab for a free trial. And when you're ready to launch, use the offer code radio lab to save 10 percent off your first purchase of a website or domain.