Transcribe your podcast
[00:00:00]

Today's episode of Rationally Speaking is sponsored by Give Well, they're dedicated to finding outstanding charities and publishing their full analysis to help donors decide where to give. They do rigorous research to quantify how much good a given charity does. For example, how many lives does it save or how much does it reduce? Poverty per dollar donated. You can read all about their research or just check out their short list of top recommended evidence based charities to maximize the amount of good that your donations can do.

[00:00:28]

It's free and available to everyone online. Check them out at Give Weblog.

[00:00:45]

Welcome to, rationally speaking, the podcast, where we explore the borderlands between reason and nonsense.

[00:00:52]

I'm your host, Julia Galef, and I'm here today with Timothy Lee. Tim is a senior tech policy reporter for ARS Technica. He's also written for The Washington Post and for Vox. And I reached out to Tim to talk about the thorny but increasingly important issue of how much tech companies should be moderating speech on their platforms, which is something that he's been covering very thoughtfully for the last few years. So, for example, people using Twitter for harassment or bullying, people creating some breadths on Reddit that are offensive or that could be considered hate speech people or bots sharing fake news on Facebook, that kind of thing.

[00:01:37]

So that's what we're going to be talking about today. Tim, welcome to the show. Hey, thanks for having me on. So I guess, first off, I'm curious if. My impression is correct that tech companies have been moving in a direction of more actively moderating speech in the last few years, and if yes, why do you think that is? Yeah, I think that's absolutely something that's happened. The biggest change you've seen is that there are some companies like Reddit and Twitter are probably the two most prominent who used to take a pretty hard line free street position of we're just an open platform.

[00:02:12]

We just help people connect and exchange information. And basically, as long as you're not breaking the law, we're not going to filter restrict your speech. And they have increasingly backed away from that stance, I think largely because as they become more mainstream, there's there are just norms in the real world that have kind of seeped into into the Internet, particularly around issues like race and gender. And also, I think the other thing that's happened is that the having the Internet at a very large scale is different from having it when it was kind of a side part of people's lives on the Internet is now so kind of all encompassing for people that, for example, with online harassment, you know, 15 years ago, if somebody tried to harass you online, you know, you just turn the computer off and didn't matter.

[00:02:58]

But now, if you are you know, if you're checking your smartphone several times an hour and somebody is harassing online, it really matters more. And so I think as the stakes have gone up, groups that previously just didn't care that much about what happened on the Internet, you know, groups like civil rights groups, for example, just have become much more concerned with these kinds of issues. And so it's been more expensive for companies to take the kind of hard line free speech position and some of them took in in previous years.

[00:03:23]

Right. Right. And as social media, for example, has become a bigger part of our lives, it's sort of an increasing sacrifice, our burden to just go offline. Yes. Again, although that also the flip side of that is like if you if your speech is moderated or censored, then having that losing that platform is sort of a bigger deal now than it was before. So I feel like the stakes are higher on both sides.

[00:03:54]

Yes, absolutely. To me, one of the most interesting aspects of this issue is trying to define what our social media company is like. What are they what kind of entity are they most analogous to? And I've heard sort of three main answers to this question in the discourse. The first is they're private companies. They're making a product and they can set whatever rules they want about who can use that product and how, as long as they're not breaking the law and as long as they're not discriminating against protected classes.

[00:04:27]

The second answer is their media companies. And the reason that becomes relevant is, as such, they should be held responsible for their content. So this is often cited in the process of demanding that companies like Facebook should take responsibility for the truth, like the veracity or the bias of the content that's shared on their platform, even if that content was shared by users and not by the company itself as a traditional media company would. And then the third answer is their public utilities, and they should be regulated as such.

[00:05:00]

And I mean, traditionally, public utilities are they're defined that way because there's only one infrastructure everyone's using, like for electricity or water. And so you don't get a natural competition of different providers for that thing. And that's not true of social media. But there are these network effects where everyone wants to be on the same platform. And so that kind of ends up working like a natural monopoly on infrastructure. So do you think that one of those answers to the question of like what our social media companies, private companies, media companies, public utilities, do you think one of those answers is closest to the truth?

[00:05:35]

Or would you give a different answer that I didn't list?

[00:05:38]

So I think I would draw the line a little bit differently. I mean, I think the private company thing is is clearly true from a kind of basic level, like private companies have the legal right to basically run their their platforms. However, they want their very strong legal protections. If a company wants to make, for example, a totally open platform where they're not moderating, they're basically absolute immunity for content that they're that their users post, but not the legal answer that that's illegal either.

[00:06:03]

But I mean, you can answer both legally and also sort of. Yeah, but but I guess I guess you're saying like they can do whatever they want. They clearly can't. The question of what should they do? And then the answer to that. I think the two two of the categories you said are are useful categories. One is a utility, I would say a platform provider. Right. Like like Comcast, for example, is clearly just providing a platform.

[00:06:23]

Nobody expects them to to moderate or censor content that flows across their network. And the media company is kind of the opposite extreme where they are exerting, you know, like The New York Times is a platform where it's The New York Times primarily distributing content that they have chosen to their audience. And then I kind of think there's a third category, which is like community, which is where the function of the company is to help a group of people communicate with each other.

[00:06:52]

But rather than simply being a passive conduit for information, they are trying to cultivate a certain kind of certain set of social norms, a certain kind of community, so that people are nice to each other. And the topic is kind of steered in a particular direction. But they're not necessarily choosing individual pieces of content or individual people to kind of elevate above everybody else. And I think that in general, social media companies are some mix of media companies and community builders.

[00:07:21]

But I think the specific mix kind of depends on the specifics of the platform. I think different platforms, you can kind of make different arguments about how much they're one or the other.

[00:07:30]

Yeah. And on the legal side, do you think that the case for them being public utilities of some sort has any legal merit? Like I've seen a couple examples of a potential legal precedents for this. Like there was one case I don't remember when probably at least ten years ago in which some people wanted to do some political protest or or pass around some political petition at a mall. And the mall didn't want them to. And the court ruled that the mall can't prohibit people from doing that because even though the mall is privately owned, it has become increasingly the case that malls are like the public squares of our country, which is kind of depressing.

[00:08:12]

But nevertheless, it was true, I guess, more ten years ago than now. And so it's unfairly restrictive of our free speech for malls to regulate what people can and can't say within them so that that kind of thing could be a precedent for that's not directly about public utilities, but it's a precedent for sort of putting restrictions on private companies in terms of how much they can, in turn, restrict speech.

[00:08:40]

Yeah, I don't know of any significant legal momentum in that direction or really efforts to establish those kind of precedents. And it doesn't really seem like a good idea to me, because although it's certainly true that Facebook's is a very large company that has a lot of influence, it is still true that you can. Use Twitter, you can use other kinds of platforms, and so I don't think we are at least yet at the point where I don't think very many people would really want the government pushing companies in that direction of you have to carry certain kinds of speech.

[00:09:07]

I think almost all the pressures in the other direction of certain groups of people would like companies to crack down on certain types of speech as a as a matter of discretion of those companies, but under a certain amount of pressure from people that would like to see more heavily moderated platforms. How is this issue breaking down along political lines, like is it just liberals are calling for moderation of hate speech and conservatives are pushing back and calling for no moderation? Or is the political breakdown more complicated than that?

[00:09:41]

I think that's the broad outline. What are the interesting things you see on the left is that I really do think you've seen a schism on the left where kind of different parts of the traditional liberal movement have been brought into conflict over this. So in the early days of the Internet, you had the civil libertarian wing of the left, you know, the ACLU, the Electronic Frontier Foundation, early kind of Internet, civil libertarian or cyber liberal kind of people and companies.

[00:10:07]

That was kind of the dominant view of left. And I thought on the Internet and at the same time at the on the offline world, you had groups like the Southern Poverty Law Center or other kinds of civil rights groups that were more used to to seeing certain kinds of speech as problematic and as those groups and the kind of constituencies they represent had become more active and visible online. I think you see some intra last disagreement where you have some parts of the left pressuring technology companies to more aggressively censor certain kinds of speech.

[00:10:41]

And then you have other other parts of the left that are little less comfortable with with that kind of thing on the right, I think it's mostly reactive. It's because most of the recent momentum for restricting kinds of speeches at hate speech, quote unquote, is probably the largest category. Conservatives who believe that some of that hate speech is not actual hate speech. And actually this kind of garden variety conservative speech, they're worried about that kind of going overboard and and kind of going after more more generic conservative speech.

[00:11:13]

And so I think on the right, it's mostly the people who are interested in this issue are mostly more on the what we call the free speech side.

[00:11:20]

Right. I did see Tucker Carlson call for the kind of regulation that I was describing a few minutes ago, like regulation to protect free speech, which I mean, I have no idea how representative that is. But it is it's interesting to me that if conservatives want are in favor of, quote unquote, free speech, that puts conservatives in the awkward position of supporting regulation of private companies, which is not a typically conservative position to protect free speech. That's been a little bit strange.

[00:11:50]

I mean, you say you saw Dennis Prager is a YouTube were fairly prominent conservative who sued YouTube, arguing that certain YouTube policies restricting I think it was largely over the monetization of YouTube content, where sometimes its content is controversial. YouTube won't take it down, but it will stop placing ads against it, which obviously, if you are if your business model is you are ad supported YouTube channel, that's a significant burden. He sued, has sued YouTube, arguing that there's a problem with this.

[00:12:23]

I don't think that's gotten a lot of traction and I think it's been a little bit opportunistic. Tucker Carlson is not exactly the most intellectually rigorous, I think commentator and is a little bit can be a little bit opportunistic on these kind of issues.

[00:12:36]

So I support whatever principle happens to sort of my or my self-interest interests right now. Yeah.

[00:12:42]

And I think what you're seeing happening is that certain tech companies, especially Google and Facebook, have become whipping boys for the right. And so there's a certain kind of populist populist opportunity for anything that bashes what are seen as Left-Wing tech companies. There's a constituency for that. And the fact that some of the things you might do to bash tech companies are inconsistent with other principles that conservative have doesn't necessarily stop everybody from taking those opportunities.

[00:13:11]

It's interesting that you say they're whipping boys for conservatives because it also feels to me like they are whipping boys for liberals a lot of the time, or at least some significant sections of the liberal side, like, you know, they're there now. These powerful elites and liberals are traditionally suspicious of powerful elites. And do you think that the backlash to tech companies, to the tech titans is lopsided politically?

[00:13:39]

Well, I think there's different kinds of backlashes. So in part of the problem of these tech companies have, as they have their fingers in so many different pies that they've able that they've been able to alienate, you know, almost every corner of the political spectrum on some issues. Right. So so you've seen it kind of totally different part of the left from the two categories I was talking about before. You have certain like antitrust scholars and thinkers about economic policy.

[00:14:03]

That kind of power have identified Google and Facebook and particularly Amazon in this context as an increasing threat to innovation and Internet openness and so forth. They just don't like the idea of a few companies controlling so much of the content we watch and read and so forth. And so but there they are much more focused on structural changes. They would like to see force. They. Book to give up control of Instagram, and it's not clear that that would have any particular what effect that would have in any particular direction on, say, moderation of hate speech.

[00:14:34]

And maybe if you had smaller, more independent companies, maybe they would be more subject to kind of grassroots pressure to restrict things. Maybe they wouldn't. It's hard to say, but that kind of in a different direction than the kind of question is should we have more of a restriction of hate speech versus a more free speech position?

[00:14:50]

Right. On the legal issue. If companies are publicly traded then and they then have a responsibility to try to maximize their value for their shareholders, couldn't that, at least in theory, trade against the idea that they can just moderate speech however they want or in response to public pressure? I think the courts are pretty deferential at that level of granularity. I mean, you're certainly supposed to act in the interests of shareholders, but you can easily make arguments on both sides, right?

[00:15:20]

I mean, the case for restricting speech is that if you have a very open platform, you have a lot of problems or harassment. You have certain minority groups feeling like they're unwelcome. And so actually, you end up with a with a smaller price. And I think there's some evidence for this. Right. The largest, most successful platform is Facebook, which is more aggressive about this kind of thing than Reddit and Twitter. And I think there's an argument that that has actually created a more kind of wholesome, family friendly environment where a larger number of people feel comfortable.

[00:15:50]

On the flipside, obviously, if you are censoring speech, the people that made that speech are not going to be welcome. And so it's overdoing it in the censorship direction. Could also limit your limit your audience. And so I don't think the courts would want to get involved and kind of second guessing like, say, we feel that the policy chosen interest of shareholders and I don't think the courts would want to get involved in trying to second guess that.

[00:16:16]

On the political issue, I I've been a little surprised that people who've been pushing for more active censorship, like, I don't know, pushing for Facebook to block fake news or pushing for sort of more active restrictions on hate speech or more broad conceptions of what hate speech is. I've been a little surprised that they're not they don't seem to be, on the whole, worried about setting these about having set these precedents and then that coming back to bite them like.

[00:16:53]

So currently, it is the case that tech companies are basically liberal in the sense that they're like run by people who are left or center left. The restrictions, for the most part, have been on things that the left dislikes. But it totally seems plausible to me that, you know, that could change. And, you know, maybe the tech titan 10 years from now is run by a conservative and he decides that it's hate speech to criticize the president or something like that.

[00:17:24]

And then having this precedent of, well, companies can just regulate speech however they see fit as long as it's technically in keeping with the law. To me, that seems worrying, like from my perspective, we should be pushing for some policy that we think will be best overall in the long run and not just like best for our current situation, in addition to like wanting to do what's fair, of course. But like, even if you were just self-interested, like you just want to promote your own side, it seems like you should be worried about this president backfiring.

[00:17:55]

What do you think?

[00:17:56]

Yeah, I think that that general worry is definitely important. I think the things you were talking about earlier between platform providers, editorial kind of decisions and community building is important. And as I sort of I see the company like Facebook more in the a little bit more on the editorial judgment business and in the community building business in the sense that you're never going to have a Facebook that's really completely open where, you know, there's just the like American Nazi party, you know, page where there's racial slurs and stuff like at some point, like Facebook is just because of the kind of platform they built, the kind of experience people expect from it.

[00:18:38]

Facebook is going to be moderating certain kinds of content. And the question is just how far they're going to go. And you can kind of criticize them in either direction for that. But I don't think it's a really realistic to say that they should, like, never get involved in limiting that kind of content. I think when you go a little lower in the on the Internet to say I already mentioned, obviously, an ISP, there's this net neutrality debate where it's kind of the opposite, where most people want to legally prohibit companies that run the Internet's infrastructure.

[00:19:05]

But one particularly interesting example here is there's this this website called The Daily Stormer. That's a literal neo-Nazi website that was had a just a just a standalone website. It wasn't on anybody else's platform. But activists started pressuring the companies to provide them with kind of basic Internet services and a service called DNS that controls the domain name that people type in to go to the site. And also there's companies that provide protection against denial of service attacks to kind of make sure you can stay online, even if people kind of try to try to force you off the Internet.

[00:19:42]

And activists pressured those companies to drop this website as a customer. And for several weeks, they were unable to be on the Internet because they kept switching to new providers and having them dropped. And I think that is more problematic because that's a case where, you know, if you're kicked off Facebook, but you can create your own website and create an online forum and people can kind of congregate there then whatever kind of out of the mainstream ideas you might want to promote in the future, you're still going to be able to organize and reach people who are interested in hearing the message.

[00:20:11]

Whereas if you have a world where certain kinds of messages are considered totally out of bounds and it really is literally impossible to have content as reachable on the Internet, then obviously you can spin out kind of worst case scenarios where maybe ideas you think are important become seen as out of fashion. So I'm definitely worried about that.

[00:20:30]

Yeah, that reminds me a little bit of the debate over should bakers be required to bake cakes for gay weddings, that kind of thing, where one distinction that some people made that I thought was a pretty good distinction is are there other like could you easily go to another baker and get your cake made? Or, you know, is there like one baker per city or something? And so you are kind of at the mercy of that person's political leanings.

[00:20:58]

And of course, you could argue that, you know, even if the market is full of lots of different bakers, it should there should still be a requirement to not discriminate against gay weddings. But but that distinction still seems relevant to me. And it's tricky with. CloudFlare and what was it? Yeah, CloudFlare and one of the companies, yeah, yeah, and I guess GoDaddy and it it seems kind of wrong to force any one of them to host white supremacist websites.

[00:21:31]

And yet it also seems wrong if these websites, as long as they're technically, you know, not breaking the law, are can't can't have a website on the Internet because no one will post them, both those situations seem wrong and I'm not sure how to resolve that tension.

[00:21:48]

Yeah, well, so I think the kind of details of what happened here were important. And one of the one of the things that was happening was these denial of service attacks, which is people would you can kind of go to the Internet underworld and contact people that just have lots and lots of server capacity. In some cases, that server capacity, they've hack into other people's computers and are like kind of using stolen bandwidth. But anyway, they just flood targets with with traffic.

[00:22:12]

And so that is a little bit like kind of mob rule. And so do you think about it in a physical context? You know, if, you know, if a controversial group is trying to hold a rally and a bunch of other thugs, you know, how of like physically shut it down? You do generally expect the police to protect the physical safety of people, too. So so so that's a case where it's it's kind of not clear.

[00:22:32]

It's not exactly a kind of matter of company discretion. There are various ways that that third parties could put a ton of pressure on these individual providers. And I think that if you're providing a kind of basic infrastructure, you know, it's like if you were you know, if you try to get like the electric company or the water company that, like, shut down some of these services and controversial deals, like, I think there's a certain layer on the Internet where it's kind of like that, where that's just really not the right layer for these kind of battles to be fought out.

[00:23:04]

And I would rather have a completely open kind of base layer the Internet and then have the arguments on sites like Reddit and Facebook where people kind of understand what's going on and they know how to go to a different one. If they don't like the way that they're the one they're using right now. Is is doing immoderation got it?

[00:23:20]

Would your view change at all if basically everyone was just using Facebook, like Twitter and other competitors kind of died out and and just everyone in the world is on Facebook? There wasn't really a viable alternative, although in theory people could, of course, just, you know, run a message, board themselves or have come comment section on their own personal website. It would just never be it would never get nearly the exposure as Facebook. So they would be at a huge disadvantage, but they could still technically host the discussions that they wanted.

[00:23:50]

I think to some extent we are in that world. Right. I mean, Facebook is way, way bigger than anything. Maybe maybe, you know, a couple other sites, but there's two or three sites that are like a huge fraction of the Internet. But I think it's important, you know, if if the American Nazi people, if there's one hundred of them and they want to start a website, they can do that. And so I don't I'm not really bothered by the fact that unpopular ideas have trouble getting people to voluntarily go to their website and sign up.

[00:24:16]

Like, I think Facebook has this traffic firehose. And if they choose not to point it in a particular direction, that seems totally fine to me as long as there is a relatively straightforward way for anybody who does want a particular kind of information to be able to go on again. Because, I mean, you know, before the Internet, if you want your ideas out, you had to, like, stand on a street corner and hand out pamphlets and setting up a website in way easier than that.

[00:24:37]

So it's never going to be the case that assuming you have the kind of basic infrastructure, the infrastructure of the Internet working is never going to be the case, that it's going to be that hard for people, for unpopular ideas to get their word out. I wanted to go back. You were mentioning before about Facebook fighting fake news and actually wanting to stand up a little bit for the idea of fighting fake news, because I think this is something people misunderstand a little bit.

[00:25:00]

I mean, the thing that I think is important to understand about Facebook is Facebook is not behaving like a neutral platform where they're just, you know, seeing what Twitter Twitter primarily shows you a reverse chronological list of the people you follow, what they're posting. And so you see kind of a representative sample of what the people you follow are posting. And Twitter isn't really deciding very much what you see on Facebook or a different Facebook. The news feed is sort of based on a proprietary other algorithm that Facebook controls, and they use various variables that, in their judgment, makes for a better news feed.

[00:25:41]

And in practice, I think they use variables that are bad for the world. I mean, like engagement is a big one, right? How many times do people click and share? I think that pushes people in directions that lead to lead them towards content that's, you know, would not be considered good criteria if you were, say, running a newspaper. And so when I at least suggested they ought to be fighting Fox News, the main thing I would like to be doing is I would like them to use the power they're already exercising in a more respectable, responsible way and say.

[00:26:14]

We are, in fact, making editorial decisions here, and we should not be doing the equivalent of like, you know, putting candy bars in front of every customer, because that's kind of the thing that sells the best we should be thinking about in the long run, what will make our platform most useful and. Valuable for our users and for the larger society, that is a really important distinction, although it seems like earlier we were talking about how there's a lot of there's a lot of room for interpretation when you're talking about what actions maximize shareholder value.

[00:26:52]

And you could in that case, we were talking about like harassment or bullying. And there's a very strong case to make that, you know, you are maximizing shareholder value by by coming down hard on harassers or Bowyer's, even if that reduces the total number of people using your site in the case of Facebook, sort of willingly stepping away from their engagement, maximizing metrics. That seems a little tougher to just sort of throw up our hands and say, well, who's to say what actually maximizes shareholder value?

[00:27:26]

Like there's a pretty strong case for increasing engagement, maximizing shareholder value. And the only other case, the case against that is is pretty clearly just about benefit to society, which is not about shareholder value. It's a very important thing that I care a lot about. But if we're just talking about what, maximizing shareholder value, it's a little hard to make a case against that, right?

[00:27:49]

I'm not sure that's true. I mean, I think it's certainly possible. Certainly when you're a small site on your way up, maximizing engagement is going to grow your audience and so forth. At this point, I think Facebook might have a sticky enough audience that if they kind of dial that back a little bit, people might feel a little less dirty, you know, seeing the news feed every day. But they still feel enough of a poll that they're coming regularly and over the long run.

[00:28:12]

But also, I guess I'm thinking about this more. This I think this idea about maximizing shareholder value is it's really important not to overstate it. I mean, if you think about The New York Times, which is a publicly traded company, they do not decide what goes on at one of The New York Times by deciding what's going to maximize shareholder value. Right. They put investigative pieces and kind of in the weeds policy pieces, but they think are important but are not necessarily going to sell the most papers on the front page all the time.

[00:28:43]

And, you know, The New York Times is structured in a way the Sulzberger family has these super majority voting rights that allow them to exercise editorial control over the paper, even though they're not majority shareholders. And, you know, if you don't like that, you can buy a different stock. But I don't think anybody would say that The New York Times has an obligation to strictly do what will maximize the return to the shareholders. There are other objectives, and I don't think there's anything in corporate law that requires that on Facebook is in a similar situation in the Mark Zuckerberg has effectively dictator for life status at Facebook.

[00:29:18]

He has a majority of the voting rights for Facebook and basically can't be fired. And so if he decided he wants to take a short term financial hit and, you know, put a lot of high quality content, however you want to define that in front of Facebook users, there's certainly nothing legally that shareholders could do about that. I don't think there'd be any kind of ethical or moral problems with him doing.

[00:29:40]

Yeah, and actually, just to push back on my own argument for a minute, it occurs to me that when I talk to people running tech companies, one thing that comes up a lot is that one of the hardest things for them is attracting really high quality employees. They're sort of competing hard on that access and. A large percentage, I don't know if technically a majority, but at least a large plurality of a large minority of employees at least care about the ethics of the company they work for.

[00:30:14]

And and they're also susceptible to social pressure. And the people working at Uber have been kind of embarrassed in the last year or two to tell people they work at Uber. And that's a major that the deals your company kind of a major blow in terms of your ability to attract the top employees. So you could really argue that it's in the best interests of the value of the company to do the ethical thing, and that that could become increasingly true the more sort of social pressure is applied.

[00:30:47]

Yeah, absolutely. I mean, you know, people make fun of Google's slogan. I dunno if they still officially have a don't be evil, but it is absolutely the case. I mean, I've known people, engineers who work at Google and people like working at a company that has a reputation for not being evil. And and I think that that slogan had a kind of specific meaning at the time they started using it. And and I think one of the things you're seeing, I was saying before that there's a kind of two halves of the left that have come into conflict.

[00:31:13]

But I also think it's in the kind of moral universe of Google employees who are mostly left of center. You have kind of the same thing where the you have some people at a company like Google who really feel the priority should be making Google a more inclusive, sensitive kind of company and that those values should kind of also include kind of how they run their products. And so if there's a lot of harassment or racist or sexist or other kinds of offensive speech, that cracking down on that is actually following the moral values of the Google employee community.

[00:31:48]

And the other hand, you have more libertarian employees who really feel like the the important thing about the Internet is its openness and sense of freedom. And so if you do too much cracking down on that kind of speech, though, that is actually running afoul of of Google values and or Facebook values or whatever. And and so I think it's not just like you said, it's not just that the CEO has particular value or that kind of executer groups are pressuring them.

[00:32:14]

I think a lot of people in these companies wrestle with those two values in different employees, prioritize them differently and how these companies behave and ultimately the result of those kind of internal debates.

[00:32:23]

Yeah, so we've talked about some examples so far in which you your take is moderation is totally justified and good. And we've also talked about in the case of the Daily Stormer, your take being moderation and moderation in the sense of refusing to host is not appropriate, because in that case, the service being provided isn't really a platform. It's just infrastructure, the infrastructure of the Internet. I'd be curious to hear any other examples you would put in either the like, clearly appropriate censorship, clearly inappropriate censorship, or the third category being sort of really tricky gray areas where you're not sure if censorship, moderation is appropriate.

[00:33:14]

I guess I would I would pretty much lean on the kind of infrastructure versus community distinction, I think that at the basic level of Internet service providers, DNS providers, web hosts, those kind of companies, I think it mostly makes sense for them to take a pretty hard straight line. If we just we're just a business that provides infrastructure and we don't get into kind of content for it. For companies, I think it's more a pragmatic question of companies need to be consistent and need to be seen as kind of reasonable by their customers.

[00:33:43]

And one of the things that Matthew Prince, who's the CEO of CloudFlare, which is one of the companies who face this dilemma, one of the things that he's emphasized is that it would be useful to think a little less about free speech and more about due process, because these companies need to especially companies that need to operate internationally. If you are in a country like Germany, you're just going to have to moderate certain kinds of hate speech because it's it's like straight up illegal to have a Nazi website.

[00:34:09]

But what everybody around the world agrees to is that you need to that you ought to have kind of consistent processes where if if you have content that is not consistent with whatever the norms of your community, that people should be able to find out what the rules are to find out how their content was judged inappropriate, you know, have an appeal process, et cetera. And so I think part of the problem you're seeing with a lot of these large mainstream tech companies like Facebook, like Google, like YouTube cetera, is that they're just starting to establish these policies.

[00:34:39]

And because there's a lot of internal kind of internal disagreement about what the policies should be, there's just a lot of stuff that happens. And there's no way to figure out kind of what why it happened or what the rules are or seemingly inconsistent applications.

[00:34:54]

Yeah, absolutely. And it's you know, to some extent, it's just an inherent it's inherently difficult because there's such a volume of material coming in that a person can't spend very much time. But I think one of the challenges that they're going to have to kind of figure out is find ways to make the process more transparent and more predictable so that you know, that, oh, if I go to Facebook, this kind of content is not allowed. And so I should set up this kind of account on Twitter instead of actually posted on Reddit or this one just going to have to be on some forum.

[00:35:20]

And I think if people feel like they understand I mean, one of the things one good example I think of this is with YouTube and the monetization situation where a lot of conservatives have been complaining about their content gets demagnetized and it seems to just be mysterious, like why that happens. And my sense from talking to various people involved with this is often is just like the advertisers specifically don't want to be attached to controversies or might not even be YouTube in particular, that, oh, can they not even tell if it's if it's coming from YouTube versus the advertisers?

[00:35:51]

I think it's not clear. Yeah, it's it's I haven't actually looked into details of how the interface works, but yeah, I think it's not on the advertiser probably doesn't want it known that like we, you know, which particular advertiser or whatever. And I'm not sure how it all works. But like anyway the point is that like YouTube should be more clear about like here are the circumstances in which your video gets monetized, you know, and maybe if it is a case that, you know, there's a video that is available for advertising in general, but no advertiser has agreed to pick up the slot or whatever that may be, there should be ways to show that.

[00:36:24]

But I think that if companies could clearly explain here are policies and here's how that policy applied in a particular case, then I think some of the kind of outrage would diminish, because even if people don't agree with the decision, they would at say, OK, they're they're applying consistently. I can go somewhere else if I'm not satisfied with what they're doing.

[00:36:42]

Right. That seems especially important to me from the angle of preserving free speech in that if you're unsure what is going to get you shut down or banned, that like if you're at all risk averse, as most of us are, that kind of leads that creates this pressure to air on the safe side. And like there's something that could be controversial or could get you banned, then maybe you just shouldn't say it. And, you know, maybe you end up.

[00:37:13]

Being more cautious than you technically needed to be, right, to get banned, but you don't know that. And so that that seems like an especially bad chilling effect to happen.

[00:37:24]

Yeah, I think that's actually the other thing I would say about about these platforms is I think they might be underestimating how much they are creating potential bad blood down the road with conservatives, because I think there is I think tech companies are generally steeped in the on the political left in terms of who their employees are and, you know, the culture of the Bay Area. And so I think that there's a danger that conservatives will end up seeing individual decisions about types of hate speech, that the conservatives will just see that being overly aggressive and come to see tech companies increasingly as just hostile to conservative ideas in general, as opposed to hostile to kind of fairly narrow types of hostility towards protected groups of people.

[00:38:07]

And I'm not sure how you I think they're just in a really difficult position because you have an increasingly polarized country where each side kind of sees fairly common types of speech on the other side as a threat to them. Yeah, and one of the virtues of the stronger free speech position is they can clearly say, like look like, you know, we're just a platform. We're not endorsing any of that speech. But, you know, here's some tool.

[00:38:34]

Twitter is try to do this to some extent. Right? If you don't like somebody, you could block them. You could be a thug. But they're a relatively high bar for actually banning people. Right. I think the more you start to actually actively moderate certain kinds of speech, then there's more of an implicit endorsement of speech you've chosen not to moderate. And so I think that's a I don't think there's any role. I categorically, all taxpayers should or shouldn't do this.

[00:38:58]

But it's something we need to think about really hard, because people pay attention and people notice when there's inconsistencies in the way things are applied. And so if you start censoring one kind of speech, you have to think about are there other similar kinds of speech that people are going to expect us to censor? And if not, are we going to see more hypocrites?

[00:39:15]

What do you think happens when conservatives get fed up and stop trusting tech companies? What like do they try to create a competitive conservative Facebook?

[00:39:26]

Like, it's not the best example here is Gab Gab, which is a kind of Twitter competitor when the Daily Stormer was shut down. And I think I figured if they the the leaders that had Twitter accounts, but anyway, they certainly didn't after the website got shut down, they all went to gab. And Gab has had a pretty testy relationship with the rest of the Bay Area. They had a Android app that was that was kicked off the Android App Store.

[00:39:55]

And they have sued Google over this. And so I think they have gotten a little bit of kind of populist grassroots outrage from conservatives kind of rallying behind them against the big evil liberal tech companies. But I but it also is just not it is a much, much smaller product than Twitter. I do not think it's gotten the kind of momentum that would allow it to become kind of the conservative Twitter with, you know, tens of millions of users.

[00:40:18]

And I think the network effects of these companies is is pretty large. So I don't think that the I think the threat is more that they will end up in kind of the political category that, like Hollywood is where it's just if there's Republican politicians, like until at least until the Trump era, the tech companies were pretty good at influencing the George W. Bush administration, you know, being on good terms with Republicans in Congress. And I mean, there were always, I think certain tech companies like Google, for example, has always been seen as leaning a little bit to the left, but they've been pretty effective at having good relationships with people on all sides of the political spectrum.

[00:40:52]

And if tech companies become seen as identified too strongly with liberals and Democrats, then when Republicans are in power, there is there is just going to be bad for them on other issues that they care about. And but it's not clear that there's an alternative because they're also facing some pressure, as I mentioned before, from people kind of on the far left. And so if they're too careful about not alienating conservatives, they might end up with a kind of Bernie Sanders type administration in the future that does things they don't like on that side.

[00:41:22]

So they're really a tough, tough position either way.

[00:41:25]

That is really tough. Well, Tim, before I let you go, I wanted to ask you for a book or or blog or just a thinker in general who has influenced you in some way over the course of your career?

[00:41:41]

Yes, I was thinking about this recently. There's a writer named Clay Shirky who is I think is still a professor at NYU, and he wrote a book in 2008 called Here Comes Everybody. That was really the first book that I found that explained how the Internet was changing organizational structures and society. And the basic idea was that in the pre Internet era, if you wanted to have any kind of large scale organization, you needed a you needed a little organization like a nonprofit or a company or something that would or the Catholic Church that would hand organize people's activity and that gave authority figures in those institutions a lot of influence over how, how what happened.

[00:42:22]

And so, for example, he talked about the the Catholic pedophilia scandal where in earlier decades that would probably have been swept under the rug. But thanks to the Internet survivals of child abuse, we're able to find each other and kind of grassroots pressure that helped. Or so he has a bunch of examples like that in the book. And when I read it in 2008, it seemed like just a pretty optimistic take about here's how the Internet is enabling grassroots activism and allowing new kinds of organization that couldn't exist before.

[00:42:54]

And what I didn't appreciate and which is more obvious now, is that obviously large institutions also do some positive things. And the erosion of the the power of institutions also means that a lot of the kind of quality control functions that large institutions perform also go out the window. And so you get things like, you know, Donald Trump getting nominated as the Republican nominee for president, which I think is hard to imagine in a free Internet era. And so I actually would like to go back and read it again and see kind of if the whole thing reads differently now.

[00:43:26]

But it's his thesis was kind of even more true than I appreciated at the time. But I think you notice I think people notice the positive effects of like mPower new groups before they notice the negative effects of eroding the power of groups that, you know, obviously has some flaws. But also we're doing some good things that people never appreciate because they had just been doing them for for as long as anybody could remember. Yeah, well said.

[00:43:50]

Great. Well, we'll link to Sharkey's book as well as to your work. And Tim, thank you so much for joining us. It's been a pleasure having you.

[00:44:00]

Thank you. It was fun. This concludes another episode of rationally speaking. Join us next time for more explorations on the borderlands between reason and nonsense.