Transcribe your podcast
[00:00:00]

These truths to be self-evident, that all men are created equal or self-governing, is a member of Congress, I get to have a lot of really interesting people and experts on what they're talking about. This is the for insights into the issues, China, bioterrorism, Medicare for all in depth discussions, breaking it down into simple terms. We we hope we hold these truths. We hold these truths with Dan Crenshaw on point of it.

[00:00:24]

All right. Ready to go. Kanyon Brimhall. That's quite the name. Thank you, sir.

[00:00:29]

Yeah, it does well with introductions. I like it. Yeah. Is it are you from Arizona?

[00:00:37]

Yes, sir. I'm like a sixth generation Arizonan, you know, born and raised, and now I'm out in D.C. So it's a very different place, a lot less dry and warm.

[00:00:47]

Yeah, yeah, that's true. It's about the snow here in a second. Is your parents I mean, was there some like Arizona connection to why they named you that?

[00:00:56]

Yeah. So my my family my mom's side has a lot of Native American heritage, so. Well, you can tell it was diluted quite drastically by the time it got to me, one of my great grandparents really wanted to help kind of carry on that family legacy. Right. Of Native American history and culture. And she basically asked my my parents if she could name one of their children. And the name that she came up with was Kenyan. And my parents, you know, being from Arizona, my mom's pretty, pretty hippy.

[00:01:30]

You know, we're like, yeah, that's great. It's very Arizona and has the Native American tie in. And yeah, but to the general world, that's just really I'm from Arizona. My name is Canyon. It makes up.

[00:01:42]

It does make sense. That's quite well. That's a great introduction to Canyon Brimhall and people would be wondering why that is. We usually don't even start the podcast that way, but we did this time and I like it. So but the true topic is not your first name. It is reforming Section 230, which you happen to be an expert on. So Kanyon Brimhall, you're a federal government affairs manager at our street, which is a think tank.

[00:02:07]

You work at technology and innovation and national security and cyber security teams promote pragmatic free market policy solutions to complex technological issues. So Kanon, thanks so much for being on. Thank you for having me, sir, it's an honor. So let's just get right to it. I, you know, always the what's the best place to even start the conversation?

[00:02:28]

Maybe that's the first question, because I'm tempted to say, OK, first explain the initial purpose of section to 30, why it even exists in the first place. And then let's get into why people think either abolishing it or reforming it would would fix a lot of the problems that we see with censorship on social media.

[00:02:50]

Yeah, I think that's great. And I think that kind of the first and foremost thing that we need to do is Section 230 is clarify exactly what it is that we're talking about. Right. Because I think that there's a lot of misinformation that kind of gets thrown around and we end up in a game of telephone where nobody's really quite sure what we're talking about. So starting off with kind of that historical context before Section two, 30, you know, in the budding days of the Internet, there was a lot of litigation regarding content on the Internet.

[00:03:22]

You know, I'm not going to spout off court cases for you. Unfortunately, I wish I could, but I'm not that good at remembering the names of all the defendants and litigants. But long story short, websites where were sued for hosting content, especially if they ever took down some content, but most other content. You know, an example could be if people are posting defamatory comments on my website, maybe I consider them, you know, Nazi propaganda or even Antifa propaganda, you know, these kind of very extremist positions or death threats, harassment, pornography.

[00:03:58]

You know, general harassment is another good category. And what happened is people started suing the posters of this information to say, you know, you posted this information, you should be liable for it. So examples include America Online, MySpace, you know, the earlier social media network that most people don't really talk about anymore. And a host of other services of America Online was one of the big targets because they were one of the predominant players in terms of providing Internet service to people.

[00:04:31]

And they I don't know if you use AOL yourself, you know, they have their own kind of sandbox environment where they had a lot of information that they kind of provided to you.

[00:04:41]

It was yeah, it was kind of funny in hindsight. Yeah. It was like its own. It was like a simultaneously a website, but also an Internet provider, as I recall it. Yeah, exactly. I won't spend too much time on this. But funnily, my my mom actually spent several years paying AOL to continue letting her log on to that Web portal, even though she could access her email for free using Internet Explorer or some kind of alternate platform because she liked AOL.

[00:05:10]

She was used to it. She liked the format that they had, the way that they presented things. So she continued to pay for that. It was worth it to her for several years until I convinced her otherwise. Yeah, we all agree it was like that. Yeah, exactly. So now she's on Gmail, so she's great. So that was kind of where things were in the early days of the Internet. So Section 230, actually, Congress passed the Communications Decency Act was originally an attempt by Congress to rein in a lot of indecent materials online.

[00:05:43]

The problem that they saw was there wasn't enough ability for four websites to moderate because if they did, then they were getting sued. You know, an example would be if you took down one Nazi comment but missed another one, you could be sued for the Nazi comment that you missed. You could also be sued for anti foot comments that you didn't take down, because now those are on your website. You've shown that you take down some things, but you didn't take down this.

[00:06:09]

So you are now liable for this piece of content, even if it's completely different than what you have taken down in the past. You took something down. You're now liable for everything, right? So Congress wanting the Internet to not be full of pornography and all kinds of disgusting things on websites. That turned out real well.

[00:06:29]

Yeah, but I would argue that it has, because when you go to Twitter, how much pornography do you see?

[00:06:35]

Yeah, I don't I have never seen. Yeah. On, on Twitter especially not, you know, hardcore graphic pornography like what you would expect to see on a pornographic website. Right. So yes, these things exist, but they are in appropriate place for people who pursue those things.

[00:06:54]

Similarly, I would love to also. But anyway, so that history that Congress said websites, we want you to take down more content. We want you to be able to take these things down. We are going to empower you using Section two thirty and a whole host of other laws to do that. Most of the other laws in the Communications Decency Act were eventually found unconstitutional, usually on first. Grounds Section 230 is one of the few sections that has kind of stood the test of time and has been litigated over and over again in order to enshrine that as a protection.

[00:07:30]

One example that I'll give soon after the in the 90s after Section 230 was implemented. One example of attempts to silence conservative speech by suing the platform that hosts it was Blumen Blumenthal. The AOL said I was going to quote court case that and there I go. But Blumenthal, the AOL. No, not the congressman. The the they sued saying that AOL was hosting Drudge. Matt Drudge, his his reports, AOL was hosting the Drudge Report. And because of that, AOL was sued for defamation because the Drudge Report was deemed by the litigants to be defamatory.

[00:08:15]

Thankfully, Section 230 existed at this point and the courts were able to step in and say this is exactly the type of thing that Section two 30 was meant to stop. You are not able to sue America Online for defamation because Matt Drudge says something that might be defamatory towards you. So I think that's kind of a I don't want to lie about you too long, but I think that's kind of a good place setting in terms of the history and how we got here.

[00:08:40]

Yeah, well, it's extremely important because I don't think people quite understand the the complicated nature of it and the need to protect the Internet from this sort of whimsical lawsuits that could happen for all sorts of reasons that even conservatives would disagree with. You know, the goal for conservatives is pretty clear. Like we we basically want Internet companies to abide by the spirit of the First Amendment, OK? And and if not that at least operate within a very clear cut set of standards.

[00:09:14]

There's a huge difference between AOL and Twitter. You know, I would argue it's one thing for for you to host a Web site like AOL or even like or even like a CNN Web site and which the entire purpose of which is to publish content like they are publishing content. CNN is publishing content that they control that they're liable for. AOL is creating is putting services online that they control, that they are liable for. But but as a side note, they also allow commentary under these things.

[00:09:49]

So that's quite different than what Twitter is, which the entire purpose of Twitter, at least as they allege, is to is for commentary. Right. It's a social media website which didn't exist when Section 230 was written. So, you know, I think many people would argue rightfully that these are two different things and they require two different sets of regulations. And so if our goal is considered because Democrats have a totally different goal, they actually want more censorship.

[00:10:15]

And I'm not I'm not sure how to meet in the middle on that when your goal is entirely different than mine. And so that's a political problem that we don't necessarily need to discuss. But if conservatives had the Senate, the House and the presidency and we were trying to figure out how to how to thoughtfully regulate this so that our goal is met, which is the maximum amount of free speech, while also while also understanding there's some things that should be taken down, like maybe it's pornography, maybe it's human trafficking.

[00:10:47]

Right. Maybe it's, you know, things that are not protected under the First Amendment like illegal things and incentivizing violence, things like that. So so what's the right way to get there? Is there any of the current talking points and pieces of legislation really get us there?

[00:11:04]

Yeah, know, I don't think so. I think that, you know, there is a number of attempts to get at Section 230, but I don't think that any of them succeed in solving the issues that you're outlining. And in particular, I'd like to kind of dive a little bit more into what you're talking about, which is the desire to have social media companies adhere to the spirit of the First Amendment right. I think that in you know, in theory, I think that's great.

[00:11:32]

And I think that in practice they attempt to create a space where people can communicate, share ideas and those kind of things. And the core of Section two 30, I think, is really strengthening to the First Amendment, which is all it really says, is that a speaker shall not be held liable for someone else's speech. So the First Amendment also protects Twitter, Facebook and these corporations and their right to free speech. So I'm I'm a strong property rights advocate, so I have a really hard time with this kind of concept that because they have reached a certain size, all of the sudden these these social media companies become kind of public, a publicly owned business.

[00:12:18]

We haven't, to my knowledge, used eminent domain to take over these companies. We haven't nationalized Twitter or Facebook, which I think would be a terrible idea. So I don't understand how we can be simultaneously pro property rights and say, you Twitter, you Facebook, you're going to have to we're going to restrict your property rights and your First Amendment rights on your platform. You know, I obviously there's a difference in size and scale, but I think of this in terms of if I owned a restaurant and someone came in and started yelling racist things, would I have the right to kick that person out of my restaurant?

[00:12:57]

Similarly, does Twitter have the right to kick somebody off of their platform if they do something that they know that speech that they fundamentally disagree with? The other part of what you were mentioning that I really think is important to discuss and think about is the question of whether or not a platform should only be able to get rid of unconstitutional or illegal speech. So you mentioned, you know, if they are held to the First Amendment, that something that if they don't violate if speech doesn't violate the First Amendment, then they it should be left up.

[00:13:28]

Correct. Is that a fair characterization? So I think a perfect example of why that doesn't work is pornography. Pornography is not illegal. But I as a Twitter user, don't want to be flooded with pornography when I go on to the site. I think there's another great example of this, which is parler, which I assume you're familiar with. They created their platform as an alternative to Twitter and Facebook, and they started off by saying, we will only remove illegal speech.

[00:13:59]

We will not remove speech that is protected by the First Amendment. So what happened? The our platform was flooded with pornography and pictures of feces, feces, poop, poop everywhere.

[00:14:12]

Imagine if everything you posted on a social media website was responded to with a thousand pictures of poop. That would not be a good way to have a discourse.

[00:14:24]

Have you seen the comments under my Twitter posts? They might not be far from it mean literal pictures of poop.

[00:14:33]

I probably find some. I mean, it's not illegal. I don't think Twitter especially well, you know, they have no interest in protecting conservatives either.

[00:14:40]

So that's that's kind of the point. Like I mean, it's very horrible stuff. You know, let's let's let's not pretend that that Twitter isn't some garbage heap of of of human expression because it absolutely is. Right. So and there is and you can find those kind of pictures. You can find pornographic pictures on Twitter. It doesn't necessarily come across your feed because it depends on who you follow. But I have a feeling that if somebody I followed posted that I don't think Twitter takes it down well.

[00:15:06]

But what I'm saying is that is different than what you're articulating there, which is this isn't somebody that you follow. This could be random people on the Internet coming to your Twitter page and posting pictures of, you know, pornographic pictures. They could be the setup parlor works.

[00:15:20]

I'm not on parler yet, you know.

[00:15:23]

Yeah, that was essentially at first their their stance was we will not remove constitutionally protected speech. But they have since learned from that mistake and have amended that policy. And they had to put out a clarification that basically said, well, of course, we're not going to just leave up pictures of poop and pornography and things like that. Of course, that's not what we meant when we said we will leave up all constitutionally protected speech, but that is what they meant.

[00:15:51]

They just didn't understand the unintended consequences of saying we will not remove it as long as it is is legal. There are a lot of legal things that we still want to be able to remove from the Internet in order to make these platforms something that we want to use.

[00:16:05]

Yeah, OK. So yeah, I think we probably end up agreeing because what I was saying earlier was like, you need to have some ability to take things down that are clearly wrong. If you want to go slightly beyond protected speech, fine. I get it right. Pornography being an example, you know, standard, maybe clear and transparent standards would be a good start. And if and by the way, I think of Twitter, Facebook, Instagram, if they had just set clear and definable standards from the get go, nobody would really be arguing.

[00:16:36]

If you said, look, you can't say these words. These are words that we just don't accept. Right. And these are defamatory words. They're racist words. We will if you use that word, your post gets taken down. That's a very clear standard by which to meet. If you say something violent, if you're clearly inciting violence, we will say, well, of course, that's not protected anyway. But you get my point.

[00:16:58]

Of course, the problem is, is the standards are loose and they've become unbelievably partisan and nobody trusts them anymore. And they have other ways of throttling. They have I mean, and I've seen this you know, I don't like to act paranoid about it, but I've seen it sometimes on my own page where you're like, this is just not how this normally goes. Right. There's. Clear difference in what's happening with this particular post. I've seen pages similar to mine and as far as growth go and all of a sudden they stopped and I've seen and I've seen Instagram influences just stop growing.

[00:17:31]

All of a sudden it didn't make any sense. We've been following the exact same growth rate for for a year now. So there's very obvious things that happen. And that's, you know, that's harder to prove because you're not necessarily taking a post down or or fact checking it, which is a little bit more obvious to people. And so, again, again, it seems like, you know, you and I probably the same goal here. And we're just and we're not sure how to get there.

[00:17:55]

I, for one, have been very skeptical of the claims that just getting rid of Section 230 would fix all of this and not lead to a bunch of other consequences. Maybe let's start there. You know, because because that's that's in the news right now. The president wants to veto the NDAA because it doesn't have the removal of Section 230 in it. Again, I totally agree with him on the need to do something about this. I'm not sure removing Section 230 is the right answer, and I'm sure as hell don't think that adding it to the NDAA is the right answer.

[00:18:25]

So, you know, that's but that's a different question. What happens if you remove Section 230 just in its entirety? Just abolish it. Gotcha.

[00:18:33]

Yeah, and that's a that's a perfect question. I think that we really go back to, in many ways, the Internet before Section 230 that I was outlining earlier, which is platforms would be sued for the content that creators put onto their platforms, platforms being a very broad range of of services websites that we communicate with, especially anybody who has a comment section or some ability for users to participate. It would essentially force Internet companies into one of two options, one, moderate everything, essentially create gatekeepers that that will preapprove all content before it's allowed on the Internet, more getting into more similar to like a newspaper or news organization.

[00:19:17]

The mainstream media, which would be we are not we are liable for everything we put on here. We're going to vet it. We're going to check it. We're going to fact check it and decide. And we'll even say we don't personally agree with this. We don't want to attach our name to this as our stance. So we're not going to allow that to go up onto our site. The other option would be a totally hands off approach that does zero moderation, because, as I mentioned earlier, taking down even one piece of content, open you up to litigation for all content on your website.

[00:19:51]

So that's where you end up with a situation that, you know, if I like to think of any kind of concrete terms, I look at websites, especially ones for like children. We absolutely want Section 230 protections for Web sites designed for children because that's what allows them to take down these things. Again, going back to pornography, it's not illegal, but we definitely don't want children exposed to pornography. You know, when I was a kid, I used a website called Neopets Dotcom.

[00:20:20]

I don't know if you've ever heard of that, but it's very tailored towards children. It's got fun games and message boards and those kind of things. Without Section 230, I imagine that website and what it would be like if they chose not to moderate at all, then which would be one of their best options, legally speaking, then that website could be flooded with pornography, child exploitation materials. It could even even something, as you know, maybe relatively innocent as feces.

[00:20:49]

You know, I don't know that we want our children to be exposed to that type of environment or they just wouldn't have comment sections.

[00:20:55]

I mean, because I guess the websites you're talking about are Web sites that allow some interaction by users.

[00:21:01]

Yeah, exactly. There's no real impact if you don't allow any interaction with from your users. If it's only a one way stream from the website to the to the user, then section to thirty, at least, as far as I understand, doesn't really apply.

[00:21:15]

Yeah. OK, so and I think that makes sense from a legal perspective. You're putting in either one of two extreme boxes. You're either getting massive censorship because you don't ever want to be sued or you're getting no censorship at all, which is kind of, you know, conservatives would be somewhat OK with that, I think. But but to your point on parler, like, you know, that's a good experiment that shows got it gets a little wild out there.

[00:21:42]

And those websites can still exist under Section 230 world. You know, you look at websites like Reddit, 4chan and Reddit has really cleaned up a lot of its Subrata. It's in more recent years, but and 4chan essentially got shut down and became eight channe. The idea there being these are the these are the free know anything goes type of type of situations where, you know, you can find a lot of racist, homophobic, defamatory. You probably find a lot of pictures of poop on those websites.

[00:22:14]

Really, you really, really bring it back to the poop. I really. Bring it back to that. So I think that that those types of websites still exist today and they can exist, so we haven't foreclosed that avenue of speech that just I don't want that to be all over the Internet.

[00:22:32]

OK, so I want to read what Section 230 actually says and see, I would argue based on this reading that I think Twitter, for instance, is surpassing the intent of 230 to an extraordinary degree by limiting political speech or even fact checking, frankly.

[00:22:51]

And and so because. Because it says. No provider user of an interactive computer service will be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be. And here's here's the words that are important, obscene.

[00:23:07]

These are all just synonyms, obscene, lewd. I hate this word. Lascivious, filthy, excessively violent, harassing or otherwise objectionable, whether or not such material is constitutionally protected. Now, I think the intent of that is kind of obvious. I don't think that that intent is met by fact checking basically what is often an opinion, a subjective opinion and political speech, which is what's driving everybody crazy. What are your thoughts on that? And could could we make that clear in the law and fix this?

[00:23:40]

So I think that the way that that was worded and particularly including the otherwise objectionable standard and the reason there are so many synonyms is because they wanted to be all encompassing. Their goal was so that some some small thing wouldn't kind of fall through the cracks because someone could say, oh, that's not really lewd. It's just filthy. You know, they said no, any of that, none of that's allowed. And if you if it's even if we didn't think of a reason why it's not allowed, here's here's a standard that you can apply to that situation.

[00:24:08]

I think that kind of help me understand a little bit better. What exactly is the core of your question? What why are they the question?

[00:24:19]

My question is, could we reword that line there to be more narrow and write and say, like, I don't have a problem with the obscene, lewd, you know, because, you know, for your example of poop. Right. I don't really have a problem with these, but the otherwise objectionable part does open up the door for quite a few more quite a few more examples of censorship. Yeah.

[00:24:43]

So I'll give you an example of an otherwise objectionable issue that would not be covered by the the other words, where are we to say get rid of that that portion of the standard? My colleague Shoshana Weissman, who I believe was on the podcast actually with a long conversation on CBS. Yeah, yeah.

[00:25:00]

She loves lots of she's she's really great. And so she has written about this issue of dating sites. Dating sites are dealing with romance scams. So one of the people will create fake profiles to catfish people to trick them into sending them money or various activities, essentially, and wrapping them in a web of of manipulation and exploitation that does not necessarily fall under the obscene, lewd, lascivious, filthy, excessively violent harassing. Yeah, but it is otherwise objectionable, which is why we have that standard.

[00:25:38]

And this is one of the things that that Shoshana has emphasized. If we eliminate Section 230, dating websites would lose the ability to remove fake profiles because it doesn't meet any of those other criteria. So it's important to have that otherwise objectionable as a catchall. Because here's the other big point that I would make on this. Getting rid of Section two 30 would be a giveaway to trial lawyers.

[00:26:03]

Yeah, people, lawyers, lawyers would absolutely love it if you make it so that it could more people surprise. Right. You know, I've never seen Josh Hall, for example, a former attorney general, and obviously very much a legally minded person, says that he wants to be able to sue social media companies. That's kind of like a, you know, dark kind of situation. A lawyer wants to sue people. Yeah, of course.

[00:26:29]

It's also worth noting that the you know, Joe Biden has called for eliminating to 30. And given that the Democrats have a totally different intent in mind, which is more censorship, it gives me pause when we talk about, you know, because we have totally different goals here. And I'm sorry, but to totally, totally different goals cannot be met by the exact same action. I just completely agree and logically thinking here. So, yeah, I completely agree.

[00:26:55]

And it always gets me in these hearings when, you know, the Republicans and Democrats will say, look how bipartisan this issue is. We're all here beating up on these tech CEOs that we all agree that we hate Facebook, we hate Twitter. We're so bipartisan. That's not what bipartisanship is.

[00:27:10]

We have to different goals. Exactly. It's not agreeing to hate the same person or company. It's agreeing to solutions. The solutions are where you get to bipartisanship. And they couldn't be more divergent on that fact because exactly like you said, Republicans are pushing for more speech to be left up, less, you know, social media interfering and deciding what to take down or or tag or any of that kind of stuff. And Democrats want more of that.

[00:27:38]

It's shocking to me to hear in these hearings. For example, you know, congresspeople will bring up specific instances of moderation and they'll say, I disagree with this moderation decision. You should not have taken down this post. And the companies will say, you know, I'm sorry. Just based on our policies, et cetera, et cetera, right afterwards, a Democratic member will stand up and say, I disagree. I think you should have taken down that post faster.

[00:28:04]

And I'm really upset with you that it was up for even 72 hours. How dare you leave defamatory information up for 72 hours. So, you know, and I think that that really gets to the core of my concern with the proposals for changing Section two 30 is they all put the government in some kind of an arbiter place of deciding what is and is not acceptable speech and acceptable duration, censorship, all these various things. And that is exactly what the First Amendment is supposed to protect us from.

[00:28:38]

It is supposed to protect us from having an essentially a ministry of truth that will tell social media companies that all must leave up this content and must take down this content that is government compelled or forbidden speech. And that especially with an incoming Democratic administration, I would think that conservatives and Republicans would realize that having giving giving more power to the federal government to determine what speech is and is not acceptable online is not a good idea.

[00:29:07]

OK, so I'm running out of options here, so. Well, here's the next question then. What would you change about Section 232 to modernize it?

[00:29:17]

Because it wasn't written in the in the scope of social media?

[00:29:21]

You know, I think that that argument strikes me very much similar to when people say that the Constitution must be out of date because it wasn't written to assume all of the technologies of today.

[00:29:33]

You know, we can amend, but we can amend the Constitution with with an overwhelming majority of senators. So. So, absolutely.

[00:29:39]

But I don't think that its age is kind of inherently limiting factor in it. I'm a.

[00:29:45]

Well, well. OK, well, let's let's put that aside. Forget about when it was written, but is it working right now, you know, for, again, to meet our goals here?

[00:29:54]

Yes, I would say yes, because social media companies, websites more broadly are not flooded with a torrent of litigation that causes them to shut down. And, you know, it's easy to get lost in this world of Facebook and Twitter and those kind of folks who, quite honestly, could probably handle at least some amount of that litigation. I worry much more about your mom and pop websites. I please share with me a few of her differently. But in the regulatory environment we have right now a Section two 30.

[00:30:24]

I'm not hearing about mom and pop shop websites getting sued into oblivion for someone posting a comment or a review on their website. That is something that we don't have. And I think that if you the real thing that scares me about changing section to 30 is the flood. Even the threat of litigation is enough to chill speech and to significantly change how businesses operate, even if their lawsuits are frivolous. Section 230 in its current form prevents those lawsuits from being brought up in the first place.

[00:30:58]

And that kind of protection, especially for small players on the Internet, I think has given us exactly the Internet ecosystem that we have today. We have all of the largest, most successful technology companies in the world, sans maybe what Spotify in Europe. That's pretty much their their main success story. That's because of Section two 30. I don't know if you're familiar with Jeff Kosoff. He wrote a book called The Twenty Six Words that created the Internet.

[00:31:26]

You read earlier, you know, the wording of Section two thirty. It's twenty six words. It's really very simple. And I wish more people would read it. That is the foundation. Can you hear you have it?

[00:31:37]

Do you have those twenty six words in front of you? Can we read them. I sure do, yes, absolutely. So that's section 230, subsection C one is treatment of publisher or speaker. No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

[00:31:59]

OK, and that that makes sense, right? It did. Does. And then let's get back to intent. I'm going to read a quote from Andrew McCarthy. I think he puts this rather well said. Quote, Clearly, Congress was not of a mind to grant immunity from speaker publisher liability to interactive computer services that suppressed content based on their subjective political objections to it. Such a grant would have contradicted Congress's aforementioned described objective to promote true diversity of political discourse so that the music gets to the goal that we have.

[00:32:36]

Right. And so you've pretty well established that you don't think changing section to thirty would really meet that goal because of all the unintended consequences that associate with that. Literally an hour ago, I just recorded a podcast with Brendan Carr, commissioner at the FCC. And he tends to agree, I mean, I think he's a little bit more on my side on like, you know, maybe we can more narrowly focus Section two. But he brought up something rather interesting, which might be a better solution, which is looking at consumer protection law because because the thing is, is that these these companies, Twitter, Facebook, they they they created their monopoly on on our Internet discourse by basically claiming that this is where free speech and this is where you can connect.

[00:33:27]

This is where you can talk. And and they got everybody in on that, OK. Then over the as years went on, they started to change their minds. Right. And they started to to act as especially as political actors. So, you know, are they upholding, you know, their initial sales pitch? Did they did they engage in false marketing to the consumer? Is that a better way to to look at this?

[00:33:52]

You know, I'm not a lawyer, so I'd be remiss to tell you whether or not that would be successful. From my understanding of it, it has been ruled by the courts that that terms of service or those kinds of voluntary agreements that the company is basically the company lays out for its users are not legally enforceable. So I don't know that it would be how how successful you will be in bringing those litigations. But but I definitely think that that is a proper that the FTC, the pro consumer welfare standard, you know, that is the proper way to look at these types of situations, particularly when it comes to antitrust situations.

[00:34:31]

I heard in there, you know, they established their monopoly. And I know this is a discussion on antitrust, but I would push back a little bit on that and say I think its itself is ironic. The sentence Facebook and Twitter created their monopolies. That in of itself seems self seems self-contradictory. Twitter is not a monopoly because Facebook exists and vice versa. Also parler gabb any other number of social media networks. So I don't think that these are monopolies to begin with.

[00:35:00]

And then the other part is that's like saying, you know, suing a company because they have a mission statement saying we want to make the world a better place. You know, we want to do these X, Y and Z and then suing them and saying, you didn't meet up, you didn't fulfill your mission statement. I don't I don't think that that's probably going to be very effective.

[00:35:18]

Yeah, well, OK.

[00:35:19]

Well, let's let's narrow the question then, because I agree that I'm not so sure litigation would be effective. They're just based on current law. I think maybe the question is, is that where we we focus our policy objectives? Is that where we focus future legislation, basically narrowing how they can actually write out these? Extremely. Because what they do is they have very vague standards. And I think we could argue that that the enforcement of those standards, because they're so vague to begin with, are extremely arbitrary and it's an extremely subjective, which in turn damages consumer protection rights.

[00:35:57]

I mean, whether it's whether it's new policy or new legislation is, you know, is that is that the better avenue to, again, get to the goal that we want to get to force these companies to actually have coherent standards by which they operate? Which in turn which in turn, you know, you could argue is a consumer protection issue, you know, because, you know, yeah, they have they have these standards. And you have to sign on to them to to to to create an account.

[00:36:25]

But the methods by which they're enforcing them are unbelievably subjective and partisan. That's that's well known at this point. And it's obvious. So are they you know, you could you might argue that they're already violating consumer protection law. And if they're not, then should we be should we be looking at policy or legislation that that defines that better?

[00:36:48]

Yeah. So I think that when we get into these areas of talking about, you know, kind of the what they promised to people and then how they've changed over time, I think that that's actually really indicative of exactly the problems that I see with putting government into this into this position.

[00:37:05]

The reason that, you know, first of all, I would say Facebook was originally started as not as some bastion of free speech on the Internet. It was started by Mark Mark Zuckerberg as a way for college students to hook up with each other. You know, I think if we're being honest about the situation, so I don't think that, you know, and even if at some point they had said, you know, yes, we should have these these lofty goals, you know, that was never the the true goal of them to begin with.

[00:37:30]

What we have seen over time is actually political influence. Political pressure has forced them to become more and more moderators and determinants of content. Know, I look back to twenty sixteen Donald Trump the election and how much how much complaining there was, especially from Democrats about misinformation and Russian manipulation of our election system with all of the speech. It was proliferating on these platforms as a result, in order to Facebook and Twitter and other social media platforms essentially have bend to political pressure to say, OK, we're going to moderate more, we're going to take down more content, more speech.

[00:38:10]

And I have a really hard time imagining how we would craft a law that would not just further that exact treatment. You know, if you've got Democratic senators, Democratic members of Congress, President Biden, who you mentioned has come out against Section two 30, if they're the ones writing the rules for what is an acceptable term of service, what you know, how little transparency or neutrality do you have to meet in order to receive these 16 to 30 protections?

[00:38:39]

I think we're going to end up in a worse place, not a better place. And just in general, I think that this strikes at the core of, you know, as a defender of the First Amendment, first and foremost, these companies have a right to speech. So I don't know that it's appropriate for us to say, you know, they can't have certain speech or they must have certain speech or their terms of service must be specifically worded in these ways or can include this information.

[00:39:05]

Yeah, I think that that's a really slippery slope that that really puts the government in control of what speech is and is not acceptable on the platforms.

[00:39:15]

OK, so kind of shut down every single solution I've sort of offered here so and so the final question would be like, is there a solution if we both agree that there's a problem and that we have a specific goal in mind that we agree on and we're not meeting that goal, just which is basically protecting the spirit of the First Amendment and more importantly, stopping the arbitrary and subjective censorship of political speech, which arguably and I think strongly, arguably, I know that's not correct.

[00:39:45]

English influences elections in major ways and has has extreme consequences. So what is the solution?

[00:39:54]

Yeah. So I think it's I am not going to say that there is no bias on social media platforms. I think it's obvious that there is you know, their user bases skew liberal. There are more liberals on Twitter and Facebook in general than there are conservatives. Their employee bases skew liberal. They're located in the Silicon Valley. I think that that's all just that just makes sense. And to the average person, that's intuitive, right? Of course, there's bias in these situations.

[00:40:24]

My issue is I don't think that putting that power in the government's hands would make it less biased. And I'm more interested in a world of competing biases because I don't believe that there is such a thing as a neutral, truly neutral arbiter of facts and information. I think a lot of things come down to subjectivity and opinion. So I'm much more interested in ensuring that there is a competitive marketplace where alternatives to Facebook and Twitter can exist. Like Gabb, like parler.

[00:40:55]

If you so, you know, going back to saying what happens if we get rid of Section two 30? I don't think those websites exist. I think that those websites get destroyed under the under crushing litigation before they're able to get off the ground. But I think that Facebook and Twitter probably continue to exist because they're the large incumbent players who are able to engage in regulatory capture. You know, it's not a coincidence that Mark Zuckerberg says, oh, yes, please regulate us.

[00:41:23]

We would love it if all of these things we do voluntarily right now, we would love it if all of our competitors were forced to do those same exact things. Of course, they feel that way that would benefit Facebook and harm their competitors. So I think that when we and the other part so there is obviously some amount of censorship online. But the other part that I want to emphasize is we have to be very careful, especially as conservatives, that we don't shoot ourselves in the foot and make something that is maybe not perfect.

[00:41:56]

Even worse. You know, I look back to Donald Trump's comments. He said without the tweets, I wouldn't be here. Straight up, he said, without without tweets, without Twitter, he does not think he would be president and many people have agreed with him that that is a way for him to cut through the mainstream media, to cut through the gatekeepers and the people who would censor him and get his message directly to his followers. So I think that that narrative, the fact that Donald Trump has used it successfully and used it to become president counters the idea that the Internet is just a wasteland for conservative thought were as if it's not useful for conservatives.

[00:42:37]

If anything, I would caution people from you know, there's there's analogies made to, well, you know, CNN or world newspapers, they're subjected to these regulations. Why aren't Facebook and Twitter? Well, how much conservative commentary do you get from those publications? Do you really want the mainstream media to also and how they operate to also be how Facebook and social media companies operate? So I think that it's important for us to protect the opportunities for speech that we have, especially for conservative speech that would otherwise be censored.

[00:43:10]

You know, the social justice warriors would destroy would make sure that Donald Trump was not heard if we subjected these platforms to increased regulation. So I would actually say that the status quo today with Section 230 is much better for conservatives than the alternative of a world without Section 230, even if it's not perfect. Yeah, yeah.

[00:43:32]

I mean, I definitely see your points on many cases there. And I have to I have to bring that up when I talk about it, too. I say, you know, the media itself is so unbelievably biased. I mean, they're got to deal with it every day. Their ability to create narratives by by by by writing objectionable stories and then using that as a sort of a sort of as a persuasion by listing of like look at this, this and this.

[00:44:02]

And there's only a tiny bit of kernel of truth to each of those. But taken in an aggregate seems to be true. They're very good at spinning narratives in that way. The only way for somebody like me to fight back is through my social media accounts. Right.

[00:44:14]

So exactly. And that is absolutely true. Even though. But but but they're trying to infiltrate that as well. Right. Because I guess the left understands that our only way as conservatives to get our voices heard is on these platforms and they're trying to chip away at that. And that only seems to be going in one direction. And so, I mean, so I kind of have to keep asking the question. We have to stop that direction. We have to stop that flow before we're completely censored out of existence.

[00:44:43]

Like, you know, again, the mainstream media has already done its job by by by by allowing the left to completely control that institution.

[00:44:53]

And even though we have our own conservative media well, as we've seen with, say, the New York Post story on Hunter Biden, social media companies will will disallow them to to to even promote their they're protected First Amendment right as a freedom of the press.

[00:45:09]

So, I mean, it's just it's going in a bad direction. And the only entity powerful enough to stop it is the government itself. I mean, I would love to believe that competition is the answer, but Parla is really just a place for conservatives to talk to each other at this point. It's not reaching your average citizen, whereas Twitter and Facebook do.

[00:45:31]

I mean, they they do have a strangle. And I understand that. Yes, there's two of them. And in theory, that's a competition. But in practice, in all practical reality, it is not it is a it is a monopoly from Silicon Valley and a select group of people that work there that all think exactly the same. So, you know, it's just it's it's this is why conservatives are so hungry for some kind of solution, because I think the idealism of the libertarian stance on this, which is that competition solves everything, isn't necessarily working in this case, and we're hungry for solution.

[00:46:06]

I mean, you even said it's not perfect. So I guess the question is, is there a way to make that that regulation perfect? You know, and it's I'm not so sure it means government is regulating that speech. I think it means a law that is written, whether it's allowing the consumer protection angle or whether it's along the section 230 angle, that just more narrowly, narrowly defines what you can take down. I mean, could it be as simple as saying political subjective political speech is not part of this protection?

[00:46:36]

Right. You can't you can't do that because that's because of the overreaching consequences of that. You know, again, the fundamental question of of of what of what government, I mean, the government already I mean, the section 230 already makes, you know, again, it uses words like obscene, lewd, filthy, excessively violent.

[00:46:55]

So it's already doing it. So. So what is the problem with adding a few more words to that sentence, to more narrowly define or exclude what they can do and, you know, in order to protect the spirit of the First Amendment?

[00:47:10]

Yes, I think first and foremost, that's my answer, is protecting the spirit of the First Amendment also means protecting the rights of those individuals to take down speech that they feel that they don't want on on their platform. Yet we get into this tricky situation where, you know, I don't really fully understand exactly where conservatives who are generally pro property rights start to think, OK, but you're too big. And again, I would push back on the kind of monopoly idea because I don't believe that that's that that's accurate.

[00:47:39]

And looking back, you know, once upon a time, Facebook was supposedly a monopoly or not, Facebook. MySpace was supposedly a monopoly. And it's it still exists. MySpace is still a thing. You could go and make an account right now. You might even still have one if you used it back in the day. But they are not a player in the in the scheme of things. This idea that Facebook is a monopoly is really kind of surprising to me and seems very shortsighted because I barely use Facebook money.

[00:48:10]

The people my generation I know don't. And one of its biggest competitors, if not its biggest, is probably tick tock, which is a rising star and is not Twitter a common competitor. Google, right. Google is commonly touted as a monopoly or a giant company with outsized power and the ability to dominate markets. Well, OK, what happened to Google Plus, the social network that Google started? It failed because people didn't like it. So I think that while it seems in this very this moment right now, it seems hard to imagine a world where Facebook and Twitter aren't the dominant players.

[00:48:47]

I think that in some ways that's a failure of imagination, because if we look to the future, there will almost inevitably be new services, new platforms that will come and will challenge them. You know, even the acquisition of Instagram was itself Facebook attempting to innovate ahead of the curve in order to stay in business. And they may not be as relevant today as they are if they hadn't done that because Instagram was a budding competitor. So I think that, you know, first and foremost, I would kind of challenge those ideas that there is this kind of market dominance or market, that it is inevitable that these are the players.

[00:49:22]

And then I would be very skeptical of having the government detail exactly what you can and cannot take down. I think that leaving it up to the the because everything is subjective, there are so hard to find objectively, you know, this is wrong. This is bad. You're always going to have someone subjective making these decisions. And I would rather have those subjective decisions be made in the companies than I would in the government. And especially when the government is largely controlled by liberals who are calling for increased censorship of speech.

[00:49:57]

That really scares me.

[00:49:59]

Yeah, but I kind of feel like we're talking past each other because I, I also do not want regulators in the administration, any administration saying this was right, this was wrong, you know, et cetera, et cetera, that. Yeah, because that that's that's just that's just somebody else giving a subjective opinion. That's not the goal here. The goal here is to is to maybe force into law that, a, you have to have better standards by which you operate.

[00:50:30]

Right. Because of consumer protection rights. You can't you can't ask people to sign up for something based on this set of standards and then just and then just run willy nilly with these standards, which are so vague and so impossible to to to predict, really, because it's impossible to predict what will be taken down, what will be fact checked based on those standards.

[00:50:51]

So by simple consumer protection measures, you could argue like, you know, the best way to do this is to have, you know, clearer standards in place by which we can all operate that allow for no subjectivity.

[00:51:04]

Right. Again, none of this would be happening if these if these companies would just say, look, this is what's right and this is what's wrong, you can't say these words. You can't post the pictures of poop. You can't do this exactly right. And then then you wouldn't have all this space for this nonsense. Yeah, yeah, that makes a lot of sense and I see where you're coming from with that, I think that the the issue that we get into with is, you know, how do you what are the words that you want to ban?

[00:51:36]

You know, I think this kind of gets back into like the car can decide.

[00:51:40]

Well, they can decide. Twitter can decide what words you can't say if they want to say you can't say the word Republican. All right. Well, at least that's clear, you know, but the problem with the problem is, is that they don't do that. The problem and again, this is why I tend to agree with with with Brennan Carr's analysis on this. And he's a lawyer on this. So, you know, that's why we had that conversation.

[00:52:01]

It's I it was a pretty persuasive argument he makes that maybe that's the better avenue because because now it's about, you know, what you're really signing up for here, the standards and how government really has no control beyond that, because, yeah, as long as the standards are clear, the government, you know, litigation would you could still sue them and say you are not operating on clear standards.

[00:52:22]

You have to clarify your standards. So I think it's the best of both worlds because it allows, you know, the more libertarian stance to say, well, you can still be that liberal platform, but now you're going to have to explicitly say that you're a liberal platform.

[00:52:35]

And I want to be clear that so far what we've discussed has been pretty much entirely government action in those kind of things. I will agree with you in the sense that these companies can do better, and I wish that they would. I think that transparency is a big portion of that. And you mentioned towards the beginning of our discussion that we probably wouldn't even be having a lot of these issues. We wouldn't have these discussions to begin with if Twitter and Facebook provided clear and transparent terms of service that very articulately said, here's what's allowed.

[00:53:08]

Here's not what's here's what's not allowed. Here's, you know, this these specific phrases or terms or types of images, these things are banned. These things are allowed. I think that that is would be a good thing for these companies to do, because I think it would be good for their business. I think that it would it would endear them to their users. So from a purely business perspective, wanting to provide the best experience to your users, I think that you should provide transparency.

[00:53:36]

I also don't always agree with, you know, individual moderation decisions. So and it would be really nice. And sometimes I see ones where I'm like, dang, that, why is that up? And this other thing was taken down. Yeah. And it's very consistent. Exactly. And I think that also has a lot to do with the just the mechanism by which they do they do take downs, which is a notice and takedown regime. You know, they have millions or billions of pieces of content that get posted on their website on a regular basis.

[00:54:03]

And the way that they decide what to even scrutinize, let alone take down, is usually based on user reports. So a user says, I don't like that they flag it and then that gets put into either an algorithm or into a manual curation system. And as I mentioned earlier, I think it's obvious to everyone that the employee bases as well as the user bases for these companies skew liberal. So, of course, you're going to end up with more conservative content taken down if the entire regime is OK.

[00:54:34]

We've got a bunch of liberals on our platform and they're going to I I've also heard anecdotal evidence that liberals are more likely to flag content as inappropriate, whereas conservatives are more likely to say, I don't agree with it, but I'm not going to flag it. And just based on how the platforms work, that's going to result in more censorship of conservatives. So I wish for, in a free market perspective, that Twitter and Facebook would voluntarily adopt more transparency metrics, because I think that that is how they will stay relevant going into the future.

[00:55:06]

And one of the ways that they won't get overtaken by a competitor, because I could see that as a point of competition in the market. If a social media company arises that says we have a very clear and transparent criteria for what we will and will not take down, that is a competitive advantage for that company with users who will say thank thank you. Oh, this is exactly what we've been waiting for this whole time. No more arbitrary takedowns, none of this kind of stuff.

[00:55:34]

That's exactly what we want. So that gets back to kind of what I was saying as far as competition and new entrants challenging them. Or it might even be that a new entrant challenges them and then forces Facebook and Twitter to adopt more transparent rules, which I think would be a very good thing. My skepticism comes in when you say, like the federal government should mandate that they have transparent rules or what and what does, because then I have lots of questions.

[00:56:04]

What does transparency mean? What can I make them? And I as a corporation would say, OK, I'm going to try to make these as vague as possible if I'm being held to them legally, because I'm going to want to be able to wiggle out of things later on down on the road, down the road. So I don't think that a government mandate would result. In more clarity or more transparency, I think that it would result in less you know, I look at things like GDP right in Europe, I'm sure you're familiar with that.

[00:56:34]

The data regulation that basically, you know, we are all now dealing with because we get a pop up on every website that says, do you accept cookies? I don't think that that has made our information any more secure. It just means that now their companies have us check a box every time we go to a website so that they can get out of liability. So I think that you would end up in a similar situation where they would do something boilerplate in order to meet the technical requirements of the law that Congress passes.

[00:57:04]

But I don't I think that it would get further from the spirit of what you're going for, which is true transparency. And I think we get that through competition in the market. I hope so. I hope so. I mean, I just have to agree to disagree on that, but it has been a very good and thoughtful and in fact oriented conversation because there's just so much confusion around, you know, I mean, if you ask, you know, a typical American right now, it really is just just get rid of Section two, 30 and everything's fine.

[00:57:34]

The reality is far more complicated than that. But we we certainly have an interest in and in improving this. And to me, it certainly is about transparency and clarity of terms of use and standards by which they operate. And, you know, I would I would you know, I'm not as I'm not as hopeful as you are that competition will solve that. It just doesn't seem to be working. It seems to be getting worse. And even with parler and everybody moving to parler, I don't think Jack Dorsey cares.

[00:58:08]

It just I just don't. And if and if he cares, his his his board members and his staff certainly don't care. And, you know, again, I don't know what's inside the mind of Jack Dorsey. Maybe it's just maybe it's pressure from his employees that force them into this. I don't know. I don't really care. I just know that it's it's not working. And we've got to be very thoughtful about how we we get to the objective that we want, which is protecting the First Amendment.

[00:58:40]

And and I don't think we ever saw that that certain politically oriented companies would have so much control over the broader narrative that Americans absorb and that they listen to. You know, we always we're always worried about the government having too much control over over what people heard and what they listen to. But there are entities that are that are arguably far more powerful than the government in this in this respect. And and they're not acting in good faith. And so it's a it's a difficult problem as far as this conversation as has.

[00:59:16]

Absolutely.

[00:59:18]

Kanyon, anything else to add?

[00:59:21]

Yeah, I just there's just one thing I was going to mention on, and you were talking about the New York Post story. And I think that that's a really good example of both of, you know, a mistake on the part of the platforms that I think that Twitter probably should not have taken down that content in the way that it did. But it is it's a shows the difficulty of moderating and content moderation, I would like to emphasize, is very difficult, especially when you have people in Congress on the opposite sides of the aisle telling you to do the exact opposite thing with a specific instance of content where one member is telling you, take it down and another is telling you, leave it up.

[01:00:01]

I think that they have done the best that they can in a very difficult situation and and they make mistakes. That's absolutely the case. But I would I would argue that the New York Post story actually illustrates the benefit of social media because they throttled it. They, you know, limited its exposure. But I still heard about it. You still heard about it. It's still on Twitter today. I can go on Twitter right now and find a thousand copies of the New York Post story.

[01:00:33]

In fact, sometimes when a story gets censored, it actually causes it to gain steam and get more attention. I would argue that New York Post story might have gotten more attention because it was censored by Twitter than if it had not been censored at all. It could have just quietly gone into the night. But censoring it really drew attention to it. And I would challenge people who think that, you know, reforming Section 230 or eliminating Section 230 will improve the situation.

[01:01:00]

I would challenge them to look at the mainstream media if the mainstream media had decided that that New York Post story was obtained illegitimately, you know, in this instance via hacked materials, do you think that we would be able to go onto the CNN website and find information about the New York Post story? No, they would have scrubbed it. They would have censored it. They would have made sure that there was no mention of it anywhere. So if anything, I think that that is an illustration of consumer pressure against Twitter caused them to actually host speech that they had formerly censored.

[01:01:34]

And that's a power that we only have in social media and not in any other form of media. So please, let's not make social media more like mainstream media, because I think there's a lot that we would lose.

[01:01:45]

Yeah, yeah. Definitely a fair argument. All right, Cannon, thanks so much for being on. Thanks for the thoughtful discussion on this.

[01:01:52]

Really appreciate it, Congressman. Likewise. I really appreciate your thoughtful questions. You know, so much we can get to engage in rancor and, you know, talking past each other. But I think that you and I have really engaged in a thoughtful conversation. I appreciate that.

[01:02:05]

Yeah, it was great and I think educational as well. All right. Appreciate you being on.