Transcribe your podcast
[00:00:00]

Then there was also stuff that Eminem said was in the song that he admitted wasn't true. So, I mean, there was something know, Eminem getting beaten up by someone and the principal in the bathroom, and Eminem's mother beating him over the head with a remote control until his brain fell out of his skull. And Eminem said, well, of course, this didn't happen. If my mother beat me until my brain fell out, I wouldn't be here. Here. It's a song. Freedom of beat, fundamental rights, freedom of conscience, academic freedom, freedom of press, and.

[00:00:35]

The right to listen. You're listening to, so to speak, the free speech podcast brought to you by fire, the foundation for individual rights and expression. Hey, folks, welcome back to so to speak, the free speech podcast, where every other week, or mostly every other week, we take an uncensored look at the world of free expression through personal stories and candid conversations. Happy New year. I'm really excited about today's show. Today we are speaking with Jeff Kasif. He is an associate professor of cybersecurity law in the United States Naval Academy's cyberscience department. He's the author of four books, most recently, and the topic of today's conversation, liar in a crowded theater. It's a topic that has been in the news for almost a half a decade now with our discussions surrounding miss dis and mal information, some of the stuff that we've been seeing with artificial intelligence and how it can hallucinate and perhaps deliver, if we're talking about generative AI, false results, including false case citations, or to non existent cases. But Jeff has this kind of reputation as a guy who writes books at the time when the topic of those books is in the news, and the books tend to get published just before they tend to capture the news.

[00:01:55]

So I don't know if Jeff has any sort of, like, prescience or is in touch with some higher being that gives him a heads up. For example, that anonymous speech, which was a book that he published in 2022, was going to be a topic of conversation. We're going to talk a little bit about that at the end of the show. And that section 230, which governs kind of Internet liability, was going to be a topic of conversation in the courts and elsewhere when he published that book in 2019. I should also mention a book in 2019, a book in 2022, and a book in 2023. This man is busy. Jeff Kasif, welcome onto the show.

[00:02:29]

Thanks so much having me.

[00:02:30]

So I have to ask, you're an associate professor of cybersecurity law at the United States Naval Academy's cyber science department. As I mentioned earlier, how did you get this focus on the First Amendment?

[00:02:43]

Yeah. So I guess I first should say that everything I say is not on behalf of the US government, the Department of Defense, or the Naval Academy. I don't speak for them, and I think they're probably happy that. So, right out of college, I was a journalist, and I was a journalist while I went to law school as well. And so I was always really interested in the First Amendment. I relied on it both to obtain access to information and also to not get sued by people I was writing about. So it's always been near and dear to my heart. When I practiced law out of law school, one of the things that I did was First Amendment law, doing pre publication review for media organizations. And that was really the most fun part of my job. And I really got into a lot of the online First Amendment 230 stuff, because I would get a lot of complaints from subjects of coverage, not only about the coverage itself from my clients, but also about the user comments. This was often for tv stations that had websites, and I was able to just write them these letters citing section 230 in one page, and it would magically make it go away.

[00:04:02]

I'd say, hey, we can take it down or keep it up. That's up to us. And I was just so fascinated by the fact that there was this law that was kind of this magic law for websites that said, we're not going to be responsible for user content. So when I went into academia, I was really interested, and there hadn't been any history of. How did we get this?

[00:04:26]

Well, we should mention, I guess, for our listeners who maybe aren't familiar with the law, what it exactly is. I know, I want to talk about it later in the conversation, but since we've already sort of dived into it, probably helpful context.

[00:04:36]

Yeah. So it was passed in 1996, and it says that if you're a website, social media platform, ISP, anything providing Internet access or platform, that you are not responsible for the content that users and other third parties post. So if you were to go onto Facebook and defame me, I'd be able to sue you, and there would be the standard First Amendment protections, but section 230 would prevent me from suing Facebook. And I was just so interested in, we're really the only country with a protection this strong, and how did we get it? So I thought, I want to write a book. I used to be a journalist, so why don't I write a book about this. And I will say I started pitching the book in 2015, and it was tough to sell, even to academic presses. They were like, no one's ever heard of this. What is section 230? And then it ended up coming out, I guess, just five years ago next month. And that was just as it started to become a presidential campaign issue. So it just kind of went crazy from there.

[00:05:49]

Yeah. The title of the book, the 26 words that created the Internet, I think, is provocative. And was that your choice?

[00:05:57]

It was. I started off with the title, the 26 words that shaped the Internet. And as I was writing the book, I thought, I think I could go a little stronger, because when you think about all of the platforms and how they came to be the foundational first and second generation platforms like Facebook and Myspace, they never could have gotten off the ground. Mark Zuckerberg now would be able to. I mean, Facebook now could survive with that section 230. But when Mark Zuckerberg was a Harvard dropout in 2003, 2004, he could not have created a business the way he did around user content if he was going to be potentially legally responsible for everything that people were posting. So I'll say that over the past five years, there have been a number of computer scientists who have tried to tell me that there were actually computer scientists who created the Internet. I give the caveats in the introduction, but it did at least get some attention to, what is this?

[00:07:07]

I mean, you could, I guess, say, created the modern Internet and sort of connected in social nature, but then you're losing a punchy title.

[00:07:16]

Right, exactly.

[00:07:19]

So anyway, that's 230, and you got connected through kind of doing pre published review and working with media companies, it sounds like.

[00:07:26]

Yeah, the first Amendment issues always fascinated me as well. And, I mean, 230 and the First Amendment really have, obviously, a lot of overlap. So the current book really is something that also the topic of when can the government ban false speech? Is something that always fascinated me as well. And it seemed like really, right around the 2020 election, we were starting to get a lot of proposals from people who, frankly, should know better about the government's role in policing misinformation. And it often stemmed from legitimate concerns. I mean, people who said, there are potential authoritarians who are trying to take away our democracy through misinformation, and we need to do something about it. I disagree with their strategies. I think that eroding free speech protections isn't the best way to combat authoritarianism. I think authoritarians like it when there's not free speech. I was starting to hear from people who I frankly expected more from these proposals, both for the Internet and offline, to say, hey, let's send politicians to jail if they falsely claim to have won an election. It's like, wow, that's kind of scary stuff that we're starting to hear from people who are supposed to be reasonable.

[00:09:03]

Yeah. This isn't a new space, right? I mean, the first Amendment and society at large has had to deal with the effect or impact of lies throughout human history. Frankly, we have new words for them now. Miss disinformation, for example. But it all comes down to the core ideas. Is a lie protected speech? And in what context might it not be? The title of your book, let's start there. Liar in a crowded theater derives from a line in an early 20th century Supreme Court opinion, Schenk v. United States. Can you talk about that opinion and the reason for choosing this headline or title?

[00:09:50]

Well, I'll start with the reason I chose it is because that hadn't initially been the title of the book. I wanted to write it about why the First Amendment protects a lot, not all, but a lot of false speech. And as I was writing it, I'm really big into primary documents and court filings and transcripts, because I feel like that's where the best color for the books comes from. It's not the court opinions, but it's what people are filing and saying in court. And for every government, every plaintiff, every regulator that's trying to penalize allegedly false speech in almost every case, at some point they say, well, just as you can't yell fire in a crowded theater, you also cannot say whatever. And usually you can say what they're trying to. So I just thought, okay, I've got to get fire in a crowded theater in here somewhere. And so the origin of it, there's this misconception that the Supreme Court, sometime in history, decided a case involving someone yelling, fire in a crowded theater, and that's not happened. I don't know if it's hanging up right behind me. There might be a little glare.

[00:11:02]

There's a little bit of a glare.

[00:11:03]

Yeah, I keep it there as a reminder. It was a pamphlet that was distributed by a socialist party official in Philadelphia in 1917, and it made a really bad legal argument. It basically said that the military draft violates the 13th Amendment. It's silly. And when you read the whole pamphlet, it's almost insane. But by modern standards, if you read it, you'd say, okay, it's not within a narrow category of a first amendment. Carve out. He could hand it out. But no one's really going to believe it. But at the time, the guy handing it out, Charles Schenck, was prosecuted for violating the Espionage act because the government said, this poses a clear and present danger to our military efforts. We're in the ramp up to World War I, and you're now saying that we can't draft people. And he gets convicted. And the Supreme Court unanimously affirms his conviction. One of his main challenges was that his prosecution violates the First Amendment. And Justice Oliver Wendell Holmes, writing for the court, writes, no, this is a clear and present danger. And, yes, the first amendment protects some speech, but just as you can't falsely shout fire in a theater and cause a panic, you also cannot hand out these materials.

[00:12:38]

So Holmes actually got that from a similar case, that it was a different case that he was hearing right around the same time. That was an example that a prosecutor had used at trial to justify the prosecution. So it wasn't even, at least it's best, most commonly believed that it wasn't even Holmes who came up with that. But he basically got it from a prosecutor's argument. But that's a long way of saying that whenever anyone now tells me, you can't yell fire in a crowded theater, I say, okay, well, can I also criticize the military draft? And that's essentially what you're talking about. You're saying you could go to prison if you say the military draft is on.

[00:13:23]

Yeah, because that's what today's standard is. Crazy. Yeah. And that's what Justice Holmes was analogizing. Right. And it's just a false analogy. The reason I like the title is because you use liar in a crowded theater. And when some people use this analogy to justify censoring speech, usually not related to a fire in a crowded theater, they forget the falsely part. They forget that the person who's shouting fire in a crowded theater is doing so falsely, they're lying. Right. And if there is indeed a fire in a crowded theater, please do shout fire so that people can get out. Right. It's the falsely part, but it's also missing the latter part of Justice Holmes quote, which is and causing a panic. So, for example, you could falsely shout fire in a crowded theater. I could shout it right now in this office building. Fire, fire, fire. But nobody's going to actually believe that there's a fire if the context suggests that you're just being satirical or you don't actually mean it, but if it's likely to and intended to cause a panic surrounding a fire that does not exist, then maybe it is unprotected. So there's just a lot of context and import surrounding how that phrase is used or analogized that people just leave out.

[00:14:35]

So I love that your title uses the word liar because it gets to some of that nuance that is often left out when people use this phrase.

[00:14:44]

And I've gotten a lot of pushback about it because what people say is, well, you could get in trouble for yelling fire in a crowded theater. And under certain circumstances, that's true. So if you're intentionally or recklessly yelling fire in a theater that's crowded and you have no reason to believe there is a fire and people trample each other, then there are false alarm statutes, just like you can't call in a bomb threat. But the point is that it's not really anything innovative to say that the First Amendment is not mean. Other than Hugo black, nobody is really going around making that, and he's been.

[00:15:28]

Dead for a while, and he wasn't always as absolutist as he claimed to be, for example, in the secondary education.

[00:15:37]

But, but, I mean, nobody's seriously saying that you could perjure yourself and not face consequences. But the point is that the First Amendment, as the Supreme Court has repeatedly said, is not this ad hoc balancing test where we look at whether the harms outweigh the benefits. I mean, that's Europe. That's a lot of other places. But when you say fire in a crowded theater, what you're really saying is the first Amendment is anything. It only protects the speech that we want it to protect. That's really what the shorthand is, and that's not the case, fortunately. At least now I worry that perhaps if people start saying it enough, the courts will start to accept that, and that's a pretty dangerous place to be. But at least for now, we're not there.

[00:16:27]

What does the first amendment say about wise? What are the key cases? Well.

[00:16:35]

Until 2012, it was kind of messy. There were some defamation cases that had some fairly unhelpful dicta, which talking about there not being First Amendment value in false speech. But this was really in the context of defamation and whether certain types of plaintiffs had to face a much higher evidentiary standard.

[00:17:01]

Yeah, I see the heed the rising voices news advertisement in your background.

[00:17:08]

So that was for public officials and public figures. The Supreme Court said that they have to show actual malice, which is clear and convincing evidence of act of knowledge, of falsity or reckless disregard, which is a really high speech protective standard. But the Supreme Court in the 70s said that public or private figures do not have the same burden. And in that case, the Gertz case, they made comments about how false speech does not have First Amendment value. And so there was sort of some uncertainty about does the First Amendment protect false speech? And we got our answer in 2012. And that was a case involving a man named Javier Alvarez, and he was a local water commissioner in the Pomona, California area. And frankly, he liked to lie. He made up some pretty fantastical tales about his life. And at his first meeting, this was in 2007, they asked the new board members to introduce themselves and he said, well, I was a marine and I received the Congressional Medal of Honor. And that wasn't true. And the Congressional Medal of Honor is the highest military honor. There have only been about 3500 people who ever received the Congressional Medal of Honor.

[00:18:33]

So it's kind of a stupid thing to lie about because you could just go on the Internet and find that he did not receive it. Now, unfortunately for him, the FBI caught a recording of this meeting and kind of double unfortunately for him, just two years earlier, Congress had passed something called the Stolen Valor act, which says that if you lie about receiving the Congressional Medal of Honor, you can face up to a year in prison regardless of your intent, regardless of whether you profit from the lie. And so he challenges the stolen Valor act when he's getting prosecuted all the way up to the Supreme Court. And it's a fractured opinion. But four justice plurality and a two justice concurrence find that the stolen Valor act, which basically imposes strict liability for false speech, is unconstitutional. So in that case, we have the Supreme Court very clearly stating that just because speech is false, it is not automatically exempt from First Amendment protections.

[00:19:35]

What are the other categories of speech that would be unprotected and that involves lies? We've already talked a little bit about defamation. I do want to ask you about the Dominion lawsuit here in a moment. But you mentioned perjury earlier, I suspect also commercial fraud is another category.

[00:19:52]

Yeah, commercial fraud. I mean, commercial lying to a federal agent. So if the FBI visits you, and we saw this in some of the prosecutions surrounding various FBI investigations into Russia and so forth, that some of the charges were under a section of the criminal code called 101, which says you can't lie to a federal agent. And so people who otherwise might not have been prosecuted did get prosecuted because they allegedly lied to FBI agents. That's also an area where you could face liability. There are also certain torts like false light, which is similar to defamation, where you could face liability. And we talked about perjury as well. So it's not that all false speech is protected, it's just that you need more than just the speech being false itself to be able to either put someone in prison or obtain damages from.

[00:21:06]

Yeah, and you can look at the falsely shouting fire in a crowded theater analogy as perhaps falling under the incitement standard that was later calculated in Brandenburg, which means that the lawlessness needed to be likely imminent. And in that case, you could imagine some circumstances where it would be. We talked about defamation. I want to ask about the Eminem case. So, Eminem, the rapper, public figure, right? He writes a song for his first album, Slim Shady LP, about being beaten up by a bully, and that bully later sued him, this bully presumably being a private figure for what he said the bully did to him. And the case was dismissed, because while Eminem said things that were not true in his song lyrics, the court found that they were sufficiently true or true enough that Eminem didn't face liability. And this speaks to the substantial truth doctrine. I was wondering if you could talk a little bit about this case and the difficulty in determining what is true and what isn't true and how the courts and our society thinks about those questions.

[00:22:19]

Yeah. So when I was practicing media law, when we would unfortunately get to a point where we actually would have a lawsuit, rather than sort of convincing people not to sue, one of the things that we relied on was something called the substantial truth doctrine, which basically says that you don't have a defamation or false light claim, even if not every word is correct, as long as what's known as the gist or the sting of the statement overall is true. And I wanted to illustrate it. And I looked through, really, dozens of cases, and I just happened upon this case involving Eminem. And I'm a fan of Eminem. And what really drew my attention was that the judge wrote her opinion partly in rap verse, which is kind of questionable. So I thought, I want to know more about it. Unfortunately, somehow, this was a case from 2003, but I was able to get all of the court records, including all the depositions, because this case went to summary judgment. So I was able to get Eminem's deposition, which is very colorful. I quote from as much as I could. I wanted to fill the whole book with Eminem's deposition, but I had to limit myself.

[00:23:37]

And there was also the deposition of the person who alleges that he defamed him. And what was so fascinating was there was other evidence, like a lawsuit that Eminem's mother filed against the school district while Eminem was in middle school, that named this bully and know he was hurt while being. While being bullied by this person. And what was so fascinating was looking at the recollections of Eminem and his bully. And there were. There was a lot of stuff that they basically agreed on, like, they went to school together, that the larger group that this bully hung out with did sort of say things to him. There were some sorts of altercations, but the bully maintained that he didn't actually do anything himself. And then there was also stuff that Eminem said was in the song that he admitted wasn't true. So, I mean, there was something about Eminem getting beaten up by someone and the principal in the bathroom and Eminem's mother beating him over the head with a remote control until his brain fell out of his skull. And Eminem said, well, of course this didn't happen. If my mother beat me until my brain fell out, I wouldn't be here.

[00:24:58]

It's a song, and nobody takes this seriously. And the judge basically agreed with him and said, yes. While all of this didn't happen, overall, there was some sort of animosity, and that was what was being described. And she ended up getting affirmed by the state appellate court. So what I find really fascinating about that is especially. I mean, this was something that happened in the early eighty s, and it was 2003 that they were litigating this. And one of the things about truth and falsity is that you're never going to have, especially when you have this amount of time, you're not going to have a perfect documentation of what is the precise truth. Now, maybe if you had video recordings and things, but in the typical case like this, you're going to have different people's memories. And so I found it really fascinating reading through the depositions and finding both of them sort of testing what their vision of the truth was.

[00:26:04]

Yeah. And you often find in psychology that the longer time goes by, the more untrustworthy our memories become. You'll find situations where someone recalls an incident that happened to them right after the fact, and then ten years later, they'll recall the same incident and their story changes. And that's not because they're being untruthful, necessarily. It's just because the human memory is fallible and forgets things or reinterprets them. So there's quite a good body of research surrounding that. And here we're talking about a song that was published what, like 15 years after the alleged incident is to have occurred? And I think you write in your book that the case underscores the reality that a consensus on one absolute truth is often unattainable, and it could be unrealistic and chilling to expect precise accuracy. And I know this as a podcast host. I think any Ed newscaster knows this. When you're talking in real time with people, often you'll get the gist right. But sometimes the facts might be a little bit murky. But it's hard to have a conversation otherwise. And if you have integrity, of course you hope to correct the record or correct the facts, but it's really hard to have kind of a real time.

[00:27:21]

And I know Eminem wrote the song deliberately and so he wasn't doing it in real time. But I'm just speaking to the broader kind of challenge in getting things 100% accurate when you're talking in real time. That brings me, I guess, to the Dominion lawsuit. I'd love to hear your perspective on that case again. This is the case involving the 2020 election, where Fox News had a number of hosts and guests on air who accused Dominion voting systems of fraudulently counting votes, rigging the elections, things of that nature. At least that was the allegation. Dominion sued Fox News over this. It went to court, but just before trial, they settled for a shocking amount of money. I think to most pundits and speculators who watch this space, and longtime listeners of this show will know that we actually had the two parties in the lawsuit, the firm representing Fox News and the firm representing Dominion on the podcast for kind of a symposium or discussion after the settlement was reached, which was really interesting. But I'd love to hear kind of your thoughts on that case.

[00:28:39]

Yeah. So we won't know for sure how it would have resolved. I think that if there was any defamation case where the plaintiff could have won, this is really toward the top of my list because as a media lawyer, I read through all of the texts and emails from the different fox hosts and producers that were part of the evidence in that case. And I mean, that's the sort of thing that keeps you up at worst. It's almost like the subject line should have been evidence of actual malice because it was like repeatedly, like, yes, this is totally crazy. And then it continues to go on the air. Dominion did everything right in the ramp up to litigation in that it didn't sue right away. What it did was it kept putting out responses and saying, this is false and this is why. And this is the evidence, and they kept doing it. And while that's helpful counterspeech to sort of counteract the false speech, it's also really good evidence for a plaintiff. And I think that the fact that the judge before trial found, as a matter of law, that the statements were false, that's devastating.

[00:30:06]

And that's not something that typically happens, is a judge not leaving the factual determination of truth or falsity to the jury. That made it so much more difficult for Fox to have had a trial.

[00:30:23]

Now, I will say the late attorney in the case, when we had that previous podcast, which was hosted by the First Amendment, Salons said, as said, it was really hard to make the arguments we needed to make because the court had essentially eliminated our First Amendment arguments. But he was talking about how he did some jury research. And their core argument was essentially that the president in the United States was saying that the election was stolen, that a lot of his spokespeople were saying the election is stolen. The president and the spokespeople are newsworthy, and it's important to hear what they have to say. And the Fox News, even if its hosts thought that the argument was BS, had a duty to report this news story that was happening. And as soon as the court cases that were looking into these allegations, the lawsuits that were being filed by Trump and his team were being filed, resolved, finding that there was no sort of funny business happening in some of these districts, that they stopped reporting it. And so the argument was essentially that this was a news story, it was going through the courts.

[00:31:35]

The president of the United States was making these arguments. But as soon as they found out how the courts ruled in this case, they stopped. And the lead attorney for Fox said that by all accounts in our research, was going to be a very compelling argument for the jury, but it never got there.

[00:31:53]

So there's something in the law that I write about in the book called the Fair report privilege, which basically says that if you're a media or outlet or just anyone on social media, any speaker, if you fairly and accurately report a public proceeding, so that could be a court filing, it could be Senate hearing. If it's something in an official government proceeding or filing and you report on it, you cannot be held liable for defamation, even if what you're reporting is inaccurate. And so that's pretty well established. Different states have different levels of that protection. But as a former journalist and a media lawyer, that was actually what I relied on most, was that if you get it from the documents, that's as close to being bulletproof as anything. Now, there is an extension of that that I write about in the book called the neutral report privilege that was developed in the late 1970s by a federal court in New York which extended that and said that it's not just limited. This privilege isn't limited just to government documents. But if any public figure makes a statement that's newsworthy about something in the public interest, that the media cannot be held liable for reporting, is this a.

[00:33:20]

Statutory protection or one that's developed through the law?

[00:33:23]

It's basically a First Amendment common law privilege that the second Circuit recognized. This was a case involving the New York Times, where the New York Times reported the Audubon Society's claims about scientists saying that they were being paid off and they weren't. And what the New York Times basically said is, when a public group like the Audubon Society makes this statement, we have to write about it. And the second Circuit agreed and developed this privilege. There have been some other courts that have developed it that have adopted that, but a lot of other courts have rejected it. And in the Delaware judge in the Fox case also rejected the privilege. And I think had he allowed the neutral report privilege, I think that there still would be some difficulties applying it. But I think there's a decent chance that Fox could have used that successfully, because you don't get anything much more in the public interest than what the president of the United States is mean. It's not a bad argument to know if you have the leader of the executive branch making claims, how do you not report on that? And I think there's some merit to that.

[00:34:48]

Yeah. The settlement was for $787.5 million, a huge settlement. I live in Washington, DC, and just across the river in old Town Alexandria. Claire Locke, who was one of the firms representing Dominion in this case, has a huge compound in primo real estate right on the river. So I imagine some of that money is going to pay their lease or their mortgage or whatever they got. But they're the kind of premier defamation firm, and they're in our area. There is another lawsuit, right, against Fox News from another voting system that's still kind of going through the courts.

[00:35:22]

Yeah. There are quite a few lawsuits from Smartmatic and Dominion, not just against Fox, but against other media outlets. And they're in different stages of litigation. But I think Claire Locke, I've seen them talk about their strategy in both pre litigation and during litigation, and they're quite effective. I think part of it is just the facts will always be better in some cases than others. And this is a really plaintiff friendly set of facts that you have, but they helped shape that in terms of making sure there was that factual record there. Now, I don't think that if Dominion and Smartmatic continue to succeed, I don't necessarily think that that will lead to the sort of rebirth of defamation claims as the solution to falsehoods, because this is a very unique situation where you have two plaintiffs that were uniquely harmed by this. A lot of sort of the misinformation is sort of these general society wide harms, and you're not going to be able to use defamation very easily to litigate that. And you also have the ability where there's a lot of money at stake, and so it's much easier to finance the litigation and there's a bigger payoff, potential payoff for it.

[00:37:03]

If it's the standard individual going after some troll on social media, there's often not sort of the financial model for that to work.

[00:37:14]

Yeah. Claire Locke has been in the news recently as well, because they were retained by the Harvard Corporation or Harvard to write a kind of angry lawyer letter threatening a lawsuit against the New York Post. For the New York Post's kind of fishing around about the plagiarism allegations against Harvard President Claudine Gay. And Harvard and Claire Locke were criticized for that as being a threat to a free press and free speech and just kind of the reporting due diligence, I guess.

[00:37:46]

Yeah, I think that a lot of, I disagree, and obviously, I don't know everything about what led up to that, but typically, you're not going to succeed in killing the story with a lawyer letter, and you're often going to strike sand effect yourself. While the story obviously looked bad from a number of fronts, it looked much worse because you had the most prominent college in the United States threatening a newspaper with litigation for reporting about something that was right.

[00:38:30]

Yeah. And the continued kind of drip, drip of allegations of plagiarism and the examples that have been put forth in the public sphere make it really hard to make that defamation argument.

[00:38:42]

Yeah.

[00:38:43]

But I want to pivot now to your other book, which is the United States of Anonymous, and talk a little bit about anonymous speech, which has been in the news recently. Nikki Haley suggested during a Fox News interview that anonymous speech on social media is a national security threat and said that if she became president, every person on social media should be verified by their name. It's also been in the news because there are a lot of states passing or considering so called social media or Internet age verification bills for adult websites or social media websites. I think Utah has one of those, which would, depending on how you interpret the law and the tech that you would implement in order to comply with it, sort of require people to reveal their identity before visiting these news sites. So can you talk a little bit about kind of the evolution of anonymous speech, maybe addressing Nikki Haley and some of the age verification bills since your book was published in 2022?

[00:39:45]

So I wrote the book actually sort of as a follow up to the 230 book because one of the sort of follow up proposals that people would have when they'd say, okay, well, if you're not going to sue the platform, then you've got to be able to sue the person who's posting on the platform. And to be able to do that, you have to make sure that you know who they are. And so there for a while have been these sort of half baked proposals to say, well, let's make everyone use their real name on social media. And I think that it's very short sighted because there are a lot of people, I have the luxury of being able to use my real name on social media and post what I want, but there's a lot of people for job reasons or family reasons that can't do that. And so I was concerned about that. So I started looking into the cases about anonymity. And, I mean, it's really going back to the founding of our country. Most of the stuff that was being published about our form of government and criticizing the king was for very good reason, not published with people's real names.

[00:41:03]

It was, Thomas Payne wrote, common sense is written by an Englishman after we had independence and we had trouble getting states to ratify the constitution. You had Hamilton, Madison, and Jay write the Federalist papers as Publius, and that wasn't really as much that they feared for their safety, but they knew that their argument would have more weight if their names weren't attached.

[00:41:26]

Yeah. It would make it harder for people to lobby ad hominem attacks because they would force folks to stay focused on the arguments because there isn't a person associated with those arguments to attack. I think part of the same reason why it wasn't really known who were part of the drafting committee of the Declaration of Independence until like a decade later. Right.

[00:41:46]

Yeah.

[00:41:47]

Forcing people to focus on the words in the declaration rather than who Thomas Jefferson is and any of his perhaps foibles.

[00:41:57]

Yeah, absolutely. And so the Supreme Court eventually recognized this history in a series of cases involving people who were getting in trouble for distributing brochures. And pamphlets without having their names printed on them because there were various state laws that said if you hand out pamphlets, you have to print the name of the author. And the Supreme Court really rooted this first Amendment protection in our tradition of anonymous speech. And that carried on to cases involving people who go door to door canvassing. There were some localities that said, you have to wear a name badge, and the court said, no, you can't require that. And then finally, in the early late 90s, really, as there were bulletin boards online, primarily Yahoo finance, which basically back in the day, they had bulletin boards for every publicly traded company, and anyone could go on with a pseudonym and post whatever they wanted. And what you had, it was mainly intended for investors, but what you had were a lot of disgruntled employees who otherwise would never have had a voice going on and criticizing their management, their ceos, the boards. And this really got under the skin of these executives because you think about it.

[00:43:18]

And before then, the only people who really could publicly criticize the executives were journalists. And if the journalists wrote something that upset the executives, they would complain to their editors, but they knew who to deal with. But now they suddenly had their own employees who they didn't even know who it was going on the Internet and having the gall to criticize them. And they hated it. So what they started doing was filing these really bogus defamation lawsuits, because most or trade secret lawsuits, and none of this was defamation or trade secrets. But the point wasn't to actually win. What they wanted to do was just file the lawsuit and then get against John Doe, a series of John Doe's and then get discovery against Yahoo and then the isps to track down who was posting this so they could identify them, shame them and fire them. And then they dropped the lawsuit. And this was happening a ton. And finally you had groups like public Citizen and the Electronic Frontier foundation get involved and they started making First Amendment challenges to these subpoenas on behalf of these anonymous posters. And the courts agreed with them and said that if you're going to unmask someone on the Internet, that you have to make a very strong showing because we have this fundamental First Amendment value.

[00:44:40]

And so that carries on to today in the debate. What we saw with Nikki Haley, she actually walked back her comments a little bit. She first said that people have to give their name when they're using the Internet. And then she the next day said she was really just talking about people in other countries.

[00:44:58]

She actually got a lot of blowback, which I was pretty surprised by. How many Americans prize, the right to speak anonymously. We were following the conversation on social media here at fire, and we're just kind of taken aback by how vicious or loud, I guess I should say, is probably the better word. The response to Nikki Haley's kind of offhand comment was, yeah, well, I mean.

[00:45:22]

On a lot of the platforms like Twitter and Reddit, anonymity is really important for a lot of people. Now, there are some platforms like Facebook that require real names, and they actually will kick people off if it's found that they're not using their real names. But that's Facebook's decision. I don't think it's a good decision. But Facebook is free to have that policy and Twitter is free to have the policy where they don't require.

[00:45:48]

Yeah, I mean, there are some business decisions. Yeah, there are some business decisions for why you might want to eliminate anonymous speech, which is to kind of eliminate the bot problem, which a number of these platforms suffer from. But there are good kind of free speech arguments for why you should allow for anonymous speakers.

[00:46:06]

Yeah, absolutely. But in a lot of other countries, they have real name registration requirements, and they can do that because they don't have the First Amendment. And again, just like with misinformation, there are a lot of reasonable people who think that eliminating anonymity on the Internet will fix all of its problems. What I try to tell them is that it will really shut down the Internet as a method of communication for a whole lot of people.

[00:46:39]

What do you see as kind of the impact of these social media age verification laws or adult entertainment website age verification laws? Do you think the courts are going to buy these arguments?

[00:46:52]

So I think that what we're seeing so far is we're seeing a lot of failures for a lot of different reasons, and not necessarily just on the anonymous speech front, also just on sort of the right to receive information, the right to communicate. I think that the social media age verification laws are going to have the biggest problem, because, again, what you're starting to do is say, unless you provide identification, you're not going to be able to use social media. Now, I think that the adult entertainment sites, I think there's similar arguments. I think the fact, if they're limited to adult entertainment, perhaps some courts might look at it differently. But we haven't in the OR. A lot of the great First Amendment precedents have come from cases involving pornography and the government's attempt to regulate it. Reno versus ACLU, which set the high standard for First Amendment protection on the Internet back in 1997, that was about preventing the transmission of indecent material on the Internet. So this isn't a new issue, and I think that the test will be to see if courts start to rethink their First Amendment protections.

[00:48:12]

Well, one of the precedents that's maybe worth considering in this context is, I believe, Brown v. Entertainment Merchants association, which was the case about violent video games from 2010 or 2011. And I believe Scalia wrote the opinion, which ultimately struck down a California law that required age verification from those selling violent video games. And I think there was some dicta, or maybe it was like a footnote that said if California's law was maybe more narrowly tailored to younger minors, like people 13 or younger, that the law might have survived, but it was any minor below the age of 18. Is there any precedent right now kind of distinguishing between younger minors and older minors when it comes to how the level of scrutiny, I guess, courts give to these sort of First Amendment questions surrounding privacy and age verification?

[00:49:14]

Other than the dicta from that case, I'm not aware of any. There is for privacy related sort of the collection of information. So we have COPA, the Children Online Privacy Protection act, which applies to the collection of information from children under 13, and that effectively shuts out a number of websites for them because there's a lot of different protections that you have to go through if you're going to be providing, collecting the information from them. But I think the difference is this goes beyond collection, and this is about the ability to speak on the Internet.

[00:49:53]

I want to shift now to your 230 book, which we've already talked about a little bit, but I want to get your perspective on how the conversation there is evolving. Right. You have Justice Clarence Thomas saying that 230 should be looked at, but the court kind of punted on it in the Twitter cases last term, at least a 230 question, and you have representatives in both parties for kind of different reasons and contradictory reasons, arguing to gut 230. But I don't know that anyone actually has an actual interest. Maybe this is just me kind of reading intentions in gutting 230. Even the Supreme Court, I think, understands how it would just kind of turn the Internet upside down if it got involved in kind of narrowing or striking down 230 and protections as potentially unconstitutional. So I'm wondering just how you're thinking about that dynamic right now and whether you think there is actually a viable movement to reform section 230.

[00:50:51]

When the book came out five years ago, it was sort of right after that that we started to get proposals constantly. It seemed like from the Democrats, it tended to be, we don't like this type of content, so we should get rid of section 230 protection for this type of content. And for the Republicans, it tended to be, let's condition section 230 protections on being neutral, whatever neutral would mean that we think there is bias, moderation, and let's end that. And I think that what the Democrats might be realizing is that a lot of what they've been proposing would be unconstitutional. And I think that the Republicans might be realizing that eroding section 230 could actually lead to fewer venues for controversial speech. Because if you don't have the protection from liability and you're suddenly getting a lot of lawsuits that might lead to defamation or a lot of speech that might lead to defamation lawsuits, that's going to be more costly, and you might just shut down those avenues. This is coming from someone who wrote a book saying that section 230 created the Internet. I think that a lot of people are realizing that no matter how important section 230 is, that changing it might not be the solution to everything.

[00:52:14]

And I think on the judicial side, when you listen to the argument in the Gonzalez v. Google case, which was the section 230 part of that case, you kind of heard in real time, the justices realizing they really shouldn't have granted cert in that case, because they were saying, like, okay, well, where do we draw this line? Because the argument that the plaintiffs were making is that if you algorithmically amplify this content, then somehow section 230 should not apply. And the judges were like, okay, well, what does that mean? They were recognizing that the Internet runs on algorithms. It's not sort of this magical unicorn dust. I mean, it's just like algorithms are how everything is presented. And the plaintiffs didn't really have a satisfactory answer. They were talking about using thumbnails and stuff that I wasn't quite following. And Justice Kagan eventually know we're not the nine greatest experts on. Isn't this something that Congress should be dealing with? Why are you coming to? So after that, it wasn't really much of a surprise when the court entirely punted on two thirds.

[00:53:28]

Yeah. And I mentioned earlier the Twitter cases. What I was really referring to was the Gonzalez case, which involves Google and the Tamna case, which involved Twitter. They kind of punted. But didn't we just get a case out of California Supreme Court, I believe, this week involving 230? State court?

[00:53:51]

Yeah, it was the California state trial court. This is actually where I think the biggest section 230 developments are happening. So this was a case against Snapchat, which it was brought by the families of teenagers who had died of fentanyl overdoses, and they alleged that their children had purchased these drugs that were laced with fentanyl on Snapchat. Typically, section 230 would apply if your claim is, we're suing over the content of the messages between the children and the drug dealers. But the lawyers were quite clever, and they made an argument that's been made before and is increasingly accepted, which is, we're not suing over the speech. We're suing over the defective product design. So just like if you buy a ladder with a rung that's loose, you could sue for product liability. What they're saying is that certain features of Snapchat, where it makes it difficult for parents to monitor, it, makes it easy for minors to get access the various ways the deleting function, the sort of ephemeral nature of the messages, that all of that makes it a defective product. And the court, for most of the claims, accepted that section 230 does not protect against those sorts of claims because it's not about the speech, it's not about user speech, it's about the design.

[00:55:28]

And there is some precedent. So the 9th Circuit, about two years ago, in another case against Snapchat, found that this was another tragic case involving teenagers. I had no idea that this even existed on Snapchat. But there's a feature called speed filter, which seems like an incredibly dumb thing for them to ever have done, which basically, when you're using Snapchat, taking a picture, it puts over the picture the speed that you're going. And not surprisingly, teenagers start using this to play a game where they see how fast they could drive when they're snapchatting. And there were a number of accidents, including some deaths. And so the families of two children, two teenagers who died while doing this, sued Snapchat. And the 9th Circuit said 230 doesn't protect against this because this is a dangerous product. This wasn't about the actual number that was printed out that the users provided. This was about something that allegedly causes these miners to engage in very dangerous behavior. Now, they still have to go and litigate the merits, but the argument is that 230 itself does not protect against it.

[00:56:44]

Well, it almost seems like a too cute by half argument, the defective product design argument. Right. Couldn't you just replace the Internet or social media with the phone or email? Right. That fentanyl conversation that led to the overdoses could have easily happened over the phone, too. Are the phone companies providing defective services because they don't give parents more tools to monitor phone lines? It seems like they're asking these sort of companies to do more than they could possibly do. Teenagers are always going to find ways to circumvent parental controls through any mode of communication. Right? Are schools providing defective products if they're not monitoring the conversations that teenagers are having at the locker rooms where they're perhaps discussing a drug deal? I don't know.

[00:57:46]

Well, so they're still going to have to litigate all of that, and the plaintiffs have a pretty heavy burden to litigate. So all this says is that 230 doesn't protect against it. So I think it's still going to be a tough case. But I think in some ways, this actually could be good for section 230, because these sort of extreme cases, I think definitely the speed filter case, if it comes out the other way, then that's sort of another list of. This is why section 230 is the worst law ever. It allows these speed filters. So I think that it might not be the worst thing to let some of these products liability cases get litigated rather than have this as sort of another list of the long harms that section 230 protects against.

[00:58:40]

Well, Jeff, we're running out of time here. I want to ask if you have another book in the hopper. Personally, I just kind of want to know what the next big free speech issue is that I'm going to have to deal with, because you seem to know them ahead of time.

[00:58:54]

So I'm actually co authoring my first book right now with Jacob Mashangama on the future of free speech globally. So what we're doing is we're stepping back and looking at not, and this is pushing my boundaries because I focus so much on the US. So I'm really excited to work with someone who has a more global perspective.

[00:59:17]

And Jacob's from Denmark, I believe, founder of justice. He's a senior fellow here at Fire. He's been on the podcast before, but he has that perspective. And he wrote the book free speech, which is a bold and ballsy title, very authoritative looking title about the history of free speech from socrates to social media. So he's kind of surveyed this landscape.

[00:59:40]

Yeah, yeah. And so what we're doing is we're taking more of a forward looking approach and looking at, okay, what's the state right now of free speech? And I mean, frankly, globally, it's not great. We're facing a lot of challenges, and Jacob writes a lot about sort of the free speech recession globally. So we're looking at the challenges and really how the United States really needs to set an example for the rest of the world in maintaining free speech protections. And we're coming up with some suggestions globally on how to stop the further erosion of free speech protections.

[01:00:23]

United States is widely regarded as the most speech protective nation in the world. Is there any place that's like a second in your research, have you found.

[01:00:32]

That it really depends on which area of speech, so some countries have more protections for anonymous speech or for political speech, but there's not, I mean, the United States really stands out overall. I mean, I would say it's certainly not located anywhere in Europe right now. They're facing a lot of very big challenges. And their charter, their free speech protections are more of a balancing act, and we're seeing a lot of laws being passed that really demonstrate the dangers of that.

[01:01:09]

Well, this is a fascinating conversation, and I hope to have you and maybe Jacob back when your book comes out. Do you know when it's going to come out?

[01:01:18]

Gosh, I think it's fall 2025, so we've got this year to write it. So I think that's the target date right now for the publication.

[01:01:32]

Well, that's great. I look forward to checking that one out again. Our guest today is Jeff Kossif. He is an associate professor of cybersecurity law in the United States Naval Academy Cyberscience Department. But as he said at the top, he is not speaking on behalf of the Naval Academy or the Department of Defense. And his most recent book is Liar in a crowded theater, freedom of speech in a world of misinformation. Jeff, thanks again for coming on the show.

[01:01:55]

Thanks so much for having me.

[01:01:56]

This podcast is hosted by me, Nico Perino, and produced by Sam Nieserholzer and myself. It's edited by my colleagues, Aaron Reese and Ella Ross. You can learn more about, so to speak, by subscribing to our YouTube channel, where videos of these conversations are hosted. You can also follow us on Twitter or Instagram by searching for the handle free speech talk and like us on Facebook at Speak podcast. And as always, we'd like to hear from you. So you can shoot us an email, provide us your feedback, ask us some questions at sotospeak@thefire.org. We take reviews. Reviews help us attract new listeners to the show. If you listen to us on Spotify or Apple Podcasts or Google Play, leave a review there. Again, it's the best thing you can do to support the show. And until next time, I thank you all again for listening.