Transcribe your podcast
[00:00:00]

These truths to be self-evident, that all men are created as a member of Congress, I get to have a lot of really interesting people and experts on what they're talking about. This is the podcast for insights into the issues. China, bioterrorism, Medicare for all in depth discussions, breaking it down into simple terms. We we hold we hold these truths. We hold these truths with Dan Crenshaw and Gloria Borger. Welcome back, folks.

[00:00:25]

Another special episode of watching hearings with your congressman. So I know you guys aren't all on C-SPAN all the time watching what we're doing. I do try to post this stuff on social media, but people see like what looks like a zoo meeting and they just tune out. I get it.

[00:00:42]

But you're listening to my podcast, and he gave it five stars.

[00:00:44]

So we're going to play the the hearing we did in the Energy and Commerce Committee with big tech. And now the Democrats wanted to call this hearing because they also hate big tech, but they have big tech for different reasons. They think big tech spreads too much conservative thought. That's what Democrats think now. They don't call it that. What they say is it's spreading misinformation. But when you listen to what Democrats are saying, what they really mean is it's spreading conservative opinions.

[00:01:14]

They don't like that. Now, um, that that's a point that I don't think we bring up enough. You know, we all like to yell at big tech, we're concerned about censorship and the conservative bias of the censorship that they engage in, which I think is definitely true.

[00:01:32]

But there's a much deeper threat than that. OK, you got it. You got to peel it back another layer.

[00:01:39]

Why do they even why do they have this bias? Why do they feel pressure to have this bias?

[00:01:46]

Look, the truth is, Jack Dorsey, Mark Zuckerberg, they didn't start these platforms with that intent. They didn't. I mean, I was there when they started it.

[00:01:55]

I was I was I was one of the first schools in 2004 to get Facebook. It was open. Twitter was Twitter was open. The early days of social media had pretty vast engagement. And over time, they moderated content. More and more and more algorithms started to drive you more towards what it thought you wanted. And this is a ton of consequences. I add more and more and more people to to be sort of the arbiters of these algorithms and decide what would be censored and what wouldn't and continue to tinker with those things.

[00:02:24]

And of course, people have biases and the vast majority of people who work at these tech companies are are left wingers. That's just just true. We can we can measure that based on donations, things like that. But again, let's peel back the layer real quick. It's what causes that culture. It's leftism and it's the Democrat Party that embodies it, which is what I noted from this hearing was how much the Democrats think big tech doesn't censor enough.

[00:02:54]

That's their that's their issue are issues that you censor too much. Although we do have a lot of Republicans concerned with online bullying, children getting on and on these on these platforms and sort of having their brains rewired in certain ways that that is concerning as well. I do tend to think I'm not so sure I know what I want big tech to do about that. I do know that parents need to, you know, be very careful about what they're letting their kids engage in on these online platforms, because as human beings, I don't think we've figured out how to communicate this way effectively and in a healthy way yet.

[00:03:31]

I think that is true and that's a worthy conversation to have.

[00:03:35]

Republicans also voiced a lot of concern for for a lot of illegal activity that continues to be prolific on a lot of these platforms, human trafficking, drug trafficking, things of that nature.

[00:03:46]

I focused on something different. I focused on the cultural problem we have where there's a growing number of people in this country that do not believe in free speech and free debate. And I want to give I kept the hearing on, all right, I kept my attention on Mark Zuckerberg just because I only get five minutes. Right. And I would have liked to ask Jack the same thing. You know, do you even believe in these values? And he emphatically said, yes.

[00:04:16]

And I'll be honest. I believe him. I believe him. I actually do. And everything I've noted about him in the past confirms that belief. Is he a liberal? Yes, but I think.

[00:04:28]

But but I think he might be an actual liberal, like, you know, the kind of liberal I disagree with on a vast majority of things.

[00:04:34]

But that still believes that that free speech, while it's messy, while it causes a lot of frustration and confusion and problems, it's still better than the alternative, which is authoritarianism and weak. We got to stop pretending like like like the freedoms we have in America are normal in Western society.

[00:04:56]

They're not people being jailed in Canada because they missed gender. Their daughter in the UK, Piers Morgan, was investigated by the government because you got a bunch of complaints and because he criticized Meghan Markle. This is nineteen eighty four authoritarian nonsense that happens in many other European countries, too. They don't have free speech. They don't have a First Amendment. Their right to free speech has long been protected by the culture. There's just been the cultural agreement and and precedent that you would just that you would that you would value this thing we call free speech.

[00:05:34]

But it's never been protected in their constitution. We're the only ones who do. That's a pretty big deal. But our Constitution can only last in so far as the American culture believes in it. That's a fact we've changed the Constitution over time, usually for good things. Remember, women couldn't vote according to the Constitution, then we changed it because the American culture decided that this was no longer relevant, this was no longer valid. So we had to change it.

[00:06:03]

Don't think that we can't change the Constitution again if the American culture stops believing in the First Amendment and there's a growing number of people who don't, and I witnessed an entire political party speak out against the First Amendment, speak in favor of political of censoring their political opposition. Now, Democrats know that they can't do that. They know that that the First Amendment protects you from them. But they've also found out that there's a really powerful entity out there, unlike any entity before, and that's big tech.

[00:06:34]

Now, again, a lot of us conservatives like big tech. The only reason you're hearing this right now is because of big tech. The only reason you hear a lot of what I tell you is because of big tech. So, look, I I'm a little bit more careful than a lot of conservatives when I just go bashing big tech. Big tech is our best weapon against the left because the left has managed to infiltrate mainstream media and the education system far more than we ever did.

[00:07:00]

So. You know, does it need some reform? Yeah, and I've laid out in past podcast podcasts on what kind of reform is needed and I lay it out at the end of at the end of that hearing, too. But let's be really clear on who the real enemy is. The real enemy are the people pressuring big tech to do their bidding and to infringe on the First Amendment on their behalf because they can't do it themselves?

[00:07:25]

That's what's happening, guys. That's what I pointed out here. And I wanted to let them know. I wanted to let Mark Zuckerberg know I'm on your side. If you actually believe what you just told me, you did believe. Sarcasm is very pointed questions. I asked him, do you believe that you're the arbiter of truth for political opinion?

[00:07:42]

He says no. And look, I'll be on it.

[00:07:44]

During that hearing, him and Jack Dorsey both pushed back a little bit on Democrats. When Democrats like, why won't you censor this? Why won't you do this?

[00:07:53]

And they and they didn't give them much. OK, we have to continue to foster that, we have to continue to say, look, we're the ones who believe in these platforms. We just want them. We just want them to be more fair, OK? We want them to be more transparent. We want to real appeals process. When people get banned and we want the and you look, you can make up whatever rules you want for your platform, but they need to be clear and they need to be applied equitably.

[00:08:21]

And I shouldn't be able to easily find examples where they're not applied equitably. And all of us can find those examples all the time.

[00:08:27]

I brought up one during this hearing, you know, where and I watched both of those videos, the Project Veritas video and the CNN video, they're exactly the same.

[00:08:37]

It's a reporter with a microphone. It's labeled. It's not some secret recording. And they're just going and confronting somebody at their home.

[00:08:44]

Now, you can argue that that's again, I would be really annoyed if somebody confronted me in my home.

[00:08:49]

So, you know, again, that that's not to say not to say it's not really, really obnoxious. But if we're talking about things that get taken down, then they should both be taken down. No, I don't think either one should be taken down, but that's what we're talking about. That's what I want to make clear to too big tech. I, I think it's an impulse. It's an impossible job to to try and monitor and regulate so many forms of speech, hate speech being one of the most annoying ones, you know, and I look through their definitions of of hate speech.

[00:09:25]

Some of it's not so bad, it's OK, maybe, maybe I'll just read Facebook's Twitters is actually pretty is much shorter. But. The thing is, is I'll read some of it and then you can be the judge. So Facebook, we define hate speech as a direct attack against people on the basis of what we call protected characteristics race, ethnicity, national origin, disability, religious affiliation, caste, sexual orientation, sex, gender identity and serious disease.

[00:09:54]

We define attacks as violent. This is important because. Yeah, because what does it mean to attack somebody? We define attacks as violent or dehumanizing speech, harmful stereotypes, statements of inferiority, expressions of contempt, disgust or dismissal, cursing and calls for exclusion or segregation. I would point out it's not over. But let me just point something out here. Not just this week, Senator Tany, Tammy Duckworth said that she wouldn't she wouldn't vote for any nominees who were white.

[00:10:27]

So if she posted that on Facebook, I would assume Facebook would take that down, of course, would they know this is the problem and should they know they shouldn't? I want to know what she's saying. I want to know the garbage that's coming out of people's mouths. I would let know what I would I would suggest to Facebook is list words, list words that you can't say.

[00:10:48]

Again, Facebook has every right. They are protected by the First Amendment. And my goal is to get them to adhere because they're protected by the First Amendment to to transfer that protection to their users to the greatest extent possible. I get that it's not going to be totally free speech as per the law under their platforms. That's their right to do that. But in the spirit of the First Amendment, I would encourage them and hope for them. And I've told them this, try to match it as much as possible, be as clear as possible, what's right or what's OK and what's not.

[00:11:22]

OK, and your platform, they go on to say we consider age a protected characteristic when referenced along with another protected characteristic. OK, we to protect refugees, migrants, immigrants, asylum seekers from the most severe attacks, though we do allow commentary and criticism of immigration policies. Similarly, we provide some protections for characteristics like occupation with their referenced with the protected characteristics.

[00:11:45]

We recognize that people sometimes share content that includes someone else's hate speech to condemn it or raise awareness. In other cases, speech that might otherwise violate our standards can be used, self-referential or in an empowering way. Our policies are designed to allow room for these types of speech, but we require people to clearly indicates their intent. If intention is unclear, we may remove content.

[00:12:09]

Well, I mean, there's a lot to unpack there. I think the main issue is it's vague, it's vague, like I still stand by. I don't think hate speech is a real thing. I think incitement to violence, incitement to violence is a real thing and it can be defined rather clearly. Now, they they sort of define it that way. You know, they say we define attacks as violent, but then they say dehumanizing speech. And what's dehumanizing.

[00:12:37]

Go through the comments on my social media. There's a lot of dehumanizing stuff. I am constantly berated for having one. I constantly they never take it down. You know, should they? I don't know, I kind of want to know how garbage human beings think. I mean, I can take it. I have thick skin.

[00:12:55]

I also have the ability to turn off comments if I really don't want to see it. So, look, I'm a bit more of a free speech absolutist. That's just that's just who I am. But I but I think these I think this description is far too vague for for any group of people to properly enforce and fairly enforce. So they do this to themselves. They make they give themselves this impossible task of trying to regulate speech. There's a reason that we don't do any of this in law, though.

[00:13:24]

There's a reason that in law you have to clearly show intent and action when you're talking about incitement to violence, which is which is not protected. Political speech on Twitter. What do they have? They say on hateful content says you may not promote violence against, ah, directly attack or threaten other people on the basis of race, ethnicity, national origin, caste, sexual orientation, gender, gender identity, religious affiliation, age, disability or serious disease.

[00:13:52]

That's a pretty reasonable statement. That's a little bit more clear. You may not promote violence against gay. That's basically incitement to violence. It may be more detail that that is more in accordance with current law. Wouldn't be a terrible idea. Directly attack, threaten. OK, those are clear words. I'll give them that. Now, is it applied? Clearly, a lot of people will disagree.

[00:14:15]

That's the problem. This is also a problem and says we also do not allow accounts whose primary purpose is inciting harm towards others on the basis of these categories. Well. Antifa, Hamas, the ayatollahs. I mean, come on. Now, I guess Antifa is just against conservatives and according to this political affiliation, is actually not one of their protected classes.

[00:14:47]

So I guess that's how they get out of that one, but on all of these, I would think in Tifa and Tifa content would have to be taken down. I mean, they're basically a domestic terrorist organization. They're constantly engaging in criminal activity. I don't have it in front of me. But there was a longer version for Facebook and it clearly said, you know, anybody who's promoting criminal activity we take down. But again, why isn't it why are there antifa accounts at all?

[00:15:14]

Again, by these standards, they should be taken down.

[00:15:17]

And I think that's what drives conservatives crazy is is the fact that this just not applied equitably. And look, if it were up to me, I would say, yeah, antifa stays up. I want to know what A.F. is doing. I want law enforcement to be able to track antifa on social media as they're doing all the crazy things.

[00:15:37]

So I'm more of a free speech absolutist on this. These things are just are just far too difficult to actually enforce, which is which is the point we're making.

[00:15:48]

And I think I think big tech can do a much better job getting to the right place on that. And actually, one of the one of the better suggestions that we'd heard in this came from Zuckerberg mostly who who more actively advocates for reforms to Section 230. And I think I think deep down where he's coming from is he just doesn't want to deal with it anymore. And they've talked about sort of an industry standard, an industry group, the same the same way that.

[00:16:21]

You know, it was agreed upon that in cable television networks, certain words and pornographic images would just be banned. That was decided upon by by a sort of industry standard and maybe something similar is required here. That, again, is more standardized, more transparent, and there's just less guesswork because it's the guesswork that's driving everybody crazy.

[00:16:45]

I look, you can't have a platform that's a total free for all. Even Parla is not a total free for all. It's important for people to know that they moderate content. You have to there's the user experience would be pretty bad if you didn't moderate any content at all. But at the same time, you moderate too much and you lose trust in the users. And also you're selling them a false bill of goods. And I think that people feel like they're they've been they've they've bought into these platforms.

[00:17:14]

And and then when these standards are not applied equitably, that their their rights as consumers have been violated, I think that's a fair that's a fair statement to make. That's the approach that I personally take to all of this. And but more importantly, like I said, the real threat here is that there's an entire political movement that wants to weaponize social media platforms to infringe on the First Amendment. We can't let them. And and we have to let the social media companies know that as long as they resist that we have their back.

[00:17:47]

If they don't resist it, the enemies of liberty, that's a fact. OK, that was a long buildup. Without further ado, here's the five minutes fame on this hearing and I hope you enjoy. Take a listen.

[00:18:02]

I've been on social media longer than anyone in Congress. I think it was one of the first schools to have Facebook back in 2004. And it seemed to me that the goal of social media was simply to connect people. Now, the reason we're here today is because over time, the role of social media has expanded in an extraordinary way. Your power to sway opinions and control narratives is far greater than the US government's power ever has been. So I noticed a trend today, there's a growing desire for many of my colleagues to make you the arbiters of truth so they know you have this power and they want to direct that power for their own political gain.

[00:18:39]

Mr. Zuckerberg, since Facebook was my first love, I'm going to direct questions at you. And this isn't a trick question, I promise. Do you believe in the spirit of the First Amendment, free speech, robust debate, basic liberal values? Yes, absolutely. See, my colleagues can't infringe on the First Amendment. The American people in their speech are protected from government as they should be. My colleagues, this administration, they can't silence people they disagree with, no matter how much they want to.

[00:19:06]

But I do think they want to. Just in this hearing, I've heard Democrats complain about misinformation. They wish they clearly made political speech. They disagree with it. Complain today that Prager University content is still up. I've heard them accuse conservative veterans of being tinfoil hat wearing extremists and their opinions on climate change that they disagree with should be taken down. This is quite different for the Republican complaint that illegal content needs to be addressed. The growing number of people in this country that don't believe in the liberal values of free speech and free debate.

[00:19:38]

The promise you the death of the First Amendment will come when the culture no longer believes in it. But it happens and it becomes OK to jail or investigate citizens for speech like has happened in Canada and throughout Europe. Their culture turned against free speech. You are sitting here today as witnesses are part of the culture. You can stand up for the spirit of open debate and free speech, or you can be the enemy of it. Your stance is important because it's clear that many want to weaponize your platforms to get you to do their bidding for them.

[00:20:08]

Mr. Zuckerberg, do you think it's your place to be the judge of what is true when it comes to political opinions, Congressman? No, I don't believe that we should be the arbiter of truth. Thank you. And I promise you this, as long as you resist these increasing calls from politicians to do their political bidding for them, I will have your back. And you don't become an enemy of liberty and long standing American tradition. You might all agree in principle.

[00:20:34]

But what I just said, Mr. Zuckerberg, you clearly do. And I appreciate I have a feeling the others would answer it as well. I just don't have time to ask everybody. But the fact remains, the community standards on social media platforms are perceived to be applied unequally and with blatant bias. Mr. Dorsey, and just one example. I saw a video of that from Project Veritas that was taken down because they confronted a Facebook executive on his front lawn.

[00:20:58]

But here's the thing. I can show you a video of CNN doing the exact same thing to an old woman who was a Trump supporter in her front yard. I've looked at both videos. It's an apples to apples comparison. CNN remains a project. Veritas was taken down. I'll give you a chance to respond to that. I have a feeling you're going to tell me to look into it. I don't have an understanding of the case, but I would imagine if we were to take a video like that down, I would be due to doxxing concern over private address.

[00:21:29]

The address was blurred out. You don't have it and you don't have the case in front of you. I get that. The point is that there's countless examples like this. I just asked. I just found that one today. There's countless examples like this. So even if we agree in principle and everything I just went over, you guys have lost trust and you've lost trust because this bias is seeping through and we need more transparency. We need better appeals process, more equitable application of your community guidelines, because we have to root out political bias in these platforms.

[00:22:02]

I think and I've talked with a lot of you offline or at least your or your staff, I think are some agreement there. And I haven't heard in this hearing anybody ask you what you're doing to to to achieve these goals. So I will allow you to do that. Now, may Mr. Zuckerberg, we'll start with you.

[00:22:20]

Sorry to achieve what? Your goals. More transparency, more more feeling, better appeals process for taking down more equitable application of community guidelines. So for transparency, we issue quarterly community standards enforcement reports on what prevalence of harmful content of each category, from terrorism to incitement, to violence, to child exploitation, all the things we've talked about, how much of it there is and how effective we are at finding that and and stats on that for appeals. The biggest thing that we've done is set up this independent oversight board, which is staffed with people who all have a strong commitment to free expression, for whom people in our community can ultimately appeal to them.

[00:23:04]

And that group will make a binding decision, including overturning several of the things that we've taken down and telling us that we have to take them, that we have to put them back up, and then we respect that. So there you have it, probably good to talk to them for another hour and grill them on, I think, a more productive conversation about, look, how are you going to get better transparency? How are you going to improve the appeals process when people feel unjustly censored or platformed?

[00:23:31]

And how are you going to ensure that these extremely vague standards on hate speech that we went over before can possibly be applied equitably? And you know what? I would have continued to ask that because I only got to ask Zuckerberg, but I wanted to ask Jack the same thing. Do you think you're the arbiter of truth? He probably would have said no in that in that particular hearing, but then it's fine. Good. I'm happy we agree in principle that means we're allies, but don't make us enemies by by not living up to that principle.

[00:24:03]

Resist the left man. That's what I tell these. That's what I tell these tech CEOs. Resist the council culture. Let's stop letting them threaten you. Right. They threaten you with advertisement polls. You know, they they they threaten you with public pressure. The same with all CEOs when you stand up to the WOAK mob.

[00:24:23]

I've never seen anybody who's who's regretted that. You know, GooYa being a great example. Everything got sold off the shelves when they stood up to the local mob. That's just what happens. It's worth it, it's worth it just for the sake of Americanism, for the sake of the spirit of the First Amendment, if we lose the culture of the First Amendment, we lose it all. And I want to believe that these guys up top at these companies at least started off believing in that, and it's important that we let them know that if they do the right thing, we've got their back.

[00:24:59]

If they do the wrong thing, we don't. But we need them and they need us, so let's keep fighting the fight out.