Transcribe your podcast
[00:00:00]

My name is Frances Robles, I'm a national and foreign correspondent for The New York Times. I think people have a vision of reporters from the movies pecking away at the computer. I don't work in the newsroom. I work on the street. My sources are political dissidents, convicts, fugitives. Once, when an ex-con told me that he had been framed for murder, I spent months investigating it because that's the kind of journalism that I do, the kind that requires bearing witness.

[00:00:23]

If this kind of journalism is important to you, you can support it by becoming a New York Times subscriber. Go to NY Times dot com. Subscribe from The New York Times. I'm Michael Barbaro. This is The Daily.

[00:00:39]

It's been four years since the 2016 election demonstrated the powerful role that social media companies have come to play in shaping political discourse and beliefs in America. Since then, there have been growing calls to address the spread of polarization and misinformation promoted on these platforms, while Facebook has been slower to acknowledge a need for change. Twitter has embraced it and said that they made mistakes. But with three months to go until the 2020 election, these changes have been incremental. While Twitter itself is more popular than ever.

[00:01:21]

Today, a conversation with Twitter's CEO, Jack Dorsey, about whether those changes will be enough. It's Friday, August 7th.

[00:01:35]

Bu. Hey there. Hey. Hey, Jack, it's Michael, how are you? I'm good. How are you doing? Was that your icon? Was that you as a child or. That was a child.

[00:01:47]

That was me as a child. Oh, how old? Maybe three or four. And what should we deduce from the fact that your icon is a child?

[00:01:59]

Well, it's me. It's it's not just a child. And the reminder, the inner child.

[00:02:06]

The innocent child. Yeah. So if you're ready, I think we are ready.

[00:02:11]

Ready?

[00:02:12]

OK, Jack, I'm going to start with an intentionally provocative question, which is, do you believe that you are one of the most powerful people on Earth right now?

[00:02:24]

No, no, no, no, no, no. And why not? Well, because if it's a reference to the power of Twitter has, I think that power is ultimately in the hands of the people that use it every single day. And that's that's been the thing that is most special about the service, is that everything that has made Twitter powerful has come from the people using it. The people really push the direction of where the service goes and where it is and what it wants to be in our job as a company.

[00:03:00]

My job as an individual at the company is to is to be a check point on that.

[00:03:06]

But isn't that in many ways what you're you're grappling with the way in which the people and in a way the unchecked power of the people has transformed Twitter? And isn't that a lot of what we're seeing from you now, an effort to kind of moderate that, to rein it in to to intervene in some way to control that power?

[00:03:25]

I don't think it's unchecked power. First, I think that the people are constantly checking themselves, and that's what you see in public conversation in the first place. What we're dealing with, though, is people, gaming systems, people taking unfair advantage of systems, people setting up accounts in order to manipulate conversation. And that's where we really need to focus our energies. Is an audience unfairly earned? Is it captured in some way that isn't consistent with reality, which would be a flaw in our system?

[00:04:02]

But are we also dealing with people checking not just themselves, but each other? And I didn't intend to go here so quickly, but this idea of cancel culture that is so present in our society right now, doesn't that have to do with something that. Doesn't have to do with gaming systems or taking unfair advantage of them, but rather with the incentive structure of Twitter itself.

[00:04:26]

Yeah, I think you're spot on with incentives. I think if if we're to do all this over again and rewind the disciplines that we were lacking in the company in the early days that I wish we would have understood and then hired for like a game theorist to just really understand the ramifications of tiny decisions that we make, such as what happens with re tweet versus read, tweet with comment and what happens when you put it next to a like button. What does that mean?

[00:04:57]

So game theorist, a behavioral economist to help us understand incentives and then social scientists, those are disciplines that we lacked that I think ultimately would have been important in helping us think about not just building a product, but building something that people use socially and in the ramifications of that. But I think, you know, to me, these things tend to be pendulum swings. And while we do see a lot of what you're labeling is cancer culture today, I do think it's important that we continue to allow the space for people to express their past and their history and context, because I think context matters so much, because if we can express that, we can learn from it and then we can really progress and improve as a culture.

[00:05:50]

Whereas individuals either you're acknowledging that Twitter has become a difficult place for context, I it really depends on what part of Twitter you look at.

[00:06:00]

80 percent of Twitter is outside the United States, 80 percent of Twitter doesn't really concern itself with what you're bringing up right now in politics and news. Twitter, certainly there's different ways of using it that, you know, some of which are great because they hold power to account. Some are not because it doesn't allow for an evolution of an individual or an institution or learning. But there are multiple Twitters happening in parallel all the time. And we in the US and especially in the media, tend to focus on one small sliver of it that does have real impact in the world.

[00:06:39]

But, you know, from a company and service perspective, we have to pay attention to something so much larger.

[00:06:45]

Right. Well, we also have a leader in the United States who engages in that small sliver in a very big way.

[00:06:51]

Yes. And I want to talk about all of this that we're starting to dabble in in much greater depth.

[00:06:59]

But you seem to be saying that some very essential elements of Twitter from the start may contain flaws, have contained flaws or turned out to be flaws.

[00:07:09]

But I'd like to go back with you the very beginning to understand how it is that that we got here.

[00:07:14]

And I want to understand what you thought you were making and what you actually have created. So what did you imagine Twitter was going to be when you created it? And for that matter, why did you create it? What was it supposed to do?

[00:07:30]

Well, I think Twitter is unique and certainly to us, and it wasn't something we really invented or something we discovered. And we kept pulling the thread on it and saying that we didn't really have any specific content around what it should be or what it shouldn't be. We saw some opportunity in technology based on all of our backgrounds and experience, and we were also kind of thrust into a world that had, you know, a country that there was just getting access to us and we built it to use it.

[00:08:04]

And immediately we felt it was incredible. And the moment that we felt there was something there, at least for us, was we all went. To our various homes or dinners or yoga sessions or whatever it was away from the office when it finally started working after two weeks and we all were updating each other about what we were doing, and even though we weren't physically present with each other anymore, we fell together. And it was there was such an amazing feeling knowing that I would be sending an update and potentially it would buzz some of my friends pockets and they would take it out.

[00:08:44]

And they would understand in that moment immediately what I was going through. What I was thinking was very cool. It was instantaneous. And I was brought into that moment, the moment it was happening.

[00:08:54]

It doesn't a to do that same thing. So what felt so exciting about the fact that the reach of this would be unconstrained?

[00:09:01]

Exactly. That the reach was unconstrained. It wasn't me choosing to send an update to these seven people.

[00:09:08]

It was me writing on the wall and people following updates from that wall. And the number of people following those updates could be infinite. There was a change in the model so that it was a broadcast and anyone could could tune in effectively versus me selecting folks to receive a message. Mm hmm.

[00:09:30]

So you understood early on that the potential was kind of huge. Well, no, I wouldn't necessarily say about it, I would say that it felt amazing and it felt electric and it felt very powerful to us. But what really showed us Twitter were the people and how they used it and how they used it completely differently than how we used it.

[00:09:56]

So I want to talk about that. I want to talk about what I see is the beginning of the transformation of Twitter and the act of tweeting. And I see the start of that. Really as being the Arab Spring, you know, young people in the Middle East using this new platform to call for change, to document the calls for change, to document government's response to those calls and ultimately to bring the change right. And really alter the course of history, because governments during this period literally fell and rose and we see similar usage in the United States with Black Lives Matter.

[00:10:34]

And it felt like that development in the usage of Twitter was was very much celebrated, especially by progressives, and it felt kind of noble.

[00:10:42]

But this also represents, in my recollection, the beginning of Twitter as a pretty active agent in politics. And that brought a lot of changes to the platform, including hyperpolarization. America's political discourse starts to play out on Twitter and it gets pretty nasty pretty quickly.

[00:10:59]

There's harassment, there's name calling, there's threats, there's anti-Semitism, there's racism.

[00:11:04]

And the idea that everybody who disagrees with me or disagrees with you, you know, that they're just an evil idiot.

[00:11:09]

That becomes a pretty powerful sentiment on Twitter and it becomes a powerful sentiment kind of quickly. And I know that I just tick through a lot there. So would you would you agree with that basic depiction of the transformation? Well, I think it's ignoring. Everything that happens on the Internet parallels, well, abuse and harassment did not start after this polarization or the political dialogue coming on Twitter. It's been on the Internet forever. And certainly, folks in the early days of Twitter experiencing abuse, hate, harassment, it just wasn't acknowledged, unfortunately, by us in the early days enough or by the general population in general media.

[00:11:54]

So the fact that there wasn't acknowledgement or even observation or stories about it doesn't mean it wasn't happening. It was. And it just wasn't being made visible enough and acknowledged enough.

[00:12:08]

Are you saying that was a fundamental reality of human interaction or at least of human interaction online as opposed to a result of the incentives and the structures of Twitter?

[00:12:19]

Oh, I think it's it certainly has always been part of the structure of the Internet and some of what it incentivizes as well versus not. And it's not you know, it's it's something that. We certainly also saw play out, and it's not to say that we didn't incentivize different ways or amplifying the behaviors that already existed, but digital communication has always seen the sort of attack and these sort of trends.

[00:12:51]

Well, let's get into this, because I'm curious, when you start to acknowledge that you need to address the incentives and the structures within Twitter, because if this was understood for years and it sounds like you're saying it was, why are we really only in the last few months, maybe last year or so starting to have this conversation?

[00:13:13]

Why are we only now hearing you acknowledge that the original model had mistakes and that changes are needed?

[00:13:23]

Oh, I've been talking about this for five years at least. But the reason why is there is no general. Like when when we started Twitter, we were effectively trying to build something that we wanted to use. And then as we saw more people use it, more people wanted to use. And that was a lot of our focus. And there wasn't enough of the focus on some of the challenges, specifically with online speech that came with it. Eventually we hired for that and and built policy and built enforcement around it.

[00:13:57]

But it's how we started. And I don't think it could really start any other way back then just because we were pulling from a lot of decentralized models where there is no policy, there is no enforcement, it is all up to who uses it and the individuals into something that is now centralized and requires a different approach and a different mindset. And we also didn't necessarily know exactly where this thing was going to go or if it would exist in a few months.

[00:14:29]

But as we got more and more certainty of what we were building, what we were seeing reacting to some of the negative aspects of what we were seeing, we recognize those gaps and then hired for them.

[00:14:40]

Do you do you think that you were too slow to make that move, to move from the focus on growth and kind of corralling all these new people, using the service to the focus on questions of of speech and abuse?

[00:14:53]

I think independent of that, we're probably too slow to assume that we had the right disciplines within the company to handle success, meaning if that people were using this for more and more conversation, for more and more speech, this is less about building a product and more about how people interrelate with one another, how people converse with one another. And we were a bunch of engineers and designers and product managers. We didn't have the discipline around like what we're incentivizing or what we're not incentivizing by the small choices or the large choices that we make.

[00:15:30]

We we had to guess and we had to experiment and we had to see. And I I imagine, you know, again, if we were to redo anything, it would just it would really look at some of the problems that we were assuming that we were going to face and make sure that we have the right skills and not assume that product managers and designers and engineers have those skill sets.

[00:15:53]

So I want to actually dig into what we mean when we talk about incentives and structures, when it comes to incentives. What were they, as far as you could tell on Twitter? And how do they play a role in both the popularity and eventually the trouble that we're talking about here on the platform? Well, I think. Choices around. Showing how many people follow you. And that that number was bolded, then big in your profile certainly incentivised.

[00:16:33]

Me to make that number go up, right? Or anyone like that's a number that for whatever reason this product thinks is important, that inherently incentivizes people to grow that number as quickly as possible. The decisions we made around having a favorite button on a tweet and then shifting that to a like button and that button having a number associated with it. So people wanting or constructing tweets that went viral and spread as quickly as possible through re tweet numbers, the bigger those numbers, supposedly better it is.

[00:17:11]

And what was the problem with that?

[00:17:12]

Specifically, what happens when you incentivize likes and tweets? Well, I I you know, as you know, it can create behaviors that people are writing headlines for people to click so that eventually people see the ads behind that click. And does that really the right intention versus informing people about what's happening? And we certainly saw a lot of behaviors where people were constructing tweets just to get as much spread as possible. And then we saw even more sophisticated attacks around that, where people found out ways to game the systems in order to get more visibility and to get their message higher than someone else.

[00:17:57]

So, yeah, I mean, I think that spread without necessarily substance is an incentive that can be can be dangerous.

[00:18:06]

So this is coming back to this idea of nuance and whether that's possible or now, I guess we're talking about whether it's rewarded on Twitter.

[00:18:14]

Yeah, I mean. You know, one of the things that we we are experimenting with is a small little feature where if you tweet an article that you haven't actually even opened to look at to read, we will give you a notification that says, hey, you haven't actually looked at this. Are you sure you want to spread it? Because that is a vector for information to spread that might be misleading and and people to unknowingly participate in spreading misleading information.

[00:18:49]

I think there there are people with intent and then there's a lot of people who are just kind of seeing things and seeing a headline and seeing a particular tweet and saying, oh, my God, and then spreading it without without knowing what's in it, without knowing what's in it.

[00:19:05]

And that is on us to help. Why not just implement that?

[00:19:09]

That's a pretty interesting idea. I'm trying to imagine what the downside of that would be of making sure people have actually.

[00:19:16]

So your vaccine, what it is you're pointing to some of our biggest issues and the dawn of the company, why not just launch something instead of thinking deeply about it and see what the ramifications are? And there are certainly positives. So we can imagine, but there's probably some negatives as well. And, you know, if if everyone has access to this and everyone is is using it, how does that change the discourse? And maybe it's entirely positive, but maybe it's not.

[00:19:42]

Maybe there's some underlying hidden assumptions that we're making that we need to verify or things that we're not seeing new vectors of attack, infections of abuse. So it's stepping back and thinking deeply about every single small action that we're taking and having a hypothesis like what we have with this particular feature and then testing it and seeing how it plays out on small scale. And then as we gain confidence around it. Yes, launch it, everyone. You're trying to learn the lessons of Twitter today, even as you think about changing the consequences of that.

[00:20:19]

The one thing I want our company to be incredible at, the one skill I want us to build is our capacity to learn. It's a cycle of observe, learn, improve that. If we can be incredible at that cycle, I'm confident we'll do the right things no matter what challenges we're facing. If we become too rigid in coming up with an idea like we just discussed and saying, let's just launch it and hope for the best, we're going to become more relevant or dangerous and we just can't afford the.

[00:20:56]

Would you agree this might seem harsh, that you have not been incredible about that and maybe not even especially good at it, but you think you will be in the future about learning? About pervasive, observing, learning, changing, and I would agree that we we haven't been awesome, but I think we're getting better and better every single day. And I think that is on display publicly, especially in this past year, around everything that we've learned and how we have evolved our policies and evolved our actions and our enforcement.

[00:21:28]

And I would say that the transparency the company has with the world right now is unique and something I'm very, very proud of and goes much further than than most. So so let's talk about algorithms for a moment and how you're thinking about those. What about the intentional surfacing by Twitter of particular types of messages, messages that tend to be hot, emotional to draw lots of eyeballs, controversial? Should that have worked differently? Well, it's an end point.

[00:22:05]

So these rhythms are constantly evolving, so it's not a past tense where we can't change things, but a lot of the algorithms are are built on how people engage with the content and the simplest form. Are people retweeting this tweet? Are people replying to it or people liking it? And if you stop there, then you get to a result where some of the most salacious or controversial tweets will naturally rise to the top, because those are the things that people naturally click on our chair without thinking about it or reply to.

[00:22:46]

So there has to be some balancing effect to that. It can't just be a pure read of that signal. It has to take in other signals.

[00:22:55]

I think the most important thing was it was it was at one point basically a pyramid of those signals that I I wasn't fully present in all the decisions around how those were constructed originally. But I'm sure it was because you have to start somewhere and then you have to evolve based on what you learn. I think it does point to a few issues around our problems that we as a company and also an industry need to to solve for. They are way too much of a black box.

[00:23:25]

They are not written in such a way that they express what criteria they're using to make decisions or even can express how they made a particular decision. And that's important certainly when you consider a ranking algorithm and what you see versus what you don't see. So we need to open up and be transparent around our rhythm's work and how they're used and maybe even enable people to choose their own algorithms to rank the content or to create their own algorithms to rank it.

[00:23:58]

To be that open, I think would be pretty incredible so that we can all come to better solutions because it affects society in such large ways all the way back. You want a diverse portfolio, so you'll have to do a lot of legwork, right? Well, not really. See, with iShares core ETFs, you don't have to research individual stocks and bonds. That's because iShares core ETFs are designed to give you broad and balanced building blocks across a range of major asset classes, making diversification easy peasy.

[00:24:43]

Take a closer look at iShares core ETFs and get a new perspective on your portfolio. Learn more at iShares Dotcom iShares. Invest in something bigger. Can you give us a vision for Twitter for for the future of Twitter? Can you describe how it's incentives and algorithm in this vision work? I mean, what they reward and what that version of Twitter actually looks like?

[00:25:07]

Help us see it. Well, what I believe we're building is a conversation layer of the Internet. I believe so fundamentally and the promises of the Internet, what it what it enables in our world and what I think Twitter represents is the conversation layer of that and the public conversation layer in particular. And I think one of the things I get really excited about is I look at the trends of technology, our number one, the trends of translation technology and Real-Time translation.

[00:25:41]

And I think we're we're moving from a world where a lot of people had to normalize around the top three languages in order to communicate with each other, such as English and Spanish and Mandarin. And given what we're able to now build with translation technologies. We're actually able to realize a future where when I come to a service like Twitter, I can express myself in my own dialect and anyone in the world can understand it in real time.

[00:26:12]

But beyond but beyond the language thing for a minute, because I want to be super clear about this, to give this to me in a way that I think over a casual listener would understand. If I'm saying to you right now, for me and for many people, Twitter feels like a place where people go to share and to share and to comment on emotional and attention grabbing and often divisive messages. How do I describe that and experience that in three years and five years?

[00:26:39]

What's fundamentally different about it?

[00:26:42]

Well, or maybe it's not. Again, I think that the reason I talked about translation is because you have more voices and you have more people participating, and I think that is important. We don't have enough people participating in this in a way that they're comfortable with.

[00:26:58]

So you want Twitter to be a better reflection of the whole world of more people? Absolutely.

[00:27:04]

Do you think that would solve the civility question that we're dealing with?

[00:27:07]

I mean, I'm not saying it's possible that I'm not saying it solves it, but as we have more representation around the world, then it shifts into another issue. And another challenge is how do you focus on relevance and what is relevant to me and what's not relevant to me. And that's where the algorithms come in. And relevance can't just be did I write something that is contentious or like written in a way that is meant to spread, but is it actually valuable and relevant to a certain population or a certain aspect of the population?

[00:27:45]

And there are going to be certain conversations that span communities and span nations and span cultures. But the majority of them are not. The majority of them are going to be more localized. So I think and I don't know what the time frame is, but you'll see a Twitter that has this this blurring of a very localized conversation, whether the locality be cultural or geography based or a topic that was not the experience of Facebook in, say, Myanmar.

[00:28:17]

I mean, there are places where social media has been expanded and people have deployed those social media services in pretty horrible ways, in ways that have been genocidal.

[00:28:27]

So isn't it possible that what we're talking about here just spreads across the world because of the if the fundamentals don't change about what is shared, why it's shared, what our eyeballs are drawn to, what the kind of fundamental kind of incentives of this system are in a place like Twitter, then aren't we just about to export it to other places?

[00:28:48]

Is it more attention on those problems? Is isn't more acknowledgement on those issues unfolding in real time in the public important so that we can technology so that more people can try to help solve them? Some people who may not have had access because they didn't understand it in the past now understand it and can jump in even if they're not particularly in the location. I guess that depends on if you think Twitter, and I'm curious what you think is a reflection of society or an amplifier or creator of divisiveness and polarization.

[00:29:23]

It's kind of a chicken and egg thing all over again. Yeah, like like any tool can be both with all these tools that we build, all technologies, we we start using them in one way and we discover all these problems and then we address those and we continue that iteration. That's this is not unique to this time. And it goes back throughout our history as as a civilization. You can't pick any tool that wasn't used in some way and both positive and negative.

[00:29:51]

And the same is true for the tools of the Internet.

[00:29:54]

The reason I'm asking that original question is I want to know how you are going to try to ensure that it's that it's not both. Right. How do you ensure there before you grow again globally that you've solved kind of the root problem? And what does that algorithm look like?

[00:30:09]

I think that's just the wrong way to think about it, that there's one solution. The third I only heard you really say that the solution was growth in translation.

[00:30:19]

So I just want to I really want to be sure I give you a chance and that we understand what you see as even if it's not just one solution, maybe a very clear specific set of solutions.

[00:30:30]

It's just a constant. To me, it's a constant iteration. It's a constant push to be steps ahead of how people might utilize this in a negative way, address that and then see new potential use cases that are negative. If we try to develop a perfect system that solves the problem, quote unquote, we're just going to get it wrong and it's going to evolve past it, showing itself in that way in so many different cases.

[00:31:00]

I wonder if I can set an example for you beyond the experiment you're running on, making sure people have opened articles before sharing how you might keep iterating for good on Twitter. I mean, there's tons of examples, but one example is another experiment that we've run in the past and that we're running, which is for any particular article that is shared, you might see one point of view in this direction, another point of view with a slightly different and other point of view that is completely different just to show and kind of break through some of the bubbles that we that we tend to naturally build, because I think that the hope and the hypothesis is that you might see these different takes and it might incentivize you to really dig deep into the article or actually watch the video that's being shared so you can have your own informed point of view and share your opinion as well.

[00:31:54]

And the more of that variety and diverse perspective we have across language barriers, across cultural barriers, you know, we get to better answers.

[00:32:05]

Of course, one thing you have done that we haven't talked about a lot here is you've applied a layer of kind of fact checking and flagging of things that are not accurate and may have, you know, significant public consequence, especially when it comes to something like the coronavirus.

[00:32:19]

And as we're as we're talking about your evolution from from growth to a focus on questions of speech, I wonder, do you care if groups like conservatives in the US feel like you have a bias against them? I mean, one of your peers, Mark Zuckerberg, does seem to care about that quite a bit. Do you care if some meaningful percentage of Americans feel that you are somehow suppressing them, censoring them as part of your iterations and your growth?

[00:32:46]

I absolutely do. I come from a very conservative state of Missouri where my hometown is much more liberal. So it's in this basket of conservatism and my dad is very much a conservative. So I absolutely do care that we're building a system that does not take our own bias into account, but feels fair. And I think one way to show that is continue to be a lot more transparent around our decisions, continue to be clear in our policies, which we haven't been in the past and we haven't been transparent around our actions and the way behind it.

[00:33:29]

I think that brings us naturally to President Trump and his own use of Twitter, and there's perhaps no one who has better grasped the incentives of Twitter and how to exploit them than Donald Trump. I once sat in his office with them before he was president, and he talked to me about how savvy a user of Twitter he was and is, and he's turned out to be very savvy about it. Would you agree with that?

[00:33:51]

I mean, that Donald Trump is one of the most deft users of this platform you've created.

[00:33:57]

I he's he's definitely used it to great effect, and I wouldn't say he's necessarily the the most because it's just really a question of like what your what your goals are. I would say, because his usage tends to be consistent. He started in a particular way and that has remained consistent to today.

[00:34:18]

But I guess what I'm saying is he's taken advantage of the existing incentive structure. Does that feel right? And he continues to to kind of use it to his advantage. And given the emotional quality of his tweets, it suggests that Twitter is still very much rewarding those incentives.

[00:34:35]

How how would you say that? How would you say it's rewarding?

[00:34:37]

Well. If versus versus society rewarding that versus, you know, the media constantly pointing the cameras on on that and putting all the attention on that, how is people using Twitter different from the approach that we're seeing elsewhere? I believe that was a rhetorical question, but TV is a mediated, somewhat mediated form and for the most part, until very recently when the president wanted to say something that contained misinformation, that contained an outright lie, that called the judge a name, that expressed a racist sentiment that pretty much was his to do on your platform without any real Leya or mediation.

[00:35:25]

And they got tons and tons and tons of tweets and picked up an attention where you're completely ignoring the layer of people who push back on any one particular tweet or reply to it or spread it with a correction to their followers and to hashtags or search terms so that there is mediation. But it's a question of the people doing the mediation versus the centralized media doing it.

[00:35:51]

So you're you're you're right there. And again, like a lot of our policies and focus in this particular area are really focused on like the velocity and the spread of information and the gaming of these systems and where this might have harm if people were to see it and take it out of context. So we did label tweets of his that we felt could be harmful because they may have led people to believe that they were registered to vote when in fact, they weren't.

[00:36:22]

And we didn't take the tweets down. We presented them with annotator them with information as to the facts expressed by the various institutions that were doing the work around registering voters. So I think it's important that we do recognize No. One, that these annotations are happening by the crowd in real time all the time. And number two, there are particular areas such as voter suppression and election integrity that we should also take action upon, and we should make that policy as tight as possible and we should make those interventions as infrequent as possible.

[00:37:06]

But the reality is they have to be there.

[00:37:09]

Are you prepared to ban Donald Trump from Twitter if you feel that he has repeatedly violated your rules, your terms of service? And I ask this because you have banned certain figures from Twitter, the most famous being Alex Jones of Infowars, because you all said that he had violated your rules around abusive behavior. So does Donald Trump break those rules? And, you know, I mean that because like I said before, he attacks judges or he calls women dogs or he spreads false and misleading information on a routine basis.

[00:37:39]

So we are independent of any particular account. We we hold all accounts to the same rules. But if there are particular egregious aspects of violations of our terms of service, we won't hesitate to take action on the accounts and use every tool that we have together with us or independent of the US president or any leader around the world. We will take action if we if we feel it necessary.

[00:38:06]

I know people resist hypotheticals, especially people in your position, but but sometimes they can be important. And let's say that it is November third or fourth or fifth or sixth or seventh, because this is going to be a very unusual election. And President Trump takes to Twitter and declares that he is the winner, even though that's not yet clear or accurate. And in some ways, our democratic system at that moment is going to be very severely tested. And if he makes that declaration on Twitter, what do you do?

[00:38:31]

Well, I guess I would look for opportunities to learn from the past. So do we not see some of this? Play out way back during Gore and Bush in Florida in terms of being confused as to what the end state around what the election was and and how that evolved. So look for lessons in history and work with our peers and civil society to to really understand what's going on and then make an informed decision. But as you said, it's.

[00:39:08]

It's a hypothetical that we just need to think a lot more about in terms of the integrity of the conversation around the election and what that means and what that looks like. And this is our number one focus area for the health of the public conversation in this country, which is the conversation around elections. And you will see us continue to evolve our policy to protect the integrity of the of the conversation around around elections. Listening to you talk to you are very measured, you are calm, you are careful, and you're the CEO of a platform that in its current form thrives on emotion that's notorious for elevating the most, as we've said, kind of hot, sensational charged views.

[00:39:56]

A colleague of ours anticipating his conversation and kind of knowing your tone joke that if you tweet something that sounds like Jack Dorsey, it probably won't do all that well on Twitter.

[00:40:07]

And honestly and honestly, Twitter doesn't seem like you.

[00:40:11]

Exactly. And you don't exactly seem like Twitter. So do you like Twitter right now? Do you like what it's become?

[00:40:17]

Do you like it, what it is as it exists today?

[00:40:22]

One view of Twitter and I think it's a very specific view, is us all focused on these reactionary emotional headline click bait tweets when that's just not the reality of the majority of our usage in the world. And not that it's not important to focus on news and politics and and how that changes the discourse. And not that it's not important to help do everything that we can to fix it. But I think the way to do that is, is to listen and to use the tool in such a way that we can really understand how society is evolving, how technology is evolving, that we can utilize to to help these problems and in the first place.

[00:41:05]

So, yeah, I'm not I don't I don't use Twitter to get as much spread as possible. I used to listen and to observe and to and to understand our world and my world and myself, also my. But if Twitter doesn't change meaningfully from its current form, does it remain deeply flawed? It would be silly for us not to change to. So, yes, it would be it would it should become irrelevant if it doesn't change, if it doesn't constantly evolve and if it doesn't recognize gaps and opportunities to to get better.

[00:41:42]

So, absolutely. That that would be a that we would earn that that irrelevance in that particular case. Well, Jack, thank you very much, we appreciate your time. Thank you. Thanks for doing it. Thank you so much. We'll be right back. A lot of companies are struggling right now, Zendesk is here to help their remote support bundle comes with the basic tools your team needs to stay agile and connected with customers, whether it's by email, phone chat, help center or social media.

[00:42:45]

And with Zendesk, it takes hours, not weeks, to get up and running. Their customer support software is easy to use and quickly scales to meet changing needs. You can try it now for six months for free. Go to Zendesk Dotcom slash the daily to get started. The deal is made by Feel Welcome, Andy Mills, Lisa Tobin, Rachel Quester, Lindsey Garrison, Andy Brown, Claire Tennis Geter, Paige Kowit, Michael Simon Johnson, Brad Fisher, Larissa Anderson, Wendy Dor, Chris Wood, Jessica Chang, Stella Tannen, Alexandra Lee Young.

[00:43:26]

Jonathan Wolff. Lisa Chow. Eric Kripke. Mark George Luke Vanderpool, Kelly Pran, Julia Longoria Sindhu. Donna Summer, MJ Davis, Lyn Austin Mitchell. Nina Pontac. Dan Powell. Dave Shaw. Sydney Harper. Daniel Guimet, Hanz Butoh. Robert Jimmerson, Mike Benowa. Bianca gave us the Chaturvedi, Michelle Banjar and Liz Oberlin. Our theme music is by Jim Grundberg and Ben Landsburg of Wonderly. Special thanks to Sam Dolnick, Michael Bouchard, Lauren Jackson, Julia Simon, Mahima Chobani, Nora Keller and Des Ibaka.

[00:44:15]

That's it for The Daily. I'm Michael Barbaro. See you on Monday. The capture is a new Peacocke original series that explores pressing questions about surveillance and misinformation in a post truth world seen can be deceiving, hailed by critics as a thinking man's bodyguard. The capture is a modern day spy drama set in London that begins with the arrest of a former soldier and then spirals into a thrilling conspiracy involving manipulated video evidence. All episodes of the capture are available now on Peacocke, the new streaming service from NBC Universal.

[00:44:59]

Sign up at Peacocke TV.com to stream now.