Transcribe your podcast
[00:00:00]

If we don't have spaces where we can start to come to some common understanding, like the whole democracy thing doesn't work. But how do we do that? How do we speak to each other in a divided society? And I think the exciting thing, the really positive thing is that these problems are solvable. And we know that because they have been solved many times before.

[00:00:22]

I am very pleased to have someone I admire, respect and have known for many years and call a friend, Eli Pariser, who is author of The Filter Bubble and gave a very famous TED talk that led to many of the things everyone now is talking about in terms of echo chambers, the dangers of personalization. Eli also runs Civic Signals, which is a nonprofit that is reimagining the new digital public spaces that we need and importantly is putting on a festival called the New Public Festival, happening January 12th through 14th online.

[00:00:53]

The reason that I wanted to invite Eli on is a lot of people now we estimate close to about 100 million people have seen the social dilemma, which is incredible. And obviously the film enumerates all the harms and the problem with these digital social spaces that we now inhabit. But it doesn't talk about the solutions. And I think what Eli is doing with the festival is trying to say, how do we actually do this and is this actually tractable?

[00:01:18]

We know this is a civilizational challenge, but there's so many people who are like putting their hands up and like, I'm ready. I want to be part of solving this thing.

[00:01:26]

Our minds have been so engineered to expect so little from the way that we engage online that we've kind of forgotten just how rich the alternative space is.

[00:01:36]

And I hope that the new public festival is a first step and one of many steps towards all of us imagining not just the dilemma, but the social solutions. I'm Tristan Harris. And this is your undivided attention. Eli, just welcome to your undivided attention and would love for you to talk a little about why this isn't such an intractable problem. It's just on.

[00:02:11]

So first off, thank you so much for the work that you've been doing. It's been great through the years to be thinking about this together. When I wrote The Filter Bubble, I was looking ahead at this future where people were increasingly in their own information bubbles and polarized and not able to create a sense of common ground and common facts and was really worried about it, but also didn't have much to say about what do we do about this?

[00:02:39]

Why would we ever be important? That doesn't seem to become reality. No, I know it was a real mess.

[00:02:45]

But to be clear, you actually spoke about this. Was it two thousand, nine or ten or something like 2011?

[00:02:50]

Yeah, but it was early and I got a lot of pushback from folks in tech basically saying, like, this problem's overblown and took, I think, until 2016 for people to kind of really start to grapple with it seriously. So I've been doing this research with Talia Stroud, who's a communications professor at the University of Texas, Austin, and my co-director at Suffolk Signals. And the central question that we started with was this question of if you were a Facebook or Twitter, you have a news feed and you wanted to just optimize for a healthy, pluralistic society.

[00:03:25]

Like, let's say you don't care about clicks and eyeballs. You just care about building a healthy, pluralistic society. How would you do that? What would the metrics be? And we walked around that question for a long time. We talked to political scientists and sociologists and finally kind of Tallia had this breakthrough where she said, you know, we're treating this as if it's an information problem. It's about how do we organize and sort information. And really, from a sociological perspective, it's not so much an information problem.

[00:03:51]

It's really a question of human relationships and how people form those relationships, those bonds of trust and connection that the information flows across. And when we started thinking about that way, we started thinking, well, what if we think about digital platforms as spaces? When you think about people in space, you don't think about kind of little facts zipping back and forth between brands. You think about all the funny, great things that humans do. We said and we watch each other and we have all these nonverbal cues.

[00:04:21]

There's so much going on that's not just a pure matter of me pushing content to you and you pushing content to me. So that really led to this next realization, which was, hey, the question of how do you get strangers to behave well together? That's one of the oldest civilizational questions there is. And ever since humans have started to build permanent towns and cities and settlements, there's been this question of like, how do you design for common area for people to be able to speak and be seen and heard?

[00:04:53]

You know, one big jumping off point for us, really was both urban planning and the sociology of cities, especially when you get into cities, you start to have strangers interacting. Right. And I think a lot of our digital problems come down to how do you get strangers to behave together?

[00:05:10]

One way I often think about this is, you know, if you didn't have any zoning laws in a city and you didn't care about urban planning and you just cared about what maximizes revenue and profit, your city would quickly look like Las Vegas. Right. I actually don't want to mention Las Vegas, but the notion that there's bright lights everywhere, there's signs blaring past, there is sort of prostitution, gambling. There's whatever works at creating the most sensory experience without, say, zoning laws.

[00:05:35]

That is there a residential area? Is there a commercial area? Are there public parks? When you talk about building relationships, I think about the times that I've met you in New York and in Fort Greene, where you can just sit on a park bench and you can see someone next to you, an old woman feeding the birds who might come from a different religious background or, you know, New York is such a diverse city and you can really see that diversity.

[00:05:55]

And there's nothing equivalent to sitting on a park bench and seeing someone feeding the birds on Facebook or on Tinder in Tinder is another place that they're stranger to stranger interactions. But then there's a notion that Tinder is structuring stranger stranger relationships in a particular way as sort of a set of playing cards of faces that you can choose either like or dislike. So what you're really saying is there's a way that our cities can script certain social behaviors, script certain norms, create certain audiences that allow for some kinds of behaviors and not others.

[00:06:25]

I mean, a park without those park benches is different than a park with those park benches is different than a park with a fountain versus without a fountain versus a park with a piano at the center of it on a sunny day where people can sit down and play the piano for others. Yeah. Versus one without.

[00:06:39]

Part of what this metaphor has led us to is this notion that, like, we've got to stop just focusing on a few big tech platforms and start looking at what is the whole neighborhood that needs to be built. And so what new public festival is trying to do is kind of highlight some of the people who are building those other parts of the neighborhood. Right. The parks of the Internet. The libraries will have the Internet archive involved as a great.

[00:07:03]

Use of public digital infrastructure, but believe that until we have all of those pieces working together, we're not going to be able to kind of build the digital social fabric we need in order to, like, survive as a civilization. And people are doing it. And that's part of what we're going to showcase over those three days since the release of the social dilemma.

[00:07:26]

One of the more encouraging trends I've seen is there hadn't been a lot of people building alternative sort of social products, things like me or clubhouse or recently a telepath. And there actually seems to be more of them and more attention on them, because I think there's a bigger consumer demand for different ways for us to publicly interact online that won't reproduce some of these problems. I mean, this is one kind of social space structured in a particular way, and it's very different than a clubhouse, which is for those who don't know, a kind of moderated audio chat rooms almost.

[00:07:57]

I think of them kind of like live radio broadcasting for you and your friends on topics. And you can kind of have moderation and bring someone into the room and someone can raise their hand. And we've got a caller calling in from New Jersey, Adoni from New Jersey. Come in at what you already have to say. And there's all sorts of experiments. And how do we do social spaces differently to avoid the mistakes of the past include teen mental health, addiction, isolation, conspiracy, thinking, alienation, breakdown of truth and trust.

[00:08:25]

But one of the big things that might be worth kind of elevating here is we've never had a global public square before. And I I wonder sometimes is it possible to have such a thing? Is there are there even design constraints for a healthy digital global public square or do we really need to do it more locally? And I think that's one of the differences between, say, Reddit, which has smaller channels moderated by individual kind of humans that are thinking about what's good for that community, as opposed to the kind of Facebook model where there's this one big global or the Twitter model.

[00:08:56]

There's one big global public square and we build these master. I watch towers to shoot down these little laser weapons, all the hate speech we don't want or whatever. And that doesn't seem like a sustainable solution. So how do you think about the global aspect versus the like?

[00:09:09]

Yeah, well, I mean, for what it's worth and this is said with all respect for the many really thoughtful and good people who are working inside of the Facebook and Twitter and Googles. But the more I say with it, the more impossible it seems to me the task is of coming up with an algorithm or a set of algorithmic variations that can post human life. And two hundred countries like that just seems on its face extremely difficult or impossible. And I think some of that is why local governance and local identity has always been part of how we've managed.

[00:09:52]

Even when you build a federalized country or an empire, there is some room for people to have some agency and control over what their local space looks like and express who they are. That's a structural problem and a governance problem and a problem that runs right against, I think, the business model which focuses on scale as a primary objective. I think what we need to build will tend to be smaller and more discrete because it's really it's impossible to hold norms and to feel a sense of ownership over something as big and diffuse as like the entire world's space.

[00:10:34]

And I think in that sense is kind of a really interesting federalized model because the speech norms depend on which Subrata, you're in and you can jump into a lot of Subrata, but it's like traveling to a country like you have to follow the rules of that country now. And I'm not suggesting that it is without its problems also, but I think it's a really interesting model. Read is really structured around these Subrata, these kind of verticals that are heavily moderated.

[00:11:03]

The rules are usually on the right hand bar when you come into a space and you can set whatever rules you want. And if it's a place where you only can post cat standing up, then if you put a post a cat sitting down like, too bad there's not going to be seen. But those are rules that those aren't rules that Reddit, the platform is setting the rules that moderators are coming up with together. And so there's a lot of room for experimentation and control of what do we want this space to be, who is welcome here and how do we build that again, if you sort of flip back and forth between how does this work and physical life, how does it work in digital life?

[00:11:39]

You know, there are a whole bunch of people whose job it is to mediate spaces.

[00:11:43]

So if you think about like a library, like librarians are just checking out books, they're like doing this very delicate balancing act of all of the different constituencies that have to come together. And that's part of what makes libraries like this incredibly socially generative place. But it's a skill and it's something that people need to be paid for.

[00:12:02]

And right now, we've sort of imagined that everything can be done with these volunteer moderators who do amazing work, but it's like we've basically taken this huge chunk of human work and pretended that we can do it all volunteer. And I think it's not surprising in that sense that things start falling apart because the people whose job it is to glue things together literally aren't are there or aren't being given the tools that they need. Yeah, I hear you saying that there's specific skills that people need to have to do that moderation.

[00:12:34]

Well, part of you use these examples of Reddit and the kind of skills it takes to be a moderator, a peacemaker, a mediator, and pulling two people aside and being like, hey, let's actually work this out. How does this work? And say Wikipedia, where there is actually conflict about a page and there's sort of a social community there and a governance system in which people, a small community of mostly white, mostly men, editors famously actually making editorial decisions on behalf of billions of people.

[00:12:59]

And Wikipedia has a kind of federated model where you have different countries and languages, at least different languages, I believe. Right. Of of the kind of world's encyclopedia. And so you have different sized communities with different levels of moderation, skill or sensibility. And somehow that's a kind of a human driven process. And as I understand, in the early days of Wikipedia, there's sort of an I.R.S. channel and people are back channeling stuff and kind of hanging out in chat rooms and that kind of early feel of the Internet of just meeting people and hanging out.

[00:13:25]

Actually, in 2006, I went to the Wikimedia conference. I got to meet some of the early Wikipedia. And, you know, they're just such fascinating people, the kind of people who, if you went to their house, have libraries full of books on subjects and tend to be, you know, the kind of people you would see in a library or a college professor or something, the kind of people who are just by themselves working on those things.

[00:13:45]

And somehow they developed these relationships with each other. And that's very different than, say, how Facebook was built, which is one of the things I heard you bringing up was scale, that they actually literally had the imperative to grow as fast as possible. So it's almost like saying, hey, you know, that public square in the middle of the city, let's just keep expanding and expanding it forever and obliterating every other aspect of the city because of a growth imperative and a profit imperative.

[00:14:06]

And then meanwhile, we'll only pay for more police officers or moderators of the public square. So when people put up soapboxes that really shouldn't be up there, we're only going to pay so many people to kind of try to look for people with soapbox. Suddenly, the square is filled with Kuhnen soapboxes and everyone's shouting as loud as they can. And there's polarization everywhere so that structure isn't working. And then there's I heard you also say there's a governance question.

[00:14:27]

So what is how do you design that space? What are the right affordance is how big should the square be? Should there be soapboxes? Should they be soapboxes at certain times?

[00:14:35]

There's a fourth dimensional thing here, too, which is maybe on Saturdays or Sundays we allow for soapbox talks or something like that, but not on every day. Right. Or work hours are different than no work hours. So there's a design question and then there's the governance question. So if there's a problem in my public square where I'm online and I'm in the Philippines, we had Maria Ressa, the Philippines journalist, on how does someone in a country adjudicate?

[00:14:57]

Hey, we don't like the way that this norm is set on Facebook for our country, our content policy. And right now we have whether it's the Facebook trust and safety team or the Twitter trust and safety team or the tick tock trust and safety team, you have to appeal to these opaque authorities that I believe in, that you both you and I both believe are doing the best they can, but have created a kind of unmanageable task for themselves.

[00:15:18]

I mean, I think this is like it's an imposter. It's an impossible setup to be, I think, like a global private venture backed tech company trying to host the global public conversation. I just think we'll look back at this era and like feel some sadness and amusement that we tried this model because what is a library? If you optimize it to be a venture institution like it's Amazon, but it's not a library anymore. And that's because a bunch of the really important community work that libraries do, helping people find access to services coming into contact with each other.

[00:15:55]

All of that is super not revenue driving and super not engagement driving. Even sometimes part of what a public institution does is serve people who may be having challenges, may not be able to access anywhere else. And those are the first people that you cut if you're trying to build a sleeker, faster, more up into the right growth model. This current paradigm in tech, it's a setup for, you know, Twitter and Facebook, and it's also a set up for all of the users.

[00:16:23]

It is obviously not possible. And so then that opens up this question of like, well, what can it look like? And, you know, you referenced Wikipedia. And I think when you start to really think about Google and Wikipedia as a kind of model, I think it's really interesting because there are things that Google does do extraordinarily well as a private company and as a public service. But coming to the definition of truth for a given topic, we wouldn't trust Google if it said that this was the authoritative page on Joe Biden or Donald Trump or like who's making that editorial call?

[00:16:58]

And it's only because Wikipedia has this model that builds its own legitimacy, that it makes for a good search result. So to me, what we need to start thinking about is how do we build those kinds of symbiotic relationships in social space? And I'll give you an example, and they'll be at the festival. There's an organization in Vermont called Front Porch Forum, and front Porch Forum is like it's arguably social media. It's arguably like a fancy email newsletter for each town.

[00:17:29]

But it's basically like a heavily moderated conversation, space for every locality in Vermont. Two thirds of people in Vermont use it. It's been growing even while Facebook's been growing. And why? Because you can post once a day and it's all very carefully moderated. And so the quality of the conversation is good. And if you want to have a flame war, you have to be willing to do it for like two weeks. You know, like you have to stay mad for a really long time, which is hard, and stay mad and engaged and able to follow the norms that have been set and that there are actually people who will send your email back to you if you don't follow.

[00:18:05]

Could they change some of those rules and drive engagement way up? Totally, because arguing creates lots of engagement, but it doesn't create a sense of healthy community conversation. So I think the question is it doesn't necessarily have to be an either or. I think it's a both in how do we build more spaces that can do that kind of work? And then they're also going to be the take talks and the Instagram's in the Facebook that are doing some of the things that they do well.

[00:18:35]

But let's not pretend that we can do all of those things within one kind of institution.

[00:18:40]

Mm hmm. Yeah. From the social dilemma where Bailey Richardson, who was early at Instagram and is a whistle blower featured in the film, talks about how the Internet used to be this wild and wacky place in the 90s and early 2000s when I think you and I first kind of got into it or picked up far before then. I remember being on newsletter lists where you just had these great conversations. I was learning to program and I was on these early Mac programming newsletters that no one was on and people would help me and mentor me.

[00:19:08]

I had mentor relationships with people that I met there and then gradually over time, the kind of wackiness and the creativity and the animated GIFs and anything you want, you know, the MySpace sort of version or generation of these things turned into this increasingly commodified, unified, single value, single skinning, single aesthetic strip mall.

[00:19:28]

And so it's kind of a strip mall ization of Haiti's are sort of generic big box stores. And we can fill it with anything we want. And I think that's one of the things I hear you bringing up, is how that desire for scale also forcibly creates commodification because you can't have that level of diversity and have it unilaterally scale by one entity. But maybe if you could go back and talk more about what you were studying, urban spaces as inspiration for virtual spaces, could you give us some examples in the diversity of urban spaces and scripts that we have for strangers interacting?

[00:19:59]

Sure, yeah.

[00:20:00]

So there's this big conversation in urban planning about the design of the physical environment versus what they call programming and programming and physical places is literally like, is there a band, is there a farmer's market? When do we create what kinds of activity? And it's not requiring anyone to do anything, but it's creating these hubs and these attractors that help shape how people are using a space. And there's such a difference between a beautiful big bandshell, but you haven't really figured out how to populate it with musicians and a bunch of musicians on a subway platform who live in the hall space and bring people together in this joyous way and the sense of connection.

[00:20:41]

When you think about that metaphor and digital life, one of the things that we lack in a lot of these spaces is what programming happens, happens because someone is saying, hey, look at me, or someone's sort of leading a charge. But often they're selling something. They're not designing it with the goal of how do we bring all of the different constituencies of this community together into proximity in ways that they see each other having fun? I mean, this is, I think, one of the misunderstandings that we have about how to solve the Rip's in our social fabric is that it's like if only we could talk better about politics, then that would be the solution.

[00:21:18]

And I actually think, again, when you look at the civic spaces we have, they're not made for like talking with strangers about politics. Like that's like a tiny little chunk of what we do. It's important we do it at town halls. There are protests. Those are important as ways of representing public intent. But a lot of it is like being together, seeing each other and just like learning to feel OK with each other in this very non charged way and that sense of the familiar stranger.

[00:21:47]

It really changes what you feel like about how what behaviors, OK, you feel like, OK, I don't totally know this person, but maybe they would have my back if I get harassed or if something goes wrong and you're maybe a little more willing to talk. And so there are all these dynamics that start to kick in when you feel like, OK, I'm actually in this community that is intentionally thinking about how to welcome everybody and intentionally thinking about how to build a sense of relationship between everybody.

[00:22:14]

And so that's just one one example of where I think this metaphor is powerful. But I think the other piece is that so much of the spaces that we have digitally are envisioned and designed by a particular set of people who are. And college educated American white dudes like you and me, and that's a real limiting factor and what kinds of spaces we have, because even if we're really trying our best to be the most empathetic and the most thoughtful about other people's experiences, like we're going to tend to design for ourselves.

[00:22:48]

And so how do we open up if we're not trying to solve for one platform to rule them all, but we're trying to solve for a whole ecosystem of institutions that help serve a community's needs. It opens up a lot more space for lots of different folks to be creating and designing the point you brought up earlier.

[00:23:06]

So incredibly important, which is that our physical public spaces are not all designed so that we can just have the maximum number of political conversations. Right. Like, you know, when I go to Fort Greene or Union Square in New York City or Golden Gate Park in San Francisco, it's not just like, hey, here's more park spaces where everyone's sitting down in circles yelling at each other about politics is when I come online on Twitter, on Facebook.

[00:23:29]

It's almost unavoidable that that activity has not just primacy, but it's sort of a self reinforcing feedback loop. Where I like the equivalent would be some kind of physical space where the the more people have like their first conversation about politics, like it immediately creates the strange attractor where it like accelerates and amplifies and grows and then pulls in more people. Then suddenly signs start showing up everywhere, inviting people who are not in the park to come into the park to have a political conversation.

[00:23:52]

And it's a sort of like amoeba that just keeps feeding upon itself, like this hungry ghost that just wants more political conversation. A lot of people have Facebook and Twitter are trying to figure out how do we depolarize people or create unifying, harmonizing conversation. And you have this great insight and saying maybe that shouldn't be the goal at all. Maybe it should be. Let's go do something together. Let's go play with our kids together in a park. Let's go clean up the litter in the subway together.

[00:24:15]

Let's go build a space for homeless people. Let's go create something together. Let's play music together. Let's sing together in a choir. These are activities that don't make our politics at all even really a part of what matters in human relationship building. And I think that's just a really interesting thing. Like what was social networks look like if they weren't even about but even put as a kind of a tax or sort of on political conversation, that's not really the place for where we would have it.

[00:24:43]

I don't know. I think it's interesting. I mean, I think it's how do you prioritize some other things first? And I think spaces for dissent and spaces for people to, like, realize that this is part of what we use public space for, too. It's like people come together and realize that the oppression that they've felt is shared, and that's part of how power gets rebalanced and how inequality gets addressed. So that's important, too. But I guess the point is, like it's not the only function and it's not happening all the time.

[00:25:16]

And I think you need a lot of the relationship building for each moment of the like, hacking through the hard, knotty, thorny issues as a as a society. And so we've got a lot of like centrifugal forces pulling everybody apart and not a lot. That's like the places where we're all coming together and finding some sense of common community. And I think if you think about how we plan public spaces in cities like a lot of its basketball courts and baseball fields and soccer fields and like, again, that's like that's important.

[00:25:46]

That's not a game. Yeah, right. That nice to have. That's like that's where community happens. But I think in particular, it's how do you create spaces that everyone's welcome in and invested in and feels like they're a part of. And that's been that's really hard to do well and conscientiously, especially when you're trying to grow as fast as you can.

[00:26:09]

We learn these lessons in the cities that didn't work in the seventies, that then started working by the 90s.

[00:26:14]

What are those lessons that we need to apply here? Yeah, well, saying that there's a lot to learn from the history of cities and urban planning and public space is not to say that like humans have nailed these problems or done them perfectly now or previously, these aren't like all totally solve problems. But you do see some of the same tropes come up again and again. And one is Jane Jacobs, who is one of the early pioneers of kind of modern urban thinking, which she was reacting to was Robert Moses, who was trying to optimize the whole city for the automobile and for the white middle class and ended up destroying whole neighborhoods and segregating New York City in this really extraordinary and destructive way.

[00:26:59]

And so there's a lesson there, too, which is the Moses approach was this kind of Top-Down technocratic approach that did not give communities a lot of agency involvement, autonomy or respect. And, you know, we're seeing I think some of those same patterns play out again. And I think it's one of the challenges of kind of like monolithic technocratic design. We don't need to repeat that whole cycle.

[00:27:27]

Yeah, I think you're referring to. The work of Jane Jacobs famous book, Death and Life of Great American Cities and the history of the way that we have different ideas about what makes a good city work. I think she also highlights Corbusier, who's the French professional architect who idealized when I have my notes here, is called Total City Planning and Big is beautiful, where he had no patience for the opposite for physical environments that centuries of urban living created in this sort of slow evolutionary process of all those complexity of those local town networks and structures, it's like there's this is the mess that we are in now.

[00:28:02]

There is no solution to be found here. And his new vision was modeled after and inspired by the authority of the machine of the steamship of the airplane, the automobile, the factory, and simple, repetitive lines that like, erased the horror of this sort of evolutionary complexity of these old cities. It just kind of a crew and a crew crew, I think, of like software where you just build layers upon layers upon layers and it's like, oh, my God, can't we just start over?

[00:28:25]

And so both Karbasi and Robert Moses in New York kind of epitomized this high modernism aesthetic where we're going to top down decide and plan the entire to city bottom to top right versus the approach of Jane Jacobs, who especially in the 60s, really admired and was noting what are the patterns that make the best livable cities the most livable? I think she found that Greenwich Village was famously the epitome of most livable cities where you have strangers who may not even know each other.

[00:28:55]

But there's a sense of safety and well lit streets and there's stoops and people hang out in the stoops and there's the notion of eyes on the street. So instead of having policing or a sense of you have to have police everywhere, you have this the notion that there's sort of the ambient level of you would be seen if crime occurred, you'd be seen by someone on the street and we're all looking out for each other, gives the sort of felt a sense of safety and protection.

[00:29:16]

You take any kind of Twitter hate fight now, and there's this sense of everyone can be yelling at maximum volume and throwing whatever they want with each other. And they ended up with very different ending structures and forms. Anyway, I think this is the kind of history you're referencing of like what are the lessons that we can pull out of some of these different structures?

[00:29:35]

Yeah, and I think one notable and recurring theme is that those movements and the people who led them, you know, it was kind of like a singular auteur vision, usually by a dude about here's how life should be for everyone, you know. And I think that leaves a lot of things out and a lot of people out and also just doesn't give a lot of room for the further wisdom that exists on the street about how life might be best lived.

[00:30:05]

And so that's why I think I mean, at the end of the day, one of the things that public design does or should do when it's successful is like you do have some sense of the community weighing in and helping to kind of design and think about what they what they want to be. And that's an incredibly constructive and powerful thing. When when you do that and it's not here's this nice white space that we built for you go play. It's like, what are we going to do together and how should we build it together?

[00:30:36]

Yeah, totally. And I think one of the things I really loved about your most recent TED talk about this is the notion that for a while it looked hopeless, like if you lived I think I think you use the example of New York City in the 60s or 70s, where I think it was the 70s where they had the kind of highest crime rates and it was the dirtiest that had ever been. And the pollution was awful. And a lot of people could come to the false conclusion that I guess this is just what happens to cities like cities are dirty, polluted, unsafe graffiti everywhere.

[00:31:04]

And then I think the point of your work here is that we actually ended up learning through lots of experimentation patterns that actually mattered. You could have big cities that have eight million plus people in them while not having them devolve into this chaos. I mean, a good example of this is there's very little litter to start with and people are pretty good about keeping it that way.

[00:31:25]

We need to start with a good set of parameters that doesn't encourage people to pollute or litter information that they don't know to be true.

[00:31:33]

Yeah, the notion that we need to be able to form norms, which is, I think what you're getting at and get noticed. Well, you know, spaces where there's some sense of here's what we do here. Here's how we behave. Absolutely. And I guess I think that starts even a level down, though. It starts with I know you and you're a good guy and you're around. And so, like, if you say some crazy thing, I'm not just going to haul off and punch you because we have some sense of relationship and some sense of investing in that relationship.

[00:32:06]

And I'm maybe going to, like, pause and try to understand where you're coming from. Right.

[00:32:10]

I think one of the things that's gone wrong in social media is the lack of forgiveness and mistakes being career ending. Right.

[00:32:18]

The notion of not really just cancel culture, but the notion if you say the wrong thing once, there's no room for trying out something. You know, you think of a child and I'm talking about adults in politics. If you think about a teenager who's going to try out. Being more playful or trickster or experimenting with different identity, they make one wrong move and suddenly it goes viral and their entire school basically is ready to kind of pounce on them and say, you know, you can't do that.

[00:32:40]

And it makes this kind of culture of risk aversion among children who can't try on different identities. And I think you can extrapolate that up to our global political moment where people have views they want to work out and talk about. But everything feels so charged and radioactive. One of my favorite blog posts in the realm of thinking about how to deal with situations like this, Nick Punch, who I think is an alumni from Stanford Business School Graduate School of Business, has written a post about what if Twitter were redesigned for when we make mistakes and how to get how to sort of say and acknowledge and publicly apologize for a mistake, but in a way that's rewarded with growth, not further vitriol.

[00:33:17]

Because I think the current system is you make a mistake. And I think we've actually talked about this before. You know, it's very hard to publicly say that and then feel like the anonymous mob that has incentivized to create a black and white caricature to create that forgiveness. And he actually games out what would happen if there is sort of a instead of a tweet button that I made a mistake button and also how that can be weaponized. And, you know, it's not all good.

[00:33:40]

You can read it and people can use it ironically and say something that's not offensive and say, I made a mistake. And but I like that kind of thinking because I think what we need in our public spaces and democratic spaces are spaces that respect our full humanness, which means experimentation, growth, trying things on. It's not just as simple as you said is information, but privileging what it means to strengthen our relationships so that when we make mistakes, people trust that there's a good person underneath.

[00:34:05]

And that's what we're seeing in each other. Even when we might disagree with someone's shared statements. Totally.

[00:34:12]

And the good news, again, is like there are lots of exciting solutions to that problem, seeing with the decentralized web, with different groups of creators that are starting to create their own spaces, we're starting to be able to look beyond this one totalizing vision and we're starting to see what the future of social communication can be. And so how do we lift up all of the folks who are doing that really important work?

[00:34:37]

I wanted to give one quick example of another one that you mentioned, modifying social norms.

[00:34:41]

One of my favorite early examples of this for quality sensemaking was a Reddit channel called Change My View, independent company called Change of View and then cease fire. And it's a young 19 year old at the time, I think Scottish founder who built this whole system where the norms are.

[00:34:58]

You get more points on that Reddit channel, like you get ranked up as the top user. They're called Delta points for the more you change other people's minds. So the way that the system works is when you post on Reddit, you'd say, I would like someone to change my mind about climate change and economic growth because I see a fundamental tension there that we can't get. We can't solve climate change without also dealing with economic growth. And then instead of saying, this is my opinion and you're all wrong and I want to see more and more self reinforcing news that makes me feel I'm right.

[00:35:24]

The whole premise, the core interaction type, the social norm is I want people to change my mind. And then literally the rewards go and flow in the direction of people whose long term reputation is. They were the top ranked answer for changing people's minds on various topics. And that's a small example of, I think, changing the rules of the game, the social script, Reddit channel. So you mentioned that there are healthier spaces that have already emerged online, but do they add up to a digital equivalent to Greenwich Village?

[00:35:53]

I mean, what what do we need to get to something like that?

[00:35:57]

Well, I think certainly we all have examples of pockets or communities and maybe it's a Subrata or a really well run Facebook group or, you know, there are these little slivers of like healthy community.

[00:36:10]

And part of what Talia and I have really been focused on is, one, what are the qualities that those share so that we can start to really think about, like evaluating which platforms are doing a better or worse job of building those kinds of spaces. And we'll be releasing some of our research on that at the festival. But then how do you start to build more of them? Part of that is about kind of building the tools and the technical infrastructure.

[00:36:35]

But part of it also really is about building the human labor force and recognizing it and supporting it. You know, you need to build the building for the library, but it doesn't work very well if you don't have people who work there who are experts. And I think we've imagined in digital life that we can abstract all the experts and how to help people get along, the social workers, the librarians, the editors and producers who help make sense of information.

[00:37:04]

Well, I'll just do that as volunteers and it'll be fine. And I think it's not fun.

[00:37:08]

And so how do we rebuild those institutions and bring them into this conversation? That takes money. It takes resources. And some of that, I think probably should come from companies like Facebook and Google that have displaced a whole bunch of social infrastructure through their economic. And that's now part of their margin, like maybe we need to take a bit of that and use it to fund the functions that are missing and, you know, not to not to tax them to death, but just to, like, reset the equilibrium of how we build a healthy civil society.

[00:37:46]

Totally.

[00:37:46]

I mean, reminds me of other extractive industries, energy. We're giving this example before, you know, whether it's Con Edison in New York or PG&E in California, when you have a business model in which you make more money, the more the toxic thing happens, which is that you waste lots of energy. So like it's fine if you use a small amount of energy and you pay PG&E or Con Edison for the amount of energy you use. But if they literally make more money, the more energy I waste, like leave all the showers on, leave all the lights on, likely, you know, that's a wasteful business model.

[00:38:14]

But the way that they handled that is by saying past the seasonal availability, limits on energy, they don't actually pocket that money that goes into a regeneration fund to fund the transfer from existing fossil fuel infrastructure for energy into renewable infrastructure. In other words, as people are paying for their energy and they pay the more they use, that incentivizes them from using more. That money doesn't just make PG&E more profitable. It gets put into this collective fund to build renewable energy infrastructure for the whole.

[00:38:41]

And you can imagine if you transfer that logic to something like a Facebook and Google, that to the extent there is advertising and obviously the business model is one of the core issues here, let's imagine that after 30 minutes a day of usage, all monetization gets put into a public fund that revitalizes journalism, revitalizes research for universities and your group and others that are doing this sort of sociology of how do we make new digital public spaces, because that should be funded through the pollution level extraction of the tech companies.

[00:39:13]

So, like, what are the big questions at the new public festival that you think are important for people to answer? I mean, this is a huge area of work and research, and you're putting together this two day, three day conference on it. What are some of the big questions you've outlined that you're wanting everyone to participate in helping to answer?

[00:39:31]

So we're so excited about it, because part of what we're trying to do, and it'll probably be like some awesome things and some crazy experiments that don't totally work, but we're really trying to explore in the design of the festival like how do we build something that's not your normal broadcast a thon, but something that's a little bit more participatory and building a virtual space of our own. And we'll actually have kind of a park map that the festival happens and that helps us orient and space and engage together.

[00:40:00]

And we're really trying to explore, like, how do we get outside of the outside of the zoom box. We're working with a bunch of artists to think about all sorts of different ways that people can relate, give feedback, engage with each other. Using their mouse cursor is using colors, using different forms of conversation. There will be parts of the visual park where we can all kind of mark-up together an area and look at what each other are thinking.

[00:40:28]

So we're trying to just build a sense of we're actually all thinking together rather than there are a few people who have the answers and everyone else is in the audience applauding them. One of the things I love to see from your work and maybe we collaborate on this is coming out of it a kind of a design patterns library for here are a little example, screenshots, videos, little demos, little mixture of here's what New York did. Here's what Chicago did.

[00:40:55]

Here's this little tiny social network you've never heard of that they have this kind of little interaction style. So there's this sense of possibility and documenting and cataloguing all the different shades and colors that are on our palette when we can make social spaces for addressing these different problems. I think coming out of this space, how do we create this catalog of just inspiration so that there's a lasting and visual sense of what is possible and you don't have to just attend the three day conference to see it.

[00:41:21]

And that's one of the one of the things that we'll be doing, actually. So you nailed it. But in terms of the questions, so we're basically kind of starting with hearing from some really extraordinary urban planners and designers and folks who have been really thoughtful about designing for communities. What does public space do? What separates healthy spaces from unhealthy spaces, both in the physical world and online? And I will be sharing some of this research that we've been doing looking at what are the kind of qualities that a lot of healthy public spaces share.

[00:41:54]

We're going to talk about kind of safety and security. And that's obviously a very hot and complicated and interesting topic right now. You know, it's critical for people to feel safe in order to speak, but also that can be a hindrance to really having the conversation or a way of locking people out. And so we've got this really rich set of folks who are going to address that. And then we're doing this World Cafe where we've got a bunch of amazing thinkers, including including you, who are going to kind of offer these provocations about what the future of public life online could look like.

[00:42:32]

And we'll all be able to kind of explore that together. We're going to do a showcase of what the digital future might look like or feel like and then really start thinking ahead 10, 20 years. Like what might it be like if we really can change the paradigm and we've got some brilliant political scientists and designers and futurists who are going to kind of set that conversation in motion. And so the hope is it's not a conference where, like, you'll learn to like meet design trick for extending engagement or whatever, but it offers some real inspiration and grist for people who are thinking about how do we build this differently and also some community and connection.

[00:43:15]

Because what's exciting and you see this, too, is like we know this is a civilizational challenge, but there's so many people who are like putting their hands up and like, I'm ready. Like, I want to be part of solving this thing and contributing and lifting up solutions. And and so hopefully this can be a place where folks like that need each other, get inspired and start building something totally.

[00:43:37]

I think we've seen just so much more now more than ever, especially just the inbox that we see of people's messages after seeing the social dilemma everyone wants is hungry for solutions. They want to say, what are the alternative? I want to switch my kids to a different platform. This is actually one of the big issues is after seeing the social dilemma, all these high schools and colleges and K through 12 schools, their kids are all on whether it's Ticktock or Instagram, and they don't have another alternative to connect with each other on.

[00:44:04]

And imagine there could easily be an alternative that's built more by child psychologists that are actually even thinking where is it not appropriate to have social media at all, just to make sure we're saying that? I think that's actually developmentally dangerous fundamentally at some core level. But at some age, what are the kind of spaces that work? Well, you know, when I was growing up in my elementary school, I remember in education software company coming in and doing some experiments with us, and they had child psychologists who were testing to see if these spaces that they created for us, these little games, I think that they were testing with our school if they were actually appropriate for the kids.

[00:44:35]

And you don't see that kind of care, that kind of Sesame Street values driven, we're not doing Sesame Street to maximize engagement for children. In fact, it's the opposite. Elmo gets up and says, now you can do a little dance, which is sort of this this trick to get the kids to stand up out of their sofa chair. It's a stopping you, if you like, instead of the infinitely scrolling feed of just watching Sesame Street.

[00:44:54]

They literally say, like, now let's get up and do a little dance with Elmo. And that's the sort of cue to create a space in a break that now maybe you have a conscious choice. Your body's literally in a new state in space where the choice to kind of lead and do something else is available. I'm just excited about the diversity of people and thinkers who are coming together. And in my mind, I can already sort of see it like in a dream world.

[00:45:14]

It's like Audrey Digital Minister of Taiwan, creates the political, you know, sort of Athens auditorium for political discussions about how do we govern ourselves.

[00:45:23]

There's just so many aspects of our social space that I think what you're trying to show is that there's a fertile ground of infinite alternative possibility in our imagination is the thing we have to reclaim, because we've been living inside of these commodified spaces for so long that we've forgotten how to reimagine dream big a. Think what's really possible, so we've ceded the role of, like, really having ambition and imagination for the future to tech entrepreneurs. Right. And I say that with respect, like a lot of those visions are amazing.

[00:45:56]

But there's a similar tradition that isn't given as much credit, I think, of ambition for new kinds of public infrastructure. And when groups of people in the Midwest in the 19 in 1910 had this crazy idea of like let's make secondary education, high school free for everyone, and that spread rapidly across the whole country and empowered the whole mid century economic engine of America like that was a public intervention, an incredible invention, and kind of breathtaking in how quickly it was adopted.

[00:46:33]

But, you know, it was a different type of imagination. And public parks are another example. Libraries are another example. And it feels like right now, you know, that's part of what we're trying to say is it's really time to think on that scale. Like, what are the institutions that this generation is going to leave behind that do for us what those institutions did in terms of moving the country forward and that do it in an equitable way?

[00:46:59]

That's the project that we're hoping to kind of be a small part of. If you're interested in coming to the new public festival, you can join us at new public anger festival. And everyone's welcome. It's open to the public and it's free to attend. The participatory here is going to fill up pretty quickly, but everybody's welcome to join on the really great live stream as well. And again, you can register at new Public Anger Festival in L.A. I wish you the best of luck for new public festival, and I will look forward to participating myself.

[00:47:33]

Thank you so much for for coming today.

[00:47:35]

Thanks so much for having me on and for everything you're doing. Tristan. Your undivided attention is produced by the Center for Humane Technology, our executive producers, Dan Kaddoumi and our associate producer Natalie Jones nor al-Samarrai help with the fact checking original music and sound design by Ryan. And he is Hockaday and a special thanks to the whole Center for Humane Technology team for making his podcast possible. Our very special thanks to the generous supporters of our work at the Center for Information Technology, including the Omidyar Network, the Gerald Schwartz and Heather Riesman Foundation, the Patrick J.

[00:48:16]

McGovern Foundation, Evol Foundation, Craig Newmark Philanthropies and Knight Foundation, among many others. Huge thanks from all of us.