Transcribe your podcast
[00:00:03]

Let me leave you with TED Talk Daily, today's talk is crucial for these divisive, toxic times that so many of us wish we weren't part of.

[00:00:12]

Activist and former CIA officer Jill Eisenstadt dug into the extreme polarization we find ourselves in and the way social media platforms profit from turning us against each other. In her TED 20 20 talk, she raises alarm, but she also offers a way out from the toxicity so we can turn toward healing our societies and bridging differences instead.

[00:00:37]

Around five years ago, it struck me that I was losing the ability to engage with people who aren't like minded. The idea of discussing hot button issues with my fellow Americans was starting to give me more heartburn than the times that I engaged with suspected extremists overseas, it was starting to leave me feeling more embittered and frustrated. And so just like that, I shifted my entire focus from global national security threats to trying to understand what was causing this push towards extreme polarization at home.

[00:01:11]

As a former CIA officer and diplomat who spent years working on counter extremism issues, I started to fear that this was becoming a far greater threat to our democracy than any foreign adversary.

[00:01:24]

And so I started digging in and I started speaking out, which eventually led me to being hired at Facebook and ultimately brought me here today to continue warning you about how these platforms are manipulating and radicalizing so many of us and to talk about how to reclaim our public square. I was a foreign service officer in Kenya just a few years after the September 11th attacks, and I led what some call hearts and minds campaigns along the Somalia border, a big part of my job was to build trust with communities deemed the most susceptible to extremist messaging.

[00:02:00]

I spent hours drinking tea with outspoken anti-Western clerics and even dialogue with some suspected terrorists. And while many of these engagements began with mutual suspicion, I don't recall any of them resulting in shouting or insults. And in some cases, we even worked together on areas of mutual interest.

[00:02:20]

The most powerful tools we had were to simply listen, learn and build empathy. This is the essence of hearts and minds work, because what I found again and again is that what most people wanted was to feel heard, validated and respected. And I believe that's what most of us want. So what I see happening online today is especially heartbreaking in a much harder problem to tackle. We're being manipulated by the current information ecosystem and changing so many of us so far into absolutism.

[00:02:52]

That compromise has become a dirty word because right now social media companies like Facebook profit off of segmenting us and feeding us personalized content that both validates and exploits our biases. Their bottom line depends on provoking a strong emotion to keep us engaged, often incentivizing the most inflammatory and polarizing voices to the point where finding common ground no longer feels possible. And despite a growing chorus of people crying out for the platforms to change, it's clear they will not do enough on their own.

[00:03:28]

So governments must define the responsibility for the real world harms being caused by these business models and impose real costs on the damaging effects they're having to our public health, our public square and our democracy. But unfortunately, this won't happen in time for the US presidential election. So I am continuing to raise this alarm because even if one day we do have strong rules in place, it will take all of us to fix this. When I started shifting my focus from threats abroad to the breakdown in civil discourse at home, I wondered if we could repurpose some of these hearts and minds campaigns to help heal our divides.

[00:04:08]

Our more than 200 year experiment with democracy works in large part because we are able to openly and passionately debate our ideas for the best solutions. But while I still deeply believe in the power of face to face civil discourse, it just cannot compete with the polarizing effects and scale of social media. Right now, the people who are sucked down these rabbit holes of social media outrage often feel far harder to break of their ideological mindsets than those vulnerable communities I worked with ever were.

[00:04:39]

So when Facebook called me in twenty eighteen and offered me this role heading into elections integrity operations for political advertising, I felt I had to say yes. I had no illusions that I would fix it all. But when offered the opportunity to help steer the ship in a better direction, I had to at least try. I didn't work directly on polarization, but I did look at which issues were the most divisive in our society and therefore the most exploitable in elections interference efforts, which was Russia's tactic ahead of 2016.

[00:05:12]

So I started by asking questions. I wanted to understand the underlying systemic issues that were allowing all of this to happen in order to figure out how to fix it. Now, I still do believe in the power of the Internet to bring more voices to the table, but despite their stated goal of building community, the largest social media companies, as currently constructed, are antithetical to the concept of reasoned discourse. There's no way to reward listening to encourage civil debate and to protect people who sincerely want to ask questions.

[00:05:46]

In a business where optimizing engagement and user growth are the two most important metrics for success. There's no incentive to help people slow down to build in enough friction that people have to stop, recognize their emotional reaction to something and question their own assumptions before engaging. The unfortunate reality is lies are more engaging online than truth, and salaciousness beats out wanky fact based reasoning in a world optimized for frictionless morality. As long as algorithms goals are to keep us engaged, they will continue to feed us the poison that plays to our worst instincts and human weaknesses.

[00:06:29]

And yes, anger, mistrust, the culture of fear, hatred, none of this is new in America, but in recent years, social media has harnessed all of that and as I see it, dramatically tip the scales and Facebook knows it. A recent Wall Street Journal article exposed an internal Facebook presentation from 2018 that specifically points to the company's own algorithms for growing extremist groups presence on their platform and for polarizing their users. But keeping us engaged is how they make their money, the modern information environment is crystallized around profiling us and then segmenting us into more and more narrow categories to perfect this personalization process.

[00:07:17]

We're then bombarded with information confirming our views, reinforcing our biases and making us feel like we belong to something.

[00:07:26]

These are the same tactics we would see terrorist recruiters using on vulnerable youth, albeit in smaller, more localized ways before social media with the ultimate goal of persuading their behavior. Unfortunately, I was never empowered by Facebook to have an actual impact. In fact, on my second day there. My title and job description were changed and I was cut out of decision making meetings. My biggest efforts trying to build plans to combat disinformation and voter suppression and political ads were rejected.

[00:07:59]

And so I lasted just shy of six months. But here is my biggest takeaway from my time there. There are thousands of people at Facebook who are passionately working on a product that they truly believe makes the world a better place, but as long as the company continues to merely tinker around the margins of content policy and moderation, as opposed to considering how the entire machine is designed and monetized, they will never truly address how the platform is contributing to hatred, division and radicalization.

[00:08:32]

And that's the one conversation I never heard happened during my time there, because that would require fundamentally accepting that the thing you built might not be the best thing for society and agreeing to alter the entire product and profit model. So what can we do about this? I'm not saying that social media bears the sole responsibility for the state that we're in today. Clearly, we have deep seeded societal issues that we need to solve. But Facebook's response that it is just a mirror to society is a convenient attempt to deflect any responsibility from the way their platform is, amplifying harmful content and pushing some users towards extreme views.

[00:09:17]

And Facebook could if they wanted to fix some of this, they could stop amplifying and recommending the conspiracy theories, the hate groups, the purveyors of disinformation, and yes, in some cases, even our president, they could stop using the same personalization techniques to deliver political rhetoric that they used to sell us sneakers. They could retrain their algorithms to focus on a metric other than engagement, and they could build in guardrails to stop certain content from going viral before being reviewed.

[00:09:51]

And they could do all of this without becoming what they call the arbiters of truth. But they made it clear that they will not go far enough to do the right thing without being forced to and to be frank, why should they? The markets keep rewarding them and they're not breaking the law because as it stands, there are no US laws, compelling Facebook or any social media company to protect our public square, our democracy and even our elections. We have ceded the decision making on what rules to write and what to enforce to the CEOs of for profit Internet companies.

[00:10:30]

Is this what we want? A post Truth World where toxicity and tribalism Trump Bridge building and consensus seeking. I do remain optimistic that we still have more in common with each other than the current media and online environment portray, and I do believe that having more perspectives surface makes for a more robust and inclusive democracy, but not the way it's happening right now. And it bears emphasizing, I do not want to kill off these companies. I just want them held to a certain level of accountability, just like the rest of society.

[00:11:06]

It is time for our governments to step up and do their jobs of protecting our citizenry. And while there isn't one magical piece of legislation that will fix this all, I do believe that governments can and must find the balance between protecting free speech and holding these platforms accountable for their effects on society. And they could do so in part by insisting on actual transparency around how these recommendation engines are working around how the curation, amplification and targeting are happening. You see, I want these companies held accountable not for if an individual posts misinformation or extreme rhetoric, but for how their recommendation engines spread, how their algorithms are steering people towards it, and how their tools are used to target people with it.

[00:11:59]

I tried to make change from within Facebook and failed, and so I've been using my voice again for the past few years to continue sounding this alarm and hopefully inspire more people to demand this accountability. My message to you is simple pressure your government representatives to step up and stop ceding our public square to for profit interests, help educate your friends and family about how they're being manipulated online. Push yourselves to engage with people who aren't like minded. Make this issue a priority.

[00:12:34]

We need a whole of society approach to fix this. And my message to the leaders of my former employer, Facebook, is this. Right now, people are using your tools exactly as they were designed to sow hatred, division and distrust, and you are not just allowing it, you are enabling it. And yes, there are lots of great stories of positive things happening on your platform around the globe. But that doesn't make any of this OK.

[00:13:07]

And it's only getting worse as we're heading into our election. And even more concerning face our biggest potential crisis yet if the results aren't trusted and if violence breaks out. So when in twenty twenty one, you once again say, we know we have to do better. I want you to remember this moment because it's no longer just a few outlier voices, civil rights leaders, academics, journalists, advertisers. Your own employees are shouting from the rooftops that your policies and your business practices are harming people and democracy.

[00:13:43]

You own your decisions, but you can no longer say that you couldn't have seen it coming. Thank you. PR ex.