Happy Scribe Logo

Transcript

Proofread by 3 readers
Proofread
[00:00:00]

This is Hidden Brain, I'm Shankar Vedantam. In the popular imagination, leaders are visionaries, they're single minded, like Morpheus in The Matrix films. They know exactly how to rally the troops when catastrophe looms.

[00:00:16]

[Morpheus speaking] Tonight, let us make them remember. This is Zion. And we are not afraid.

[00:00:28]

Or take The Wolf of Wall Street in which Leonardo DiCaprio plays a ruthless stock trader. At one point he gives a motivational speech to his sales team.

[00:00:38]

Are you behind on your credit card bills? Good pick up the phone and start dialing. Is your landlord ready to evict you? Good pick up the phone and start dialing. I want you to deal with your problems by becoming rich.

[00:00:55]

In Hollywood's imagination, leaders often have messianic vision and unshakable conviction.

[00:01:02]

I can't accomplish a god damn thing of any human meaning or worth until we cure ourselves of slavery and end this pestilential war.

[00:01:16]

Especially in the United States, we celebrate certainty and look down on those who express doubt, caution and hesitation.

[00:01:29]

This week on Hidden Brain, we examine the psychological origins and the unforeseen consequences of certainty. And we'll explore the magic that happens when we replace certitude with curiosity.

[00:02:00]

Adam Grant is an organizational psychologist at the Wharton School. He's the author of Think Again: The Power of Knowing What You Don't Know. He's interested in the question of obstinacy. Why do so many of us find it difficult to question our own beliefs and challenge our own views? Adam Grant, welcome to Hidden Brain.

[00:02:20]

Thank you, Shankar. Such a treat to be back here.

[00:02:23]

On the face of it, if I had known you as a kid, I might not have predicted that you would grow up to write a book about the virtues of humility. As a kid, you say you earned the nickname Mr. FACT's.

[00:02:35]

You were very knowledgeable, but it sounds like you yourself had trouble admitting that you were ever wrong.

[00:02:41]

I think I was a pretty annoying kid. It must have been really unpleasant for my friends, too, because I spent so many hours trading baseball cards.

[00:02:51]

And, you know, I'd say, well, this Mark McGwire rookie is not worth what you think it is because he only had a .282 batting average in 1989. And I just rattle off these these random statistics. And I think most of the time my friends could care less and over a couple of years.

[00:03:11]

I learned the hard way that this was not serving me well if I cared about people enjoying interacting with me. And also if I cared about my own learning, the moment that comes to mind is I must have been in seventh grade.

[00:03:25]

I think I was 12. My friend Khan was on the phone with me. It was a commercial during Seinfeld and we got into an argument. I don't remember what it was about. And I just refused to give in, even though he had really good proof and eventually he hung up on me and I called him back and I said, did the power go out?

[00:03:45]

Why did the line drop?

[00:03:46]

And he said, I won't talk to you until you admit you're wrong.

[00:03:52]

Yeah. And we had a habit of talking every day, so it was a it was a big moment and by that point it was clear that he was right. But I was still having a hard time admitting it, and that bothered me.

[00:04:06]

Adam no longer has a big problem when it comes to admitting he's wrong, but he does still suffer from a related problem. He finds it hard to challenge other people when they do or say something wrong. Well, I think some of this stems from my personality, I'm a pretty agreeable person, so agreeableness is one of the major dimensions of personality around the world, we think of agreeable people as warm, friendly, polite, welcoming, Canadian. Disagreeable people tend to be more critical, skeptical and challenging.

[00:04:43]

And they're also statistically over-represented among engineers and lawyers.

[00:04:48]

And as long as I can remember, I've been agreeable. And it's it's weird because on the one hand, I hated admitting I was wrong and I was extremely stubborn.

[00:04:59]

But on the other hand, I really liked Harmony and I wanted to get along with other people.

[00:05:04]

And I can think of multiple situations where I've been sitting in an Uber or Lift and the air conditioning is just blasting and it's just too uncomfortable to ask the driver to turn it down because it's not my car.

[00:05:18]

So I'll just sit there with my teeth chattering until the ride is over.

[00:05:26]

And being agreeable seems like a good thing, but but you also point out in the book that it can come with a downside.

[00:05:32]

Yeah, I think like everything else in life, it has tradeoffs. So on the one hand, agreeable people create a lot of harmony. They tend to get along with other people. They're constantly encouraging. But if you look at the data on leadership effectiveness, one of the things you see is highly agreeable people tend to be worse at leading organizations and teams than people who are somewhere in the middle of that spectrum.

[00:05:56]

Hmm.

[00:05:57]

And the risk is that sometimes they're too nice, they're too polite.

[00:06:00]

They say yes to everything and they don't challenge people enough.

[00:06:03]

And I've felt that we see that in negotiations to where agreeable people are really prone to what's called agreement bias.

[00:06:11]

Cleverly. Where you come to the table, somebody offers you a terrible deal, but you hate the idea of saying no.

[00:06:19]

And so you say yes to something that's not in your best interests.

[00:06:23]

Being agreeable and wanting to be seen as agreeable are only two of the drivers behind our reluctance to second guess ourselves and second guess others. Another source of the problem has to do with a popular notion about the risks of changing one's mind. If you're a student who takes multiple choice tests, or if you've watched TV quiz shows like Who Wants to Be a Millionaire where you're given four possible answers to each question and have to pick the right one, you have likely experienced the first instinct fallacy.

[00:06:55]

It's one of the most interesting findings in the psychology of being a student. It dates back to an experience we all had growing up.

[00:07:04]

I remember my mom telling me if you're unsure of an answer on a test, go with your gut. Go with your first instinct.

[00:07:12]

And yet, if you look at the research, if you do go with your gut versus your second guess your first instinct, which is better, and on average, the vast majority of students who reject their gut, they actually improve their scores on average.

[00:07:26]

Hmm.

[00:07:27]

And so there's a fallacy that your first thoughts are your best thoughts. A lot of times, intuition is just a subconscious pattern recognition. And the patterns that you're recognizing from the past may not be relevant to the problem you're solving right now in the present.

[00:07:42]

Mm hmm.

[00:07:43]

And so you don't want to trust your gut. You want to test your gut. And even when you tell people about this evidence, they are still reluctant to rethink their first answer.

[00:07:53]

I wonder if some of this has to do with the role of regret.

[00:07:55]

If I have first instinct, the first answer, and I'm wrong about it, I'm going to experience some regret. But if I come up with an answer and then I switch to the wrong answer and it turned out my first answer was actually correct, presumably I would experience greater regret now because I had the sense, you know, I actually had the right answer and I foolishly went and changed my mind.

[00:08:15]

That is such an eloquent summary of what a number of psychologists have argued and found that they regret the pain of of having had the right answer and then undone it is much greater than the pain of of sticking with what you thought was the right answer, even though it was wrong.

[00:08:35]

So stubbornness and obstinacy and the first instinct fallacy affect not just our personal lives, but they also affect things in a much bigger scale. Can you tell me the story of the meteoric rise and fall of BlackBerry, Adam?

[00:08:49]

Sure. The beloved BlackBerry.

[00:08:51]

I still miss my BlackBerry, the keyboard in particular. I need to wish the iPhone had that U2. Yes, me too.

[00:08:58]

I'm so glad I'm not alone. Maybe one day.

[00:09:02]

So the BlackBerry obviously was a complete rethinking of the way that we communicate. And the BlackBerry starts out as a two way pager. And this brilliant scientist, engineer Mike Lazaridis, figures out that they could actually use this device for work emails.

[00:09:23]

So they launch it. It skyrockets in popularity. I think we can both remember a time when basically everyone you knew had a BlackBerry and they just dominated the market. And then BlackBerry fell apart because - Mike and his colleagues were unwilling to rethink the very things that had made BlackBerry great.

[00:09:46]

It's this stuff right here.

[00:09:49]

They all have these keyboards that are there, whether you need them or not to be there.

[00:09:54]

But what we're going to do is get rid of all these buttons and just make a giant screen, a giant screen.

[00:10:02]

They could not wrap their minds around the idea that people would want a touch screen.

[00:10:08]

And as late as 2008, 2009, 2010, 2011, when the iPhone has exploded in popularity, there's still having arguments about with with Mike as co-founder and CEO saying, well, you know, I get this holding up his BlackBerry and his keyboard and then the iPhone.

[00:10:29]

Look at that and saying, I don't get this.

[00:10:32]

And they just got locked into this set of assumptions that what people wanted out of a BlackBerry was a device for for basically work e-mail, as opposed to essentially a computer in your pocket for home entertainment.

[00:10:44]

Right.

[00:10:44]

And they really miss that opportunity to think again.

[00:10:52]

Our reluctance to think again can have even bigger stakes in the 1980s, NASA downplayed a brewing problem in the spacecraft Challenger. Since the spacecraft had completed many missions, officials assumed it was safe. But in January 1986, the spacecraft exploded moments after liftoff, killing seven astronauts on board.

[00:11:18]

We have a report from the flight dynamics officer that the vehicle has exploded. Flight director confirms that. We are looking at checking with the recovery forces to see what can be done at this point.

[00:11:28]

Or, take the U.S. war in Iraq, where President George W. Bush and his colleagues failed to rethink their views after their initial rosy expectations of the war.

[00:11:41]

We will, in fact, be greeted as liberators

[00:11:43]

failed to materialize. Adams cites another tragedy, the Vietnam War. In the early 1960s, diplomat George Ball warned President John F. Kennedy and then President Lyndon Johnson about the risks of entering and escalating the war. He foresaw that if the United States went forward with the war, leaders would later have trouble rethinking their position.

[00:12:10]

It's chilling. Basically, he writes this at the early phase decision of should we even enter the Vietnam War or not? He's an undersecretary at the time.

[00:12:20]

It was a difficult time because as time went on, I found myself more and more isolated because I was the only one urging this kind of a policy.

[00:12:30]

And he basically says, look, what I'm afraid is going to happen is that we're going to send some initial troops to war and people are going to die. And then we've sent our own people to die. That can't have been for nothing. And so then we're going to have to send more people and invest more resources. And we're going to get trapped in this escalation of commitment to a war that didn't need to be fought in the first place. And my understanding is most historians and political scientists now would tell you that that's exactly what happened.

[00:13:06]

When we come back, we look at why rethinking is psychologically painful and how to help ourselves and others overcome this pain. You're listening to Hidden Brain. I'm Shankar Vedantam. This is Hidden Brain, I'm Shankar Vedantam. Psychologist Adam Grant is the author of Think Again, a book about the virtues of rethinking oppositions. We've seen how not being open to rethinking our beliefs can have major consequences in our personal, professional and political lives.

[00:13:47]

Adam, I want to talk about some of the drivers of obstinacy in our lives. I know that you're a fan of the TV show Seinfeld. And there's a famous scene which features a restaurant owner who is called the Soup Nazi.

[00:13:59]

Nothing for you.

[00:14:01]

He makes great soup, but he cannot tolerate the slightest criticism or deviation from the script. I want to play you a short clip where the character Elaine visits the Soup Nazi.

[00:14:12]

One mulligatawny. And what is that right there? That lima bean?

[00:14:18]

Yes.

[00:14:19]

Never been a big fan. You know what? Has anyone ever told you you look exactly like Al Pacino, you know, scent of a woman who are who are.

[00:14:35]

Very good, very good. You know something? No soup for you. You come back one year.

[00:14:43]

So the Soup Nazi illustrates something that you talk about at home, the difference between relationship, conflict and Task conflict.

[00:14:50]

What is this difference? I'm suddenly craving some mulligatawny, but we'll get back to that. Let's talk about Task and Relationship conflict.

[00:15:09]

Most of us, especially those of us who are agreeable, when we think about conflict, we are thinking about Relationship conflict. That's the personal, emotional, I think you're a terrible person. And my life would be better if I never had to interact with you.

[00:15:16]

Kind of clash that a lot of us run into. There's another kind of conflict, though, that an organizational psychologist named Eddie Jan and her colleagues have studied. Task conflict, and it's the idea of debating about different opinions and perspectives. It's potentially constructive because it's actually about trying to get to the truth. It's not personal. It's not emotional.

[00:15:41]

We're not trying to beat up the other person. We're not feeling like we're being attacked.

[00:15:48]

We're trying to hash out or sought out different views through what might be a feisty conversation. But it's intellectual. And I think one of the biggest problems that the Soup Nazi had is he could not have a task conflict without it becoming a relationship conflict.

[00:16:07]

The moment that you object to his line, that you don't follow his rules, he takes it very personally and bans you from his soup oasis.

[00:16:19]

Is that how you interpret it as well?

[00:16:21]

That's exactly how I interpreted it. I mean, Elaine has many problems as a character, of course, but the fact that she's basically saying she's not a fan of lima beans, that is not necessarily a reflection on, you know, the soup Nazi character is just means that there's a customer who doesn't like something that you made. It's not a big deal. But he can't take a criticism or a disagreement as just saying, all right, someone has a different opinion than I do.

[00:16:41]

He has to take everything personally. And I mean, you've surveyed teams of workers in Silicon Valley about conflict and examine which groups perform well and which ones don't. And you found that the way they handle tasks, conflict and relationship, conflict, it's often behind successful teams and unsuccessful teams. Can you explain to me what you found?

[00:17:03]

Yeah, I think the mistake that a lot of people make is they assume that less conflict is better. That if you want to build a successful collaboration or a great team, then you want to minimize the amount of tension you have. But as some researchers have argued, based on a lot of evidence, the absence of conflict is not harmony, it's apathy.

[00:17:26]

If you're in a group where people never disagree. The only way that could really happen is if people don't care enough to speak their minds.

[00:17:33]

And so in order to get to wise decisions, creative solutions, we need to hear a variety of perspectives.

[00:17:42]

We need diversity of thought. And task conflict is one of the ways that we get there by saying, you know what, I think we we actually don't agree on what the vision for our company should be or what our strategy should be or how to design this product.

[00:17:57]

And so let's let's hash that out, and I I don't think we do that enough. So in the study that I did, I tracked team performance over a number of months and I surveyed people in teams on how often they were having relationship conflict as well as task conflicts.

[00:18:13]

Even if they agreed on nothing else, they agreed on what kind of conflict they were having and how much of it.

[00:18:19]

And it turned out in the failed groups, they tended to have a lot more relationship conflicts than task conflicts, especially early on, they were so busy disliking each other that they didn't really have substantive debates until about halfway through the life cycle of their project.

[00:18:36]

And by then it was almost too late to change course, whereas in the high performing groups, they started out with very little relationship conflict and plenty of task conflict, saying, look, before we design a product, we really want to get all the ideas on the table about how we might do it or what it might be for.

[00:18:54]

And then once they sorted those out, they were able to to really focus and align around what their common mission was.

[00:19:01]

And they were able then over time to say, OK, as different issues crop up that we disagree on, we're going to have another debate and we're going to hear each other's views again. And that ultimately gave them a better shot at working on something that was really promising.

[00:19:16]

So I hear you say sort of two things about relationship conflicts that are pernicious. One is that in the presence of relationship conflict, people shut down and don't voice the concerns that they have that could be useful for the performance of the group.

[00:19:28]

But the second thing is when task conflict concerns are raised, when you have relationship conflicts, those are likely to be interpreted through the lens of relationship conflict. In other words, someone raises an issue with something that the group is doing and people behave like the soup Nazi. They react and take things personally.

[00:19:47]

Yeah, that is a very clear and elegant summary of what I think the research points toward when a disagreement becomes personal. Everything that gets raised by the other person is interpreted in the most negative light possible. And then I think the other problem is people sometimes just they don't even hear the substance of the idea because they're so invested in defending their ego or in proving the other person wrong.

[00:20:17]

There's a related idea to this distinction between task conflict and relationship conflict that you explore in your book. Adam, you say that one reason it's hard to admit we are wrong is that we sometimes confuse our beliefs with our values.

[00:20:30]

What do you mean by those terms? OK, so when I think about a belief, I would say that's something that you take as true. A value is something you think is important. And yeah, I think a lot of us make a mistake of taking our beliefs and opinions and making them our identity. And since I spent a lot of time studying the workplace, I really enjoy thinking about how dangerous the world would be if people in the professions that we rely on every day did that.

[00:21:04]

So let's say you were a doctor, I don't know, half a century ago or more.

[00:21:09]

I would not want to go to you if your identity was professional lobotomist.

[00:21:15]

Right. If you're somebody who carries this belief that the right way to treat anxiety or other kinds of problems is to just remove your frontal lobe or part of it, that's probably not going to go well.

[00:21:27]

And so I want to see the doctor who's open about beliefs, who says, well, I want to learn from the best evidence about how to treat this problem.

[00:21:35]

Mm hmm.

[00:21:36]

But who's committed to you and has conviction around the value of protecting and promoting health and safety? Mm hmm. And you could do the same thing with with all kinds of other jobs.

[00:21:46]

I definitely would not want my community policed by an officer who sees herself as a stop and frisker.

[00:21:54]

You know, and there are examples of leaders who basically model what it's like to have task conflict without relationship conflict. I was thinking of something that President Obama said some years ago when he invited someone he disagreed with to play a prominent role in his administration.

[00:22:10]

We're not going to agree on every single issue, but what we have to do is to be able to create an atmosphere where we can disagree without being disagreeable and then focus on those things that we hold in common as Americans.

[00:22:24]

To disagree without being disagreeable.

[00:22:27]

I think many of us forget this lesson at and we think that if someone else is wrong, our job is just to correct them. How we correct them is unimportant.

[00:22:35]

Yeah, I think that's such a common mistake in communication. We think it's the message that matters. But so often whether somebody is willing to hear a message depends on who's saying it, why it's being said and how it's being delivered.

[00:22:52]

And I cannot tell you, Schanker, the number of times that I have rejected useful criticism because I didn't trust the person who was giving it to me.

[00:23:04]

Mm hmm.

[00:23:04]

Or they delivered it in a way that I found disrespectful or offensive.

[00:23:09]

Right.

[00:23:09]

And that was a missed opportunity for both of us.

[00:23:16]

Not all of us listen to useful feedback even when it's presented clearly and without rancor. That's because we confuse challenges to our views with threats to our ego.

[00:23:28]

There's a term that I love for this which comes out of psychology originally Tony Greenwald's term. It's the totalitarian ego.

[00:23:38]

And the idea is that all of us have an inner dictator policing our thoughts.

[00:23:43]

And the dictator's job is to keep out threatening information, much like Kim Jong Un would control the press in North Korea.

[00:23:53]

Mm hmm.

[00:23:54]

And when your core beliefs are attacked, the inner dictator comes in and rescues you with mental armor and, you know, activates confirmation bias where you only see what you expected to see all along, triggers desirability bias, where you only see what you wanted to see all along.

[00:24:14]

And you can feel like you're not under threat after all.

[00:24:22]

You can see the totalitarian ego at work in a study conducted some years ago by researchers in Australia. They asked volunteers to think of a time when they did something wrong and apologized for it, and to also think about a time when they did something wrong and did not apologize for it. Researcher Tyler Okimoto explains what they found.

[00:24:41]

When you refuse to apologize it actually makes you feel more empowered. That power and control seems to translate into greater feelings of self-worth.

[00:24:50]

And in some ways, the sounds like the inner dictator when we when we apologize, in some ways we are disarming ourselves. And when we refuse to apologize, in some ways we are mounting a form of emotional self-defense.

[00:25:04]

Yeah, sadly, staying attached to are wrong convictions makes us feel strong. And psychologists have also found for decades that the act of resisting influence only further fortifies our convictions. Because we can we basically get inoculated against future attacks. We have all of our defenses ready and we end up sealing our beliefs in an ever more impenetrable fortress.

[00:25:36]

We've talked about a number of different drivers of obstinacy and stubbornness, our inability to question our own beliefs.

[00:25:43]

I want to talk a moment about the role of identity and stereotypes. Can you talk about how these things are loyalties to different groups? Are our membership in different tribes can keep us in some ways from challenging, you know, deeply held beliefs or cherished beliefs and in some ways keep us from rethinking our views?

[00:26:03]

Yeah.

[00:26:03]

So I have a brilliant colleague, Phil Tetlock, who wrote a paper about how almost every decision you've ever made, almost every opinion you've ever formed, is influenced by your relationship to the people around you and by the groups that you're part of and the identities that you hold about who you are in the social world.

[00:26:23]

And what Phil observed is we often spend time thinking like preachers, prosecutors and politicians.

[00:26:32]

Preaching is basically defending a set of sacred beliefs and saying, look, I found the truth. My job is to proselytize.

[00:26:40]

It doesn't make any difference if the churches today have stopped preaching on Hell, if the preachers don't preach on Hell, it is still a place that you must deal with one day.

[00:26:51]

Prosecuting is the reverse.

[00:26:54]

It's saying, OK, my job is to prove you wrong and win my case with the best argument.

[00:27:00]

For this man to skirt financial and moral responsibility because he found a scuzzy lawyer and a scuzzier shrink to pronounce him disabled, for this man to waltz into a court and get an order saying that this woman was never married when she led an exemplary married life. How dare you live?

[00:27:17]

And any time where part of a group that has strong beliefs. It's pretty unlikely that we're going to rethink any opinions or decisions as we get into preacher or prosecutor mode, because we already know.

[00:27:29]

I already know I'm right and you're wrong. We're a little more flexible when we shift into politician mode.

[00:27:36]

Look at this, what is it, Lucania, the no B.S. V.P. damn right they are.

[00:27:41]

I mean, I lied and everything, but it sounded true, at least.

[00:27:45]

Because when you're thinking like a politician, what you're trying to do is get the approval of an audience that you care about.

[00:27:53]

And so you might be campaigning and lobbying. And sometimes that means adjusting and flexing at least what you say you believe in order to fit in and win them over. The problem is that we're doing it because we want to prove our allegiance to a tribe, not because we're trying to get closer to the truth.

[00:28:21]

Coming up, strategies to help us rethink our own views and how to help others reconsider their cherished opinions. You're listening to Hidden Brain. I'm Shankar Vedantam. This is Hidden Brain. I'm Shankar Vedantam. University of Pennsylvania psychologist Adam Grant has written a book about the power of searching for the flaws in our own beliefs and arguments of challenging deeply held views. Many of us have trouble doing this, and the echo chambers that surround us usually don't reward us for being reflective and adaptable.

[00:29:09]

But history shows that if we check our personal egos, we can make important discoveries.

[00:29:15]

Adam, you tell the story of Orville and Wilbur Wright, the brothers who invented the first successful airplane.

[00:29:22]

Can you describe their relationship to me and how it bears on the conversation we're having? Of all the moments in history that I would love to witness, I think watching the Wright brothers argue would be pretty high on my list. So if you look at the history of what the Wright brothers created together, it seemed like they were constantly in sync. They created their own printing press together. They ran their own bicycle shop. They made their own bikes together.

[00:29:48]

They launched a newspaper together. And of course, we all know they invented the first at least successful airplane together.

[00:29:55]

And I always assumed that they were just lucky to have such harmony.

[00:30:00]

And if you read any of the biographies that have been written about them, if you read their own letters and personal communications, if you read the stories and the anecdotes from people who knew them well, it was very clear that arguing was their default mode and it was almost the family business.

[00:30:21]

And what I think is fascinating about the Wright brothers is they mastered the ability to have productive task conflicts without it spilling into relationship conflict.

[00:30:30]

It was typical for them when they were trying to invent their airplane to argue for weeks about questions like how do you design a propeller?

[00:30:39]

And they would sometimes even shoot for hours back and forth.

[00:30:43]

And at one point, their sister threatened to leave the house because she just couldn't take it anymore.

[00:30:48]

But they seem to get a kick out of it. They called it scrapping and they said, look, the whole point of an argument is it helps both people see more clearly if you do it well.

[00:31:00]

They never saw an argument as personal that their mechanic used a phrase that I think about almost every day. He said, I don't think they really got mad, but they sure got awfully hot.

[00:31:14]

And that that, to me, captures the passion, the energy, the feistiness that goes into, you know, duking out a set of ideas that's really important to you, but not leaving that interaction angry.

[00:31:27]

Hmm. No, not all of us have the benefit of having, you know, a sibling, you know, real or metaphorical who can play this role for us. We don't have a partner maybe who who can help us rethink our views.

[00:31:38]

But all of us are often part of organizations and part of teams, or we can be part of organizations and teams that in some ways can play the same role for us. You tell the story of Steve Jobs, the co-founder of Apple, obviously a brilliant visionary, but he was also famously stubborn.

[00:31:56]

He was also prone to the first instinct fallacy and very nearly came close to not inventing the iPhone. Can you tell me what happened and how Steve's mind got changed? Sure.

[00:32:07]

When you think about your network, we all have a support network that's usually the highly agreeable people who we know are going to have our back and, you know, really lift us up or pick us up when we're down.

[00:32:21]

I think what we overlook is that we also need a challenge network, which is a group of people that we trust to question us to point out the holes in our thinking, the flaws in our logic, the ways that our decisions might be leading us astray from our goals.

[00:32:38]

And it's not clear to me that Steve Jobs did this intentionally, but he was very lucky to be surrounded with a group of people who played that role of a challenge at work for him.

[00:32:49]

What I do all day is meet with teams of people and work on ideas and solve problems to make new products, to make new marketing programs, whatever it is. And are people willing to tell you you're wrong? Oh, yeah, I mean, other than snarky journalists, I mean, people are oh yeah, no, we have wonderful arguments and you win them all or. Oh no I wish I did.

[00:33:16]

And he was dead set against making a phone. He complained for years about how smartphones were for the pocket protector crowd. And Apple makes cool products. We don't want to touch that.

[00:33:30]

He could rant for hours at a time about how, you know, everybody was beholden to the cell phone carriers and they didn't know how to make an elegant product. And sometimes he would even throw his own phone against the wall and shatter it because he was so frustrated with how bad the technology was.

[00:33:48]

Luckily, I think for Apple and for anybody who's a fan of the iPhone or the iPad, he surrounded himself with brilliant engineers and designers who knew how to get him to think again. You have to be run by ideas, not hierarchy, the best ideas have to will. And a lot of the things that they did as part of his Challenge Network are things that we've seen people do every day. They would plant seeds.

[00:34:19]

They would say, hey, I hear Microsoft is talking about making a phone. How ugly do you think that's going to be? And if we ever made one of those, what would that look like?

[00:34:32]

They would ask questions like, you know, hey, we did the iPod. We've already put 20000 songs in your pocket.

[00:34:39]

What if we put everything in your pocket? And what they were doing was they were activating his curiosity.

[00:34:46]

If you told him he was wrong, he would immediately go into prosecutor mode and tear your argument apart.

[00:34:53]

If you told him about your idea, he would preach about his idea.

[00:34:58]

But if you could ask a question that intrigued and led him to realize that he didn't know some things, he might then go out and try to discover them or give you the green light to go and discover them.

[00:35:11]

And those kinds of conversations finally got him to reverse course and make a phone.

[00:35:20]

You know, lots of us in some ways make the same mistake that Steve Jobs made you.

[00:35:25]

We sort of respond, as you say, like prosecutors. We respond defensively, like preachers. If our views are challenged.

[00:35:32]

You've done some exploration into how we can change the views of the people around us. And some of that you just spoke about a second ago, some of the techniques in some ways seem almost coming at the issue sideways rather than head on.

[00:35:47]

And this is, again, somewhat odd because I understand that you sometimes have been called a logic bully.

[00:35:54]

Can you tell me how you got that nickname Adam and what you have discovered about the effectiveness of logic in changing people's minds?

[00:36:01]

I certainly can. I had a former student named Jamie Comis for some career advice, and it was clear in the first minute or so of our conversation that she was already locked into the plan that she had made. And I was worried that she might be making a decision that she would regret.

[00:36:18]

And so I just told her, you know, here are all the reasons why I think you're making a potentially big mistake. Well, she listened patiently for, I think two or three minutes, and then she said, you're a logical. Illogic, what a logic. She said that I overwhelmed her with rational arguments and data and she didn't agree, but she didn't feel like she could fight back. I think what I learned from that experience, Chakiris.

[00:36:47]

You can't really bully someone into changing their mind. And even if you could. You're probably just getting lip service where they're telling you what they think you want to hear to get you to shut up, right, because they're tired of being bullied.

[00:37:03]

And I think that the questions we ask people, the humility we express in saying, you know what, there are lots of things that I'm not sure about here.

[00:37:12]

The curiosity we show in trying to understand more about their own views and their motivation to change their thinking. That's where real thought happens.

[00:37:23]

That's where people who might have closed their minds say maybe, maybe I'll open it this time, maybe I will reconsider.

[00:37:33]

You've looked at people who are successful negotiators and successful debaters. What are some of the techniques they use that that the rest of us might not think to do? You know, right off the bat, in terms of engaging with an opponent, for example, what are what are some of the dos and don'ts?

[00:37:47]

OK, so there's a classic study by Neil Rock'em and colleagues of expert versus average negotiators where they compare what their habits are. And there are a few things that differentiate the experts from the average negotiators. One is they spend a lot more time both in their planning and in their actual negotiations, thinking about common ground and talking about common ground, saying we want to build areas of consensus before we find out where we're opposed.

[00:38:12]

Mm hmm. They also asked a lot more questions.

[00:38:15]

They'd say, OK, here are two or three possible proposals. What are your reactions to this? What do you like? What do you dislike and what are your thoughts? And that allow them to both learn more and again, signal more flexibility as well.

[00:38:27]

I mean, it's so interesting, especially in the political sphere, but but certainly not limited to the political sphere.

[00:38:32]

We often think of trying to change someone's opinion with the metaphor of, you know, a tug of war, that the harder I pull, the more I can get you off balance, the more likely I am to win. And the model that you're suggesting here is a very different model, you know, model where you're asking a lot of questions, where you're seeking common ground, where you're willing to make concessions, where you're open to figuring out how you yourself might be wrong.

[00:38:56]

That's a very different model and a very different metaphor than a tug of war.

[00:39:01]

It is. This is not the metaphor I would have ever come up with.

[00:39:05]

But there are some psychologists who have said we should think about disagreements, less wars and more as dance's. And I can't dance at all.

[00:39:18]

I'm so bad at dancing that my wife signed us up for dancing lessons before our wedding and after the second one she gave up. I don't have any rhythm, so I wouldn't have thought of dancing.

[00:39:31]

But what I like about the dance metaphor is, you know, that in a dance your job is to get in sync with your partner.

[00:39:38]

And that means if you've both shown up to the dance with an idea about what steps you're going to take, you can't lead all the time and expect your partner to do all of the adjusting.

[00:39:50]

You actually have to be willing to step back and let your partner lead from time to time. And that's what expert negotiators seem to do, it's what great debaters seem to do, and I think it's what all of us could do more when we have polarized conversations.

[00:40:07]

You also apply this this insight in education, when you think about what teachers are trying to do, teachers are in some ways the whole point of education is to get people to rethink their views, to think more carefully about their positions, to to learn. You describe the the story of a public school teacher named Ron Berger who exemplifies this idea. What does Iran do to help people, you know, think about what they're doing in new and interesting ways?

[00:40:35]

So one of the things that Ron does is with his first graders, he'll ask them to draw a house, but instead of drawing a house, he says, I want you to draw four different versions of a house.

[00:40:47]

And he's teaching them that just as you wouldn't expect an artist to frame their first draft, that the initial work they do is open to being reinterpreted and can evolve and improve over time.

[00:41:00]

And he sets up a whole challenge network in his classroom where after you do your first draft, they do what's called a critique session and all the other students are invited to tell you the things that they think are effective and how you can revise and refine it.

[00:41:14]

Who else would add something? Attack? What would you say about the angle?

[00:41:18]

Because not to be mean about the angles, just not exact.

[00:41:23]

So, OK, so show me come on up here to show me where what you would ask him to do slightly differently.

[00:41:29]

And it's not about you, right. It's about the work. Everybody is there to try to help you improve.

[00:41:33]

It is good that he is so good.

[00:41:38]

And by the end of of going through that exercise, what a lot of students will say is they go home and, you know, you've got a first grader who's coloring and they insist to their parents that they want to do six or seven drafts because they really want to keep rethinking the way that they've drawn somebody's eyes or, you know, the colors that they happen to use in a picture. And what a great way to learn that just because you had a first thought about how to create an idea or how to answer a question, that does not mean that should be our final thought.

[00:42:19]

So it's interesting, we've looked at lots of different, you know, variations of this idea of the ways in which TASC conflict sort of spills over into relationship conflict so very often when we are resistant to change, it's not because of the specific ideas themselves. It's because of the context in which those ideas are located, how the idea is presented to us, what the idea means to us, whether our identities are tied up with the old idea versus the new idea.

[00:42:41]

You know, our general stereotypes. It's all of this larger context that in some ways, you know, holds us almost as if it's in concrete, holds us to our pre-existing views. And what are the things that I see you're pointing out repeatedly, is that in order to get people out of the cycle where they are interpreting everything through the lens of relationship, conflict and instability is you actually have to start by emphasizing relationship stability. And in the absence of that task, conflict can quickly become interpreted as relationship conflict.

[00:43:12]

Can you talk to me a moment, Adam, about the very important idea of psychological safety and how psychological safety is connected with getting teams to think more creatively, more innovatively and also engage in productive tasks, conflict? Psychological safety is, as Amy Edmondson and Cohen and other researchers have defined it, it's the belief that you can take a risk without being punished or penalized.

[00:43:41]

It's really founded on a culture of trust and respect where people will say, look, if I want to point out a problem or I have a concern that I might want to raise, or if I think someone needs to rethink a decision they've made or a judgment that they've landed at, that I can bring that up without having to worry that they're going to bite my head off. Mm hmm.

[00:44:03]

And we know that psychological safety is one of the foundations of building a learning culture and making it easy for people to rethink things. Because we've seen this in in studies of hospitals, for example, that when teams have psychological safety, they actually admit the errors that they've made and then everyone else can learn from them and rethink their routines and practices. Whereas if they lack psychological safety, people are motivated to hide their mistakes and then they repeat them and no one else ends up rethinking the way that they're operating either.

[00:44:32]

Mm hmm. And so I think there's great value in creating psychological safety. I'd say it's easier said than done for a lot of people. Amy Edmondson is quick to point out that psychological safety is not about being nice or having low standards. We actually need psychological safety with accountability. We can have high expectations for people, but also give them the freedom and permission to rethink some of even what we might have called best practices.

[00:45:02]

I have to say, though, when I look out at the culture at him, when I look at sort of the culture of conversation on social media in our political sphere, you know, on cable TV, I don't see an environment that is rich in psychological safety. I don't see an environment where you're rewarded for being nuanced and rethinking your views and admitting when you're wrong. In fact, if anything, the incentives are lined up exactly in the opposite direction.

[00:45:27]

Sadly, I think that's become the norm. I do think, though, there are some steps we can take to have more thoughtful conversations with the strangers that we love to preach at and prosecute. We can be aware of of what psychologists call solution aversion. The idea behind solution aversion is that if you propose a way to fix a problem and people don't like your solution, they often reject not only the solution, but also the problem in the first place.

[00:45:54]

Hmm. So let's say with climate change, for example, if you say, well, we need a whole bunch of companies to reduce their emissions and you're talking to somebody who's a staunch free market conservative, they're not necessarily going to like that idea. And so their motivation then is to deny the existence of the climate problem in the first place. And I think we should be really cautious about jumping to solutions. We would be better off saying, hey, I'm aware that there are some problems when it comes to climate change.

[00:46:25]

And I would love to hear your ideas about the different possible ways that we could solve them. We shouldn't spend all this time talking about why my solution is right or why your view that climate change isn't an issue is wrong. Instead, I should say, well, given your views about what we should do on climate policy, how would your proposed solutions work and how would you implement them? And when you ask those questions, something really intriguing happens.

[00:46:58]

Psychologists call it the illusion of explanatory depth. And it's the idea that we think we understand complex systems much better than we actually do. And the best way to make us a little bit more intellectually humble, curious, nuanced, more doubting, less dogmatic is to ask us to explain those very systems and their impact.

[00:47:20]

And so if you try to walk me through all of the effects of greenhouse gas emissions program, you quickly realize there are a lot of things you don't know.

[00:47:31]

And empirically that that tends to lead you to moderate your views. You become less extreme, you become more open minded, and we have a more civil dialogue.

[00:47:40]

Hmm. So I want to take a moment and channel Adam Grant in the next question, I'm asking Adam Grant and I want to try and rethink some of the things that we have said in the conversation today. You know, we all know the iPhone was a hit so we can work backwards and deduce that Steve Jobs was right to listen to his engineers. But there's also some risk in this kind of backward induction. Sometimes it might be possible that inflexibility is actually the right answer.

[00:48:10]

You know, some of history's greatest leaders have been relatively inflexible, I think, about Winston Churchill facing down, you know, Adolf Hitler, even think of, you know, people like Mahatma Gandhi, you know, very singular, focused in terms of what they were doing, very unwilling to reconsider sort of the rightness of their views.

[00:48:28]

Now, clearly, you know, inflexibility and is also associated with some of the worst leaders in history. But are there times when, in fact, second guessing can lead to dissension, to dithering, maybe even to defeat?

[00:48:42]

I think those are great examples. And I think you're right. Right. What's the saying? That every great truth has an opposite truth. Yeah. So flexibility is a virtue. So is persistence. And I think that the art, much more than science is figuring out when to stay the course and when to shift gears. I don't think that being open to rethinking means you always have to change your mind. So if you're Churchill, the goal of beating Hitler definitely don't want you to rethink that.

[00:49:16]

I do want you, though, to be open to different strategies, right? Because I might find out that their strategy has changed. And so my needs to evolve to. And I think that what that means is we can be pretty steadfast in our principles and our overarching goals and very flexible in finding the right ways to achieve those goals or live those principles. Hmm.

[00:49:38]

You know, as I was reading your book at I was thinking about this proverb, I don't remember where I read this, but it said that if you want to walk fast, you should walk alone.

[00:49:48]

But if you want to walk far, you should walk together.

[00:49:52]

And I'm wondering if that's possible that that second guessing and rethinking might, in fact, have short term costs, but long term benefits?

[00:50:00]

I would say that's an astute observation. I think rethinking. It doesn't have to be slow, but it often slows us down. I worry that we spend too much of our time listening to people who think fast and shallow and not enough time paying attention to people who think slow and deep. And I might even go further and say, I think we need to become people sometimes who think slow and deep, which is a I guess a a reinforcement of so much of what Danny Kahneman has spent his career studying.

[00:50:35]

But I think that there are also ways to accelerate our rethinking, because it would be very tempting to say, OK, you know, in a lot of situations, what we need to do is we just need to slow down. We need to pause. We need to give ourself a chance to come up with a second opinion and third opinions. And, yeah, I think slowing down helps with that.

[00:50:58]

But there's no reason why we can't in the moment say, OK, I'm about to make a decision. I have a clear plan about what I want my career to be, or I have an idea about what city I want to live in. And let me take five minutes and just think about all the reasons why that plan might be wrong, not just the reasons it might be right. Let me reach out to somebody in my challenge network and ask them, can you see some holes in my reasoning?

[00:51:22]

Is there any way that I might regret this decision?

[00:51:25]

And that could be a quick reconsideration process.

[00:51:36]

Ironically, it's something we've all been forced to do in the past year by a pandemic. So many convictions, I cannot tell you, Chancre, how many leaders told me they would never let their people work remote and are now questioning whether they should even have a physical office? I think it's unfortunate that we only did that rethinking when we were forced to. And I think we could take the initiative to do it more deliberately, more proactively, as opposed to waiting until we have no other choice.

[00:52:11]

Adam Grant is the author of Think Again The Power of Knowing What You Don't Know. Adam, thank you for joining me today on Hidden Brain. Thank you, Shankar. This was such a delight. Hidden Brain is produced by Hidden Brain Media, Metromedia is our exclusive advertising sales partner. Our production team includes Brigid McCarthy, Laura Querelle, Kristen Wong, Ryan Cats', Autumn Barnes and Andrew Chadwick. Tara Boyer is our executive producer. I'm Hidden Brain's executive editor.

[00:52:52]

Our unsung hero this week is Natalie Mualla. Natalie heads up business development at our advertising partner, Stitcher Media.

[00:53:01]

She played a crucial role in visualizing how hidden brain, media and stitcher could work together. In recent weeks, Natalie has taken our partnership further. She's given me useful leads for philosophers who could be guests on Hidden Brain. I've always found Natalie to be direct, kind and helpful. And that is the best kind of collaborator. Thank you, Natalie. Many listeners have asked us over the years how they can support our show. We have a way to do that.

[00:53:35]

Please go to Hidden Brain Dog and click on the support button. We appreciate your help. I'm Shankar Vedantam. See you next week.