Happy Scribe Logo

Transcript

Proofread by 5 readers
Proofread
[00:00:00]

This is Hidden Brain. I'm Shankar Vedantam. When you look at the partisan divisions in the United States and other countries, you see something curious.

[00:00:10]

There's a bias, a double standard. It's always there. It pervades our society.

[00:00:15]

It's not just that each side accuses the other of bias.

[00:00:18]

Right wing media isn't doing journalism. It's doing fan fiction.

[00:00:23]

The left can insult people. The left can make outrageous statements. Nothing happens. It's different on the right.

[00:00:30]

They accuse one another of the same kinds of bias.

[00:00:34]

All the staff members of the New York Times, Washington Post, CNN, MSNBC, they're like cornered, rabid rats. They've been selling us lies for so many years.

[00:00:47]

The Americans who listen to Fox News and conservative talk radio are being lied to and manipulated every day.

[00:00:56]

Each side says the other is blind to facts, blind to reason.

[00:01:00]

Look, we're taking on a Republican Party that has rejected science where the vast majority of Republican congressmen and senators do not even accept the reality of climate change, let alone that ...

[00:01:12]

This is a metaphor, really, for the left's entire program, which is built entirely on denying reality. They deny the reality of illegal immigration. They deny the reality of terrorism. They deny the reality of biological gender.

[00:01:24]

You see the same thing in many other conflicts around the world. Each side accuses the other of inflexibility and ideological blindness.

[00:01:36]

Now, there are certainly situations where one side is right and the other is wrong, one side is biased and the other is not, or at least less so. Our focus today, though, is not on specific controversies. Rather, we want to explore the psychological mechanisms that prompt us to judge our own behavior very differently than the behavior of other people.

[00:02:06]

This week on Hidden Brain, the double standard inside our heads.

[00:02:23]

On a daily basis, all of us evaluate others, we think about the claims of people who want to sell us something, we gauge the ideas of colleagues, we assess friends and family. We also regularly look into our own hearts and minds. We evaluate ourselves.

[00:02:40]

At Princeton University, psychologist Emily Pronin has studied why our minds come to very different conclusions about ourselves and others.

[00:02:49]

Emily Pronin, welcome to Hidden Brain.

[00:02:52]

Thank you, Shankar.

[00:02:53]

So a few years ago, Emily, you conducted an experiment where you brought volunteers into a lab and you told them about a range of different biases-- biases like the halo effect where, you know, you see someone who's very beautiful and you assume this person must also be very intelligent, or a bias like confirmation bias where we go looking for information that supports our pre-existing views. And you did something very interesting.

[00:03:18]

You asked the volunteers whether they thought that they would fall prey to these biases. What did they tell you?

[00:03:27]

We had students in a class so that they kind of all knew each other from being in the class together. And what we did is we described each bias just in a few sentences. We didn't use the word bias. We didn't want to make it sound like a negative thing so that people would say that's bad.

[00:03:40]

I don't do that. We just described it in neutral terms. Sometimes people do this, do you do this? And what we found is that people said, "Oh, Gee! Other people do do that, you know, that's so great, you put that into words like that, I see that all the time. You know, someone's really attractive and then they think that person is great on every dimension. But me.

[00:04:00]

I know I don't really do that." So what happened was people recognized the bias as something that people do and they attributed it to other people, but they thought that they did it quite a bit less.

[00:04:13]

And the same thing happens in so many different domains. If you asked me, do you evaluate the news fairly? Are you a good judge of policy? I'll tell you. Of course I am. But I can see lots of biases in the people around me. Emily, you call this the bias blind spot.

[00:04:31]

What do you mean by the term?

[00:04:34]

The reason why I came to call it a bias blind spot is that a blind spot refers to a situation where you can see something sort of all around you except in one place.

[00:04:43]

Mm hmm.

[00:04:43]

And so the blind spot is seeing the bias in yourself because it turns out that people could readily recognize these biases all around them.

[00:04:53]

Let's look at some specific domains where the bias blind spot affects us. When it comes to ethics, we're all quick to see conflicts of interest in other people, but slow to see it when it comes to ourselves.

[00:05:05]

It's such a beautiful example. So, you know, doctors and gifts from the pharmaceutical industry. So people have studied this, and doctors will say, "I'm not influenced by gifts." And oftentimes the gifts are small right there, like you have a Pfizer pen or pad of paper.

[00:05:22]

Sometimes the gifts are rather large, like we'd love to hear you come and give a talk on your research, you know, in the Caribbean, you know, well, we'll fly you over there, you know, in a private jet to give your talk. And I credit to the medical industry that I think, you know, they have really worked on trying to root this out because they recognized it as a problem. So there's no longer free lunches for residents every day, you know, sponsored by various drug companies, as far as I understand.

[00:05:45]

But the point being, the doctors said that they were not influenced by these gifts, but that other doctors were. So it's a perfect example of a conflict of interest not being recognized in self, but seen in others.

[00:05:58]

The bias blind spot also affects how we think we are affected by marketing and how we think others are affected by marketing. I want to play you a clip from an ad that I recently came by.

[00:06:08]

GLH means great looking hair. Just spray GLH on, and it instantly covers your bald spot, leaving you with great looking hair. GLH is not a paint or a cover up. It's an amazing powder that clings to the tiniest hairs on your head. Order GLH now for only.....

[00:06:25]

Now, that was an ad for spray on hair.

[00:06:28]

Now, I don't think I'm influenced by advertising, whether that's commercial advertising or political advertising, but I think other people are quite vulnerable to such persuasion.

[00:06:39]

Yeah, there's a phenomenon called the third person effect whereby people think that persuasive attempts have more of an impact on other people than themselves. So they say commercials, you know, political ads, you know those things. I'm sort of immune to them. They don't influence me, whereas people recognize it's a whole industry, you know, it's influencing other people. But we see others as more susceptible to these influences than ourselves.

[00:07:01]

Yeah.

[00:07:02]

How does this work in politics when we evaluate our political opponents? How does this bias sort of play out in our evaluations, both of people on our side and people on the other side?

[00:07:12]

Yeah, so that's a great question. And obviously we've all been thinking about it a lot recently. So I think there's a lot of things that are going on. One is what do I believe are the roots of my political opinions and political beliefs? And people will swear that the roots of their political beliefs are just in a rational analysis of the issue. Right. So I take the positions I do because those are the correct positions. If you if you analyze the issues, if you analyze the state of the country, if you think about what's best for the nation, these are the correct positions.

[00:07:42]

But they don't view that as being the route to the positions of those on the other side. So the other side is influenced by ideology, by self-interest, by prejudice, whatever it is.

[00:07:54]

You know, I was thinking about the study that came out some time ago, and this was during the Obama presidency. Gas prices were really high and people were asking, you know, how much is the president responsible for? High gas prices or low gas prices. And what was interesting is that the same group had asked the question to citizens during the presidency of George W. Bush when also gas prices were high. And what's fascinating and perhaps unsurprising is, of course, when gas prices are high and there's a Republican in the White House, most Republicans think the president has very little control over gas prices and therefore should not be blamed for it.

[00:08:29]

And Democrats think the president has a lot of control over gas prices and should be blamed for it. And the tables are exactly turned when you have a Democrat in the White House. And so in both cases, people in very self-interested way see the data that they have and interpret it in a way that aligns with their political beliefs. And of course, this is just one example. There must be hundreds of examples like this. That's right.

[00:08:51]

And what's amazing is it's it's motivated reasoning. And both of those words are important. Right. So it's motivated. I'm seeing things in a way that's consistent. Right. With my my motives, my prior beliefs.

[00:09:02]

But it's also reasoning because they're not just saying, well, I'm just going to believe whatever makes my side look better done right now would be just like pure motivation.

[00:09:12]

There's reasoning going on. So people are actually stopping to think, OK, well, what are the factors that influence gas prices and who might be responsible for and what's going on on the global political stage? You know, and and so if you reason it out, these things are so complex that you can find reasons for almost anything. Mm hmm. Or at least for one of the two sides of the issue.

[00:09:32]

Anyway, there is another curious dimension of the bias blind spot. When we come up with positions on various issues, we're keenly aware of the nuances and subtleties of our opinions, but we don't extend the same respect to the views of our opponents. We very rarely say the views of the people who disagree with me are thoughtful and nuanced, right?

[00:09:55]

I think that's right. I think that there is more of a tendency to sort of stereotype and caricaturize

[00:10:01]

others and to recognize the nuance and complexity in our own views. And unfortunately, the political realm I think affords that makes that even more likely because people can't really express their own ambivalences and nuances. Right. Because that's seen as sort of giving in to the other side. So people do tend to portray themselves as sort of more clear and perhaps even more extreme in that respect as a result.

[00:10:29]

We've known for a long time that our evaluations of ourselves are very different than our evaluations of others. The Bible asks why we notice a speck of dust in our brother's eye, but ignore the beam sticking out of our own eye. When we come back, the psychological quirk that produces the radically different judgments we make of ourselves and others. This is Hidden Brain, I'm Shankar Vedantam. All of us find it remarkably easy to identify bias among other people, especially our opponents and all of us find it maddeningly difficult to spot biases in ourselves.

[00:11:11]

Psychologist Emily Cronin has spent years studying this discrepancy in our perceptions, and she has found that much of the discrepancy comes down to the different yardsticks we use in judging ourselves and others.

[00:11:24]

Emily, I'm not sure if you have watched the television show Veep, but but on the show, there's a character named Jonah Ryan who decides to run for president. He's been advised that it's a bad look to be single. So he gets together with a woman who happens to be the daughter of a man his mom used to be married to. So she is his stepsister.

[00:11:43]

In an interview, Jonah prefers to think of his fiancee as only his former stepsister.

[00:11:49]

So what would you say to someone who might ask, how can they marry their step siblings?

[00:11:55]

I'm not her brother, nor have I ever been her brother.

[00:11:58]

Right. And the only time anyone could ever say that would be for that one year.

[00:12:04]

I mean, it's exactly what Woody Allen did. And nobody thinks he's weird. I mean, everybody just hates him because Antz wasn't as good as A Bug's Life. Exactly.

[00:12:11]

Yes. So this is obviously a comedy show, Emily. But I'm wondering if you can just start by explaining when it comes to our judgments of other people, what is the yardstick that we use to evaluate that they are biased.

[00:12:23]

The yardstick that we use is, in one word, behavior. Their actions.

[00:12:30]

The process is it's almost like if you imagine a fork in the road and it just goes two different ways. There are just two different paths here. There's the path that we use for self judgment and there's the path that we use for judging others. And in my view, the path that we use for judging others is we look at their actions, the path that we use for judging ourselves as we look inwards. And when I say look inwards, I mean we look to things like our thoughts, feelings, intentions, motives.

[00:12:58]

So if the question is, did I marry my brother, you know, there's an action, there's a behavior. Right? I did it or I didn't. And that's how other people will judge it. But in judging myself, I might look much more to my motives and my intentions. Am I someone who would intend to marry their brother? No, that's weird. I would never intend to do that.

[00:13:17]

So I guess I didn't do it. And so when we are interacting with other people, what we see is that we see them, their actions. We see their expressions. When we experience ourselves, we don't really see that. Instead, what we perceive is what's inside our heads. That's the information that we're flooded with. That's the information that we can't escape is our thoughts and feelings and intentions. So that's what we give so much weight to.

[00:13:48]

So one of the things that jumps out at me from what you're saying, Emily, is that our introspection, our access to our own thoughts and feelings, these are with us all the time.

[00:13:57]

So it's almost we don't actually have to ask ourselves the question, how do I evaluate myself? We automatically go to looking inward to our thoughts and feelings. When it comes to our evaluations of other people, in some ways, we don't have access to their thoughts and feelings. Those are hidden from us. And so we use what we have. And on the surface, this happens without any sort of conscious awareness that it's happening. Right. So I don't realize I'm using a different yardstick to evaluate your behavior and a different yardstick to evaluate mine.

[00:14:26]

Right. I don't think that we really think about that explicitly. What we use in any judgment psychologists can tell you is the information that's salient. Psychologists like that term salient. The information that's available to us, whatever is fresh in our brains is the information we use. And so it just so happens that for the self, the information that's fresh in our brains all the time is that stuff that we perceive to be in our brains, right, our thoughts and feelings, all that stuff that's sort of just constantly there and that we're sort of constantly aware of.

[00:15:02]

It's not just that we use different yardsticks in evaluating ourselves and others, each of those yardsticks is flawed and flawed in a different way. When it comes to evaluating our own behavior through introspection, we imagine that we can see all our motives and intentions, that they are accessible to us. But it turns out that is not the case.

[00:15:25]

There's a bunch of stuff that goes on in our brain that we're not aware of, right? We're not aware of the sources of our beliefs. We're not aware of, you know, if I go to the ice cream shop and I choose the chocolate ice cream over the vanilla, I am aware that that was my choice. But I'm not aware why that was my choice. That's happening in the brain without my having access to it. But we sometimes forget that.

[00:15:48]

So so we think that we can look inwards and find out everything. Right?

[00:15:53]

So we forget, for example, that a lot of prejudice and stereotyping happens unconsciously. And that means I can't look inwards to find it. I'm not necessarily going to have those racist intentions. That's something that people talk about a lot now.

[00:16:10]

So it's not the case that just because we have access to all this information in our heads, that that means it's always going to be prohibitive for making whatever judgment we need to make.

[00:16:22]

Hmm..I remember speaking some years ago with the researcher, Michael Teszler. He ran an interesting experiment with with Republicans and Democrats. And this is back when the country was debating the the Affordable Care Act or Obamacare and what Michael Tessler did as he presented volunteers with details of the Affordable Care Act.

[00:16:39]

But he told some of them that the plan had been put forward by President Bill Clinton, a Democratic president who was white. And he told other volunteers the plan was from Barack Obama, a Democratic president who was black. So same plan, same details both put forward by a Democratic president, except that one was white and one president was black.

[00:16:58]

And what he found was that both liberals and conservatives were subtly biased by their feelings about the racial identity of the president.

[00:17:07]

White racial liberals become more supportive of a policy when it's framed as Barack Obama than when it's framed as Bill Clinton's. But white racial conservatives become less supportive of that policy.

[00:17:20]

And Emily, I feel like this is speaking to what you just said. If you ask liberals and conservatives, how are you evaluating this policy, they will dive into the details and say, here's why I like the policy or here's why I don't like the policy.

[00:17:31]

And neither will say my affinity or my aversion to someone from a different race might be shaping my view on something like the Affordable Care Act.

[00:17:41]

Yeah, I think that's exactly right. And that's it sounds like a great study. And it's not that the subjects were lying. They were saying what they believe to be the case. They assumed incorrectly that if the race of the candidate had impacted their judgment, that they would know it.

[00:17:58]

Now, an outsider might be able to notice this pattern much more quickly because they wouldn't be relying on their their intentions. They'd just be looking at what the person did. Sometimes when we look at behavior, things can be a little bit easier to see.

[00:18:12]

Yeah. Some years ago, Emily, you came up with a theory of why are intersections are unreliable? You called it the introspection illusion.

[00:18:21]

What is the introspection illusion? Although we have access to our introspection, this is sort of what it means to be a conscious person, as you know your thoughts and your feelings and your motives and your intentions. And there there are all the time in your head we have some illusions about what that can do for us. So we think that that gives us sort of supreme self-knowledge, sort of that we can know all sorts of things about ourselves because we have access to this information.

[00:18:53]

We also think that our behavior is less important than knowing what's inside of our heads. In the case of ourselves, it's our intentions that are so important to know.

[00:19:08]

You know, I was speaking some years ago with Mahzarin Banaji, the psychologist at Harvard, and she said something really interesting to me. She said, you know, if you have a problem with your heart, you might go to a cardiologist to get it checked out. And with a cardiologist says, here's what's wrong with your heart, you’re inclined to believe her because you think the cardiologist knows more than you do about your heart. You don't tell the cardiologist, it's my heart. Therefore, I must be the expert on my heart because it belongs to me.

[00:19:33]

But Mazarin Banaji was saying the same thing doesn't happen with our minds. It's very hard when an expert comes along and says, let me explain to you how your mind works, because at some level all of us feel like we are experts in our own minds. And that's partly, I think, connected to what you're calling the introspection illusion. It feels like our mental worlds are so rich and we spend so much time in them that it feels in some ways we understand how they work.

[00:19:58]

And in some ways that could be an illusion.

[00:20:00]

That's right. And look, it's embedded in the history of our own field. You know, in the very early days of psychology when people wanted to understand the mind, they had people come into the laboratory, sit in a room and introspect. And they said, this is how we're going to learn how the mind works. And then, you know, there was a huge backlash that behaviorists came along and they said, this is nonsense. We cannot learn about how the mind works by asking people to report to us what's going on in their mind.

[00:20:23]

So they said, we're getting rid of all of that. So we're just going to focus on behavior because that is observable.

[00:20:29]

And that know, we had sort of a third wave of cognitive psychology where we realized that there were objective strategies, empirical methods that we could use to study the mind that did not rely on people telling us what was going on in their own minds.

[00:20:43]

Mm hmm. And I think part of this also rests on the idea that if everything that happens in our mind was actually accessible to conscious introspection, we might, in fact, if we were very honest and very diligent, be able to look inside our minds and see everything. But in fact, if much of our minds actually are operating outside of our conscious awareness, you know, what our minds are doing is simply not accessible to us through introspection.

[00:21:07]

Right, exactly. You brought up the halo effect when we were talking earlier. And the famous experiment on the halo effect comes from Tim Wilson and Richard Nesbitt back in nineteen seventy seven.

[00:21:19]

And they had people watch a video in which a professor with a quote unquote foreign accent. I'm not sure what his accent was talking. And then they asked subjects in the experiment to evaluate this professor. And half of the subjects had seen the professor in the video acting cold, not very likable, and the other half of subjects had seen him acting warm, very likable.

[00:21:41]

And afterwards they asked the subjects what they thought of him and what people said in the unlikeable professor condition was that they didn't like his accent when they were asked about his accent in the likeable condition, they did like his accent.

[00:21:57]

So what happened was the likeability of the professor, which was manipulated by the experimenters, influenced how much people thought the accent was likable, but they didn't realize that this had happened at all.

[00:22:08]

They had no access to what had influenced their perception of the accent. Now we knew as experimenter's, right, because we saw those who got the unlikable professor thought it was a bad accent and those who got the likeable professor thought it was a good accent. So the experimenters could say, gee, we know how they came to this conclusion, but the subject didn't know that.

[00:22:25]

The subject just looked inwards and said that's a bunch of hooey. Why would I evaluate someone's accent, how likable their accent is based on how nice they were? That makes no sense. They had no awareness of having done that. It occurred unconsciously and they denied it.

[00:22:37]

And you can see the same thing played on a much larger scale when it comes to politics. For example, we've just been through a presidential election. And, of course, you know, 90 percent of Republicans voted for the Republican candidate. 90 percent of Democrats or higher may have voted for that for the Democratic candidate.

[00:22:53]

But if you ask any one of those people, why did you vote for the candidate, they will give you a whole bunch of reasons that are very nuanced about why they thought this candidate was better than the other candidate, whereas somebody who was, you know, the neutral observer from Mars might come along and say, well, people are just simply voting for whoever their party candidate is. It doesn't really make a difference who that candidate is. And in some ways, I feel like that's an extension of what you were just describing.

[00:23:15]

There might be an underlying reason we are doing something, but once we do it, we in some ways go searching for explanations after our actions are over in some way to justify how it is we arrived at those actions in the first place.

[00:23:27]

That's right. We think we're being rational, that we're choosing the candidate based on rational decision making and rational analysis. But really what we're doing is we're rationalizing. Really, actually, there's other factors that have determined which candidate we prefer. And then after the fact, we rationalize it by coming up with what seemed like rational reasons.

[00:23:48]

And it feels like our evaluations of ourselves and others, you know, is shaped by these dual forces.

[00:23:53]

On the one hand, we ascribe greater weight to our own introspection than maybe we should, but on the other, we discount the introspections of other people. So in other words, I don't think that my political opponents have actually thought very carefully about how they've chosen that course of action. I sort of can dismiss them as being easily led as being sheep, even as we overvalue our own introspections, we undervalue the internal thought processes of other people.

[00:24:19]

Yes. And we've even done experiments where we say, look, maybe the reason why people undervalue others thought processes is that they just don't have access to them. Right. You know, you have such rich access to what's going on in your head.

[00:24:31]

So we will give people an entire think aloud protocol, meaning that before the subject made their decision, they thought aloud into a tape recorder. They just dumped all their thoughts and will give that to the subject to hear and they still disregard it. So we actually did a study with political beliefs, and this was with Jonah Berger and Sara Malucci.

[00:24:54]

We made up various California propositions. One was about increasing the maximum cargo size at the Port of Los Angeles, and then we told people their party's position on them. So the Democrats support this or the Republicans support this, and we ask them to choose their position to vote, essentially. And we asked them, were they influenced by their party's position and the subjects said, no, I wasn't influenced by that.

[00:25:20]

I just evaluated the issue. Before they said what position they would take, we had them list all their thoughts. They dumped out all of their thoughts on the issue, and then we gave that to another person to also evaluate and the other person didn't care about any of that. So people said, I went with my thoughts, I evaluated the issue. But the outsider said, oh, gee, I don't need to see all those thoughts. That's not relevant.

[00:25:44]

You're a Democrat, you went with the Democratic position. Done, simple.

[00:25:47]

One of the other ideas that's connected to your work on how we overvalue the things happening in our minds and pay less attention to the things happening in other people's minds. There's a phenomenon called naive realism. Can you talk about what that is and how it connects to your work?

[00:26:03]

So much of what we're talking about today, I think, is really rooted in the basic functioning of our brains. It's just how we are designed. Right. So, for example, we have eyes in our head and those lead us to see other people's behaviors. But the eyes don't look inwards. And there's just sort of basic brain architecture that determines so much.

[00:26:22]

And naive realism has to do with this. It has to do with the idea that there are some basic and inescapable beliefs. And one is that I believe that I see the world as it is an objective reality. And as a result, I think that others will see the world the same as I do and that when others don't, I have to explain it. And the way that I tend to explain it as either by saying that they don't understand, I need to educate them or failing that, saying there's something wrong with them.

[00:26:49]

Either they're stupid or they're biased.

[00:26:56]

You can see naive realism at work in everyday interactions. Take, for example, something that the comedian George Carlin observed.

[00:27:03]

Have you ever noticed when you're driving that anyone who's driving slower than you is an idiot and anyone driving faster than you is a maniac?

[00:27:15]

Yes, I love that quote, it reminds me of one time I was in the kitchen preparing some food and my seven year old was in the playroom with my father, his grandfather, and they were looking at different cars in a magazine. And my son kept preferring the big SUVs, you know, and my father preferred the little boxer type cars.

[00:27:35]

And at some point, my father said to him, I know it's a matter of taste, but your taste is stupid.

[00:27:47]

That's a great story. That's a wonderful story. And I feel it speaks to something that that that I think is is really important. You know, parents and teachers are constantly trying to teach this lesson. You know, don't jump to conclusions. Slow down. Don't assume you know what's happening in someone else's head. And yet it's so hard to remember to practice these lessons, right? I mean I mean, as a parent and as a teacher yourself, do you sometimes go, you know, I'm doing the exact same thing I tell my kids not to do?

[00:28:14]

Yes, because these things are so automatic and so natural, these are tendencies that we have to override in ourselves, that we can't eliminate them, right, because so the tendency to think you see the world as it is an objective reality. And therefore, if you like the race car better than the SUV, that it truly is better. That tendency is sort of inescapable. And when children do it, they don't realize even that there is a distinction when they're young between their perception is reality.

[00:28:43]

As we get older, we come to recognize. Right. Oh, wait a second. That's a matter of taste, right?

[00:28:49]

At some level, we come to realize, oh, no, no, there's different perspectives and it's an artist. But that initial belief that we have from childhood, that my perception is reality, doesn't really go away. And so it does actually feel that the car we prefer is the better one.

[00:29:10]

There are lots of implications that stem from what Emily calls this basic architecture of the brain. Here is one that should be familiar to all of us. If I'm late for a meeting, my mind is chock full with all the reasons I'm late. Traffic was terrible. I had a childcare crisis and so on. But if someone else is late for a meeting, I don't have access to all that stuff happening inside their heads. It's easy for me to think of them as just being irresponsible or careless.

[00:29:37]

Psychologists call this the fundamental attribution error. Another implication of this work has to do with the phenomenon of magical thinking. Magical thinking involves the idea that our thoughts could somehow influence the world around us. So, for example, if I think ill thoughts about you, can that give you a headache? Or if I think positive thoughts about my favorite player on the team, will that help them score a goal? And we found that, in fact, this was the case.

[00:30:09]

So, for example, we had people think evil thoughts about someone else in the experiment. We said we're interested in whether you could place a hex on the person and then they stuck pins in a voodoo doll. And then the other person reported a headache because they worked for us and we told them to. And what happened was if you were told to think ill about the other person before putting those pins in, you were told, just take a minute and think of something terrible.

[00:30:33]

Just think of the worst thing you can happening to this person. And then you stuck the pins in the doll. Then you felt like you caused the headache and you felt bad. And we found the same thing with basketball before a big university basketball game. We had people think about the different players and think about how each one of them could contribute to the game and how would they help their team score well. And then we asked them after the game how much impact they felt that their thoughts had on the score of the game.

[00:31:00]

So they thought that they'd impacted the game when they had thought about the players doing well and the way that it's related as it again involves putting too much weight on what's going on inside our heads because we're basically saying that what's going on inside my head could give someone else a headache or what's going on inside my head when I sit in the stands of a basketball game could influence the player’s score, how many baskets they shot.

[00:31:26]

I mean, if it's a critical moment in the game, I would feel terrible getting up to leave the room and get some popcorn. I feel like I can't I can't let my team down. How could I do that to them?

[00:31:38]

The introspection illusion, the bias blind spot and naive realism have profound consequences in our daily lives, they do more than shape our thinking in basketball games. They shape life and death decisions and choices to go to war. That's when we come back.

[00:32:11]

This is Hidden Brain. I'm Shankar Vedantam. Psychologist Emily Cronin has found we judge ourselves very differently from the way we judge others. This is because we use different yardsticks while doing those two things. We evaluate others based on their behavior, but we evaluate our own actions using introspection. And it turns out introspection is not a useful guide to understanding our own minds.

[00:32:35]

Emily, I want to talk about some of the implications of this work and the ways in which it plays out in the real world. And I want to start with an example of something that can seem trivial, but that produces widespread conflict across the United States. I'm going to let Whoopi Goldberg explain.

[00:32:51]

A survey found that one of the most common arguments this time of year in households across America is what temperature to put the thermostat.

[00:33:02]

And everybody relates what? I mean, like, oh, yeah, it's like that's of there's a lot of stuff we should be talking about because it's on the list. But this interests me because I feel like everybody deals with this.

[00:33:18]

I want to draw attention to the fact that that this is a topic which you bring up, almost everyone has an opinion about it and the opinion is often heated. Tell me how it connects to the conversation we're having about how we think about our minds, other people's minds and the judgments we we arrive at.

[00:33:34]

I just love the idea of the thermostat wars. It's just it's so real. And I think it goes back to that quote.

[00:33:41]

I know it's a matter of taste, but your taste is stupid because essentially, if I think that the temperature should be set to 73 and you think it should be set to 68, I do realize that this is a matter of taste, right? That there's no right answer here.

[00:33:56]

But at another level, I actually think that the temperature that I want it to be at is the correct one. And it's like that George Carlin quote, right.

[00:34:05]

If you want it to be hotter than me, you're a little soft and ridiculous, you know, and you're wasting a lot of energy. If you want it to be colder than me, you've got to be kidding. Do you really need to be that ascetic and suffer like that? We could turn up the heat a little bit more. Right. So we think it's a matter of taste, but we also think we're right.

[00:34:21]

My. And I've done some research with Nate Chic and Shayne Blackman, where we we showed this with paintings of people say, oh, paintings, that's a matter of taste until someone disagrees with them about which are the nice paintings and which are the bad paintings.

[00:34:36]

And then all of a sudden they say that that person is wrong.

[00:34:40]

Tell me a little bit more about the study. I'm fascinated by the examples that you used and what you found.

[00:34:46]

So Shane collected some images of paintings from art history books, so these were paintings by famous artists and they were in major museums and they were a range from abstract to portraits. And we would show them to a subject and ask them to rate which ones they thought were truly great and which ones they thought were overrated. And then we showed them cover story for this of another subject who had supposedly done the same task.

[00:35:15]

But that subject totally disagreed with them. So if I thought it was great, they thought it was overrated and vice versa. And then they had to evaluate this other person. And although they said that opinions on art were a matter of taste, when they saw this person who disagreed with them, they actually thought that the person was wrong and had been influenced by improper influences because otherwise they surely would have agreed with oneself.

[00:35:40]

And we've done studies with chefs. Kobi Szarkowski, my student, did a study with chefs and chefs, showed this exact phenomenon right. There's a objectively correct amount that the meat should be cooked at. The pasta should be cooked, you know how much it should be salted. And those who do it the other way are wrong.

[00:36:05]

I want to point to something that you said earlier that I think might connect with this, which is in some ways when we think about our own subjective conclusions, when we think about a painting or how long pasta should be cooked, we're actually not thinking of this as being subjective. It genuinely feels as if we have amassed a whole bunch of objective data and arrived to this conclusion that in some ways feels subjective. So, you know, we might say, yes, my taste in art and music is subjective, but it actually feels like it's not that it's actually objective.

[00:36:34]

And part of it is that when things come to us through the senses, it comes so quickly that we do not feel the operation of the mind being involved. I know if I preferred to get the chocolate ice cream to the mint chip ice cream, but if you ask me why, I simply don't have access to that. And so it doesn't feel like there's been all these intervening processes that could have biased it. So when the piece of food hits my mouth and I think that it has too much salt, I don't have access to any brain processes that are influencing that judgment.

[00:37:06]

It's just, yeah, that's too salty. And it's an immediate feeling.

[00:37:10]

And because it is so immediate, it's hard to imagine that it could have been biased by anything.

[00:37:16]

I'm wondering if this is connected in some ways, Emily, to other work that you have done that looks at how we perform during interviews, but also how we judge other people during interviews.

[00:37:26]

So we all think that we can sit before someone for half an hour, talk to them and get a pretty good sense of whether this person is a good fit for a job.

[00:37:33]

But if someone were to come along and say, oh, we can talk to you for half an hour and figure out if you are a good fit for the job, we say that's clearly inadequate because I'm so much more complex than anything that can be ascertained in 30 minutes.

[00:37:48]

Right. So there's a term, the interview illusion, which I did not coin. And it was about this idea that it's an illusion that you can tell so much from an interview. If I want to know whether you're going to be a good bricklayer, it's probably a lot more valuable for me to watch you lay bricks and for me to ask your five prior bosses how well you laid bricks than for me to sit down and interview you about how good of a bricklayer you are.

[00:38:09]

And yet people love interviews and even psychologists. We do job interviews. Why do we do that? We could just ask the people who've worked with the person, you know, we could ask their advisers to write letters and we could read their work. But we do interviews as well, and we put a lot of weight on them. And the work that I did, we actually had people come into the laboratory in pairs. These were students who'd never met each other and they talked to each other for a half an hour.

[00:38:34]

And we found that at the end of the half an hour they felt like they'd really come to know the other person, but the other person had not really come to know them. So you can only get a small understanding and a small glimpse of who I am from that conversation. But I've got the whole you. And part of that has to do with the fact that I know all the stuff about me that you didn't find out from that conversation.

[00:38:54]

I'm aware of all the stuff I didn't say, all the stuff I said that maybe was misleading about who I really am. I've got all that, but I don't have all that about you.

[00:39:08]

You can see how these biases might play out in the context not just of interpersonal conflict, but geopolitical conflict, if you think you see the world accurately and I don't. If you try to set me straight and find you can't change my views, what are you to conclude? The simplest explanation is that I can't be trusted. There's no point trying to understand me or reason with me or negotiate with me, because I must be either stupid or evil.

[00:39:36]

It's not just people's actions that influence how we want to respond to them, it's also our beliefs about what those actions stem from.

[00:39:43]

And if we believe that individuals are biased, that their mental processes are biased, then we don't believe that it makes sense to try to reason with them.

[00:39:57]

Is there any evidence that teaching people about the ways in which our minds work, that it actually changes the way they can actually perceive the conflict and perhaps respond differently?

[00:40:06]

When people learn about these different biases, they're initially very optimistic that what we need to do is educate people about the biases. So if I just tell people like my students, here's the different biases that people engage in, that that should solve the problem, they'll say, gee, I didn't know about all those biases. Yeah, and the idea is now that I know about them, I won't do them. But as you know, that's not how it works, because what happens is they say, gee, I didn't have words for all those biases, but now that you've told me the words, they give me a great vocabulary for describing what all the people around me keep doing.

[00:40:38]

So that doesn't work there.

[00:40:42]

But what Matthew Kugler and I tried was instead to educate people about the importance of unconscious processes. And we taught people about how a lot of our judgments are rooted in things we don't have access to, so that a lot of things are automatic and a lot of things are biased. And so we tried to educate them about essentially the introspection illusion, the illusion that we could have access to all these things and the fact that instead much of it is occurring automatically and is biased.

[00:41:12]

And then we asked people to complete our usual bias blind spot measure where they read about various biases and then people no longer showed a bias blind spot. So once they understood about the operation of the unconscious and how these things happen automatically, they no longer claimed to be less biased than others. So then they said, gee, maybe I am biased, maybe looking inwards and not seeing bias is not the best way to conclude whether I'm biased or not.

[00:41:43]

Can you give me a concrete example of a time when you used your own research to change how you thought about something important or to change your own behavior?

[00:41:52]

I don't know if I can give you a single important example, but I think that as a parent, I find it happening with me all the time that, you know, I'm talking to my kids about someone in our lives who's done something that sort of has, you know, irritated us in some way. You know, somebody canceled on a plan that we had or, you know, said something that, you know, was insensitive. And and I find myself, you know, doing that thing right where I'm about to jump to the fundamental attribution error.

[00:42:20]

And I'm about to say, you know, gee, that was, you know, mean or inconsiderate or lazy or whatever. And then I've got my kids there with me.

[00:42:29]

Oh, this is not what I want to teach my children. And so I say, wait a second. You know, I know it might seem like the person was being inconsiderate, but maybe they were having a really hard day.

[00:42:39]

I try to sort of teach them and to remind myself to think about people's circumstances instead of jumping right away to that dispositional attribution.

[00:42:51]

You know, Emily, as I'm thinking about your work, I'm realizing that many miscommunications might happen because our thoughts seem so clear to us. But we do a terrible job communicating those thoughts to others. Things in our minds seem so clear and loom so large to us that we somehow assume they must be clear to others as well.

[00:43:10]

Yeah, it's interesting, I think, of the example of breakups or romantic breakups and people sort of they want to be kind and they want to be considerate and they want to do it nicely, not always. And and the other person is left totally confused and says, oh, you know, yeah, I think, you know, we're just taking a break for a few days, you know, or, you know, we were we just hit a rough patch, you know. And the other person thinks they have successfully broken up and ended the relationship. But we forget that our intentions, what we're intending to do is to break up, to close the door, but to do it in a very kind and considerate way. And what would the other person thinks is that you've sent a bunch of mixed messages and you're leaving the door open.

[00:43:47]

And so this is just one of so many examples where we don't recognize our our lack of transparency and what we haven't communicated because it's so obvious to us.

[00:43:56]

I don't know if you're a fan of the show Parks and Recreation, but there was an incident on Parks and Rec that basically is almost exactly the same lines where one party is breaking up with another. But but they break up so politely and so kindly that the other party thinks, great, we've had a wonderful chat. The relationship is now on to a higher level than it was before, and one party thinks they've broken up while the other party thinks wow we really are in a good place now.

[00:44:19]

So you're leaving soon? Back to Indianapolis briefly and then on to a town called Snarling Indiana for several months.

[00:44:28]

Never heard of it.

[00:44:29]

It's quite small. The cows outnumber the people. Four to one.

[00:44:32]

And then after Chris moves and tracks him down, storms into his house and accuses him of cheating on her.

[00:44:38]

God, I'm so sorry, honey. I'm so embarrassed. I was scared that you were cheating on me and..

[00:44:47]

No, I'm not cheating on you. But I'm also not dating you. We broke up last week.

[00:44:57]

Yeah, I'm laughing, but it's actually sad. I mean, it actually causes a lot of suffering in reality. Yeah.

[00:45:01]

We've just been through a really bruising political year, Emily. And the country as a whole has been very divided. And, you know, a lot of people are really asking, is it possible for us to come together as a nation after, you know, a very bitter political fight?

[00:45:16]

I'm wondering if you were to give advice to the nation based on the work that you have done. What would that advice look and sound like?

[00:45:23]

I mean, I think one thing I would say is that if you were judging yourself by all your positive intentions, right, and your good feelings, you know. If your intentions are that you want the country to be in a better place, that you want people to thrive, don't assume that other's intentions are different from your own.

[00:45:50]

And if you put a lot of weight on your on your intentions, those other's intentions would deserve just that same amount of weight. So we owe some charity in judging others behavior by giving some weight to their intentions. And we should not assume that their intentions are so different from our own. And if we can start with that and start with the charity of trying to find the positive intentions in others, then maybe there is some hope. But it's so hard to do, especially when things are so divided and feel so divided.

[00:46:32]

Psychologist Emily Cronin teaches at Princeton University. Emily, thanks for joining me today on Hidden Brain. Thank you so much for having me. Hidden Brain is produced by Hidden Brain Media, Metromedia is our exclusive advertising sales partner. Our production team includes Brigid McCarthy, Kristen Wong, Laura Corelle, Ryan Cats', Autumn Barnes and Andrew Chadwick. Tara Boyle is our executive producer. I'm Hidden Brains executive editor. Our unsung hero today is Max Linowitz. Max is an onboarding manager who just works for the company that handles payroll for hidden brain media.

[00:47:16]

Max worked closely with us as we were launching our company, and he patiently answered our many questions about how to pay employees and track things like vacation and sick time. His gutsier made a busy time feel less stressful.

[00:47:30]

Thank you, Max, for helping us to find our feet. For more hidden brain, you can follow us on Facebook and Twitter or at hiddenbrain.org. If you like today's show and like our program, please be sure to share it with a friend. If they don't know how to subscribe to our podcast, please show them. I'm Shankar Vedantam. See you next week.