Transcribe your podcast
[00:00:14]

Rationally speaking, is a presentation of New York City skeptics dedicated to promoting critical thinking, skeptical inquiry and science education. For more information, please visit us at NYC Skeptic's Doug. Welcome to, rationally speaking, the podcast, where we explore the borderlands between reason and nonsense. I'm your host, Julia Gillard, and I'm in the office of today's guest, Professor Don, where Don is a professor at the University of California at Berkeley, his high school of business.

[00:00:52]

He's a professor of management of organizations, and he's one of the things he's best known for. The way I first heard of him is as the co-author of the excellent textbook judgment and managerial decision making with Max Bazerman. It's a great intro, very comprehensive intro to decision making and joysticks and biases and how they affect actual real World Decision-Making. Don's research specializes in one of his areas of focus is on overconfidence, and that's what we're going to be talking about in today's episode.

[00:01:25]

Don, welcome to rationally speaking. Thanks. I'm delighted to be with you. So how confident are you that you are an expert and only moderately excellent you are you taught yourself? Well, I strive towards good calibration in all of my confidence judgments. It's a quite a motto. So before we dive into the meat of the episode, I think most people colloquially are used to thinking of overconfidence as basically being someone who thinks too highly of themselves, like they have an overly high opinion of their skill or their intelligence or charm or whatnot, or possibly overconfidence means being too confident that some endeavour of yours will succeed.

[00:02:10]

Is that what you and your subfield mean by the term? Or do you have some more precise academic meaning? You are correct in guessing that like many academics, I strive toward precision in my definition and the colloquial definition sometimes gets in the way. So people use the term confidence in all sorts of ways, many of which are not really amenable to an assessment of overconfidence. OK, in order to. Accuse someone of being overconfident. I need to be able to show exactly what it is they believe and exactly how that deviates from reality, from reality or from what they would be justified in believing reality is.

[00:02:58]

That is an interesting distinction, the latter is more important, what they would be justified in thinking that reality is because there are circumstances. We might get into this, but this is already starting to split hairs a bit. There are circumstances in which rational people can believe false things. So selective information exposure can lead bazillions to inaccurate beliefs under some circumstances. And there are some attempts to account for the evidence on overconfidence by relying on just such explanations. OK, well, we don't have to go quite down that route, OK?

[00:03:36]

Sorry, I have I always want to be the one who splits hairs the most. Finally I starts to one up my. It sounds like a challenge. Yeah. I'll take it. One of the major contributions of my research to the study of overconfidence is to clarify the different approaches to the study of overconfidence. So the term overconfidence has been used in lots of different ways by lots of different scholars. And I try to clarify those different approaches. So one is overestimation thinking that you're better than you are.

[00:04:08]

A second is over placemen, thinking that you place higher than others in some percentile ranking, for instance, then you actually do. And the third is Overprovision, believing that your knowledge is more accurate, more precise, more truthful than it actually is. Got it. So you could be over precise in your belief that you're actually not very skilled. Yup. Every parent has seen their child make that error of saying, no, I can't. I shouldn't try it.

[00:04:39]

I'll fail school. No way I'll succeed. Right. And you as a parent say no. There's a chance that you could do it and there's a chance you might love it. It's worth the risk. Try it out. Right. Or if there are some parents who say you will definitely succeed. I know I tend to be well calibrated in the confidence judgments I transmit to my children. I would expect no less. So what are the relationships between those types of overconfidence like?

[00:05:04]

Actually, let me guess before you tell me, OK, I don't actually know, but it would seem to me that over estimation and over placement would both be related because the better you think you are an absolute, the higher you think you should rank relative to other people. But it's not obvious to me that over precision would be related to those two things for the exact because of the examples we were just talking about. That's exactly right. Yes, a precision is weakly related to the other two.

[00:05:29]

OK, and in any given task, over estimation, over placement are strongly correlated. My research also examines some of the more quirky features of their relationship with each other, including the fact that across tasks over estimation and no replacement are negatively correlated with one another.

[00:05:51]

Oh, we're on hard tasks. People overestimate their performance, but think they're worse than others. And on easy tasks. People underestimate performance but think that they're better than others. Interesting. OK, I have a guess as to an example of one that comes up. I'm going to save that for later in the week before we we conclude this like disambiguation portion of the podcast, I want to ask about optimism like which I am using to mean thinking that some project of yours has a greater than greater chance of success than you were justified in thinking it does.

[00:06:27]

Does that how does that fit into that three way taxonomy? It is an excellent question and optimism has been studied a great deal. Perhaps the most famous scholars of optimism are Charles Carver and Mike Scheier, who have a scale that assesses the personality trait of optimism and their usage of the term is actually not that far from the colloquial usage of the term. Where to be optimistic is just to believe that good things are going to happen. Optimism is distinctively about a forecast for the future and whether you think good things are bad, things are going to happen to you.

[00:07:14]

So it would not be something that might colloquially count as optimism that wouldn't fall into that definition would be something like I think people are basically good at heart. Yeah. Or like the world is is better than it used to be. OK, yeah. Because like in my attempt to quantify and assess overconfidence, there isn't an outcome there. People are good at heart. What does that mean. Right. Right. So I guess you could try to extrapolate but that would be your own extrapolation.

[00:07:43]

Right. OK, and interestingly, this trait of optimism seems very weakly related to actual specific measures of overconfidence. So when I asked Mike Scheier why his optimistic personality trait didn't correlate with any of my measure. Of overconfidence, he said, oh, I wouldn't expect it to I would expect it to. Yeah, my reaction was cool. I mean, if it doesn't correlate with any specific beliefs. Right. And I think it's hard to reconcile those in any sort of coherent or rational framework of beliefs.

[00:08:30]

But I have since had to concede that there is a real psychological phenomenology where you can have this free-floating positive expectation. That doesn't commit you to any specific delusional beliefs. Yeah, this must go ahead. I was just going to point out the irony of me studying overconfidence. My mother finds it deliciously ironic because she regards me as a hopeless optimist, but at the same time sort of mercilessly specific and critics self-critical when it comes to specific beliefs. So she's when she thinks of you as an optimist, she thinks I'm a generally sort of sunny, enthusiastic disposition.

[00:09:28]

Right. Doesn't necessarily cash out in concrete prediction. But when it comes to figuring out how to feel about something that at least in my mom's assessment, I do a really good job figuring out how to put my outcomes in life in the best possible light and feeling grateful for what I have rather than wishing that I had more or that I had something else. And how to feel about your life seems to me to be quite a separate matter from assessing how likely it is you'll succeed or whether your investments will increase in value or whatever.

[00:10:07]

Right, right. In fact, I've definitely known people I could I can think of two different people who would have the same probability assessment of, you know, whether they will get the things they want in life or how likely their startup is to succeed. But they feel very differently about that probability assessment. So that is an important distinction to make. As one other side note, I could never tell when people call themselves optimists, like when someone self labels as an optimist, whether they mean I genuinely expect that things are going well or are going to go well, or if they sometimes it really thinks it really sounds like they're saying I see things as better than they actually are, which is a weirdly sort of doublethink way to.

[00:10:51]

And honestly, the only real interpretation of that, I think can be as a sort of general affect or a sort of flag that you're waving as opposed to an actual belief, because as a belief, it's incoherent to say, I think things are better than they are. You know, we have attempted to study these beliefs really cool. And you're right that a rational, coherent, integrated set of beliefs cannot hold these different pieces that seem to be at odds.

[00:11:21]

Our data, to my great consternation and perpetual bafflement, do seem to reflect ordinary people being comfortable having both of these beliefs. And sometimes, when pressed, they will say things like, yes, I prefer optimism. As a rule, Americans enthusiastically endorse optimism and they will even say, yes, I prefer to have more positive beliefs than an objective analysis would prescribe. But when pressed. So you think self-delusion is the right way to live, they will say, no, no, no, I want to be optimistic and accurate.

[00:12:06]

Yeah, that's my reaction. And when when you say how how is that possible, people will say things like, well, I want to be optimistic because it increases the chance that good things happen to me. My optimistic beliefs make themselves come true. When people say that their optimism makes them more likely to succeed. I have two responses. One is I want to understand the magical processes, beliefs and magical processes that could actually lead to such a result.

[00:12:41]

And the second is, well, if it's actually going to be that good, then is it optimistic to believe that it's going to be that good? So, for instance, if I believe that I'm going to survive until tomorrow. Yep. There I believe that the better outcome is more likely. Is that optimism? Well, I can sort of I might be on their side in this particular case because like self-fulfilling prophecies or just weird things, right?

[00:13:12]

Like if if they were if they didn't call themselves optimistic, then in fact, according to their model of the world, they would not do as well. And therefore, it's just it's one of those. Yes, it turns into whatever you say. I think many people do have that model of the world in which optimistic beliefs are more likely to result in positive outcomes. Some of that looks like magical thinking, like lot of the secret. The law of attraction.

[00:13:39]

Exactly. Oh, that's a drives me crazy. We've covered that will not be shocked to learn there. And there is a slightly more sensible version of those beliefs wherein people believe that their own faith in a positive future might motivate them to take action that would produce positive outcomes.

[00:14:08]

Some of those beliefs are more plausible than others, like my positive outlook leads to enthusiastic, outgoing behavior that elicits more positive reactions from others, right? Some of it is a little less convincing along the lines of, well, if I believe I'm going to do well on the test, then that will lead me to study more. And people generally don't have a sensible response to the concern. Well, what about circumstances in which you're persuading yourself that you're going to do fine undermines your motivation to prepare for undesirable outcomes.

[00:14:46]

Right. So people don't have a response to that, because that's always what I want to ask about. When I when I do, they say, oh, yeah, well, that is a response. I guess. So I'm glad we're talking about this, because this comes up not just in the podcast again and again, like most recently, perhaps Maria Konnikova is wonderful touting. Yes, we've had her on the show twice and she was touting the benefits of self-delusion for this for exactly this reason that it affects your performance and therefore leads to better outcomes.

[00:15:20]

And Maria was touting self delusion that it would help. Yeah. Oh, I'm so disappointed. Well, you should check out the context to make sure that I'm not misrepresenting her. But I mean, this is like, yeah, there's a lot to unpack here. I'll for context, I will just say that I, I used to use overconfidence as an example of a cognitive bias. When people would ask me what I do and I would talk about this organization, we run workshops and we try to help people notice and overcome cognitive biases.

[00:15:52]

And they would want an example. And I used to reach for overconfidence as an example just because it was so such a nice, clear example of a bias so relevant to decision making. There is actually research on how to overcome it. And I've stopped using it as an example because I got so much pushback from people who just didn't think that there was such a thing is over there. They didn't think there was such a thing as too much confidence that basically the more companies you have, the better.

[00:16:18]

And our goal to be more successful, our goal should just be to find ways to boost our confidence, irrespective of the mapping to reality to be. I encounter those objections with some regularity myself. As so it may not be wholly irrelevant that we are both based in the Bay Area on the moonshots and unicorn's a bunch of those people. I say, so you want to be more overconfident about everything. You should believe that your investments will go up infinitely tomorrow.

[00:16:50]

You should believe that you are immortal and can drive as fast or as crazy as you want and don't subject yourself to risks.

[00:16:57]

Well, I never tried the snarky response. I always tried the very gentle response and it didn't work. Maybe a nicer person than I am successful. So one manifestation of this model that I think that I've seen recently and I want to see if you agree with this is growth mindset. So I don't want to talk about the specific thing that Carol Dweck, the researcher who coined this term, means that growth mindset, but rather the colloquial understanding of growth mindset, which is instead of instead of thinking to yourself, well, I'm not a good public speaker.

[00:17:34]

I guess I'm just not the kind of person who can do public speaking as well and giving up. You should instead be thinking to yourself, I'm not a good public speaker yet, but it is a a learnable skill and I'm someone who can learn skills. And by adopting this this growth mindset instead of a fixed mindset, you can actually improve at the skill. And I have to admit, I wonder whether, like encouraging people to have growth mind.

[00:18:00]

It, as much as possible, is actually encouraging overconfidence, because it seems to me to be an actually empirical question, whether people can improve. I mean, to some extent, I suppose people can always improve. No, I don't know. There might be some skills that are just, you know, limited by your IQ or limited by something else. And I would ask people who practice their skills at telekinesis or haven't even thought of that example.

[00:18:25]

Yeah. And they will fail. Right. Right. The only real skills. Yes, exactly. So, OK, let's the question just to real skills for now, because I think that makes it a more interesting question. I agree. There is surely some fact of the matter about how difficult it would be for you to improve it. A skill is growth mindset actually always a good thing, or is it only a good thing if you can actually improve pretty easily with practice?

[00:18:53]

I'm going to go with the latter. I think I should have well calibrated, accurate beliefs about how much you can improve and how much effort it will take to accomplish it and then what the rewards are at the end. So I loved your interview with Bob Frank and was so disappointed to hear him touting self-delusion as a recommendation for improvement and so proud of you that you had to be to fire on that. Right. That was the more recent instance of this of this recurring debate on my show, not Radio Konnikova.

[00:19:24]

Yeah, my 11 year old son loves basketball, is obsessed with basketball. I am so proud of his achievements and how much he is invested in improving. He's he's a pretty good player and he really enjoys it. And when he talks about his career in the NBA, I say the odds of you playing in the NBA are so vanishingly small. I think it's great you love basketball and it's great that you work so hard to get better. It's it will provide a lifetime of pleasure and reward.

[00:20:00]

But you should study hard in school because you are going to play in the NBA. Yeah, I have been playing with this model recently where let me back up one way that my thinking about biases and mystics has evolved over the last few years is that I used to think of them as relatively modular so that we have a bunch of biases. If you can improve any one particular bias, you're better off. Now, I have a more interconnected model of biases where let's say you have bias and bias.

[00:20:32]

B if you just improve bias, you might make yourself worse off, even though if you could improve bias and be together, like fix both bias and by speed, you would end up better off than you were when you had both biases. And I wonder whether overconfidence is kind of like bias in my model, where like so I think we have we definitely have overconfidence as a species, but we also have other related biases like like narrow framing or loss aversion.

[00:21:06]

Or maybe a better way to describe this other related bias is that I think humans find it hard to motivate ourselves to act just based on the kind of expected value calculation. So, like, if you have two options and option one is you could, I don't know, have five hundred dollars for sure. And option two is you have like a one in ten shot at ten thousand dollars. Option two is actually a better, higher expected value especially.

[00:21:31]

And it's especially a better choice if you're going to have the choice presented to you repeatedly over time. Like you should totally go with option two all the time. But it's just not that motivating to choose something that has a low probability of paying off, even if it's a high payout. And so if we just fix overconfidence or optimism, I suppose I'm pointing at and we say we give people the ability to see clearly what their odds actually are and see like, oh, I have only a 10 percent shot at getting this reward, then maybe they actually won't take it, even though it's the best thing to do, because they can divide the idea of doing something with only a ten percent chance of winning.

[00:22:08]

So I'll stop there.

[00:22:10]

What do you think of this model? There are a number of attempts to account for the persistence of overconfidence in human judgment with explanations exactly like the one you've offered, where overconfidence is somehow the antidote to some other human frailty or weakness or fear and.

[00:22:33]

I have a problem with recommending. Overconfidence as a treatment that way or endorsing it is irrational, like, yeah, you want a machine that has this known flaw because sometimes it helps cover up another known flaw. Yeah, well, that's problematic for a few reasons. First of all, like, if you can fix the first floor, then you wouldn't need the second. And it's hard to forecast exactly all the ways in which floor to this being overconfidence could get you into trouble that you hadn't anticipated in ways that flow one is involved in.

[00:23:12]

So it's a little like you're writing code and you realize there's a bug because, you know, you do a calculation and gets an answer that's five too low. And instead of figuring out what the bug was, you just say everything until you write. You'd like to go back and fix the original bug. So do people always have well calibrated beliefs and do they respond sensibly to uncertainty? No matter what I recommend, overconfidence is like a general treatment or correction for the those errors.

[00:23:46]

Again, I don't I can't recommend this as as a general strategy for a rational individual. Yeah. I mean, I think the reason I'm interested in this model is not so much as a way to decide whether it's rational or irrational to be overconfident, because I think it's irrational, but more just as a as kind of a Chesterton's fence type exercise. I don't know that. So there's this parable introduced by S.A.S. named Chesterton, where he says he imagines someone coming across a fence in the middle of a road somewhere and saying, well, I don't see why this fence is here.

[00:24:24]

Let's just take it down. And G.K. Chesterton, Chesterton says to this hypothetical man, no, go off and figure out why that fence was put there in the first place. And only once you understand why it's there will I maybe let you take it down, but not before. And this parable has been invoked whenever someone whenever some sort of high minded, well-intentioned reformer says, I don't see why we have such and such social conventions. I don't see why cities ended up organised the way they are.

[00:24:54]

This seems clearly inefficient or suboptimal. Let's tear it down and build something new that makes sense. And I so I don't think that the point of G.K. Chesterton, the point I take from Chesterton's fence is not that we should never change anything. And in this case, it's certainly my conclusion is not that the set of biases that our brains evolved happens to be the most optimal arrangement of a brain. That would seem very coincidental to me, especially given that we don't live in the environment that we evolved in.

[00:25:23]

But the point is more that in our efforts, in our quest to improve the way we think, we want to notice what hidden benefits there had been from the old system so that we can make sure that we don't lose them in the process of improving. I love that challenge. It is wonderfully inspiring and encouraging to any scholar who's interested in asking one more level of why. Why do we observe so much overconfidence? Accounting for its origins psychologically and practically is of enormous interest.

[00:26:02]

If, however, the challenge you've issued to me is explain what in the ancestral environment could have favored the evolution of beings that are overconfident there, we get into some of the problems associated with all evolutionary psychology, and that is the shortage in the fossil record of evidence on the situations, cultures and environments in which humans evolved. And we can speculate endlessly about ancestral environments that could possibly have favored overconfidence. And we will never have an answer. Sure, there there was a distinction that you made in this cluster of topics and one of your papers between optimism in the period of deciding what to do, like deciding whether to take a risk versus optimism in the implementation phase.

[00:27:10]

Can you talk a little bit about why are like why the diagnosis might be different in those two cases? Yeah, this is a question that I confront with my students in the classroom talking about the role of confidence in the execution of leadership, because everyone sees the important role that leaders play, instilling confidence in others, where in any large organization a leader has a role. Marshalling resources, encouraging people to march in the same direction, to commit themselves to a cause and confidence that the leader has picked the right direction is enormously useful in the service of that goal.

[00:28:02]

But that doesn't mean the leader you want to recommend that the leader be delusional about them, assessing the risks associated with different courses of action, whether that be challenging the status quo in police law enforcement or deciding whether to enter some new market or found a new company as the leader, you have got to assess the risks with clear eyed insight before you commit the organization to taking some course of action. Once you decided that's the way to go, you'd really like everybody marching to the beat of the same drummer in order to maximize the organization's chance of success, so you're explicitly not endorsing self-deception as a leader, but you are maybe endorsing other deception?

[00:28:54]

That is a profoundly troubling question, which I must confront with every class when I bring this topic up and I'll say no to the second two, because a leader who attempts to lead by fooling his or her followers is practicing a profoundly unsustainable form of leadership. Unmasking their their lies is an easy way to discredit them as a leader. So I think what I'm encouraging is more akin to strategic self presentation along the lines of these strategic choices one might make in an optimistic outlook that chooses the more favorable way to feel about a given set of facts.

[00:29:54]

This is sort of what we were talking about earlier in the conversation and having a kind of enthusiastic all in attitude independently of what your probability estimates are about success. Yes, I like that. Yeah, I keep I've been looking for solutions in the space of ways to ways to purchase outward confidence without spending epistemic hygiene. I guess that's the tradeoff of confronting this question.

[00:30:20]

Research that I am doing with colleagues here, Cameron Andersen, with Liz Cheney, who is a postdoc here, is now a professor at the University of Utah looking at issues of confidence, expression and the circumstances under which confidence, expression can help increase the faith that others have in you or potentially get you into trouble. If you express overconfidence and the expressions of confidence that are most likely to get people into trouble are things like I am 100 percent sure that we can accomplish this task successfully or early in the Obama administration.

[00:31:04]

We will get down to five percent unemployment within a year. That sort of specific claim is falsifiable and can get you into trouble.

[00:31:14]

There are other ways of expressing confidence that are less likely to get a leader into trouble that aren't attached to specific claims. Cool. I look forward to reading that because this is also something that I confront a lot. This is one of the objections that people have to to addressing their overconfidence is that they feel that they worry it'll cost them several points. And I think it's a reasonable worry. I'm curious whether you think that overconfidence on the part of, say, entrepreneurs or angel investors might actually be a positive externality that like maybe it is, in fact bad for the individual who's deciding to throw five years or a million dollars into some project that doesn't have much of a chance of success.

[00:32:03]

And if they were correctly calibrated, they would do something that was maybe slightly less glamorous, but much more likely to succeed. But it action. But but all of that collective overconfidence benefits the rest of us, assuming that the average new startup is is positive for the world, which doesn't have to mean all of them are positive, but just that every now and then you get a Google or an Uber. I guess I'm revealing myself as someone who thinks that Uber positive is a whole other episode.

[00:32:30]

But, you know, if if all the entrepreneurs and angel investors were perfectly calibrated, we would just never have any of these vehicles and certainly have less of them. I would agree wholeheartedly that that sort of confidence has a positive externality that as a nation and as a world, we have benefited enormously from inventors, entrepreneurs and dreamers who take bold and irresponsible risks. They have created movements and companies that have benefited us all. And the US economy is so much more dynamic thanks to that sort of courage.

[00:33:13]

So do you feel like your work is good for society? If you're encouraging business school students in the Bay Area to be less overconfident, if it means they're less likely to throw away their life savings, to ruin their lives, to subject themselves to hardship, disappointment and frustration, then yes, I can feel good about the advice I'm offering to my individual student. I also do so in the full knowledge that there will always be a ready supply of eager entrepreneurs who are delusional about their chances of success and are ready to fill the schedule of any VC willing to talk to them.

[00:33:55]

Yeah, I don't disagree with that. Now that we're back on the subject of entrepreneurial overconfidence, that we sort of hinted that earlier in the conversation, do you want to delve into this sort of seeming paradox in the relationship between overestimation and of replacement between tasks? Sure. Here we are getting into some of the hairsplitting that you invited me to explore earlier.

[00:34:20]

The explanation for the negative correlation between overestimation over replacement across tasks I think can best be summarized by the simple and uncontroversial notion that people's knowledge of themselves, their performance, their ability and their potential is imperfect. So people will make errors when the task is hard enough. That everyone gets everything wrong, whether everybody will fail, if anyone makes an error estimating performance, they can only overestimate it when the task is so easy or everyone is so capable that everyone gets everything right or everyone succeeds and they can only underestimate it.

[00:35:02]

OK, so that kind of like the regression to the mean effect.

[00:35:06]

Yeah, except for there's not I mean it's more like regression to your prior position prior where you're you've gotten some signal of performance but it's noisy and so your prior combines with the signal and as a result what you regret you get is regressive estimates we're on and very hard task people overestimate on the very easy tests people underestimate that's been known for decades. That is a hard, easy effect in estimations of performance. It appears to be superficially contradicted in people's beliefs about placement because on hard test, people think that they're worse than others, even though they have just overestimated.

[00:35:47]

And the way that that happens is a natural consequence of the fact that even though their knowledge of themselves is imperfect, their knowledge of others is even more imperfect. So, for instance, if I ask a group of my students, give yourself a percentile rank relative to others in the room on your juggling ability, on average we will see that the mean is below the fiftieth percentile. On average, people think they're worse strugglers than others. They know they're not very good, but they think maybe there are some jugglers in the room and it's unlikely that everyone here is worse than me.

[00:36:21]

And so they were in estimating other's performance on this very hard task and wind up believing that they're worse than others when they're not there. Many other instances in life where people feel capable and think, oh, I'm pretty good at this, I know how to drive a car and I haven't had an accident in the last week, so maybe I'm better than others. And they wind up over placing themselves. But in the task that entrepreneurs gravitate to that are especially difficult to make a successful company off the ground, I guess I don't know what exactly I'm comparing this to sort of an undefined reference class, but are those the kind of tasks that would lead to this sort of paradox?

[00:37:12]

Well, we do see that there are substantial or substantial variation by industry. So industries where there are a lot of people who think, yeah, I know how to do that, most notably running a restaurant, you see much higher rates of entry, intense competition, and then much higher rates of failure as a result.

[00:37:35]

The other issue in entrepreneurship has to do with self selection. So if you want to ask whether a potential market entrants are biased in their beliefs, if they are on average overconfident, that is a question that must necessarily include all potential entrants, everyone who could have gotten in and started a company. Well, there are an awful lot of people out there who look at the statistics and you look at the challenges associated with entrepreneurial entry and think, oh, man, no way I'm going to stick to my day job.

[00:38:08]

So you've got this huge selection effect where in a nation of 300 million people, how many choose to get in and enter where you're going? You should expect an adverse selection problem where those who choose to enter are those who are most delusional, overconfident.

[00:38:26]

And it ought not then to be a surprise.

[00:38:29]

If we look at the lottery winners at the end, if we look at Bill Gates and ask, was he delusional, overconfident about his chances of success? Yeah, well, those who ultimately succeed are sampled from those who chose to get in. And they're all delusional, delusional, overconfident. So the fact that you're left with a set of people who are delusional, overconfident isn't all that informative. If you ask, well, wouldn't you like to be like Larry Ellison or Bill Gates?

[00:39:02]

Well, yeah, if we conditioned on their ultimate success. But if we condition on the set of people who had those delusional, optimistic beliefs and ask, well, was their attempt at entrepreneurship a positive expected value bet there? The evidence is much less clear. Right? I think I read also in one of your papers that the fully a third of entrepreneurs, when asked about their chances of success, will say, my chances are 100 percent. Yeah, this is famous, you know, and Dunkelberg result.

[00:39:36]

Yeah, it could. I mean, I can only explain that as a sort of, you know, social signaling or like steeling themselves, like trying to cultivate optimism instead of trying to take their best guess. Otherwise, I just can't my mind can't accept, given the grim statistics on entrepreneurial failure, it is those people who have convinced themselves their chances are so great that they're going to beat the odds. That's who's left to run the risk of entry.

[00:40:05]

Yeah, we're almost out of time. But I want to take the last few minutes to invite you. If there are any open questions that you're particularly interested in regarding overconfidence or optimism could be something you have already studied or something that you want to study. What question would you like to see answered? Wow. Well, my research has really zeroed in recently on Overprovision, the third type of overconfidence, which we haven't talked about much. That is the excessive faith that you know the truth.

[00:40:35]

And it is by many measures the most robust form of overconfidence. It is rare that you find exceptions where people are under confident about the accuracy of their beliefs. And I am fascinated as to why. One answer that I find intriguing comes from a book which might actually be my rationally speaking pick.

[00:41:03]

You know how this works. Great. So what's the book? The book is called Being Wrong. OK, Great. It is by a brilliant and insightful journalist named Kathryn Schulz, and it is all about the ways in which we are too sure of ourselves are overconfident and they're wrong. And she offers the following beguiling explanation for how it is that we can consistently overestimate the accuracy of our knowledge. She says that we get used to being right about everything all the time, and her explanation is as follows.

[00:41:43]

It's really simple and leads to a profound and surprising conclusion. So stick with me. Most people believe what they believe because they believe it to be true with you so far.

[00:41:56]

So we hold the beliefs that we hold because we think that they're accurate about the world. As soon as we find out that something that we used to believe is not true in that instance. We go from believing anything to not believing it. So there's never a point for more than half a second at which we believe something that is that we know to be false. Yes, we know. Yeah. So she asked, what does it feel like to be wrong?

[00:42:24]

And her insightful answer is, it feels like being right, because in the instant that you realized you were wrong, then you abandoned that false belief and you think was nice, silly to believe that before now my beliefs are more correct than they were before. Now, what I believe is right amazing. I've noticed myself doing this on slightly lesser scales where I boast about how I'm I'm a really good judge of I don't know who has been abroad or who has some other particular demographic characteristic.

[00:42:59]

And if someone asked me for examples, I'll list off a bunch of examples as sort of proof that I'm good at this. And then when pressed, I'll realize, wait, I never actually confirmed that any of those examples I was right about, they just felt right to me. And so my brain stored them as confirmation of my detection skill. It does seem to be a tenacious prop.. You are. So I think this is a good point out of need to wrap up.

[00:43:26]

So let's move on at this point to the rationally speaking.

[00:43:45]

Welcome back. Every episode on rationally speaking, we invite our guest to introduce the pick of the episode that's a book or website or something that has influenced his or her thinking in some way. So, Don, what is your pick for today's episode? Well, I already mentioned Being Wrong by Kathryn Schultz. Its charming subtitle is Adventures Within the Margin of Error. That has a great subtitle. I'm envious that she thought of it first. I'm envious of her writing.

[00:44:12]

In many ways. She's brilliant and clever and inspiring. It is a close sibling and a nice complement to the other book that I would also highlight as a potential pick. And that is Bright cited by Barbara Ehrenreich in which she, in her own angry, cynical, wonderful way, takes on the American love of optimism. And in discussing her own cancer diagnosis and the advice she got from so many people that she had to stay positive confronts the perversities of an optimistic outlook and courageously entertains the possibility of rational, good calibration in one's judgments and beliefs.

[00:45:04]

I can only I can I can just imagine Barbara Ehrenreich face when people give her a cheerful and disconnected from reality pep talk about cancer. That's the vivid image. So would you say that everybody got blindsided? I guess that's the public being blindsided by bright side thinking. Would you say that it conflicts it all with your research? Is it or is it just a a sort of vivid concretization of what you discovered through through research? She writes more articulately and inspiringly than I can about conclusions that I think are deeply compatible with my research that note the dangers of self-delusion and the brilliant, joyful, inspiring and empowering consequences of well calibrated, irrational beliefs.

[00:46:00]

Excellent, perfect note on which to end, rationally speaking episode. Don, it's been a pleasure having you on the show. Thank you so much for doing this. Pleasure has been mine. This concludes another episode of Rationally Speaking. Join us next time for more explorations on the borderlands between reason and nonsense.

[00:46:22]

The rationally speaking podcast is presented by New York City skeptics for program notes, links, and to get involved in an online conversation about this and other episodes, please visit rationally speaking podcast Dog. This podcast is produced by Benny Pollack and recorded in the heart of Greenwich Village, New York. Our theme, Truth by Todd Rundgren, is used by permission. Thank you for listening.