Transcribe your podcast
[00:00:14]

Rationally speaking, is a presentation of New York City skeptics dedicated to promoting critical thinking, skeptical inquiry and science education. For more information, please visit us at NYC Skeptic's Doug. Welcome to rationally speaking in the podcast, where we explore the borderlands between reason and nonsense, I'm your host master manipulator. And with me, as always, is my co-host, Julia Gillard. Julia, what are we going to talk about today? Azmera topic today is intuition.

[00:00:51]

Yes, we're going to talk about what intuition really is like, what people mean when they talk about intuition, where our intuition come from, whether intuition and reason are opposed, like whether there are alternative methods of drawing conclusions about the world or whether they're compatible in some way and when intuition might be useful, if it can be useful and when it can be particularly misleading.

[00:01:16]

So the word intuition comes from the Latin. There's a lot of words do into where an intuitive means knowledge from within.

[00:01:25]

So which is, I think, actually a pretty good description of how most people think about intuition, that there's something inside us that tells us what to do.

[00:01:34]

Right. So I think that's a great umbrella definition, but it still leaves open the question of that knowledge from within. Where did that knowledge. Right?

[00:01:42]

I mean, so the word is sometimes used to mean knowledge that you've sort of built up over time through practice and experience, which, you know, has all been sort of stored in some way in your brain, but which you don't have like conscious, deliberative access to.

[00:01:59]

So you might have an intuition about a situation as being dangerous, even though you aren't consciously thinking about here are the reasons it's dangerous. There's just you've learned patterns and rules and or, you know, maybe a chess player, an expert chess player might have an intuition that a certain move will be good.

[00:02:16]

And that's just something he's developed through years and years, thousands and thousands of games.

[00:02:20]

He's learned that certain patterns on the board tend to reward certain kinds of moves so that that kind of intuition is actually can be quite reliable because it's based on actual data as long as it's being synthesized properly.

[00:02:34]

Nonbiased way or that one is domain specific.

[00:02:36]

That is domain specific. Yes.

[00:02:38]

And I actually, you know, looking back, the first example I gave of feeling like a situation might be dangerous without being able to articulate exactly why that's not. That's a bit of a misleading example. And I realize because it sort of blurs into the second category of intuition, which is intuitions that come from some sort of innate or evolved, you know, predisposition. That's right. And instinct and things. Yeah.

[00:03:01]

Which might be relevant in the current modern context at hand.

[00:03:05]

And it might not you know, we might have an instinct that people who are attractive are more likely to be telling the truth.

[00:03:13]

But, you know, I don't know how that ever came about, even even in evolutionary time that that was ever true. Yeah.

[00:03:21]

So it's hard it's hard to look back and see what the roots were of things like this, because, you know, in the ancestral environment, we weren't trying to convince each other of things. So, you know, so somehow our brains are like are trying to extrapolate, you know, the way they work from how they were evolved to work. And somehow that has resulted in us trusting attractive people more.

[00:03:39]

But anyway and so then the third main category of intuition that I've noticed or a way of referring to intuition that I've noticed people use, is a sort of like a supernatural sixth sense. Right. So, yes, it comes from within you. But but the question of where that knowledge within you came from is just left with this sort of mysterious and questionmark. You know, it's just in there and you just know and there's not any any attempt made an explanation of how you just know, you know, even if you're not consciously figuring something out, like you would think, there's still some reason that you're having the intuition you do, but that tends to be left intentionally blank.

[00:04:13]

All right.

[00:04:14]

So one thing that I think we should clear out right at the beginning is that there is actually research on intuition. So we're not just using our intuition about to talk about intuition. And we're actually there is quite a bit of research in cognitive science in the last several years.

[00:04:28]

And in fact, most of it I shouldn't say most of it, but a good chunk of it actually does deal with things like chess playing, because there it's clear that's clearly an activity that is that you can do deliberately and, you know, through rational expression, explicit, rational thinking. But it's also clearly an area where, as you said, the more expert you become, the more actually you shift to sort of intuitive responses. You risk that you developed by by practice.

[00:04:56]

Yeah, I think they've done some experiments where they had expert chess players play something like 10 games in parallel.

[00:05:04]

So clearly they didn't have the cognitive resources to be devoted to each particular game, yet they still did really well.

[00:05:10]

The only explanation being that they were using these intuitive heuristics for what were good moves and an even more telling experiment about chess is this way in which the the chess players were first put in front of the actual situation on the chess board, meaning a situation that can actually occur on the. Transport in the West, what to do, and they, of course, got it right most of the time because there were, you know, familiar, if not with the specific situations within, they developed the sort of a sense of what would work under similar circumstance.

[00:05:42]

When you say a situation that could occur, you mean like an arrangement of pieces that you could end up arriving at if you started with the normal beginning and like made moves.

[00:05:50]

Exactly.

[00:05:50]

And then they were put in front of a situation that actually it was completely random.

[00:05:55]

It was not there was not actually a likely situation or that it could not possibly occur under normal conditions of a game, in which case their intuitions about what to do, even a chess master, intuition about what to do was just as good as any any any noble because there's nothing striking.

[00:06:10]

Yes, it was pretty striking now. So so those experiments and others in other domains of expertise, such as, for instance, being a nurse or, you know, or teaching mathematics, you know, these sorts of research, there is some research has been done in these other domain of expertise. They all seem to point out that there is certainly such a thing as intuition that is domain specific, but there isn't any such thing as being generally intuitive.

[00:06:38]

So when somebody says to you, I'm an intuitive person, that is nonsense. It's there's no such thing as an intuitive person.

[00:06:45]

Although the only caveat that one can possibly have about that is that there is such a thing about presumably about being intuitive about social situations.

[00:06:55]

So if you say if when somebody says I'm an intuitive person, they actually mean I have a good sense of social situations and how to react or navigate that, actually, then it's possible because that is a domain of expertise and in some sense.

[00:07:11]

But being intuitive across the board makes no sense.

[00:07:13]

Yeah, I think that's probably one of the main misinterpretations of Blink by Malcolm Gladwell, the popular science book that was published a few years back, that, you know, I skimmed it.

[00:07:25]

I don't remember the particulars of the book, but I know that that at least based on the evidence that he gives in that book, the thesis whether he makes this explicit or not is that is that intuition can be can lead to better results if you, you know, have a lot of expertise in that particular domain.

[00:07:40]

Right.

[00:07:41]

But the way that it's generally quoted when people talk about it, you know, colloquially, is that is that relying on your intuition, you know, in sort of snap judgments in general will lead to better outcomes than, you know, deliberative reasoning?

[00:07:53]

Yeah, that is actually the mastery, not not the case. And in fact, again, there's quite an interesting literature about it.

[00:08:00]

Now, one of the things I wanted to point out is actually that the apparently the first modern psychologist to think along the lines of intuition and rational decision making as to different but interacting processes was none other than the father of modern psychology, William James. So he actually distinguished basically the two modes of connection. Intuition, according to James, works in an associated manner. It feels effortless. So it's fast and, you know, it works without essentially the subject realising what's going on.

[00:08:37]

On the other hand, rational thinking is analytical.

[00:08:40]

It does require effort and it's much slower, but it tends to be more accurate. So there are those situations where it's easier to come up with examples where, you know, because you do not have enough time, you're on the time constraint or other Kafranbel constraints. Do you actually do need to rely on an intuition?

[00:09:00]

But there's also it's also easy to come up with examples where the intuition actually fails, especially one of the most common area where intuitions fail is by probability theory.

[00:09:09]

Right, right. Based rate neglect, I think, is one of my favorite examples where they they'll ask people.

[00:09:18]

The common example is that there's there's a test, say, for breast cancer. And the test, the breast cancer, you know, occurs in one out of every thousand women in the relevant population. And the test has a false positive rate of, say, five percent. And so the question presented to subjects is, let's say a person tests positive for breast cancer. What is the chance that that person actually has the disease, knowing nothing else about the person's symptoms and signs and so on?

[00:09:46]

And and so people know that the test is occasionally wrong.

[00:09:50]

So they say, you know, well, you know, the rate of five percent, maybe it's, you know, like 90 percent chance or 85 percent chance that the person actually has breast cancer. The correct answer is two percent so wildly off.

[00:10:03]

And the problem there is that people are not taking into account the base rate of breast cancer in the population. So breast cancer is so rare in the population, one in a thousand, that even if you get a positive result on this test, it's still much more likely that the test was wrong than that you actually have breast cancer. And so, yeah, this is a systematic bias that we have that we don't we neglect the base rate and we we overweight the the recent evidence or, you know, the case specific evidence.

[00:10:30]

And that's a great example. What you were saying a few minutes ago, which is, you know, that is clearly an area of probability estimates in a fairly complex situation, it's clear an area where you wouldn't expect evolution to have much of a of any of a, you know, instinct because we hardly had to deal with complex situations like that.

[00:10:48]

Oh, and I should also qualify what I just said, that the problem is it's traditionally presented as with percentages, you know, the percentage of the population, what percent of the time is test wrong?

[00:10:57]

But when you rephrase the question in terms of frequency, as you say, you know, one in a thousand people or, you know, five and 100 times that sort of thing, then people are much more able to get the right answer, which does it totally fits with the explanation that we're used to thinking in, you know, numbers of things we're not used to thinking in terms of just right. I'm sorry our intuitions have evolved in that way.

[00:11:18]

And I should also mention that that even medical students and staff and doctors tend to get this question wrong at a much higher rate than you would hope.

[00:11:26]

Yeah, so this probability theory is complex enough that actually does require quite a bit of training. And so that explains why most people don't have correct intuitions about about probability.

[00:11:37]

Now, one of the things that I have to point out, I want to go back for a minute to James, distinction between the two modes of operating and processing information in the brain, which I'm sure it's a simplification because the brain operates in, you know, probably many more than just two basic categories.

[00:11:56]

But those are distinct enough and interesting enough for our purposes. Now, we've talked in the past often about the fact that, of course, these days you can't talk about X without talking about the neurobiology of X.

[00:12:09]

And in this case, it's intuition and that's there's no exception. So now the research is interesting for a variety of reasons.

[00:12:15]

One, just just one bit of information I want to point out, because it does help make sense of what most people think of intuitions and why so many people are so strongly attached to their intuitions.

[00:12:25]

So, you know, the rational decision making involves, of course, the prefrontal cortex.

[00:12:29]

It's that that's where, you know, our conscious thinking, rational thinking and slow.

[00:12:35]

As we said earlier, it's accurate, but it's it's slow and it's time time consuming, effort consuming. Now, if you look at the areas that tend to be more involved, according to neurobiologists with intuition, on the other hand, these include the amygdala, the basal ganglia, the nucleus accumbens, the lateral temporal cortex and the ventromedial prefrontal cortex.

[00:12:58]

Now, the important one or one of the important ones, not just of those that I just mentioned, is the amygdala. Why? Because that's also the seed of strong emotional reactions.

[00:13:09]

So apparently one of the reasons we have this gut feeling about our intuitions, we're so sure that our intuitions are correct. It's because they are reinforced emotionally. They're connected emotionally through the amygdala.

[00:13:20]

So we really rely a lot on intuitions.

[00:13:23]

We have the gut feeling is actually very strong. There's an emotional attachment which should not necessarily originate with, you know, with rational decision making deliberation.

[00:13:35]

So so the neurobiology there helps actually to understand the process and understand it better.

[00:13:42]

I just wanted to go back to something you said a bit earlier about the when deliberative reasoning can fail.

[00:13:50]

I think you talked a bit about how deliberative, deliberative reasoning can value when you don't have a lot of time.

[00:13:56]

But there are actually a couple of other interesting cases where deliberative reasoning fails.

[00:14:02]

Well, first, it can fail when there's just too many factors for you to consider.

[00:14:05]

And so you just get overwhelmed and you end up picking in sort of a more random or arbitrary way in your deliberative reasoning attempt than you would if you just use intuition where you would like just you would intuitively focus on a few of the most salient or, you know, emotionally fraught, emotionally weighted aspects of the situation.

[00:14:24]

You would just decide based on those. And that's not as good as being completely deliberative and taking everything into account, but it's still better than like throwing up your hands and kind of picking randomly, which is what people tend to do when they get overwhelmed by all of the different factors.

[00:14:36]

But then the other way that deliberative reasoning can fail people.

[00:14:39]

There have been a few recent studies about this where people were asked to choose between I think they tried a number of comparisons like choose between cars to purchase, choose between insurance packages, a few other comparisons usually of the sort of what do you want to purchase sort and and some subjects they asked to make the decision just intuitively or, you know, go with your gut in some subjects, were asked to write out pros and cons or to like look at these tables of of, you know, what the mileage of the car is, what the costs of the different cars are, and like add, you know, weights to the different factors and come up with a total score for each car and make your decision that way.

[00:15:15]

And for the most part, the people in that deliberative reasoning group tended to be less happy with the choice that they eventually made.

[00:15:25]

And the interpretation was that people when asked to.

[00:15:29]

Pay attention to these, like when asked to use deliberative reasoning, people pay attention to quantifiable things like the mileage of the car and and the cost and so on, but they don't they don't know how to quantify things like their subjective liking of the car. So they just leave that out of their calculation. But that's obviously a really important factor in whether you're going to end up happy with your choice. And just because you can't quantify it doesn't mean it's not rational.

[00:15:54]

That's typical.

[00:15:56]

A typical fallacy that I hear people make clearly. If if it affects your happiness, it is rational to take it into account.

[00:16:02]

Absolutely. Absolutely. You might not be able to to quantify. Yeah.

[00:16:05]

And of course, yeah, it is impossible to precisely quantify it. But even, you know, even acknowledging that it's important in trying roughly to quantify how much you like, it would go a long way towards making these kind of calculations more effective.

[00:16:17]

I want to mention research that that I encountered that I found very interesting. Again, going back to this interaction between the intuitive processing and sort of the the rational deliberation, processing of information. So this is by Adam Altor of Princeton. And you had also collaborators at Harvard and the University of Chicago. They published this group, published a number of papers over the last few years on these on these issues.

[00:16:41]

So what one of the things they did was to show that there are certain situations where people switch from intuitive to explicit analysis of a given problem and they investigated what under what condition this happens.

[00:16:54]

And so they found that usually what happens is that they switch from into people switch from intuitive to explicit analysis when they have something personal at stake in the outcome, like if they're likely to lose money on a bad decision or something, and then they trust their intuition less and they're more more careful, more deliberate about it. And also, if people are under however, on the other end on the country, people are under time, pressure or they experience what it's called a cognitive load so that they're busy doing a bunch of other things at the same time, then they will rely on intuition because it's faster.

[00:17:28]

And so it's as a Ristic. Now, Altair and his colleagues have also investigated the effect of what they call disfluency. And I found this this fascinating. So this fluorescing disfluency is a measure of how comfortable you are with the information that you're receiving. And they're comfortable.

[00:17:45]

Yes. So comfortable is. No, no, no.

[00:17:47]

As in the way how easily you can process that information.

[00:17:50]

So, for instance, they say they argue that neurologically, from analogical perspective, disfluency triggers the anterior cingulate cortex, which activates the prefrontal cortex, where much of our analytical thinking is done. So the more disfluency the information is, the more likely you are to engage the rational, deliberate decision making. Now, how do you make people disfluency about things? So at one in one case, what they did was they simply gave the same problem to solve the people.

[00:18:19]

But in one case, they wrote it in clearly legible type font and in other case they wrote in if they found there was a bit more difficult to read like and they controlled for the fact that it wasn't just the slowing down that caused the difference, because you can you can manipulate the time that people spend on the problem independently.

[00:18:38]

And so it was just the fact that they had to the people had to make more of an effort to read the more difficult font that switched them from from intuitive to deliberative decision making.

[00:18:49]

I think I've heard about this in the context of moral decision making, that people are more likely to use intuitive choice. I think Joshua alluded to this research.

[00:18:56]

When you look at these and there is an agreement in one of these, you know, you get this, OK, you can cause yourself disfluency and in a very easy way.

[00:19:05]

So if you want to force yourself to switch from intuitive to to Deliberative Decision-Making, apparently the only thing you need to do is to further your brows.

[00:19:16]

Oh, really? Yes. So that it mimics the sort of the typical.

[00:19:21]

I'm a thinker now. Yeah, I'm a thinker. The typical expression associated with thinking, well, that's enough to trigger the areas of the brain that are involved in deliberative thinking. And so you are actually a thinker, as it turns out, if you just act like a thinker.

[00:19:35]

Well, I thought that was fascinating and empathetic. It is kind of pathetic.

[00:19:40]

But, you know, next time you want to put your thinking cap, I have a thinking cap. I'm in my office at Lehman College.

[00:19:46]

Somebody one of my and sometimes I can't tell when you're being that that I choose to make it. Yeah. Thinking I do it. It looks like it.

[00:19:55]

It's it basically inverted lampshade of all that somebody previously in the office labeled the thinking cap.

[00:20:07]

And I don't use it, but it's a photo for the website. We need a photo.

[00:20:15]

So sorry. So you think you have a thinking cap and then I derailed you right now.

[00:20:20]

So so the point of these experiments is that it's remarkably easy to manipulate basically either even in a self-induced way, the balance.

[00:20:29]

Between intuition and and deliberate thinking, depending on what's at stake on what you have cognitive load from other tasks or even simply from taking the attitude of adopting the attitude of a thinker, you switch things around.

[00:20:44]

It's I think it's, you know, useful hint.

[00:20:47]

I bet that could be really useful information for, say, charities when they're trying to persuade people that a problem is important based on statistics. You know, people's inclination usually is to be more persuaded by like a moving story than to be persuaded by, look, these tens of thousands of, you know, people are suffering are these tens of thousands of creatures are dying and so on and so forth. So maybe the statistics would have more impact on people or maybe people would be more willing to see the potential benefit of their their dollars donated or something if the charity, you know, printed their letter in a really hard to read something like that.

[00:21:23]

That's right. I talked earlier about how deliberative reasoning can go awry, but there's a few other examples of how intuitive reasoning can go awry that we haven't touched on yet that I think are interesting.

[00:21:35]

So it can I mean, there are these evolutionarily programmed cases that we mentioned where, you know, the current situation is not relevant to the situation, which are our intuitions. We're programmed. And and then there are sort of these probabilistic reasoning errors we have.

[00:21:51]

But there's also, in general, I think, a good rule to follow that intuition tends to be less reliable in cases where where the problem you're considering is not something that you have experience with in the past. It's sort of an unprecedented problem.

[00:22:08]

Like like say you're trying to decide how likely artificial intelligence is to lead to, you know, some sort of catastrophe.

[00:22:18]

You might rely on your intuition, but you have to consider the fact that your intuition in this case, it just has no similar problems to have been shaped on, like, you know, a problem like artificial intelligence or any other kind of existential risk.

[00:22:34]

We don't have any examples like that. So it's hard to think that we could rely on our intuition to tell us what's going to happen or what we should do.

[00:22:41]

And similarly, in problems like this, there tend to be you can often notice, other factors that have contributed to your intuition that shouldn't have like epistemically like, you know, again, to take the example of artificial intelligence, if you're using your intuition to consider how dangerous that might be, you have to think about the fact that your intuition is probably shaped by fiction because you've read a lot of stories about robot apocalypses and, you know, A.I. explosion that took over the world.

[00:23:07]

There are not very many stories about AI going in a nice, boring, pleasant way.

[00:23:10]

So so I guess the takeaway here is that being able to recognize where your intuition comes from can help you decide when it's a good guide and in a particular problem.

[00:23:18]

Yeah, that that is a very good point. I want to come back also to this idea of expertise, because as it turns out, there is research that shows how intuition and deliberative thinking interact over time to build expertise, as it turns out, simplifying things a little bit. Broadly speaking, three phases to becoming an expert. And they're actually the interesting thing that I found in the literature on expertise is that they're remarkably similar, almost regardless of the field of expertise you're talking about.

[00:23:51]

So these three phases that I'm about to discuss in a minute apply whether you want to become a chess master or a tennis player, professional tennis player or whatever in between. So it's it's both so intellectual as well as sort of practical applications. All right.

[00:24:06]

So the first phase is when the beginner focuses her attention simply on understanding what it is that needs to be done, the task task at hand. Right. And basically mostly not making mistakes.

[00:24:20]

Think think of the first time you start trying to learn to drive a car, for instance.

[00:24:26]

You need conscious attention and to a lot of different things, and particularly to avoid mistakes, especially mistakes that cost somebody's life or your car, at least now in the second phase, this kind of conscious attention shifts toward, you know, about the basic tasks, shifts toward a more intuitive level.

[00:24:48]

So the individual performs things quite automatically.

[00:24:51]

Again, think about driving a car.

[00:24:54]

You use sort of your body in your mind, memorize and internalize these the basic movements so you can actually drive a car. Now, when you're listening to the radio and talking to somebody else, hopefully not on the phone or not texting, but nonetheless, you can do a lot of other things. And it is you're much more relaxed because there is a lot that goes on and it's actually been shifted automatically to the to the unconscious. Now, here's the problem.

[00:25:19]

So the good news is that usually it takes in whatever the field is, takes actually a fairly short period of time to go from the first to the second phase.

[00:25:27]

The bad news is it takes a lot of.

[00:25:29]

More time and effort to get from the second and third phase in this case, in the example of the driving, if you want to become a NASCAR driver, a Formula One driver or something like that or a chess master.

[00:25:40]

What happened is that the initial improvement was aided by switching control from conscious thought to intuition. All right, then, until the task becomes automatic and faster, but then further improvement requires actually mindful attention focused on the areas where you still make mistakes, OK? And and that is actually and it requires a lot of effort, a conscious effort to correct those mistakes. This is, in fact, often referred to as deliberate practice. So this is the situation where they say a soccer player or a tennis player that are good enough.

[00:26:11]

They're actually pretty good.

[00:26:12]

They have a good intuition about the game and everything, but they still make mistakes in terms of, you know, they need to correct the form of certain details and what they do.

[00:26:20]

That one becomes very difficult because your brain is automated, a lot of that stuff.

[00:26:24]

And now it's sort of, you know, now it's up to you to sort of go back into the subconscious and then and fish out the stuff that you're not doing well and focusing your attention exactly the Automator until you understand what the mistake is corrected.

[00:26:39]

And then Riata made. And the bad news, as I said, is that this is a lot of hard work and it takes a lot of time.

[00:26:46]

Yeah, I think that's actually really useful information because I personally I have I've often found myself in situations where I'm like, oh, I'd like to become better at large.

[00:26:58]

Like what? Let's say, I don't know, small talk. Let's say I will I'd like to become better at small talk.

[00:27:01]

Oh yeah. I wanted that to let me know if I mean if you mean.

[00:27:05]

Well I was just going to say that I, I tend to gravitate towards the the assumption that I can just do a lot of it and I will gradually become better. But it sounds I always sort of had the sneaking suspicion that I might have to actually work harder at becoming better at it instead of just doing it a lot and hoping that it just gradually, you know, improves itself in my in my brain. Yeah.

[00:27:29]

So it sounds like that's what you're saying. Unfortunately, it turns out you need mindful attention, apparently now that this last phase of sort of moving from somebody who is simply good at what doing a certain thing to actually becoming an expert. As I said, it's actually remarkably similar across fields and it's measuring to the several tens of thousands of hours, which basically means roughly about ten years to become an expert in a particular field.

[00:27:55]

And the levels of expertise themselves can actually be you can have more than one level of expertise. So, for instance, in the case of chess players, which is one, as we said earlier, one of the cases that were most studies have been had been done. It takes about 10 years to become a professional player, but it takes then another 10 years or so to become a master level player.

[00:28:18]

And by the way, that has very little to do with sort of native ability. You know, a lot of some people think that, oh, well, but there are some geniuses do a really good and all that. Well, as it turns out, even the geniuses do need to do manual practice to actually become masters, even if you have a talent for something. First of all, a lot of times talent is actually simply being precocious about something is not that you're necessarily much better than other people, is that you get there much earlier in life.

[00:28:45]

But even when there is native talent, if you want to move from talent to actual mastering and expertise, you still need to go to these mindful attention phase, which unfortunately is pretty long. It doesn't seem to be a shortcut for that.

[00:28:59]

I just wanted to point out that I've noticed a number of there's a lot of objections to the idea of rationality from sort of the general population.

[00:29:08]

But most interesting to me are the objections to rationality from like smart, educated, empirically minded, presumably rational, presumably pretty rational people who will say things like, well, you know, you don't want to be too rational.

[00:29:23]

And so I gave this talk at Skeptic on last fall about trying to explore where these ideas come from. It was called the straw.

[00:29:32]

Vulcan's I was talking about this this sort of caricature of rationality that you see in movies and TV like like the Vulcans.

[00:29:41]

Yeah. I'm going as just now in the last few weeks through the entire Star Trek original series.

[00:29:47]

So I know in yeah, I had video clips from the original series and I talked to illustrate the different misconceptions about rationality.

[00:29:52]

I didn't realize how in the first season, especially things changed later in the second and third seasons. The original Star Trek only only lasted three seasons, but in the first, although there were long seasons worth twenty five episodes per season. But in the first season, McCoy is really nasty to Spock. Yeah, he really is.

[00:30:09]

A little later on they develop these more playful thing, but initially the guy's really on his case. It's like, wow, yeah.

[00:30:15]

They sort of like the writers decided, decided we need a foil, you know, and they just like went with that one hundred and twenty percent. All right.

[00:30:21]

So what about this? All right. So I just wanted to say that one of the reasons that I've perceived generally rational people.

[00:30:29]

To believe things like it's possible to be too rational is that they're they're they're conflating the rational, irrational distinction with the system into an intuitive, deliberative decision. I was about to say system one, system two. And then I remember we haven't used those terms yet, I guess. But that's actually we can do that. Yeah. That's actually a common way that psychologists or cognitive psychologists talk about intuition and deliberative reasoning as system one and system two. They're just kind of boring names.

[00:30:55]

I don't use them very much, but that's like type one and type two at or in statistics. So boring and they don't get only boring.

[00:31:03]

But these the three are type three are.

[00:31:05]

Yeah, it's never remembering what exactly they give you. No clue to remember what they refer to.

[00:31:11]

But yes, system one intuition system to deliberative reasoning and and so, so my point in the torquing when I was discussing, you know, intuition or deliberate reasoning is that both systems as Masimo you and I have discussed in the episode, have their strengths and weaknesses and and rationality is about trying to to find the truest path to an accurate picture of reality or about trying to find trying to optimize your decision making to, you know, to best achieve your goals, whatever they are.

[00:31:41]

And so you don't rely on system to blindly. Of course, you you decide based on this particular context, which method is going to be the most likely one to get me, you know, most reliably to the truth or get me most reliably to achieving my goals. And so if you're in a situation that's really pressed for time or where you can't consider all the relevant factors and the rational thing to do is to use your intuition, or if you know that you're a chess master with a lot of experience, you use your intuition.

[00:32:05]

That's the most rational thing to do. So I think that people can when they say it's possible to be too rational, they're thinking it's possible to be too deliberative, too analytical, too analytical in situations where it's not actually the best strategy you can put it or.

[00:32:18]

Yeah, exactly.

[00:32:18]

And so I agree with that point. I just think that they're using the word rational. They're I think that's a very interesting distinction. I have one more piece of research that I want to bring up, because I find it interesting this this gets a little closer to the question of expertise as opposed to the question of intuition. As we said, the two actually very much involved, which are because, you know, intuition is a crucial part of becoming an expert.

[00:32:43]

But there is some research that shows how people become experts in that, that what that means is that you have structure to knowledge, which becomes intuitive at some point about certain situations.

[00:32:58]

And I found this particular example interesting as well as sort of amusing. Two researchers named Cindy Melo Silver and MIT of Green Pepper have investigated the difference between superficial and structural knowledge.

[00:33:12]

In other words, between the novice and sort of the expert in the particular case of people's understanding of aquaria of, you know, Clarian.

[00:33:22]

Yeah, that's right. So thanks with this aquarium. Yeah.

[00:33:25]

Oh, you were just saying it was just the fancy way anyway.

[00:33:30]

So what they did, they compared four groups of people and their understanding of fish tanks, children, naive adults.

[00:33:39]

These are adults that have no particular interest in the subject matter. Now, with respect to this particular case, not exactly.

[00:33:46]

Exactly.

[00:33:48]

And in two different types of experts, one where biologists and the other one where we've an interest in ecology and the other ones where lobbyists who care and, you know, and build aquarium hobbyists.

[00:34:00]

Hobbyists. Yes. All right. So that unsurprising part of the research is, of course, the children and naive adults displayed a very simple understanding of the workings of the of an aquarium.

[00:34:11]

They had no structural knowledge of the of the of our aquarium work.

[00:34:15]

But the experts, on the other hand, were, of course, you know, they appreciated the systemic functioning of an aquarium and they could describe multiple causal pathways, effectively enclosed ecosystem, etc. But there were differences.

[00:34:29]

The biologists explained things in terms and therefore conceptualize things in terms of the science of ecosystems. You know, in a very abstract theoretical level, they thought of aquaria as a microcosm of nature, of a natural ecosystem.

[00:34:44]

The hobbyists, on the other hand, built their model. They were also experts and they come up with the same similar conclusions to the biologists. But their structural knowledge was built around the mental model of practical issues dealing with filtering systems, feeding systems, you know, anything that plays a direct role in actually keeping an aquarium, you know, good looking and and clean.

[00:35:07]

So the idea is that not only intuition and deliberative, you know, combination of intuition and deliberative thinking are the path, the long term path to to expertise. But there can be more than one type of structure of knowledge and therefore more than one type of expertise, even on the same domain, which means that I suspect that the biologists are actually going. To have different intuitions from the hobbyists in this case about Akwei when in certain areas, because it's likely that their structural knowledge being structured differently and being based on the different kind of intuitions, it may, under certain conditions, lead the to to make different decisions or to formulate a different understanding of of of of the system, the complex system that they're interested in.

[00:35:52]

But one still going to be more right than the other. Oh, yeah. Exactly the problem. Right. Right, right.

[00:35:59]

Oh, while we're sharing fund research related to intuition, I have just a couple of my favorite examples of of intuitions gone awry that I'd like to share.

[00:36:10]

One of them is a problem that I think it's been it's sort of a staple on various deliberative reasoning tests, like not measuring people's ability to do deliberative reasoning, but their inclination to do deliberative reasoning. So questions to which you will get the wrong answer. If you don't use deliberative reasoning, they're easy once you do. But you know you'll get them wrong if you don't.

[00:36:31]

So this question was given to, among other subjects, a class at Princeton.

[00:36:38]

The question is a bat and a ball together add up to a dollar and ten cents. That's how much they cost together. The bat costs a dollar more than the ball. How much does the ball cost? Hmm. I'm going to wait a minute for our listeners to think about it, OK?

[00:36:56]

OK, so the answer, which 50 percent of the class at Princeton gave, which is the intuitive system, one answer is that that's that the ball cost 10 cents because you look at a dollar ten and you look at a dollar and you take away the dollar and you get 10 cents.

[00:37:12]

So essentially what you're doing there with that process is you're not really thinking about the problem. You're just sort of fiddling around for like, well, you know, what do problems like this generally involve?

[00:37:21]

Well, generally, you take one thing away from another, so I'll do that.

[00:37:25]

Of course, that's not actually the right answer, because if the ball was 10 cents in the back was the dollar, then the bet would cost 90 cents more than the ball, not a dollar more than the ball. The right answer is five cents.

[00:37:35]

So it just you know, it shows how quickly that, you know, people even, you know, smart, well educated people reach for the system.

[00:37:42]

One answer, even on a test of reasoning, which presumably these people knew that they were they were being tested on.

[00:37:49]

And and then the other example of this intuitive reasoning that I, I find so amusing and kind of endearing is a classic social psychology experiment in which researchers sent someone to wait in line at a copy machine, and then they the subject asked the person ahead of them, excuse me, do you mind if I cut in line and maybe about 50 or 40 percent of them agreed to let the subject cut in line ahead of them.

[00:38:14]

But when the experimenters redid the study and had the subject to ask instead of can I cut in front of you, but can I cut in front of you because I need to make copies, then then the ascent rate went up to like ninety five percent, because now they have a reason, because it sounds like a reason, but I mean of course they need to make copies is the only reason they would have to cut in line at a copy machine.

[00:38:36]

But, but because the request was phrased in terms of giving a reason, our system, one reasoning kicks in and we go, oh, they have a reason. So sure you have a reason.

[00:38:45]

You know, when it is when I hear these kinds of things that I'm reminded of P.T. Barnum saying that there's a sucker born every minute. And apparently most of us actually are in that category, at least under some circumstances.

[00:38:57]

Yeah, that's true.

[00:39:01]

We are just about running out of time and we have a couple of good picks for this episode.

[00:39:05]

So we will wrap up the section of the podcast and move on to the rationally speaking picks.

[00:39:27]

Welcome back every episode, Julia, and I think a couple of our favorite books, movies, websites or whatever tickles our rational fancy. Let's start as usual with Julius Pick.

[00:39:36]

Thanks, Massimo. My pick is I'm continuing my trend of of diverging from the standard book websites, movie article model, my website.

[00:39:46]

My pick is an info graphic this week.

[00:39:50]

So basically this is from the website. Information is Beautiful, which is a great source of of really elegant and clever and interesting ways of depicting important data.

[00:40:02]

So this one is called Snake Oil Questionmark, scientific evidence for popular health supplements. And it's thought it fit really well into our wheelhouse for at least a couple of reasons.

[00:40:14]

First, because the field of health and nutrition is just such a difficult field to pick through and find reliable information about, I find it personally very daunting.

[00:40:24]

I tend to just throw up my hands and let you know other people I trust who have more time on their hands and more patients than me go through it.

[00:40:32]

But so so this this infographics displays various like nutritional supplements ranging from, you know, things like fish oil to antioxidants to minerals like iron, just anything that you could take as a as a supplement to your diet.

[00:40:50]

And it graphs them on this sort of constantly changing bubble chart in which the the bubbles that sort of float up to the top of the chart are the ones with the most reliable evidence for them. And looking at now, it's really beautiful. Beautiful.

[00:41:08]

Yeah. And it's you can sort of see as when you load the page, the bubbles sort of like form and like gradually like nudge each other aside until they settle into place and that they're like hierarchy relative to each other. Depends on the current state of the evidence. I believe it's updated regularly.

[00:41:26]

So yeah. So it's it's just like a really clear, intuitive way of seeing what the top studies are. And then it's also just a cool info graphic because it conveys so much information in a really clear and non confusing way.

[00:41:36]

So the size of the bubble, each bubble is a particular supplement like garlic or green tea or vitamin D, and then the size of the bubble conveys how popular it is in terms of Google hits.

[00:41:48]

So the big bubbles represent things that people care about. And then you. It's also a nice study in Economy of Information display.

[00:41:57]

So it's of course, important to know what condition the the supplement is potentially useful for because something might be really useful for heart disease, but not at all useful for Alzheimer's.

[00:42:09]

So when you scroll your mouse over the bubble, it'll tell you what that particular supplement is being is being considered successful or unsuccessful with regard to.

[00:42:21]

And yeah. And then the colour of the bubble. There are a few bubbles that are that are small, are kind of far down, but they're colored orange so that you can see that like the research is really accelerating in that area.

[00:42:31]

So it's one to watch. So this is this is sort of become my favorite source of information about diet and health supplements. Interesting. Both functionally and just like aesthetically.

[00:42:42]

Very good. So my pick, on the other hand, is a book which I have not finished.

[00:42:48]

So I probably arguably shouldn't really recommend it because like the first two thirds of this book that it just about so I find it interesting as an idea.

[00:42:56]

It's well-written and it certainly is thought provoking. Provoking the book is Zombie Economics How that Ideas Still Walk Among US. And it is by a New Zealand based economist, John Quiggin. And the basic idea is that there are there's a number of ideas in economics that are dead and should be buried because they've been shown not to work and they just, like zombies, come back to life, presumably, you know, against the habit in spite of the evidence and presumably because there are sort of political or ideological interests and so on and so forth.

[00:43:33]

So the book actually goes through five major one major, major ideas. I found, for instance, the the article about the Great Moderation, which is the very first chapter, particularly interesting, the great. So the Great Moderation is this idea that the markets at some point in history have been tamed and that these oscillations with bubbles and bursts have gone away and this time they're going away for for sure forever because we figured it out how to tame the market.

[00:44:05]

Apparently this idea, which is called the Great Moderation, has been actually proposed several times.

[00:44:09]

And sure enough, several times has been shown to be wrong because then the bubble actually has occurred and then it has burst.

[00:44:15]

And the last incarnation of these Great Moderation idea was was popular until 2007. And of course, in 2008, we had the worldwide collapse of the economy. So. Clearly, that was not a particularly good idea anyway. The other is, you know, with humor and with a good amount of information and references goes through these these ideas. And in the process, it teaches you both about, you know, the basics of economics, which I would think of general interest to people, but also about economics as a profession.

[00:44:46]

And theology gets right, sociology, economics and also, to some extent, epistemology of economics. You know, a lot of people, frankly, including myself, do have the impression that economics is just you know, it's either entirely theoretical, in which case is basically pure mathematics, which is fine. But then when it comes to actual applications, economists don't necessarily know what the heck they're talking about.

[00:45:09]

Now, as it turns out, that's not quite fair.

[00:45:11]

They do know what they're talking about, except that occasionally there are other forces that come in or perhaps more than occasionally other forces that come into the field and sort of help the field forget temporally that they actually have to learn the lesson and the lesson needs to be relearned all over the thing.

[00:45:29]

Of course, it makes this somewhat tragic is that usually the lesson is learned on our backs as when there is a collapse, either at national level or Y level anyway. So it's zombie economics how that idea still walk among us?

[00:45:42]

Oh, it sounds really interesting. And upon hearing your explanation, it's obvious that that is what I should have expected the book to be about.

[00:45:51]

But I will admit, when I first heard the title Zombie Economics, I thought it was going to be about zombies respond to incentives like like like if the price of brains goes up by 10 cents to zombies, consumers brains, well, you would have to write.

[00:46:07]

All right. We are all out of time. So this concludes another episode of rationally speaking. Join us next time for more explorations on the borderlands between reason and nonsense.

[00:46:23]

The rationally speaking podcast is presented by New York City skeptics for program notes, links, and to get involved in an online conversation about this and other episodes, please visit rationally speaking podcast Dog. This podcast is produced by Benny Pollack and recorded in the heart of Greenwich Village, New York. Our theme, Truth by Todd Rundgren, is used by permission. Thank you for listening.