Transcribe your podcast
[00:00:08]

Welcome to the Knowledge Project, I'm your host, Shane Parrish, curator behind Farnam Street blog, an intellectual hub of interestingness that covers topics like human misjudgment, decision making strategy and philosophy.

[00:00:22]

The knowledge project allows me to interview amazing people from around the world to deconstruct why they're good at what they do. It's more conversation than prescription. On this episode, I have Julia Gass. She's the president and co-founder of the Center for Applied Rationality, a nonprofit organization based in Berkeley, California, devoted to developing testing and training people in strategies for reasoning and decision making. She also hosts the nationally speaking podcast, a biweekly show featuring conversations about science and philosophy.

[00:00:54]

On this episode, we talk about a host of fascinating subjects, including rationality, of course, Changing Minds, Our Own and others filtering information and a ton more. This was a fascinating conversation and one you don't want to miss. I hope you enjoy it as much as I did. Before I get started, here's a quick word from our sponsor. This podcast is supported by Slark, a messaging app bringing all your team's communication into one place so you can spend less time answering emails and attending meetings and spend more time being productive.

[00:01:31]

Visit Slocum's Ashburnham to create your team and get one hundred dollars in credits that you can use if you decide to switch to a paid plan.

[00:01:38]

Where do the interest you're the co-founder for the Center for Applied Rationality, where did the interest in rationality come from?

[00:01:46]

Well, people ask me that some time and I don't have a great answer for it. I've sort of been interested in before. I knew the word rationality. I've been interested in what that word refers to as long as I can remember. Basically, one of my guesses as to the origins of my interest is my parents. They're both very intellectually curious, smart for sure, but also intellectually curious people who like to to inquire about the world and and encourage me and my brother to be curious about the world.

[00:02:17]

And and more than that, they were also unusually good at changing their minds when presented with a good argument or when when they encountered new facts that should change their mind. So I just have these these vivid memories from when I was I couldn't have been more than six or seven, I think, and I would have these arguments with my parents the way that kids do about rules or what was fair or what good behavior, etc. And sometimes not usually, but some of the time my parents would come back later and say, you know, Julia, we thought about it and we think you're right.

[00:02:58]

We think this was unfair or whatever. And I just I remember feeling so well, grateful for one that they had listened to me and taking me seriously. But also I felt admiration for them because they didn't have to change their minds. There was no no one was going to hold them accountable. I couldn't do anything about it. But but they clearly wanted to like if if I was right, they wanted to realize that I was right. So so that kind of stuck with me.

[00:03:27]

And I got pretty interested in good arguments. What counts as good evidence? I ended up majoring in statistics in college, which I was interested in because it asks questions like how can we know things with any confidence about the world? How much evidence is enough to sort of change someone's position on something or should be enough? And through it all, I carry this memory of my parents as sort of an ideal in the back of my mind in terms of of how to react when you encounter new evidence or good arguments.

[00:03:58]

Did your parents do anything else to kind of encourage that in you or when you change your mind? Did they were they responding positively to that or.

[00:04:07]

I think so. That's less of a vivid memory in my mind. But if I had to guess, I would say yes. They I mean, my dad's well, both my parents. But my dad has more of a background in science. My mom's background is in math and statistics, actually. So my dad was the one I would more often ask questions about. Why is the sky blue, that kind of question about the moon? And and he rarely just gave me the answer.

[00:04:35]

He would more often sort of lead me through a reasoning process in kind of a Socratic type way. And usually I couldn't actually reason my way from first principles as a seven year old to figuring out why the sky is blue. But I could sort of come up with some guesses and maybe I could sort of notice, oh, maybe that guess doesn't make sense, etc.. So when I finally got the answer, it was sort of more satisfying because I had tried know when you try to solve a puzzle and it tells you the answer, it's much more satisfying and it sticks with you more than if they just tell you the puzzle and then tell you the answer straight away.

[00:05:08]

It was like that. So I think that was part of it. Well, the last thing I can think of in that space is I don't know if I'm joking about this, but my dad well, the facts are true. I don't know. I can't tell if I'm serious about them contributing to my interest in rationality. But the facts are that my dad has kind of a trickster spirit at the core of him. And sometimes, especially if he didn't know the answer to a question, he would just make up an answer.

[00:05:37]

If you've ever read Calvin and Hobbes, Calvin's dad is like, is the the example the poster child for this? So why how does the radio play music? Well, little people inside the radio that have been shrunk and higher and higher to play whenever you turn on the switch, that sort of thing, and and you would say with this perfectly straight face, it is very serious. Professor Lee. Tone of voice. And and eventually he would make me realize he was fibbing, but it was much more satisfying if I could figure it out myself, if I could learn to notice those little bells in my head of.

[00:06:15]

Wait a minute, that doesn't make any sense. Before he gave it up, your dad sounds like you'd be awesome at a dinner party.

[00:06:25]

Awesome is one way to put it. So, yeah, I mean, I it does a neat story there about how that helped me be skeptical. But I don't know. It's really hard to. It's really hard to. To. Causal connections. So now you're the co-founder for the Center for Applied Rationality. What does that do to you guys trying to accomplish?

[00:06:44]

And so our mission is to develop strategies to improve on the default human processes for reasoning and making decisions and to educate people in those improved reasoning and decision making strategies, and particularly with an eye towards having a positive impact on the world. So the timing of founding the Center for Applied Rationality, which we call far, is basically we had noticed that the science on human rationality or more precisely human irrationality, had kind of hit this peak over the last few decades.

[00:07:26]

If you've heard of Daniel Kahneman, who wrote the best seller Thinking Fast and Slow, he won a Nobel Prize for his work on human irrationality and inspired a lot more work on similar topics. And there was just the academic research had sort of been building up for the last 50 years, kind of culminating in Daniel Countermands research, Nobel Award, Nobel Prize Award. And then the public interest in the topic had been growing over the past 10 years or so with books like Thinking Fast and Slow.

[00:07:57]

But before that, Dan Ariely is predictably irrational. Nudge lots of other books on behavioral economics and and cognitive science. And the thing was, there was all this research on how the brain is irrational, what kinds of systematic reasoning and decision making errors humans make, leaving them to sort of systematically get the wrong answer to questions or systematically make decisions that they end up regretting that kind of thing. But there and there was tons of public interest in this topic, but there wasn't yet a lot of research on, OK, what do we do about this?

[00:08:32]

There's a little research examining interventions to try to overcome some of these cognitive biases, but not a lot. And most of it was pretty shallow research. And I don't mean that disparagingly. I just mean the interventions consisted of things like the experimenters would tell the treatment group about the bias and then see if they committed the bias on a quiz or something. And that's not the kind of thing that I would expect to really change these ingrained habits of thought and behavior.

[00:09:04]

I would expect that to change ingrained habits of thought and behavior. You need more longer term practice on real life issues, not on toy problems in a lab.

[00:09:17]

I definitely want to get into how we go about changing some of those processes. But maybe before we continue, do we have a common understanding of what rationality means?

[00:09:28]

How many hours do you have? It's hard. There is a sort of simple definition you could give that doesn't captured that much. And then there are increasingly complex definitions. You can give that sacrifice parsimony for for death.

[00:09:44]

Is there more than one type of rationality? Yeah. So sometimes we talk about two types of rationality. On the one hand, we have epistemic rationality. And sorry, these aren't just first term. These are terms in cognitive science and philosophy. Right. So epistemic rationality is about using processes, processes for reasoning or processing information that systematically get you closer to an accurate model of how the world works. And we can never be one hundred percent confident and get the the perfect, perfectly correct answer about everything, about how the world works, because we just have limited information in time and there's a lot of uncertainty.

[00:10:24]

But some reasoning processes are just more reliable than others. So that I mean, to take a dumb example, making stuff up or believing what a random person on the street tells you will be a less reliable process for having an accurate model of the world, then synthesizing the opinions of top experts on a topic or looking at randomized control trials or something like that. And that's kind of an oversimplified description. But I'm just trying to sketch out the what the spectrum of epistemic rationality looks like.

[00:10:55]

Should I go into the second version, please, type of rationality? So the second type is instrumental rationality, and that's about making choices that given the best information you have at your disposal, are most likely to achieve your goals. And I know the word goals kind of has associations for most people with career goals and being productive and I don't know, maybe getting an award or getting something published. But but in this context, it could really mean anything, anything you want or value.

[00:11:29]

It could mean having friends. It could mean being happy. It could mean helping the world, whatever it is you care about. Instrumental rationality is making choices that you can systematically expect are most likely to to achieve those goals as as efficiently as possible.

[00:11:46]

Is there one only one optimum path for doing that, or is it more broad than that?

[00:11:51]

Well, it's an interesting theoretical question. I suppose if you had access, if you had perfect information about the world, there might be one optimal choice that was slightly better than all the other choices in terms of its probability of. But this is all very abstract in practice. In practice, it's not helpful to think about there being one very best choice that you can know for certain, but you can do it with some careful reasoning and using some good heuristics or rules of thumb.

[00:12:22]

You can sort of rule out some obviously bad choices and try to make an educated guess among the set of of plausible best choices. Do you think that's a good strategy?

[00:12:35]

If you don't know what's likely to lead to success, but you do know it's likely to lead to failures, just eliminating the stupidity of a it's low hanging fruit?

[00:12:46]

I think I mean, so often in my work so far, I guess I didn't get to fully finish explaining what we do. But the short rest of the explanation is that we, in addition to sort of doing some of our own research on these topics, we run workshops where we run training sessions, teaching people some of the more promising techniques that we've been developing to help overcome some of these biases and make better decisions. And and so often I find myself in a situation where someone is kind of there at some crossroads.

[00:13:18]

They're there at their job and they don't really like it. And they're not motivated. They're not really doing good work at all, but they don't really know what they should be doing instead. And they're having a lot of trouble figuring out what the best choices. And so usually my answer in that case is I don't know what the best choice is. I don't think you have any way to know what the best choice is. But clearly, the thing you're doing now is not the best choice.

[00:13:41]

Your your if if the best choice to stay at your job, then it means staying at your job and finding a way to make it actually fulfilling and to make yourself actually succeed at it. The best choice is not doing a half assed job at your current job. So we can kind of eliminate your current strategy of maybe a simple example. But or it sounds silly, but that is in fact the way a lot of us make our decisions by default.

[00:14:06]

So in your your view of rationality, then what's the role of intuition?

[00:14:11]

Well, so I know a common conception of rationality in the public, in the media is that rationality means dismissing or suppressing your intuition. There's often this dichotomy set up of reason and rationality on the one hand, and then intuition and and emotion on the other hand. And so this this sort of perceived dichotomy gives rise to characters in movies or TV shows that are sort of explicitly the rational one or the logical one. And what that means in practice for the characters actions is that they're the one who goes around pooh poohing other people's thoughts and feelings and and looking down their nose at people for having fun or or falling in love or or finding something beautiful.

[00:15:10]

And that is not what we or or cognitive scientists deem rationality. The the actual scientific model, which I think is is also pretty common sense when you think about it, is our brains are our minds are divided. We do have a more intuitive side of our mind, which cognitive scientists call system one. It's kind of a dull name. I apologize for that. And then we also have this more recently evolved part of our mind that allows us to do logical reasoning way abstract trade offs against each other, do math, long term planning, that kind of thing.

[00:15:50]

It's very roughly speaking, it's our prefrontal cortex. And the kind of reasoning it does is what cognitive scientists call system to. And system one is is indispensable. There's no way we could actually survive as a species or as individuals if we ignored the output of our intuitive system. One. Right. So if I were to throw a pen at your head and you had well, we're on opposite sides of the continent right now, I think. So you're safe.

[00:16:20]

But if we meet in person some day and I throw a pen at your head and you were only able to choose what to do, using your system to it would be absurd. Your system, too, would have to sort of reason through all the possible ways to react. It might say to itself, OK, well, well, first, let's calculate the trajectory of the pen approaching my head and its velocity, and I can estimate, OK, it'll probably hit my forehead in half.

[00:16:48]

Second, what are my choices? I could stay still. I could try to dodge. Maybe I could try to catch the pen. Let's weigh the pros and cons of each option and it would quickly be a moot point. Right. The pen would have already hit your head. And this is sort of the drawback of your system to it's sort of slow and laborious and effortful, although it can do things in your system. One can't do your system.

[00:17:10]

One is very fast and it's it's very good at processing lots of information that you've sort of picked up, often unconsciously over the course of your life. So when your forget about pen fitting for heads, let's take a real example. If you're in a social situation, you can sort of pick up things. You can pick up social cues to varying extents. If you can detect whether someone is frustrated with you were bored or flirting with you or threatening you, you usually can't fully articulate what it is about their tone of voice or their facial expressions that's giving you that impression.

[00:17:48]

But you're still pretty confident it's happening. And the reason you can't articulate it is that it's your system, one, your intuition that's doing all of that processing and it's doing it in this very blackbox kind of way. But this is what it it had practice over the many years of interacting in social situations. And so it's learned over time to be able to do this well and to just spit out a quick answer about how does this person feel about me.

[00:18:11]

So that's a long winded way to say that system, one intuition is indispensable. The couch is just that. It's often fallible. And so rationality is it's less about ignoring or suppressing system one intuition and more about understanding the sort of respective strengths and weaknesses of the two systems and learning how to sort of get them to communicate with each other, so to speak. Get your system to to listen to your system one and get your system one, to listen to your system to and sort of update your emotional or intuitive impressions if they seem flawed in that situation.

[00:18:47]

You mentioned art and love and passion and all of these other things that we give us some sort of emotional response. What's the connection to rationality?

[00:18:58]

There is some connection between art and rationality. Yeah.

[00:19:02]

Is there one I mean, you brought it up in a way that made me think that there was one.

[00:19:06]

Well, you mean in my example of the the supposedly logical character in movies, can it be rational to appreciate or can it be I mean. Yeah.

[00:19:16]

Oh, by the way, before I answer your question, I realized I forgot to give the name of that character, the name for that trope. That archetype is the straw Vulcan. It's a play on the expression, the straw man, the expression to straw man, someone's right. So it just means to to present this week caricature of what your opponent is arguing and then knock that down because it's easier to knock down. And so the struggle is like a weak caricature of rationality.

[00:19:43]

It's not actual rationality, but it's easier to sort of make fun of a knockdown than real rationality. So I just want to get that in. So right. In your answer, in answer to your question about art and rationality, I mean, I'd say the only obvious connection to me, I kind of think of art as being orthogonal independent of rationality, the kinds of things that the artist is trying to do or mostly not trying to get the right answer to a question or achieve a goal.

[00:20:12]

It's more of an expressive process. But in terms of appreciating art, I'd say for many people, maybe for most people, for some form of art, that's that's very important for their goals. And the kinds of goals and thinking of are indeed the kinds of goals that people usually don't think of when I say the word goals, but are still very important goals like the goal of feeling connected to other humans on the planet, the goal of feeling a sense of meaning or finding a sense of meaning, the goal of pleasure, lots of artists is about pleasure.

[00:20:48]

So I'm glad you asked the question, actually, because this is exactly the kind of thing that I think people neglect when they think about optimizing for their goals. And these are, in fact, very important goals that people miss when they don't have them. But somehow they don't really they don't seem to come to mind when people think about, OK, now I'm going to try to be rational.

[00:21:06]

You're interested in changing minds. How do we go about doing that?

[00:21:10]

Are you talking about changing other people's minds or changing your own mind? Oh, let's let's start with our own mind and then explore how we change other people's minds.

[00:21:20]

So that is the order in which I'm interested in the order of questions you're you asked is the order in which I'm interested in them. It's difficult because usually for other people, the order is reversed. And when I want to talk about, OK, here's how to be better at change your own mind. People often interrupt me, they're like, no, no, no, I want to change other people's minds, how do I convince my coworkers, my my partner, that they're wrong?

[00:21:44]

Yes, if only they saw the world through my eyes.

[00:21:47]

Yeah, it's less interesting to me, but I'll also give my best answer to it. So to change your own mind, I mean, the first step, I would say, is just believing that that's a desirable thing to do, which many or most people don't really either would explicitly say is a bad thing to do, because changing your mind makes you kind of wishy washy or weak or stupid, or they would explicitly claim it's a good thing to do, but they don't really believe it on a sort of gut level.

[00:22:19]

And so they don't have the motivation to do it and therefore they're not going to do it. So step one is actually believing, yes, I believe on a gut level that there's probably a bunch of things I'm wrong about that I am not yet aware of just because that's true of everyone. And also that whatever it is I'm wrong about, I would like to know about that because I'll probably be able to make better decisions for myself and for other people.

[00:22:45]

I'll probably be able to avoid hurting other people if I have a more accurate model of the world. And so the implication of that is I would like to change my mind when when I encounter new evidence. In fact, I want to seek out new evidence that might cause me to change my mind, not just sort of passively accept it when it comes.

[00:23:01]

How do we get to a point where we do that? Like how do you take somebody who's naturally closed minded and develop over time? I would imagine it's probably not a light switch, this open mindedness, to the point where you can give up your beliefs or cherished thoughts about something.

[00:23:19]

Yeah, well, I mean, even the way you phrased that question, how do you take someone who's close minded is already is already kind of we're all closed minded in some ways and we're blind to certain things.

[00:23:34]

So myself, I'm blind to a lot.

[00:23:36]

Yeah, I thought you meant closed minded in the sense of not wanting to change their mind. And that's what I mean. Right.

[00:23:42]

Like so I just like you could you could show any amount of evidence. And if it's something that we hold dear, we have an emotional connection to, we're less likely to change our mind. That's an extreme example. But I mean, there's many things and organizations are workplace for any amount of evidence or rationality. I mean, the proverbial story would kind of be you present evidence of a cigarette smoking and then one person in the meeting comes up and says their grandmother has been smoking for ninety nine years and doesn't have cancer and that it just gets dismissed and wants to go to the table, write to me, how do you get to a point where people themselves are motivated, I guess in some way to change their minds or be more open minded?

[00:24:28]

Right.

[00:24:29]

So I guess I have this this mental model, it's sort of a two layer mental model where the the bottom the more fundamental layer is just abstractly but sincerely wanting to to be able to change your mind. And that doesn't mean that in any given situation, like if you read an article saying that caffeine well, that's a bad example because nutrition data is not very reliable. But let's say you read a reliable article showing that reliable studies showing that caffeine was really bad for you, but you've cherished your morning cup of coffee or your three cups of coffee a day.

[00:25:05]

That is going to be hard to hear. And the natural reaction is to find a reason to dismiss the study. And that's often going to happen even if you have that fundamental layer in place of believing in general that you would like to be the kind of person who changes your mind. The distinction I was trying to make with in my response to the closed minded phrase is I think a lot of people just don't have that fundamental bottom layer, like they don't even want to be able to change their mind in particular situations.

[00:25:30]

I think it's it's like I'm pretty pessimistic about the ability to get it to work in a particular situation. So I tend to focus on the bottom layer first and to get that in place again, I don't think it's the kind of thing you can sort of do to someone else. And when when people so we have people go through an admissions process to come to our workshops just to make sure it's going to be a good fit for them, because it's kind of an investment of time and money.

[00:25:56]

We want to make sure it's a good fit. And so one of the questions we asked them in the admissions interview is, why are you interested in coming? What are you hoping to get out of it? And sometimes occasionally we get someone who says, well, I want to learn how to explain rationality to the people around me so that they realise how irrational they are. And those people we generally turn away saying we don't think it's a good fit for them because the motivation really has to be, I want to improve my own reasoning and notice my own blind spots.

[00:26:26]

So my guess as to the things that cause someone to be not in that group of other focused people. My guess is that it's partly social, like the influence of the people around you, because humans are primates and social creatures and the things that we're motivated to try to acquire or be tend to be very influenced by the culture in which we're in which we live. So if you're if your parents valued and rewarded, change your mind or if the people in your social circle value and reward change your mind, I think that makes a huge difference.

[00:27:06]

And that that isn't just sort of an abstract apriori reasoning. I see that pattern just looking around at the people I know. The other thing that I think can sometimes work, even if you don't have the general motivation to change your mind, is there's a particular goal that you need to achieve, like your your startup is going to fail if you can't look at the data in a clear eyed way and make the best decision you can about whether to pivot or not, that kind of thing.

[00:27:33]

So sometimes these kind of immediate, like, high stakes motivations can really make you want to see the world the way it as closely as close to the way it really is as possible and not see sort of a wishful thinking version of the world that you've created in your mind. It doesn't always work. Sometimes we still kind of try to distort or deny our perception of the world, but having that motivation can help. Did I answer your question? I can't remember what your question was.

[00:28:00]

Yeah, like that.

[00:28:01]

I mean, how do you go about changing your own minds and then how do we go about changing the minds of others?

[00:28:06]

Oh yeah. I guess I didn't quite answer the how do we change our own minds question. I just I answered the question, how do you become the kind of person you could go about changing them? Right.

[00:28:16]

Yeah. So I mean, that was a parenthesis around that. So if we can go back to like, how do we go about changing our own minds? The first step you had mentioned and I derailed you after that was how do we we need to be open minded about processing new information. And then I think you had more to that answer.

[00:28:34]

Yeah. So the problem that I found when I read the preexisting advice in, I don't know, skeptic blogs or dualism, no skeptic movement is maybe I should just explain that very briefly to the skeptic movement. Or you could call it a community. I don't know. It's a group of people who are promoting scientific and and critical reasoning and go around sort of testing and sometimes debunking things that turn out to be pseudoscience or paranormal or magical claims. So anyway, I used to read a lot of skeptic blogs and listen to some skeptic podcasts.

[00:29:21]

And I think skeptics are pretty good at giving advice about the importance of changing your mind. But it tends to be the very sort of high abstract level, like be open minded. That's good advice. It's important, but it's not clear how to concretely implement that. It's sort of like telling someone to eat healthy. That's unlikely to actually change their behavior because their dietary choices are made up of all of these these little moment to moment choices about what to eat and how much do you when to stop eating.

[00:29:56]

And just having this abstract commandment eat healthy in the back of their mind usually doesn't translate into changing their actions on a moment to moment basis. So a lot of what Cifas ended up doing was just taking this sort of abstract advice like the open minded and translating it into these sort of very concrete almost algorithms that have that start with a cue with a trigger. So a cue might be I read an article that I disagree with that's like a trigger. And then the action that I can take then to try to to try to manifest the principle of open mindedness is to look for look for evidence that actually agrees with the article.

[00:30:42]

So this is sort of a counter to our general tendency to only look for evidence that that supports what we believe. So in this in this sort of trigger action plan, we call these things what we're trying to get ourselves to do is take the trigger of of a moment where my natural response would be to try to find reasons to reject an article and instead install the habit of looking for reasons not to reject the article. And that doesn't mean I'm usually going to end up thinking the article is correct, but I will be giving it a much fairer shake than I would be if I didn't have this trigger action plan installed.

[00:31:16]

So that's just one example. But it's hopefully representative of the kind of concrete, habit based approach that we take to helping people change their minds.

[00:31:26]

So that's really interesting so that, as you mentioned, that it seems like the cost of information processing would go up and in a world where we're consuming. So much more information if you had a trigger action plan where you were evaluating each piece of information that you're reading and seeking reasons for not disagreeing or agreeing or whatever it happens to be, you would be spending more time on that and have more more of an investment in it.

[00:31:54]

Do you think that that I guess how do we how do we process information in a world where the costs of doing that rationally becomes so high that maybe they're weighing the benefits? I don't know.

[00:32:08]

I mean, there is a cost. You're absolutely right. There's a reason System one and System two are called, respectively, fast and slow thinking systems.

[00:32:17]

We can't consume everything slow, can we, in today's age?

[00:32:21]

You know, you can, but no reason to let the perfect be the enemy of the good right. Of course.

[00:32:27]

And the other thing I would say is that the whole goal with habits is that they become automatic over time. So there are a lot of things that I, I find myself automatically doing now, like like when someone I dislike says something, I now sort of automatically. Well, actually, not all the time, but I sort of have an intuitive sense for for when this would be a useful thing to do. I will automatically imagine that someone I like said the exact same thing.

[00:32:56]

And I notice would my reaction be different? In other words, was I sort of unfairly dismissing this person's argument because I don't like them, just walk them out of your mind.

[00:33:08]

Right. And that that was something I sort of started trying to do intentionally. And now it happens automatically. And to be fair, it is still there's still some extra steps that are happening in my mind that I wouldn't be spending the effort on if I didn't do this at all. But at least the way the process happens kind of quickly and automatically. So that is the goal to try to get some of these explicit processes to eventually lodge themselves in your system, one in your intuitive thinking.

[00:33:37]

But I agree it's there's a bit of an investment there. It's not as it's not quite as effortless and easy as just using your default thinking systems.

[00:33:46]

Are there any other tips you would offer in terms of how we process information that helps us become more rational and we're consuming it?

[00:33:54]

Oh, man, that's the big question. Well, I guess one thing we haven't really touched on is being attuned to your emotional reactions, which was a big thing that I I don't think it was really on my radar so much before I started to see far. But there's so your emotions kind of are often in the background of your thinking. And so you become defensive. You don't even really notice that you're defensive. You just start having thoughts like how this person is out to get me or like, oh, this isn't fair.

[00:34:28]

Or you start looking for things to criticize about them instead of trying to evaluate their criticism of you. And and you're actually getting defensive. Your your body's tensing up or your shoulders are sort of turning in. You can't see my shoulders. I'm doing it now. But you just don't notice that that emotion is influencing the way that you're processing the person's argument. Right. And so a fair amount of what we teach ends up being about just being much more self aware of those emotional reactions.

[00:35:00]

And they don't always have to be defensiveness or aggressiveness or anything like that. They can often be sort of the subtle anxiety or concern in the background when your mind starts to go to a topic and then flinches away from it, because it might turn out, if I if I start going down that road, I might have to conclude that shoot, I shouldn't have entered the program in the first place or like I really I really am going to have to break up with my partner.

[00:35:23]

And these are unpleasant to think about. And so we automatically flinch away from them. But if you if you become more attuned to the to your emotional reactions, they can just be important clues as to where you're what are the blind spots you're creating for yourself and where are they?

[00:35:37]

It's interesting. I mean, yeah, I never pause and reflect on my emotional state when I'm consuming information. I just assume that process it all the same way.

[00:35:47]

Yeah, I think there's a lot of interesting variation in the texture of your reactions that you can start to notice when you look for them and different people have different ways that work well for them to develop this kind of self awareness. One of my the co-founders of so far has a background in Aikido. He's been doing it and teaching it for like two decades now. And he does a lot of meditation and he's just very embodied, I guess is the right word in a way that I'm totally amnot.

[00:36:15]

I really live in my head. And so VALIS, his name, is just really good at helping people detect the sort of physical just what's going on with their body. Like notice the tension, notice that they're not that they're like leaning forward with sort of aggression or they're like leaning. And anxiety or something like that, I just am not good at noticing these things, my body, I have other ways of noticing my emotions, but they're more cognitive and less physical.

[00:36:47]

So I think there's a lot of variation in what works for different people. But but things like meditation, I guess, and some to some extent, martial arts, I think really do help a lot of people with this stuff. And that was an interesting discovery for me, because I had never considered that there might be a link between or some overlap between meditation or martial arts and rationality.

[00:37:07]

That's interesting. Have you started doing martial arts? No, I'm really lazy.

[00:37:14]

So to kind of come back to it, how do we change other people's minds then?

[00:37:19]

Yeah, so there is there are different strategies to take depending on how intellectually honest you want to be. Right. So, you know, a lot of the research that's come out, the books that I've been referring to on irrationality that describe all of these hidden forces that affect our decisions that we're not even aware of, like how tall was the person making the claim or was I holding a hot cup of coffee when when the person asked made a request of me, that sort of thing, all these insidious aspects of our psychology that we're not even aware of, there's a real opportunity there for people to exploit those things and use them to change other people's minds without actually making any good arguments or presenting any evidence.

[00:38:08]

And in fact, there's a good book about this I would recommend called Influence. My child is a psychologist. I can't remember his university now. It's been up for a few decades. It's been a bestseller for a long time. But basically what he did is he went undercover at various companies who were in the business of persuading people, of influencing people. So I think that included a telemarketer from door to door sales. This was back in the 70s or 80s when that was more a thing, some excuse me, some lobbyists or up lobbyists, but activists who were trying to get people to sign petitions, that kind of thing.

[00:38:49]

And these these industries have there's a real profit incentive to try to find new and better ways of persuading people to do things. And they've developed all these techniques, but they're not going to publish them because they're a competitive advantage for them, of course. So he basically did all this research for years and then wrote a book classifying the different kinds of techniques that these companies use to persuade people. And it's very clear and compelling and easy to read book, but with very information dense.

[00:39:20]

So there are five categories. I'm going to forget all of them, but they include things like scarcity. Trying to make people feel like something is is a rare opportunity, social proof, trying to make me feel like other sort of high status people are doing this or believing this reciprocity. When you when you give someone something like Hari Krishna is giving people a flower at the airports, they're much more likely to to feel on some level, not consciously like they owe you something.

[00:39:51]

And so they're more likely to agree to a request or change their mind about something. So these are all kind of effective but insidious ways of changing people's minds. I personally am uncomfortable using methods of changing people's minds that don't ground out in sort of a good argument or logic. So changing people's minds with facts is harder than changing their minds with a smile or a gift or a nice haircut.

[00:40:22]

So you're like a die hard rationalist then, in the sense that if I present a good argument, you should therefore it or no, it's I definitely don't expect so I wouldn't agree with that sentence if should means I expect people to adopt it.

[00:40:38]

That's in fact one of the one of the pillars of what I consider strong vulcanism to be. Because if you look at the the quintessential straw Vulcan, Spock himself, he is supposed to be the logical, rational one. But he keeps making miscalculations because he expects other people to behave rationally and they should.

[00:40:57]

And he should know this because you've lived among humans for ages. And the fact that he still expects people to behave rationally is frankly quite irrational of him. So I don't expect people to change their minds based on facts and evidence, because I know that's not a thing. And I don't I don't know that I would endorse the moral should either. I don't think people are sort of morally wrong for not changing their minds in response to good arguments. It's just an unfortunate fact of the world and how our brains work.

[00:41:25]

What I was trying to say was more like I I personally feel kind of morally uncomfortable with with changing people's minds without. And evidence, so it's it's kind of a constraint on my ability to change people's minds that I have to work, but I force myself to work within essentially, and that doesn't mean I expect to succeed. It just means I don't want to succeed if I'm not following that constraint. Does that make sense?

[00:41:52]

Yeah, totally. So one thing that struck me as you were saying, how we pursue our avenues to pursue changing minds is I just had in the back of my mind the whole climate change thing where we have or even the cigarette companies who are trying to discredit, I guess, the the growing evidence as it was coming. And it seemed like their playbook was more creating uncertainty. And how effective is that when you're trying to change the mind of a group or make them at least or create some sort of uncertainty?

[00:42:28]

How do you how would you go about doing that?

[00:42:32]

You're asking me to put on my evil hat. Is that what you want me to do?

[00:42:36]

Well, that presumes that you didn't have it on to begin with. So I don't want to make any assumptions, create uncertainty.

[00:42:43]

I mean. Well, I think it's quite easy to to present what looks like a very compelling case for whatever you want to convince people of using, using studies, using quotes from experts, using even even randomized controlled studies, which are sort of the gold standard, allegedly the gold standard in scientific research. The problem, like the reason this is possible and the reason this is all such a problem is that there's so much scientific research that's done that it's quite possible to end up with a few studies that seem to support the claim that, I don't know, cigarettes aren't linked to cancer.

[00:43:31]

I actually haven't looked that up to see if there are studies showing that. But you can if you do enough studies, some of them will even just by chance and often by experimenter bias, will turn out to show whatever you want. And so you can easily sort of selectively present the studies that supports what you're claiming. And so even if you have a pretty skeptical, well-educated audience, who knows? OK, I shouldn't believe things of that study.

[00:43:57]

You can still kind of pull one over on them. And and I think we do this to ourselves inadvertently all the time. You know, we kind of suspect that something is true. Like we suspect that the raw food diet is is good for health. And so we Google around and we find a few papers that seem to show that, like I found it, scientific evidence. But, you know, if you were Googling for the opposite, you could also have found probably a lot more studies showing no effect on health.

[00:44:23]

So I don't know. I haven't actually looked at the studies in the raw food diet, but that's just an example. That's interesting.

[00:44:29]

And so what is your take on how you would refute that in the context of climate change? We have, you know, possibly people creating doubt or uncertainty or we can use cigarettes to be more tangible, where I think the evidence is now overwhelming that there's a definite linkage to cancer.

[00:44:46]

Yeah.

[00:44:47]

How would you have combated that if you were the government to trying to kind of crack down and improve this? And then you have these companies that are well heeled and well resourced to fight, trying to create some sort of uncertainty to maybe possibly not combat what people believe, but delay, you know, what seems like an inevitability to everybody else.

[00:45:11]

It's so hard. I mean, I might relax my constraints a little bit. What I do, I don't know. It's really hard. I don't have a good answer to this. No problem. I mean, that's a good answer.

[00:45:29]

Yeah. Yeah. No, I really I think it's really hard and I think I'm not confident that there is a good way to do it. Well, while adhering perfectly to the standards of of just using the best arguments and evidence, I think you probably have to also get charismatic spokespeople and you probably also have to buy lots of ad time to like really to use the principle of familiarity where people hear something a lot, they're more likely to believe it's true and it's not.

[00:46:02]

I don't feel great about that. But, you know, as long as you're staying relatively close to the end of the spectrum of integrity, of persuasion, it might just be necessary to try to get the right message out there. Better answer than I ought to give before we get going.

[00:46:19]

Is there any books that you've read that have had an incredibly meaningful impact on your life or what has had the most kind of impact or change your direction or your thoughts to a profound degree?

[00:46:30]

So the set of books that has had an impact on. He is a little different from the set of books I'd recommend to other people there the particular path that I followed and the particular books that happen to be influential to me, given the place I was at, is somewhat idiosyncratic. But one book that really influenced me in college was by a philosopher named IRA. It's called Language, Truth and Logic. And this is kind of I was going to say it's the it's a philosophy book for people who are kind of skeptical of philosophy.

[00:47:06]

But I also would want people who aren't skeptical of philosophy to read it because I think it would be good for them. It's kind of it's this short, clear down to earth, almost manifesto. Talking about the importance of clear thinking and philosophy would be a sort of vague way to put it. And it really had an influence on the way. I think I think I'm more likely now to do things like if I believe something like let's say I believe so-and-so is unfair or biased or let's say I believe rationality works or whatever belief I have, I'm more likely to ask myself to make myself cash out that belief.

[00:47:49]

In terms of a concrete prediction, what do I expect to see differently in the world? Because this claim is true. So forcing myself to get really concrete to the point where my belief could potentially be disproven by evidence or at least made made less likely by evidence. And this, I think, came out of reading language, truth and logic. He was basically he was annoyed at the way philosophers would debate these kinds of empty, meaningless questions.

[00:48:16]

He was pointing out a lot of these questions could not be cashed out in terms of concrete, can't refute them or yeah, there's sort of two to empty.

[00:48:26]

There's this expression not even wrong.

[00:48:29]

Like like I'm not calling your paper. You're wrong.

[00:48:33]

It's not even wrong. It doesn't even make enough sense to be wrong.

[00:48:36]

And and so a lot of people think that I kind of went a little too far in his condemnation of philosophy. And I would I would agree. But I think it's still a very useful sort of bracing splash of ice water in your face and like a good a good sort of habit to have in the back of your mind when you're thinking about things and evaluating claims. And it's an easy read, as I said, for short, I think you just get it online.

[00:49:01]

So check that out. Yeah, that's one. I would also so obviously the the work about human irrationality has been very influential to me. It's hard to pick a single book about that. I've already named Daniel Kahneman book Thinking Fast and Slow, which is Dinis.

[00:49:20]

Oh yeah.

[00:49:21]

I've already kind of answered your question, but I'll give you a couple more briefly. Oh, the other book about rationality that I would recommend is a new book by Phil Tetlock. It's called Super Forecasting. And the thing that I said towards the beginning of the podcast that the motivation for founding ce pas was that there really wasn't much research at all on how do we overcome these biases. Well, super forecasting is an exception to the rule. Phil Tetlock has been doing amazing work studying interventions to try to make people better at making predictions about things better, at evaluating arguments, that kind of thing.

[00:49:58]

And this book explains what he did and and give some really pretty impressive results showing the effects of these interventions. Basically, Phil Tetlock, team of forecasters, outperformed by by a huge margin all the other teams of forecasters and experts in politics and economics who were trying to make predictions about world events. And that's because Phil team was using these techniques. So it's pretty inspiring story, plus being full of good data and and examples. And then I guess the last thing I want to recommend is it didn't really change my thinking just because I was already kind of immersed in the subject.

[00:50:41]

But it's the kind of thing that would have changed my thinking and really inspired me if I'd read it a little earlier. It's a book by a friend of mine, Will MacAskill. It's called Doing Good Better, and it's basically making it's basically applied philosophy. So it's the premise of the book is that the way that people instinctively kind of by default try to help the world is is really ineffective and inefficient. And it's driven by these kinds of system, one sort of intuitive heuristics that aren't always that epistemically rational.

[00:51:18]

So, for example, people people are more likely to give to a charity if they see a picture of a starving child on the cover of the brochure. And that's very understandable. It's very it really taps into our emotions and motivates us. But that often has no connection to whether the charity is actually. Effective at helping people and saving lives and improving the quality of life, so effective altruism is I mean, simply put, it's just the idea that for whatever level of of money or time or effort or sacrifice you want to put into trying to help the world.

[00:51:50]

And we all have different levels that we're comfortable with. But for whatever level, there are vastly better and worse ways to do it. They're literally like orders of magnitude difference in the effectiveness of different charities trying to do the same thing. So just by by kind of stepping back from your automatic emotional reactions and asking yourself, OK, what does the evidence show you can just do? You can save one hundred times more lives with the same amount of money that you're donating to charities over your lifetime if you choose your charities.

[00:52:20]

Well, that's sort of the tip of the iceberg. But it's I like it. It's inspiring. It's sort of a great example of rationality being used to make the world a better place.

[00:52:31]

That's an awesome way to think about this as we head into the end. Listen, I want to thank you so much for your time today. And this conversation was fascinating. Thanks so much for having me on.

[00:52:45]

Hey, guys, this is Shane again, just a few more things before we wrap up. You can find Schnoodle at Farnam Street blog, dotcom slash podcast. That's fair. And S-T are E.T. blog, dotcom slash podcast. You can also find information there on how to get a transcript.

[00:53:05]

And if you'd like to receive a weekly email from me filled with all sorts of brain food, go to Furnham Street blog, dotcom slash newsletter. This is all the good stuff I've found on the Web that week that I've read and shared with close friends, books I'm reading and so much more.

[00:53:18]

Thank you for listening.