Transcribe your podcast
[00:00:00]

Today's episode of Rationally Speaking is sponsored by Livewell Give Oil takes a data driven approach to identifying charities where your donation can make a big impact. Give all spends thousands of hours every year vetting and analyzing nonprofits so that it can produce a list of charity recommendations that are backed by rigorous evidence. The list is free and available to everyone online. The New York Times has referred to give well as quote, the spreadsheet method of giving give. Those recommendations are for donors who are interested in having a high altruistic return on investment in their giving.

[00:00:30]

Its current recommended charities fight malaria, treat intestinal parasites, provide vitamin supplements and give cash to very poor people. Check them out at Give Weblog.

[00:00:53]

Welcome to, rationally speaking, the podcast, where we explore the borderlands between reason and nonsense. I'm your host, Julia Gillard, and my guest today is John Nursed.

[00:01:03]

John works as a data scientist based in Sweden. He blogs at everything, studies dotcom. And the reason he caught my eye is that John has essentially invented a new field called Earthology, which doesn't quite exist yet. But I think it absolutely could and should. It's the study of disagreement, and that's what we're going to talk about today. John, welcome to rationally speaking.

[00:01:28]

Thank you so much for inviting me. It's a pleasure. So, John, I said you you work as a data scientist, but I got the sense from reading your website that your background is in philosophy. You've published philosophy papers. How what was your actual degree in? I have an engineering degree. It confuses people a little bit because I, I did write one work in philosophy, but it's not published in any journal or anything like that. I don't know if maybe you don't have the equivalent in the United States, but at a certain level, when you've studied for a subject for certain, a certain time, you're supposed to write your first original work.

[00:02:09]

And that doesn't that doesn't have to be publishable. But you have to write, you know, an original work.

[00:02:15]

And I. I have an engineering education that's rather unusual. Because I have studied a lot of topics and, you know, fields that engineers typically don't study. You hear a lot of complaints from some people these days that engineers, they don't learn enough humanities to become well-rounded people and, you know, understand people very well. Well, when people make those complaints, they're sort of asking for more of me to be created nicely.

[00:02:49]

But if they're more structured to the degree than that, is it like, you know, you all?

[00:02:54]

Very much so, yeah. It was started it was started about 20 years ago now. I started it when it was only only six years since it started and. It was a reaction to a perceived need for more well-rounded engineers, so they took a regular engineering education five years long and they took out everything that wasn't exactly needed. They took out, you know, all the all the math and technology that was more than was needed to be called an engineer.

[00:03:26]

You know, that you would still qualify as an engineer. Yeah, but all the rest, all the empty space they filled up with courses and history and philosophy and economics and economic geography and economic history and business and all that sort of thing.

[00:03:40]

And we had to read a lot of research, humanities research about technology and its impact on society and what it means and how how it comes about, how scientific knowledge works, how it is produced and all that.

[00:03:57]

That's so yeah, it was it was I picked it because I because I found it fascinating, it was a little bit of a risk because it was because it was entirely new. Right. I had no idea what my employment prospects were, you know, coming up with this degree.

[00:04:14]

But I've done fairly well for myself.

[00:04:17]

Man, I would be so curious to see how differently people with this degree compared to normal engineering degree perform a jobs and, you know, differences in how their employers see them or, you know, what they've achieved in 15 years or something like that, although I imagine it'll be hopelessly corrupted by selection bias.

[00:04:35]

So I don't know how much we can learn from it, but yeah, like everything cool. Well, I guess with that background, I'm I'm less surprised than I was that you sort of invented an interdisciplinary field. Why don't you tell us where, first off, just what the word earthology means and where it came from.

[00:04:56]

Well, it's a new field, it's a field of study that I want to exist and all self respecting fields of study, they need to have a Greek name, of course.

[00:05:04]

And Eris is the the goddess of discord, the Greek goddess of discord that created or started one big disagreement that eventually led to the Trojan War. So that's a pretty good name. Me for me. It's great. Yeah.

[00:05:21]

And it goes well with the with the suffix ology.

[00:05:24]

So good on the naming.

[00:05:28]

Thanks. I don't remember exactly how I came up with it, but yeah. And as you said before, it is the study of disagreement and I wouldn't exactly say that that's the new field because a lot of people write, they do research and they write things that is relevant to it. Yes. You know, of course there's moral foundations theory and there's a lot of political science about what different ideologies people have. And there's a lot of philosophy about how arguments work.

[00:05:55]

Right?

[00:05:55]

Well, it seems interdisciplinary in the same way that like the field of decision theory or decision sciences, interdisciplinary in that like you have, you know, psychologists working on the descriptive side of it, like here's here's how people actually make decisions. Then you have computer scientists or philosophers or economists reasoning about how people, quote unquote, should make decisions for some meaning of said.

[00:06:16]

So I got the sense that Aristotle was kind of interdisciplinary in a similar way. Yeah, very much so.

[00:06:23]

That's what I'm that's what I'm thinking. Behavioral economics is a good is a good example of a similar field that. Yeah. Brings together economics and psychology and all those things.

[00:06:33]

You know, every soldier I'm thinking when what I'm saying it, I'm thinking of something that takes material or insights from many different fields, including philosophy, of course, and anthropology, where you can study how differently people see things in different cultures and psychology, of course, how we work, how people work in economics that studies signaling behavior, which is also an important an important factor.

[00:07:04]

And, of course, cognitive science that studies how how concepts work in our head, because I think that's very important. How we represent information in our heads is relevant for how we how we interpret information out there in the world and how other people interpret that information differently, which is a huge a huge thing for for disagreements.

[00:07:26]

So, yeah, there are material from plenty of fields that you can bring together into one and. I think every soldier has a concept. Should exist as a center of gravity for all insights that's relevant for understanding this agreement and how it works.

[00:07:43]

And when I say disagreement on perhaps mostly thinking of online disagreement and what happens when people are fighting online, which happens all the time, I didn't realize that you were envisioning that as the center.

[00:07:58]

I think I do, because. Well, the vast majority of disagreement that I come across is online. Because I don't I don't really get into fights with people in your life and I don't really see people fighting about things online. I mean, in life that much.

[00:08:14]

But if you go into a common thread on Reddit or, you know, go on to Twitter at any, you know, at any time or read a forum or anything, people will be disagreeing with each other and they will be misinterpreting each other and they will be misrepresenting each other and all that. And I've been reading common threads and forum discussions online for for probably 20 years.

[00:08:39]

And I've been actually been counting this. And I'm thinking I might be up to that famous 10000 hour mark.

[00:08:45]

I was an expert in online disagreement and Twitter threads.

[00:08:50]

I'm picturing your eyes looking hollow and and vacance after your 20 years of reading Twitter disagreement or the equivalent before Twitter, Twitter, Twitter disagreement is the that's the latest evolution, really the most powerful stuff for most of the Internet's young life.

[00:09:11]

It's been forum discussions. There has been, you know, the standard format of disagreement. And that's a little bit less a little bit less virulent than on Twitter is a little less chaotic.

[00:09:23]

So have you developed any theories over the course of your 10000 hours or so about which formats, you know, yet Reddit threads or Twitter or Tumblr or something?

[00:09:34]

Facebook, which formats online are more or less conducive to good disagreements?

[00:09:40]

Yeah, I think as I was saying, I think Twitter is probably the worst of everything.

[00:09:45]

How come and not why do you think that? But why would that be the case?

[00:09:49]

No, because you know that as well as me, that it's terrible, but no comments.

[00:09:57]

I mean, what features of Twitter do you think are causing it to be worse? I think what's the reason it's worse than than forums is the forums. They are they well, they have a set context. You know, in a forum, there are certain there are regular people that forum might have a theme, you know, there are certain etiquette in this forum or we have a better understanding of what the context is in a forum. The thing with Twitter is that there is almost no separation into different contexts.

[00:10:29]

You can just see something at any time, something you pop into your into your field of vision.

[00:10:37]

That comes from what I would call an idea space, very, very distant context. You know, people who believe very, very different things than you.

[00:10:47]

And they might be in theory, you know, in concept space, they're very, very far away. But on Twitter, that sort of distance doesn't exist at all so that all the other worlds between different contexts are broken down.

[00:11:00]

So you can see this this fragment of a fragment of alien thought that really, really might annoy you.

[00:11:08]

The interesting thing about that theory or that that factor you've pointed to is that it would it would be present even if everyone was perfectly good natured and and like emotionally charitable and not, you know, looking for outrage or looking to criticize or like assuming the worst about people. So it's it's almost a pessimistic model in that like even if everyone were angels, we would still have this cognitive problem of of like misinterpreting or misunderstanding what people mean because of lack of context.

[00:11:46]

And if we were perfect angels, it would play out differently than it does because we would probably be confused more than we are angry. Hmm. That's a good way to put it.

[00:11:57]

And I think I think in most cases when we are angry, we probably could be confused. I mean, we probably should be confused more than we should be angry.

[00:12:05]

There's this interesting thing people do when I say interesting. It's not really it's knowing where they they say they're confused or they say, like, I really I don't understand, you know, why you think such and such. But they they aren't actually expressing confusion. They're expressing kind of it's like performative confusion where they're like when they say, I don't understand, they they mean like I think it's terrible that you think that.

[00:12:32]

And like, I would never. Yeah. So far from how I think. Yeah.

[00:12:36]

Yeah. That's the type of that's the type of rhetoric of the style. Look, isn't it funny how those people do that over there. Well, it's interesting that all the people who are saying this are also saying that we're amusing, it's amusing, there's a lot of informative amusement as well online. Um, what what do you think are people's biggest misconceptions about disagreement, either descriptively like what's actually happening when a disagreement occurs or normatively, like about how one should approach a disagreement?

[00:13:09]

People might have, in your view, mistaken beliefs about how to the best way to approach a disagreement?

[00:13:16]

Well, the second one is more is more tricky because people want different things out of the process of disagreement. Like you don't necessarily want to do something.

[00:13:26]

You know, when you're disagreeing with somebody, you don't necessarily want to do something that I would consider to be constructive, like effectively communicating an idea or engaging in a mutual process of evaluating an idea or a set of ideas together, which would be, you know, ideal function of a public debate. But often you just want to you know, you want to slam the enemy. You want to get a good zinger in there or you want to impress your friends, or maybe you just want to you just want to vent.

[00:13:58]

You're having a bad day or, you know, you want to build bonds with other people who are watching the disagreement and you want to show that you're on their side.

[00:14:09]

So I think we're making a mistake if we're assuming that everyone wants to disagree in a way that results in effective evaluation of ideas.

[00:14:19]

I'm a little more interested in, you know, when there are two people who at least consciously think they are trying to disagree about ideas, what do you think they do wrong?

[00:14:31]

Well, one thing that people tend to do wrong, I think, is to assume that a disagreement means that one of the parties is wrong. You know, someone is right and somebody is wrong, and that's what we're trying to find out. Hmm. I think that's rarely the case.

[00:14:45]

I mean, of course, people disagree about things that have right and wrong answers, like math theorems or what's the capital of Spain, which these as these questions have real answers.

[00:14:58]

But those aren't really interesting disagreements. They don't cause any sort of chaos. They don't erode the public sphere or damages public debate or anything like that.

[00:15:11]

They're very simple. We don't need an elaborate theoretical construction to to to deal with those in most cases. But we're dealing with maybe dealing with a disagreement. People are disagreeing because they each they each have adopted a very low resolution beliefs like a very something very abstract in general.

[00:15:31]

If somebody believes the capitalist class is exploiting the workers and the other person thinks we must let entrepreneurs create wealth for all of us or something like that. Those are very, very abstract kind of beliefs.

[00:15:50]

They don't really get proved or disproved because none of them map onto reality in any in any in any simple way. In a straightforward way.

[00:15:59]

Yeah, there are more stories that than they are beliefs. And I think many of the things that people disagree about in the more, you know, most complicated ways they are believes more of this kind. Very, very low resolution, very abstracted, more story like, you know, fact like. So that's a big misunderstanding, I think. And when you go beliefs like that, they are not true or false. They are typically kind of true or kind of valid.

[00:16:25]

I mean, true or false doesn't even apply to them.

[00:16:27]

Exactly. Yeah. So we need to understand that. Proving that you yourself are correct doesn't mean that the other person is wrong and vice versa. There's this cartoon that sometimes gets shared on Facebook of two people pointing at a symbol scratched into the ground. And one person looking at it from one angle says it's the sixth. And the other person looking at it from the other angle says, no, it's nine. And the point of the cartoon was, you know, they can both be right.

[00:16:55]

I'm sure it was said more casually in the cartoon. But then there was an updated version of that cartoon that got shared more widely, at least in my corner of Facebook. That was like that added another caption saying like, no, actually, they both can't be right because, you know, there's a truth of the matter about what the original artist intended when he drew that symbol. Like which which way was he standing when he drew it? Was he standing in such that it was a six or nine like don't don't try to create false agreement when they're actually, you know, is a truth about who's right and who's wrong?

[00:17:24]

That's actually quite interesting. I haven't I haven't heard that before. It's interesting because it does this thing that's kind of problematic with analogies. It takes to X features that don't really transfer to the thing the analogy is trying to represent. Yeah.

[00:17:42]

For instance, nobody nobody created reality and didn't mean something with it. And they were still disagreeing about some pretty basic fact. Now the real world is is much, much more complicated than that. And we're not we're not trying to find one single fact like is there's a six or nine, but we're trying to compress reality and representing it in very in a much smaller piece of information than, you know, actual reality. And we're doing it in different ways.

[00:18:08]

And we're trying to discuss often which which one of these which one of these different comparisons of reality is more valid. And that's it's an extremely hard question.

[00:18:18]

I so I completely agree with you. And I think it's a good and like underappreciated point. The disagreements are often about these kind of low resolution abstract narratives like capitalism exploits workers. And, you know, it's important to let entrepreneurs create wealth for us and that it's it almost doesn't even make sense to talk about whether they're like true or false, because they're not detailed enough to really, you know, have truth value. Right. Right. So I agree with that.

[00:18:46]

It's a great point. But on my more optimistic days, I think that you could if you actually had the time and good faith willingness to put in the effort, you could sort of hammer out specifically like, well, when I say that capitalism exploits the workers here, the sort of more specific empirical claims that I'm making and or moral claims that I'm making, we can sort of factor out the two from each other and we can, you know, talk about the different components of my belief, which maybe I've never really consciously formulated before.

[00:19:21]

But they're kind of in there in the background, causing me to feel like the statement capitalism exploits workers is true. And if we really put in the effort, we could sort of figure out which parts of those views we agree and disagree about. And, you know, maybe some of it will boil down to empirical questions that we can't really answer definitively and we have different intuitions about. And maybe we'll end up having, you know, some moral disagreements.

[00:19:43]

But like, we could do that if we really tried. It's just that when we talk online, we rarely do. You know, how much do you agree with that optimistic view?

[00:19:53]

I think we definitely could if we if we really, you know, try to hammer down the details as you as you said. What's interesting to me is that, well, I'm interested in how this all feels from the inside, how it feels in our heads. What do you mean? And let's let's let's describe it this way.

[00:20:13]

I mean, if particle physics is smashing particles together until they break, so you see what they're made of, disagreements is a way to smash minds together and until they break and see, you know, what they're made of. So part of the reason I'm interested in disagreement is because it tells us things about how minds work.

[00:20:33]

Hmm. And yes, we could we could, you know, break down our high level, low resolution beliefs into more specific beliefs and debate them.

[00:20:44]

Absolutely. We we could and we should do that, but. I don't think that is how our believes feels like in our heads.

[00:20:55]

I don't think that's the way we have beliefs always.

[00:20:58]

And you said that you don't you don't believe that that anybody necessarily has thought about their beliefs that way. They haven't thought about them in greater detail. And that's interesting to me that we tend to have our beliefs and not very we keep them in our heads in not very specified forms. Yeah. And that is the level on which they differ. I mean, we can have beliefs that almost you know, if you try to specify them in great detail, they might even you might look the same.

[00:21:31]

But if we abstract them in different ways and had little different connotations on them, they would seem like they are different.

[00:21:39]

And we disagree about it. But we don't necessarily disagree. I mean, I had this I had this friend when I was a student. He was a it was a gender studies student. And we often discussed in the large scale pattern in society between, you know, between the sexes.

[00:21:55]

And the more we spoke. I mean, I didn't I'm not I'm not a great fan of that whole theoretical construction. And he was and we discussed it. And the more we were talking about this, the more we realized that when we pointed to individual facts about almost anything, we didn't really disagree that much at all. Hmm.

[00:22:13]

We believed almost not exactly, but almost that the same things were true, but only in the abstract when you when we sort of take all these individual beliefs and turn them into into high level beliefs. They they looked very differently, and I think that happens a lot. Yeah, so this feels like it might be a way to describe what I would have gone on to call the pessimistic, what I what I believe about disagreement on my pessimistic days, which is that you could you could sort of hammer out disagreements on those specific components about, you know, capitalism and the economy and wages and so on.

[00:22:52]

And even if you could hammer out that disagreement, it still wouldn't feel like you had actually resolved anything.

[00:22:56]

Oh, you mean do you think it wouldn't resolve anything or.

[00:23:00]

Well well, it still wouldn't feel like you had like you had done any useful work on the original disagreement that you cared about, which was about capital. I think I'm saying what you were just saying, that you're that like you can you can talk about these specific components and the specific facts and still feel like there exists an important disagreement between the two of you that you don't know how to adjudicate.

[00:23:25]

And I don't know, maybe some people wouldn't call that pessimistic.

[00:23:27]

It feels, I feel an urge to be able to get to the bottom of these things. And that feels that feels like a pessimistic state of affairs to me.

[00:23:37]

But, yeah, I mean, do you think the take away, assuming that's the case, that, like your experience with your gender studies friend is common? Do you think the takeaway is that you both just have, like, different emotional associations with these concepts? And that's why the disagreement still feels like it persists despite agreement on the facts? Or do you think it's there is a real disagreement there that you just can't quite get a handle on?

[00:24:07]

That's a that's a hard question. I mean, there are. Factors that cause you to to generalize patterns in different ways, I think your own experiences, whatever inborn temperament we might we may have for certain cognitive styles or what what other theoretical frameworks you've you've learned before and of course, your own, you know, emotional reactions to things.

[00:24:37]

What I think one should do and what I try to do is to learn how to look at things in different ways. There's this there's this model here, and it generalizes reality in this way where compresses reality in this way. And there's this other model that focuses on getting these very different features. Right. And when you're trying to get different features, I to describe them accurately, you're going to use different a different set of rules and abstract in a different way.

[00:25:06]

And there are many kinds of different belief systems that capture different parts of reality or the human experience. You know, not as well, yeah, some belief systems, the systems, they capture, some things well or not others and for other belief systems, it's the other way around. So that's why you need to you to collect so many of them. You really should not have just one.

[00:25:32]

I think it reminds me of this quote. I think it was Robert Hanssen that said it. Philosophy is mostly useful as a defense against other philosophers where I mean other other philosophies or other philosophy that suggest that the ideal situation is to never study philosophy and then you won't need a defense.

[00:25:53]

Oh, no, no. There's this other quote also. I think it's about economists from the beginning.

[00:25:58]

But I mean, it also applies to to a philosophy, which is that if you think you don't have if you don't have beliefs of economics, you just have the belief of some dead economist.

[00:26:10]

Yeah, that's that's a paraphrase. But yeah, everybody has philosophical beliefs.

[00:26:15]

And if you don't understand, you know, the nature of philosophical beliefs or learn about other philosophical beliefs, you're not going to know what they are and you're going to be their prisoner more or less.

[00:26:30]

And that's where I want things like anthropology to be part of of of learning about Earthology because you learn how to.

[00:26:38]

Think about things in a very different way, because different culture thinks about think about things in different ways.

[00:26:44]

Do you have any examples of of new frameworks, new ways of looking at things that that you sort of consciously adopted that ended up being valuable to you that you didn't expect consciously adopted?

[00:26:55]

Oh, I just mean, like, you didn't start out with you had you had to, like, seek out and try on, you know, between 10 and 15 years ago, I was really hostile to what people carelessly call postmodernism. This idea that there's no there's no definite knowledge and there's no definite meaning to words or anything like that, I I learned about it from people who were criticizing it and, you know, pretty upset about it as well as many of us do.

[00:27:22]

And of course, I didn't I didn't like it either.

[00:27:25]

And but there was a part of my my education to learn how to read texts by people who were of this persuasion, who had this sort of attitude to to life or to to science and knowledge and all that. And it was it was annoying at times because people had very different assumptions about what was important, what was interesting. And they did not acknowledge that there was a conflict here. That, OK, I'm making these certain assumptions and am ignoring this other thing here, often people just don't say anything like that and you just read it and you're supposed to to just follow them along on their little journey they're going on.

[00:28:08]

And it's very frustrating to read something when you don't have the same background, assumptions or preoccupations as the author. You just want to you want to start arguing against them like every other sentence, like, no, that doesn't follow from that. Why do you care about that? That was not the takeaway from that last paragraph. You know? You know, it's exhausting.

[00:28:26]

What's an example of an author in this in this camp of of quite a lot of it's it's a carelessly used words.

[00:28:35]

But, yeah, we were reading reading texts by people like Andrew Pickering and Evelyn Fox, Keller and Sharon Treweek. They're not they're not super famous outside of the science and technology, science and technology studies, you know, field.

[00:28:55]

So they typically described things like particle physics as an ideology rather than what. Now I'm reading about reality.

[00:29:06]

So, yeah, it's very it's very annoying to read somebody who does that when they don't even recognize that. This is a very particular perspective. And my my professor, he picked out he picked up people who wrote.

[00:29:23]

Rote from from a perspective, is called methodological relativism, which which means that when we describe historically why a particular idea became dominant in scientific fields, like why do people believe in relativity or why did they believe in the germ theory of disease or whatever, we're not supposed to make any sort of reference to the fact that it was correct.

[00:29:50]

It was supposed to be described as purely a social process. Who convinced to what sort of thing is convincing? What sort of thing is convincing, you know, and for what reason and all that. And as far as I'm hearing, you're unhappy right now.

[00:30:04]

Yeah, it's just so annoying to read about. But did you get out of it?

[00:30:09]

There's got to be a bug coming here. Yeah. Yeah, there is a there's a there's a but coming. I was annoyed by this because I took it as an attack, as an attack on the science and objectivity and, you know, everything, everything one holds dear.

[00:30:26]

But there is there is something there's something in this that is correct and something about it that gives you important insights. And I have I have adopted and understood. Some of the philosophy called postmodernism, which I think is which I think is largely correct, namely that we don't have the absolute kind of knowledge that that the earlier modernists like the logical positivist and all that, which they believe that you could get through a systematic study of, of science. You can get certainty.

[00:30:56]

And I I think that the criticism of those modernist projects that the that really that postmodernism is as as a philosophical movement, that their criticism of that was largely correct because they failed.

[00:31:12]

And the definitions of words, as I talked about many, many times.

[00:31:20]

They are they're slippery, they're not they're not they're not objectively correct definitions of words in a metaphysical sense, in the way that philosophers have seemingly believed ever since Plato or before.

[00:31:36]

And then, like the logical positivism, also say that and a much clearer and more straightforward way that maybe they did.

[00:31:43]

I'm not I'm not an expert in a different way to to ask. My question is so so as you were saying, the quote unquote postmodernists that you read and they were speaking from within this whole, like, world view and they weren't they weren't trying or able to sort of step outside of it and say, look, here are the assumptions we're making here, the framework we're using. It's one possible framework, etc. And that framework was like kind of frustrating for someone who was not for whom, but that wasn't a natural way to think.

[00:32:15]

But like, as you're communicating what you got out of it to me, I'm just wondering, could you have gotten it much more easily from someone you know, who thinks the way you do, who just says like, hey, here's like the value of science and and reason and truth and objectivity, like you're exceptions to the rule.

[00:32:36]

And you could have gotten that same insight without all the frustrating detours into postmodernism.

[00:32:40]

Oh, absolutely. Oh, that's absolutely, absolutely true. And I mean, that's that's part of what I want to do.

[00:32:50]

And I've been Googling things like postmodernism for materialists. Oh, interesting.

[00:32:56]

And I was arguing with my professor that I was I found it really frustrating that. These deep thinkers, these writers, they did not put their theories on a sort of solid metaphysical ground. They didn't explain how it related to physical reality and all that. You know, how do you how do you get this in a physical universe? And he said, well, you know, it's not it's not so important here in history. We're more interested to be interested in studying power than studying metaphysics.

[00:33:27]

And that's that's something I didn't like. I'm more of a philosopher than a historian that way.

[00:33:33]

So you want to be a translator? You want to be a guide between worlds, that is. Yes. Yeah. There's something something we need because that sort of material doesn't exist. Not as much as not as much as it should, because there are there are certain shortcomings in how science works. I mean, it could work better the knowledge production, knowledge distribution and idea evaluation and all that and the choices scientists make when they study certain things, like especially in the humanities or the social sciences, less so in the physical sciences.

[00:34:10]

But still there, too.

[00:34:12]

They make certain choices and some theories are are adopted and others are forgotten or they're rejected. And as I said before, many, many scientific ideas are not so extremely detailed that they can be you know, they can be. It can be considered completely true or completely false because people make generalizations in the scientific world as well and all these phenomena, they can be studied as sociological phenomena. That is that is true. That is a that is a valid approach.

[00:34:45]

I think so. But in order to absorb those insights better, you need to know that people aren't trying to tear down science and replace it with revelation or personal intuition or whatever it is. It's really important to understand that these are corrections to. To earlier overexuberance, I wrote in one of my pieces, I think that. We should understand the philosophical arguments called postmodernism as a reaction to to the overpromise of the earlier modernists, but I'm I was born in 1983 and that's four years after the book that's called The Post-Modern Condition was was published.

[00:35:37]

So I have grown up in. Quote unquote, the post-modern era, so. The arguments that they were, you know, that they were trying to correct against. I don't even I don't even know them. Yeah, I have. I barely heard. I have not grown up in a world where that was assumed. So it had to be criticized.

[00:35:59]

Yeah. You know, this as you've been talking and also as I read your blog, I've been trying to think about just like good principles to have in mind that might help help prevent some of the more frustrating failure modes of disagreements. And it seems like one theme that's emerging is is like understand what the other person is arguing against or what they feel needs correcting, which is OK.

[00:36:27]

So to give you an example, just as I was walking to the studio to tape this episode, I was on my phone on Twitter and I was having this friendly disagreement with Russ Roberts, who does Icon talk about it started out as a disagreement over whether it would be useful to run a study like just a survey, long term survey on people who were unsure about whether they wanted to have kids. And then look at, you know, 20 years later, the people who had kids and the people who didn't, you know, ask them about their life satisfaction, their, you know, whether they regret their choice.

[00:37:04]

Oh, and then I also said, like, ahead of time, you should ask them all a bunch of questions, like, do you enjoy playing with kids? Like, are you satisfied with your life now? Do you feel enthusiastic about having kids? Like, what are your main reasons for hesitation? And then 20 years later, you could you could look at sort of what factors tend to predict people being happy with their ultimate choice.

[00:37:24]

And anyway, so Russ objected to this saying, like, you know, you can't learn anything from data.

[00:37:33]

You need to just, like, take the leap or, you know, I think even said like reading fiction, like Jane Austen would be more useful than running a study like this, which seemed completely absurd to me.

[00:37:45]

You're friends with this person? Yeah. No, no, he can't talk you.

[00:37:49]

He's, you know, love following him on Twitter. And it's a great podcast. So where I was going with this, and I do think we made progress in our in our disagreement.

[00:37:58]

The ultimate result of the conversation was we actually agree like quite significantly that, yes, collecting information about how people feel about their how their choices turned out, that is useful.

[00:38:13]

But Russ was much more concerned than I was about people like overweighting such evidence, especially if it's called like scientific evidence or, you know, the results of a study.

[00:38:26]

And he was much he was sort of more concerned than I was about people like failing to correct for for like mentally correct for things like, I don't know, like selection bias or confirmation bias or all the things that can make study is, you know, less than perfect, perfect evidence.

[00:38:45]

And and so it's possible he, like, overstated his position because he was reacting against what he thought my position was, which was that, you know, you run a study and now you know the answer to, like, whether you should have kids or not, which was never my position, but like maybe it is a lot of people's position. And so, yeah, maybe the like a take away from that and from from some of the examples you've been giving is when someone's making an argument, it's like hard to interpret that argument without knowing what it's an argument against.

[00:39:16]

Yeah, precisely. I think that's a great example of precisely the thing.

[00:39:24]

Uh, yeah. The thing is. People will assume all kinds of things that you're not actually saying because we cannot communicate our full position when we're when we're trying to say something and you need the other person, just fill in the blanks and. What's important in this in this little discussion that you were that you were talking about and the same thing saying I see all the time is that we have these certain assumptions about what everybody else believes. You know, everyone in society thinks that, you know, experience is the only thing that counts and data doesn't.

[00:40:03]

You know, everyone just ignore scientific studies and the other person says, no, everybody just takes them far too seriously, right?

[00:40:11]

Yeah, exactly. Yeah.

[00:40:12]

And what do you what you believe? Is the case depends, of course, a lot of what you what you see around your window, but environment, you're you're in, you know, online or in real life or that.

[00:40:25]

And these things tend not to be.

[00:40:29]

They tend not to be explicit with them, not to say that, oh, everybody thinks this so so therefore I'm arguing this this is this is unstated. This is implicit usually.

[00:40:41]

So there are many moving parts here because people have different ideas of what the what what the background assumptions are. Yeah. And then what the other the relationship the other person has to.

[00:40:56]

The the background assumption, well, you know, we might have different ideas about what the what society thinks and we have different ideas, what the other person thinks and how they relate to what society thinks and why why they got that image.

[00:41:09]

They may have gotten their image by misinterpreting people. Mm hmm. You know, I might go around and misinterpret what everyone says. So I think everybody believes something that they don't believe. So there are so many moving parts in this. Yeah, is this related to the concept you wrote a post about called Zebra's where like the figure ground relationship, it wasn't it wasn't called Zebra's.

[00:41:31]

It was called the signature character. But it was it was about correctives like this. I use the example of a SEABRIGHT because you can call a zebra, either a white horse with white with black stripes or a black horse with white stripes. And it's sort of silly because it's the same thing. But when we have more complicated ideas like, yes, we should trust science, but it also has these imperfections or someone else can think, well, science is mostly useless and we should rely on intuition and, you know, our own experiences.

[00:42:02]

But I guess it can be useful sometimes. Yeah.

[00:42:04]

If the. If they're trying to make some sort of practical decision about should we trust this particular study and do this particular thing, they might you know, they might come down very close to each other. They might agree because they're both moderates in a way. They both agree that both perspectives have some some value. And most people are like that. I mean, I don't think most people are these ideological zealots that just they just believe one thing and one thing only.

[00:42:33]

I think those are overrepresented among the people who shout the most on social media. I don't think most people are like, yeah, so it depends on which. I think it matters a lot which order you believe things in. Order in terms of like chronological or no? No, I mean that something is there something is the basic thing. And then there is the other thing that corrects it in the opposite direction, right?

[00:42:59]

Yeah. Yeah.

[00:43:00]

I mean, do you you throw a ball further away and then it rolls back or you throw a ball a little way and then it continues to roll forward like that thing about the thing about having a signal and a corrective is the signal a corrective as it's a play on Signal-to-noise, by the way. So. We have a signal and a corrective, you tend to. First, be concerned that your signal is respected, I mean, your basic belief is something that needs to be respected.

[00:43:32]

Once you know that the other person isn't threatening that, I mean, you're not against science or you're not against, you know, personal experience fundamentally, then I will be prepared to, you know, show you that I kind of sort of agree with you in that my my very basic first order approximation is not it's not fully correct.

[00:43:53]

Right. Yeah. But it requires that. It requires that the fundamental beliefs is acknowledged as legitimate first.

[00:44:01]

Yeah, I think that that's actually a perfect description of what I think was happening with me and Russ and that I really needed him to acknowledge that doing a survey and, you know, doing the kind of survey I described was was better than zero. Evidence like that seemed like a very fundamental point that I needed him to acknowledge. And he did acknowledge that. But but his sticking point was he needed me to recognize that people overweight data and data is not conclusive.

[00:44:30]

And there are a ton of ways that it can, like, go wrong or not be relevant to your decision, which, of course, I believe it's just more important to me that we first acknowledge that that data is more than zero evidence anyway.

[00:44:41]

Yeah, I think I think our our social sensors play a big part here. We want to know if this person is an enemy or not.

[00:44:50]

I think of us as very aligned with me. But I guess as soon as he said a thing about survey data that seemed wrong to me, it felt like he was on the opposite side of of some very important argument.

[00:45:02]

But yeah, anyway, because it changes the context of the conversation. I mean, if we're if we're on the same team, then, you know, OK, we can talk about this a bit, a little bit more relaxed. Yeah. But if you're on the you know, the fundamentally opposite seem to me that I need to be on my guard only to defend myself. I don't give an inch a topic that sounds like it might be related to this.

[00:45:23]

But either way, I wanted to make sure to bring up at some point in this conversation is decoupling. This is something I wish was more widely known because it seems really relevant to understanding disagreements. And you've written a fair bit about it. Can you explain what decoupling is and how it relates to having good arguments? Yeah, it's a it's a concept from psychology, and I think I got it from a psychologist called Kids Danovitch who's done a lot of research into rationality, thinking I I'm not sure, though, that he would agree with with my elaboration on this concept, because I've been using it fairly liberally and I'm sort of develop my own idea about what it means.

[00:46:04]

But in the original version, I think he means that in order to think in a sort of abstract, hypothetical way, like in logic, you'll need to abstract away and, you know, get rid of all the real life context that might help you help you understand a question like if you give a hypothetical scenario, like if you robbed a bank and then you object, but I would never rob a bank. I'm a I'm a moral person or whatever.

[00:46:32]

Right. Then you don't understand what a hypothetical question is. You know, understand what that question questions and.

[00:46:39]

He talks about cognitive decoupling as removing all the possibly relevant context for a question and just thinking about it, given the stated rules and the problem or, you know, everything that is relevant is present here, it's like a mathematical problem. Everything that's relevant is stated and you only use the information that is in the problem, right? And this is something that people are. Unequally good at some people are from people who are easily and other people don't. Yes, and that's what he calls cognitive decoupling.

[00:47:15]

I use this particular model to analyze the fight between Sam Harris and Ezra Klein about a year ago. Mm hmm.

[00:47:26]

What was it a year ago now was a year ago about Jesse?

[00:47:32]

Yeah, they I don't know if I assume not everybody who's listening here knows about this story, but somehow he's had the political scientist, Charles Murray, on his podcast after he had been I don't know if he was assaulted, but something like that when he was going to give a talk at a university. Yeah, and he's controversial because among many other things, he believes and has said, as far as I understand it, that there's probably a genetic component to the difference in IQ scores between black and white Americans and which is very controversial, obviously.

[00:48:13]

And Harris had him on this podcast to support him when he had been, you know, through this incident at the university.

[00:48:21]

And Ezra Klein wrote, I think I don't know if it was an article. Right away, but here he wrote a lot of things criticizing him and his publication box when he where he was an editor. He published another article that was fairly critical of them.

[00:48:40]

According to Harris, it was really beyond the pale to to write what they what they had written.

[00:48:47]

Anyway, they they had this fight and they were on this podcast together when they were talking, talking about this.

[00:48:54]

And I used this idea of cognitive decoupling to describe the different the different ideals with which they approached the central question because. What they were disagreeing about. It seemed to me was when you ask this scientific question, what is the what are the causes for this this gap in IQ, of course, that they were they were discussing what factors do we bring in, what factors are relevant for, you know, examining this question and discussing it and trying to figure out what the truth what the truth is and.

[00:49:35]

I don't know if you listen to the podcast when they're when they're talking and they spoke for two hours, argued more or less, and a lot of it, as far as I remember, I listened to it twice, but it was a year ago now is simply disagreeing about what things are relevant or not.

[00:49:51]

Was it disagreeing about what things are relevant to the empirical question about the IQ gap, or was it disagreeing about sort of morally which things I could imagine, you know, someone saying like, it's irrelevant whether there is or isn't an IQ gap. That's not even the right question to ask. The right question to ask is should we be talking about this? And I think the answer is no, for reasons X, Y and Z, that's different from saying like there is no IQ gap because reasons X, Y and Z.

[00:50:18]

I think asking the question that question in the first place is sort of an exercise in decoupling because you're separating questions.

[00:50:26]

I can't get out of my decoupling mind.

[00:50:29]

I guess, you know, it's very interesting in terms of how you analyze this question, either from a scientific perspective, you know, is this true or not?

[00:50:39]

You know, or from this social perspective, you see what what sort of role has this issue or this belief played in history?

[00:50:49]

What what consequences kind of have what implications does it have? What are the reasons people have believed it in the past? That's the sort of that's the sort of thing that the decline discussed is very highly relevant to to why it was pushed in the first place.

[00:51:03]

And Harris was disagreeing about that.

[00:51:05]

But I mean, I feel like I also care about the history of this discussion. Yes, but I.

[00:51:12]

I assume you think it's a separate question and you try to treat it as a separate question.

[00:51:17]

I do. Yeah. Well, then I guess my I hope this question doesn't sound vain or, you know, self-congratulatory, but.

[00:51:25]

But then isn't decoupling just like strictly better than not decoupling if because you can still talk about both questions, you're just talking about them separately, or is that just a very decoupling? First thing to say? I assume it is.

[00:51:42]

I think that if I were to. Say, what's what Ezra Klein would say is that people use no, no, that is not relevant as an excuse to not have to talk about about uncomfortable things. And I think he believed that. Both Harris or Murray said what they said, partly because of their own beliefs. And their own identities and their own personal experiences, and they didn't appreciate the importance and the impact this issue would have on on other people, so I would say that I think he would argue that you cannot separate this because in real life they are not separate and you can't just wish away consequences and you cannot wish away historical factors because they're there and discussing things as if those things didn't matter.

[00:52:43]

It would be would be irresponsible.

[00:52:47]

Mm hmm. OK, I think that will be the that will be the argument now, but it's temperament. I'm I'm also very decoupling. I want to one want to separate things, though this isn't exactly the same as that. So, you know, these are two separate things. I do that all the time, but I do my best to understand why somebody would think that that's a cop out. That's the way to get away from get away from difficult.

[00:53:15]

OK, here's another proposal. Maybe the ideal is to be good at decoupling in your own head. And and if you are good at decoupling in your own head and you decide that, like thinking about the you know, what it implies to people or the harm that it can cause to even have the discussion about the factual question is great enough that it's not worth having the discussion. Then you just say that like, yeah, we you know, this is this discussion is like really not worth the harm it's going to cause.

[00:53:47]

That's my position. So then, you know, you don't actually have the two conversations. You just like recognize that there are 200 nations and you, you know, decide not to have one or something.

[00:53:58]

We run into we run into a kind of rhetorical difficulty here, because if you if you're being upfront with the fact that you think you think that, well, it's irresponsible to have this discussion in the first place, you're sort of unhappy with postmodernism.

[00:54:15]

Yeah. Yeah.

[00:54:16]

No, but if you say that if you're being clear with the fact that you you think that something should not be said because it's irresponsible, you're opening you're kind of admitting that it's not obvious that it's false.

[00:54:35]

I understand what I mean.

[00:54:36]

Yeah. If you say that. But we shouldn't do this because it's dangerous. But you open yourself up to saying, oh, but, you know, it's true.

[00:54:44]

You wouldn't say that if you thought it was false. No. Yeah, that's the thing. Yeah. You wouldn't focus on that if you believe that it might be true. So you can make you can make that explicit. That's that's a big theme from that particular discussion. Is that you being open about, you know, such reason.

[00:55:01]

It doesn't it doesn't work.

[00:55:02]

Yeah, I know.

[00:55:04]

I've been thinking about that. I've been thinking about that particular thing a lot. And it's it's it's almost it's impossible to discuss. You know, quote unquote, dangerous ideas in a way, yeah, because it kind of my my approach to anything is to just, you know, pick it apart and make everything as explicit as possible. And you can't do that. Yeah. With with a thing like that. Huh? All right, well, I'm sorry I'm making you depressed.

[00:55:33]

Yeah, I'm going to have to. I pulled out enough hair in the last hour and have to stop or I'll go bald.

[00:55:40]

So I don't think I really don't think that this is all that this is all terrible and the discourse will work.

[00:55:47]

But we need to know. We need to know why. Yeah. OK, one last question I wanted to ask before we start wrapping up is suppose that some generous funder offered you a million dollars to fund research into, you know, Earthology topics and asked you what question or questions, what do you want to study? What would you pick?

[00:56:14]

Who do your research, your original research?

[00:56:21]

You don't have to have like a research plan, but like, what questions are you curious about that you wish you understood better or that you wish you had more data on or something?

[00:56:29]

I've been very particular about not calling a research into science of disagreement because I'm unsure how useful scientific methods are. Interesting, because interestingly, though, I spoke to earlier discussion because I think it's mostly a matter of integrating information and comparing it and, you know, analyzing it in a sort of philosophical way.

[00:56:48]

And philosophical philosophers don't really do research in an empirical way.

[00:56:53]

And I. Very, very much respect empiricism and the nitty gritty ness of it, but I'm not very good at it.

[00:57:00]

I'm a castles and this guy kind of guy, so I haven't been thinking, oh, I want to do this experiment maybe with a few with a few exceptions.

[00:57:10]

I'm very often very disappointed in how surveys are done and the sort of questions that they have, because, as I said before.

[00:57:21]

Our beliefs are often very low res, and so the questions are often extremely low. Often it's often the fact that I'm thinking, what does this even mean?

[00:57:31]

Yeah, so you can answer, you know, again, answer the questions very well when you. If the. If the abstractions they make matches your own abstractions, you can answer them pretty well, but if they make abstractions that are just across your own and don't match them, then you then they just don't make any sense to you. So the answer somewhat or whatever, I would like to work on ways to improve how service work when you know, answer questions and write questions in such a way that you will make it possible for people with different sorts of internal abstractions to answer them.

[00:58:15]

And maybe right, oh, this question does not match the structure of my head, it doesn't fit into anything where I can produce a yes or no answer. Huh. Interesting, because. Yeah, yeah, you can say if you agree or disagree with a with with an issue and. With a question or statement or anything like that, and sometimes when you when you do like political questions, you can also answer. I think this is very important.

[00:58:43]

This is not important. That's also a very big dimension that people have been ignoring.

[00:58:48]

Yeah, but I miss an option that says this question does not make sense to me.

[00:58:54]

Hmm. Yeah, that would be interesting to see how much the distribution of responses changed when that option is added. It might be a lot.

[00:59:03]

Look, I know that often on surveys, even just about factual questions, when you add the option, like, I don't know the responses, like a ton of people answer that and it makes you realize, like maybe the results we got when we didn't have that option were kind of meaningless.

[00:59:19]

Yeah. Yeah, they are. I usually say that it depends is the answer to almost everything. Yeah, I was annoyed by that because I like my old workplace. We we did a lot of surveys and we wrote reports based on what came out. And I often complained about the questions and my boss smiled and told me, well, you're not the target audience.

[00:59:40]

And well, maybe, maybe I'd like to talk to, you know, research methods to develop surveys that would have me as the target audience know.

[00:59:50]

I would also be interested in those surveys like they would.

[00:59:54]

I nitpicked survey questions often enough that I feel like I might qualify as being in that target audience because.

[01:00:01]

Yeah, because you have these a lot of things, you know, we define people based on if they answer yes or no to certain questions. But I think we should also define people in terms of what questions make sense to them, right?

[01:00:14]

Yeah. Yeah. Well, John, before I let you go, at the end of each episode, I like to ask my guest to nominate a book or article or some other resource. And I think for you, the question I'd like to ask is, is there a book or other research resource that you think is like a good either good like. Exploration of some already related question or has, you know, makes a contribution in some way to the new field of aerosol allergy?

[01:00:45]

Yeah, yeah, there is. There is. But there are many books that are important. About one one in particular is one that almost, you know, ignited my love for picking apart very complicated, complicated controversies into tiny, tiny parts. And looking at them from a hundred perspectives.

[01:01:06]

And there is there is a book called Defenders of the Truth The Sociobiology Debate by a sociologist of science. I think he or she is a historian of science anyway, called Oleksiy. Actually, I think she is.

[01:01:20]

And I think she's and the University of Illinois or anything like that or something. But she's a sociologist of science and she wrote this book analyzing in excruciating detail the controversy around E.O. Wilson sociobiology in the 70s, Sociobiology was this pioneering work that looked at animal behavior as biological adaptations. And it had parts like a starting chapter and a closing chapter that was about humans and how our behavior also could be seen as adaptations and.

[01:01:58]

There was this massive controversy around this, I mean, of course, this is a controversial topic still, but then it was also massively, massively controversial and there were protests and there were academic criticisms.

[01:02:15]

All of that, lots of people involved, and she describes in detail what different beliefs about science, about the society, about, you know, the nature of truth and the responsibilities of researchers and everything like that that caused these people in the controversy, mostly the academics, to disagree so much. I mean, I read it 10 years ago, but I should really read it again because because I think it was fantastic. And it's it's really not that well known.

[01:02:51]

I never seen anyone mention it. Yeah, that's fantastic.

[01:02:56]

I'm so excited to read that, at the very least for my for my tortured soul after after ah.

[01:03:05]

Many threads about the the slipperiness of disagreements and and postmodernism.

[01:03:13]

Oh yeah. That sounds great. I'm going to go download it right away.

[01:03:17]

So willing to remind me again the title of the book Defenders of the Truth.

[01:03:24]

Great. So and it was cool on the sociobiology debate. So it's a long time. OK, great.

[01:03:30]

Willing to defenders of the truth as well as to your excellent website. Everything Studies Dotcom, which has a bunch of posts on Earthology, as well as some other interesting topics highly recommended. And I also think everyone should follow you on Twitter.

[01:03:46]

So it's every every t study, your Twitter handle, every studies.

[01:03:54]

I didn't I would have to say it when I when I decided that it's basically everything studies, but it's too long. So it's just every study with a T in the middle. Got it.

[01:04:03]

OK. Hopefully they can remember that and. Yeah. Well John, thank you so much for coming on the show. This was really a really enjoyable and and enlightening for me.

[01:04:13]

You know, I'm sorry. I'm sorry I caused so much anguish, but it was actually quite, quite a good time.

[01:04:18]

I'm not myself gradually over the course of the day. Yeah, well, till next time.

[01:04:25]

This concludes another episode of rationally speaking. Join us next time for more explorations on the borderlands between reason and nonsense.