Rationally speaking, is a presentation of New York City skeptics dedicated to promoting critical thinking, skeptical inquiry and science education. For more information, please visit us at NYC Skeptic's Doug. Welcome to, rationally speaking, the podcast, where we explore the borderlands between reason and nonsense. I'm your host, Julia Galef, and with me today is our guest, Jessie Richardson. Jessie is an award winning creative director from Australia. He is the man behind several very successful campaigns about skeptical thinking and rationality that have gone viral.
Basically, the way he describes his his life quest now is that he wants to use his advertising powers for good instead of evil. Jesse is the man behind what is probably although correct me if I'm wrong, Jesse, his most well known and and widely shared campaign called Your Logical Fallacy, is which is a catchy and attractively presented catalogue of different logical fallacies that's been shared and liked by hundreds of thousands of people on Facebook. So today we're going to talk about logical fallacies in general and the role that they play in in discourse and and how we as skeptics should be talking about logical fallacies.
And also, more broadly, this question of how to how to spread principles of skepticism widely and what are some techniques for doing that. So without further ado, welcome to the podcast. Jesse, it's great to have you. Thank you. Really lovely to be here.
So maybe to start off, you could just talk about what motivated you. Let's focus on your logical fallacy is for the moment, what motivated you to start that project?
Well, essentially, I wanted to use kind of my design and communication skills for good instead of evil, as you mentioned, and to popularize critical thinking and make it more accessible. And I focused on fallacies as a kind of proof of concept for that line of thinking, specifically because there's something that I think everyone intuitively gets, even if they're not academic. I mean, kids kind of get that idea that something's not right about the logic of something, but they don't necessarily have a name to put on it and to be able to present a clear and coherent sort of example and idea of what that that gap in logic is can be quite empowering to somebody in that light bulb moment sort of happens.
And that's a kind of intellectual gateway drug into other forms of critical thinking. And how have you seen people using your logical fallacy is so far? Oh, in a myriad different ways, I suppose it can be used. Well, the website itself obviously is is the well was such that you can direct someone to the website of the Australian public so you can direct them to your logical fallacy as dot com slash strawman, as a kind of exposition of the incoherence of their argument.
Supposedly, allegedly. So it's used in a lot of social media contexts, obviously on Facebook and so forth, and Twitter and other forums to be able to hopefully somewhat quickly and succinctly apprise someone of the folks in logic and enlighten them as to how they might better construct their arguments.
Do you want to back up and just lay out the definition of a logical fallacy? I mean, as you said, I think people do have an intuitive grasp of it, but it can be helpful to sort of pin down exactly what counts.
Yeah, absolutely. So I mean, essentially a fallacy is a flaw in reasoning. So there's an error in logic somewhere. And fallacies fall into a number of different categories. And there's not a whole lot of consensus on what exactly all those categories are. But the two biggest sort of buckets, formal and informal, informal fallacies. So formal fallacies essentially when there is a a flaw in the logical structure of an argument itself, whereas in formal fallacies, fallacious due to premises or misleading justification structures.
So the best way and the easiest way to think about it is that formal fallacies is like a mathematical mistake in logic, whereas informal fallacies are more like a a rhetorical mistake or an argumentative mistake in logic.
Do you have any any easy to recall examples offhand of formal and informal?
Sure. So a formal fallacy that's quite common is the gambler's fallacy. The gambler's fallacy is when people presume oftentimes playing roulette or other games of chance, that if there's been a run of, say, three rates in a row, that it's more likely black will come up the next time. But of course, the last roulette wheel spin has no physical bearing on what the ball's doing the next time. So it has just as much statistical chance of coming up as it did previously.
So that's a good example of a formal fallacy, because statistically, mathematically, the deductive reasoning there is is just incorrect, whereas a an informed fallacy might be something like a straw man argument where you misrepresent somebody's argument as a kind of rhetorical trick to make it seem like they're saying something, but they're not. Oh, interesting.
I actually would have I would have classify the gambler's fallacy as informal and I would have thought of a as something like like P implies Q not Q therefore not P you.
Exactly. I mean the consequent. Yeah exactly. So that's, that's another example of it. And this is actually a good example of there is some disagreement as to whether the gamblers fallacy is a formal or informal fallacy. So yeah, there's there's there's I would classify myself as a formal fallacy because it's a it's a mistake that's deductive rather than inductive, but it's there. You could also argue that there's induction in terms of presuming probability. But to my mind, there's there's a clear statistical reality there aside from any inference.
So it's a good example of just why there's disagreement within philosophical circles as to what counts as different categories of fallacies.
Yeah, I my personality is such that I tend to love these products of categorization and making taxonomy of things to. Absolutely. And I I'm going to try to explain why I like it, but I'm not sure that's that's not really these reasons. I'm going to be my real motivation. It's just sort of my post hoc attempt to explain what, you know, what's motivating me. But I think what's happening is something like categorizing things. It both makes the concept stickier, like it it makes them easier to remember and use.
And it also helps me understand the relationships between the things. So having these buckets of formal and informal fallacies or of inductive and deductive fallacies or, I don't know, probabilistic, I, I don't know what other categories of fallacies would you have?
Well, I mean, those kind of subcategories as well was in the informal and formal. So there is, for example, fallacies of relevance such as red herrings and so forth, causal reasoning, fallacies such as false cause. And in this generalization, fallacies such as composition, division, hasty generalization, and there's ambiguity. So and there's actually a really phantasy. Take resource, that fallacy files, Doug. You said that before, you think it might have been one of my very, very first, roughly speaking picks, you know, years ago now.
Yeah. They've got the same taxonomy of all the different philosophies and they've sort of nested them into various groups underneath each other, which is it can be really single. No visual context as is quite satisfying and clarifying, I think. And to understand the relationships between different groups. Yeah.
Yeah. So that what I was going to say is even though that even though I think there's some, as you were saying, some ambiguity in how to classify the various fallacies, I think just the act of of classifying them and naming them the way you do on the poster at your logical fallacy is is is quite helpful in making the concept sort of sticky and understandable, which is, I think, a big I would hypothesize, a big factor behind the success of that campaign.
Yeah. It's interesting insofar as that I think when we're talking about things like philosophy and that can get quite complicated and can seem quite heady. But as I was saying before, it's also there's a there's a strange kind of dissonance there because it also feels so tangible and intuitive that there's something wrong. And so I think if we can distill those messages to something as clear as possible, that that can be quite helpful in terms of not only gaining clarity, but also promoting better thinking more generally.
So do you have any favorite fallacies or least favorite, I suppose, depending on how you look at it?
Yeah, but I suppose probably the fallacy fallacy is one of my favorite fallacies to mention only because it sort of exposes the fact that a common mistake a lot of people make with regard to philosophies is presuming that if someone has committed a fallacy that their argument is therefore wrong and their point is wrong and everything they've ever said is probably wrong as well in the extreme version of the fallacy fallacy. Exactly right.
And their parents are wrong and they're wrong is factually wrong.
So what that does is it sort of exposes the fact that logical coherence doesn't have any bearing on truth value. So you can argue with something that is entirely true using fallacious reasoning and and terrible arguments, which is painful to watch. If you happen to agree with the person that's arguing or on the flip side, you can be arguing with perfect, logical coherency for something that is an entirely false conclusion. So the coherence of an argument itself is what the fallacies deal with.
The truth value is an entirely different proposition that goes into argumentation more generally.
Yeah, and it's really striking to see, like the the conflation of the soundness of a logical argument with the truth of its conclusion is just so it's so natural and such a natural cognitive urge to pronounce something sound if the conclusion is is actually true in real life. Absolutely.
And vice versa as well. If you see something, someone argue for something salaciously. It's it's very natural, I suppose, an intuitive for us to then presume that probably their argument is going to be unsound and that everything else that follows from that. So there's the certainly and I think there's a heuristic there that's at play, is that there probably is some amount of correlation between the fact that people who are arguing for, you know, just factually incorrect arguments such as anti vaccines or climate deniers or these kinds of areas where the employee of fallacies is ever so much more readily brought out, like machine gun fire.
And so we can we deduce intuitively that it's going to be not just correlation, but perhaps causation there when that may or may not be the case. That's right.
Yeah. And in fact, this ability to to distinguish the logical soundness or validity of an argument from the truth is one of the things that psychologists use as a metric to test people's abstract thinking abilities. Like if someone can look at the argument, well, this is going to be a bad example because I just made it up. But, you know, if if Dr Bob is always right and Dr Bob claims that climate change is a hoax, then climate change is a hoax or something like that or like I guess more probably would be if Dr Bob says something implies that it's definitely correct.
And Dr Bob says climate change is a hoax, then climate change is a hoax is. Correct, that is logically sound, however, the premises are not actually true and therefore the conclusions are not true, but it does take a certain level of abstract thinking ability to be able to say that and not to say, well, you know, that that argument is flawed.
Yeah, our intuition, I think, is to is to presume that that coherence is correlated to the reality. So it's certainly I think it's something that we learn through practice, especially as people become more versed in critical thinking, that abstract reasoning ability becomes more natural in terms of questioning that kind of granular level of detail to logic.
So we've already kind of started to touch on one of the main things that I wanted to discuss with you, which is I've developed this kind of hesitation about pointing out logical fallacies over this ever since I started getting involved with the skeptic movement. And I think there are a number of dimensions to that. One of them is sort of related to what you said about the fallacy fallacy that, you know, like pointing out that someone committed a fallacy does not invalidate the rest of their argument or does not invalidate other arguments that they make.
And sometimes people act as if that's the case. There's a great quote that I found in a book, which was another one of my rationally speaking picks called Historians Fallacies. So this I think he's a historian by trade name David David Hackett. Fisher says, you know, all great historical and philosophical arguments have probably been fallacious in some respect. And if the argument were a single chain, then if one link failed, the chain would fail. But actually, most historians arguments are not single chains.
They're rather like a kind of chain mail which can fail in some part in some place, but still retain its shape and function for the most part. So which I think is makes like vivid way to picture why why a single fallacy doesn't mean you get to therefore dismiss everything someone says. Absolutely.
But but there was this other other way that I think pointing out fallacies can be not very helpful, which is that I think often the things that people call fallacies aren't, in fact fallacies. And I think there's a number of ways that this can happen. But often I think this takes the form of people saying so when they point out a fallacy, they're saying something of the form. Well, X doesn't prove Y, you know, like the pointing out the appeal to authority.
Well, just because, you know, this expert said it doesn't make it true or pointing out an ad hominem fallacy. Well, you know, just because this person, you know, is is a crook doesn't mean they're wrong about this, etc., etc.. Yeah, sure. And the thing is, from like, if the world worked on pure logical principles, then, yes, it would be very useful to point out that X doesn't prove Y, but in reality, we're almost never looking for logical proof because that's sort of impossible.
And in the real world or we're looking for is no evidence that's at least moderately strong for Y. And so often it is the case that, you know, if if someone has relevant expertise and they claim something strongly, then that's pretty good evidence for what they're doing. Or if someone has done a bunch of things in the past that that indicates that they're like not that big on on sticking to the truth, then that is pretty good evidence that, you know, we should trust that they say, depending on what that's like.
But like. Yeah, you know what I mean.
I totally get what you mean. And I think that in my mind, if you're attempting to have a constructive conversation with somebody is a very different situation to if you arguing with a rabid ideologue who is attempting to propagate and peddle misinformation and perhaps even piddled, quite dangerous misinformation about perhaps not listening to doctor's advice and taking natural remedies, these sorts of things, or, you know, advertising or political or media related things that use fallacies in a way to distort and manipulate and misconstrue things in order to affect an outcome for that particular agenda.
I think in that instance, pointing out fallacious reasoning and the trickery involved there is particularly helpful and relevant and should be called out. However, when you are attempting to understand somebody else's point of view and perhaps come to a reasoned conclusion, just shooting off to them like you made the fallacy. Well, you're wrong isn't going to necessarily help further the conversation, so I think it's a matter of of being discerning about when and where to apply that. To my mind, one of the most important things about learning the Sellers's is that you start to see it when you are looking at maybe when you're looking at other people's arguments and through identifying it and other people's behaviors, that in turn helps you identify the fallacious reasoning that we're all subject to within our own minds and to stop trusting our brains quite so much, which is, I think, a very important thing for us, for everyone to learn.
So I totally get what you're saying about the sort of misuse and overuse of pointing out fallacies. I think in some context that can be not as helpful as others, but in some other contexts, I think it can be vitally important that we call out dodgy logic, especially when it can cause harm. Well, yeah.
I mean, I agree with everything you said, basically, but I just want to to be specific that the the thing that I'm complaining about now, I mean, I'm not complaining about you, but the practice that I've seen happen that I'm complaining about is is not about pointing out genuine fallacies, you know, when that's not the most constructive thing to do. It's pointing out fallacies that aren't actually genuine fallacies or or or I suppose a more subtle variant of this that happens.
A fair amount is sort of failing to give people to give a charitable interpretation of what people are saying so often. Like, what if you take people literally, which I think for whatever reason, skeptics and rationalists are disproportionately inclined to do to to take things literally. People are saying things that are that are fallacious. So, so often people will forgive phrase things as if, you know, they're saying that X proves Y that so-and-so is an expert.
Therefore he is correct about this. But actually, if you were to be charitable to them, what they almost certainly mean is that the person's expertise provides strong evidence for their view. Absolutely. And I've been speaking of enjoying taxonomy. I've been trying to create this taxonomy of ways that people or ways to be charitable to people's arguments. Mm hmm. So you because you appreciate a nice a nice, catchy handle for a phenomenon. And you might have already heard of this one, but one of my favorites is The Steel Man.
Does that ring a bell? I have.
Please enlighten me as to exactly what it means. Oh, good. I love getting to tell people about the domain. So you know what a strawman is. I think that I need your poster. It's for the benefit of our listeners. If they don't know it's bringing up or arguing against a weakened sort of caricature of what someone is claiming, which is like, you know, dumber than what they're actually claiming, misrepresenting their argument. Yeah. And misrepresenting it in a way that's easier for you to knock down the way.
A strawman is easy to knock down. And so the steel men, by contrast, is the opposite of that. It's addressing like taking someone's argument, which may not be the most well constructed argument because humans aren't great arguers by nature and sort of fixing it for them and saying, well, you know, the a a stronger sort of variant of this argument or I'd like to to make this argument stronger, we could, you know, add this assumption and then dealing with that stronger and therefore more interesting and more worth discussing version of the argument.
So so I've been thinking about a taxonomy of ways to do that. And I think one of the ways is just, you know, to assume that people are inadvertently exaggerating the strength of the claim, like saying X, therefore Y instead of just, you know, what they really mean, which is X, you know, give some evidence for Y. And another example that I see happen in these discussions is something like people. So so communication involves all of these unstated premises or assumptions, which is just inevitable because we can never we can never actually lay out all of the premises behind what we're saying.
Like, if I tell you, oh, you should go to the store. You know, the premises I'm not stating are things like I'm assuming that you want milk for your cereal tomorrow and I'm assuming, you know, that that the store has milk and et cetera, et cetera. That's sort of like just implicit. That's just understood in the way we communicate. I don't have to say it all the time. And so sometimes I think that that results in kind of sloppy, sloppy arguments that that behind which are actually pretty decent arguments.
And so the recent example of this, which you might appreciate, I I saw this quote from Gwyneth Paltrow, who I don't know if you followed any of her, like, activity I in the last year or so.
And. Yeah, right. So she she was most famous for being an actress and now she's sort of this lifestyle, you know.
Yeah. Lifestyle guru and merchant or something merchandiser. And she said she's really big into natural things and sort of spirituality. And she had this quote about people saying you should wear sunscreen. And she she expressed skepticism and said, you know, I don't really see I don't see how the sun could be bad for you because it's natural and people kind of understandably jumped all over this or, you know, science popularizers and skeptics jumped all over this and was.
And so the thing is, I think she's wrong, but I didn't like the way that people claimed she was wrong and that the arguments tended to be things like, well, you know, here have some arsenic that's natural to like. Yeah, sure. Do you think that nothing natural can be bad for you? And so the literal form of the argument she made, which is if it's natural, it must not be bad for you, is fallacious.
Yes. But I think there was this unstated assumption behind what she said, which is she didn't really mean anything natural is good for you. She meant things that are natural, that that humans have historically all been exposed to. And and we as a species are still around and still basically healthy. It's harder to see how those things can be bad for you. I don't think she would have argued with the strychnine or arsenic example.
So it's I mean, interestingly enough, it's a strawman of sorts. Right. So it's sort of a fallacy in a way. I think there's and I completely agree with that. The approach of trying to take a charitable sort of understanding to your opponents and to your opponents. It's just the human being.
Like I find myself using the phrase your opponents as well, even though I don't like to think that way. But it is sort of the natural way to characterize how we tend to to see people where I get within within our system of thought.
It is an adversarial context, you know, especially we're talking about debates and so forth. But to go back to you or I think that, you know, with the natural Piltz nature, fellas, is a good example of that. A lot of the time, people don't intend to commit fallacies in the sense that adversarial kind of nature in itself presumes that there's intentional manipulation, that this certainly is intentional manipulation, committing fallacies often from the media and from politicians and from advertisers and from people who perhaps have a financial agenda.
But when you're talking with some somebody who's, you know, earnestly expressing an opinion a lot of the time, the fallacious logic that they're employing is inadvertent. And it often is a heuristic of something that has some some amount of validity and truth value, i.e., you know, there's a lot of people conflate the appeal to nature philosophy as taking, of course, a lot of things that we've evolved alongside of very adapted and relevant to our ecosystem in which we find ourselves.
And so there is there is there's an amount of relevance there and truth there to why someone might conflate those things. And obviously, I get what you're saying, logically coherent form to presume that because something is natural, therefore it's good is obviously wrong. But if we're to be charitable to someone's point of view and say, OK, well, say what you're saying, but you know that is this an evolved context in which we find ourselves and how do we understand that?
So to be clear, I think she was still like even in the charitable version of her argument, Steelman version, I think she's still wrong, but far more subtle. Less obvious reason. Right. That like, well, a simple way to say it would be just that the way in which the sun is unhealthy for us in that it gives us skin cancer is that's not a a thing that tends to reduce our genetic fitness by very much, because by the time the sun has had the opportunity to give us skin cancer, we've basically reproduced almost as much as we're going to or as much as we're going to.
So it's quite right. It's not going to be that much of a selection effect on, you know, on how much humans can can in terms of. Well, I mean, the thing to bear in mind, as well as that, obviously people with white skin have evolved white skin because they were in an area that had a lot less sunlight. Or living around the equator. So there's a lot of complicating factors there. And people with white skin aren't as adapted to be in as much sun as some of them live in Brisbane, in Australia.
And I've been pretty readily half Dutch and half hour or so.
But that's like a subtler mistake. And I want people to criticize her for the actual mistake and not for the dumb strawman mistake.
Yeah. And then you you get into a situation where it's sort of like back and forth of, you know, attempting to take things down instead of actually focusing on what's the point at issue. So I think it's it's like anything, but it's how you use it that's important. And calling out dodgy logic when it's potentially harmful is, I think, really important. Trying to get to the core of what's actually at issue is obviously something to be aware of as well.
But I totally get what you're saying in terms of I think it's it's more constructive and effective to a lot of the time to try and understand someone's point of view and to listen to the intent of what they're saying, rather than to try and smack them down immediately with, you know, telling them they're wrong. Because this fallacious logic involved pointing it out obviously is, you know, can be helpful and depending on the context and depending on the objective you have, I suppose, determines what that approach should be in terms of efficacy.
You had mentioned something to me before the podcast about how your focus over the course of this project, your focus had started to shift from the the structure of the forces themselves to the psychology that motivates the forces. Is this what you were talking about or did you were you thinking of something else?
Yeah, no, I mean, that's that's kind of relevant. And I suppose that's that's to me what is one of the most interesting aspects of of of the fallacies and why. I think it's really important for us to become aware of it generally. What what's going on there in terms of the the exposition of our own psychology as to why we why we commit these fallacies seemingly quite intuitively, you might say? Because I think there is underneath most of the fallacies that we can find, there's actually a mechanism whereby we are not metacognitive, we're not thinking about our own thinking.
We are not aware of the machinations of our subconscious mind attempting to justify a price that we have a belief that we have that we don't want to let go of. And to me, that's it's a really fundamental shift in thinking when you start to become aware of your own thinking and you start to become aware that your own brain lies to you, that you maybe shouldn't trust your brain, and that maybe sometimes you should approach your own thinking with a measure of doubt and actually analyse and and take a step back from yourself to go hang on.
Why am I shifting goalposts now that someone's exposed a flaw in my thinking and changing the premises of my argument? Is it because I'm holding on to a belief or is it because there's actually some value to this? And I've just argued for poly. It could be either of those things. But when we become aware of fallacious reasoning, both in our own thinking and in others, it can be quite elucidating in terms of being able to expose that psychology that underlies things subconsciously.
Because as I said before, I think a lot of people commit fallacies without any malicious intent to manipulate anybody. It's, moreover, a defensive reaction from their own psychology to protect the beliefs that they have.
I, I, I genuinely wonder, though, whether being given these lists of fallacies or of cognitive biases does help people on net recognise the flaws in their own thinking. I can certainly see a plausible story for how it would. And I have examples in fact, of, you know, noticing like for example, I think that having names for fallacy that I was talking before about having categories and handles for concepts makes them here. And I've seen that benefit myself.
So, you know, having having the catchy phrases and images like cherry picking or no true Scotsman, I really feel like I noticed the myself committing these things more because I have names for them. So that is quite helpful on the. Absolutely. I you know, I've seen many concrete examples of people who. Well, so for example, I'm thinking of one friend who who learned all about cognitive biases. And now it's hard to have a disagreement with her, because anything that you say that disagrees with her, she will say, oh, well, but you're just biased because and then she has some reason for why you can't have a an objective position on this issue because, you know, it goes against your interest for whatever reason.
And and it's really just become this this like get out of you know, get out of evidence free card that she gets to wield whenever. And it's an example. Yeah, not totally.
And I think that it's like it's it's so to a man with a hammer, everything looks like a nail. Yeah. And it's that kind of thing going on there. And there's a continuum, I think, of, you know, how how instructive, how helpful, how elucidating learning biases, fallacies, critical thinking, argumentation more generally is to people both in terms of different people, but also over time as well. So I think that an introduction to critical thinking is obviously not the end point for a lot of people and perhaps being enamored with fallacies or cognitive biases and having your mind to be kind of semi fixated on that over time, that's probably going to taper off.
But the inculcation of that into one's own mind and being aware of sort of like a meta level of of of cognition in oneself and externally as well, is a powerful analytical tool. And the the subtle ways in which becoming cognizant of these sorts of things affects various other aspects of our lives, everything from diet to health care, to who we vote for, to all these sorts of things, and from an individual level to a societal level, it's obviously extremely complex.
My position on that would be that there is I think there's certainly potential negatives to sort of having that one, that kind of knee jerk reaction, especially with fallacies of fallacy about engaging with that any more. I think being sort of charitable to to whom you're talking and trying to understand the point of view as is important and with various other forms of cognitive bias, bicyclists and whatnot, they can be an initial flush of interest. But that initial flush of interest, say maybe a 14 year old who's grown up in a context where they've had nothing but dogma given to them for their entire lives can be quite a pivotal, pivotal moment.
So whereas for someone who perhaps, you know, is is less inclined to have their mind altered in such a way, perhaps it's less of a significant shift. But the net effect, I think is is a is a positive one in terms of understanding and furthering, you know, a more progressive and enlightened world where hopefully I'm not totally true.
But it could be true. Yes, it's true.
I signed I signed probably about maybe 70 percent upwards probabilities for that there.
So along these lines, there have been some criticisms of the your is poster among the largely positive reception. And the the criticisms tend to be around, you know, the issues that we've been discussing that like, well, you know, oversimplify is what actually counts as a fallacy or it encourages people to wield these things as weapons, to attack arguments that they for things they don't like, etc.. And so in our remaining, we have five, ten minutes.
I wanted to talk about this trade off here because I think or this this trade off between, you know, getting it really, really accurate in your communication and in making something that goes viral, because I do think there's a trade off there.
There is. And that's I mean, that's the up of there. Enough of ruminated on this quite a bit because you say there's no koppa.
Yeah, yeah, yeah. Just in terms of the there is a lot inherent to the to the the website is kind of adversarial. You know, your logical fallacy is and I've seen it used in places that are that wasn't the intent. It was meant to be more tongue in cheek. And, you know, you're not meant to be using this as a one thing should part of a broader thing. But by the same token, I kind of what was attractive about doing it this way to me is I saw this as a way that this can get social media traction and can popularize critical thinking more broadly.
And now that it has gone viral instead of over five million unique views, that's up in thousands of schools all around the world because the Post has Creative Commons and it's also doing a lot of good in terms of, you know, getting critical thinking messages out there. So on balance, I think it's it's worthwhile. It's not that there's no negatives. And I think those are kind of valid criticisms in a way. And also. Whether employed as well, so, I mean, you know, just a throw away thing to try and smack down someone's argument that was earnestly and genuinely attempting to engage is a very different thing to if someone's trying to say, you know, don't take chemotherapy, take this very special water.
And because it's natural, calling that up as a potential fallacy is, I think, a very I've no problem with someone being quite blunt about it. You know, that that's fallacious logic. So it's there's a there's a continuum there. Right. And sort of to that point, I think that we as a sceptic community tend to we tend to make things quite complicated. And there's an echo chamber effect when what we should really be doing, what I think is the most important project that we should be undertaking is spreading rationality and critical thought to the broader community, popularizing it amongst people who don't already identify sceptics.
And to do that, there is some trade off between the simplification and the communication modes that we might employ to do that. And what we're trained in, in advertising is how do we distill things to a to a really simple and engaging message that has relevance to the target market in the case of marketing. But I mean, that applies more broadly to to human psychology. And that equation to me from a consequentialist point of view is is a very clear reason to sacrifice some amount of nuance for the sake of spreading critical thinking, rationality and scepticism to a broader community.
Yeah, you know, someone one of the someone critiquing the critics of your poster pointed out, made a good point, which is that the the people criticizing the poster are, in fact, committing a kind of fallacy in a way which is the Nirvana fallacy. Have you heard of this one? Yes. Right. So for our listeners, the Nirvana fallacy is basically arguing as if something is like some some endeavor is bad or shouldn't have been done because it's not perfect when the the actual question is, you know, is it better than what what other things could have been done with those resources?
Or I mean, I imagine there are other reasonable questions you could ask, but saying you're not perfect, therefore it should not have been done is not really a great argument. Yeah, and just to further elucidate the fallacy, this bothers me every time people make an argument against gun control, that takes the form. Well, come on. Like, if someone really wants to get a gun, they still can like that. I'm sure that's true.
But that doesn't actually address the question of whether gun control would reduce deaths from gun violence.
Well, I mean, that demonstrably does, because Australia was a very clear example of more stringent gun control, had a very, very unequivocal effect on shootings. So it is time to totally take your point. It's it's it can be quite frustrating when those kinds of conversations occur.
And I'm realizing what a what a dumb move it was for me to introduce the topic of evidence about gun control at a time when I don't have time to go into the political arguments about it. And I'm going to regretfully leave that thread unfollowed. But, yeah, and along these lines, maybe the last question I want to ask you is what you've learned about making rationality and skepticism related memes go viral, like aside from the general advice to keep things simple.
Is there anything you know, you've done more campaigns than than your logical fallacy is is there anything you've learned from across those campaigns?
Yeah, I mean I mean, beyond just rationality as well as some of the principles of marketing, communications and effective communication more generally, which is employed by advertising industries and so forth. There's some quite relevant learnings there, I think for anyone that wants to communicate and does a lot of money, spent so much research to sell us crap that we don't need, that can be actually used for more noble and wasteful purposes. So I could bang on about that for hours.
But is a sort of like just to take away the holy trinity of marketing. Communications is simplicity, engagement and relevance. So trying to distill messages to the simplest form, trying to make your messages, engaging in terms of is there actually a hoax there in terms of what you are putting forward or are you you know, is it a kind of self-indulgent waffling about something? That you're interested in, that someone else might not be an intrinsic to that is being aware of your audience and being aware of how receptive they're going to be, why they might be receptive to what you're talking about in adapting to that context, which is an interesting thing more generally in terms of, I think, critical thought being less insular in our thinking and actually considering things more broadly and going well.
What's the efficacy that I'm looking for here? What's the outcome looking for who is the audience that I'm talking to and actually considering strategically what we're trying to do in terms of the outcome rather than just being on the autopilot of, you know, people should listen to me because I'm right or whatever other mechanisms are going on, I, I will.
We're just about out of time. But I will close with a nice a nice illustration of the difference between pure critical thinking and what I might be tempted to call rationality. This comes from my colleague and friend Kansi. I can't take credit for it, but someone I never had a great answer for that question. Like what is what's the difference between rationality and critical thinking? And then I saw someone ask her and she came up with this on the spot.
So she said, well, imagine that you have two people who are trying to decide where to go for dinner. And so they they researched all the restaurants in the area and they read reviews of the different restaurants and they weigh the potential bias of the reviewers and they come up with Paracon lists and et cetera, et cetera. And by the time they've narrowed it down to the finalists, it's midnight and all the restaurants are closed. These people are exercising critical thinking in looking for biases and evidence, etc.
But they're not exercising rationality because they're not actually accomplishing anything that they want to get done. That's fantastic. So I think that's yeah. Probably reminds me of that.
The thing of, you know, knowledge is knowing that tomato is a fruit and wisdom is not putting in any new milkshake.
Right? Yeah. All right. Well, we're just about out of time. Seems like a good place to wrap up. So we'll move on now to the rationally speaking PECC.
Welcome back. Every episode, we invite our guest to introduce the rationally speaking pick of the episode, and I ask our guests to choose a book or other work or even organization that has influenced their thinking substantially over the course of their career, change their mind or shifted their focus in some way. So with that, Jesse, what's your pick for the episode?
Actually, I'm going to pick a countryman of mine, Tim Minchin, who has been quite not only influential but entertaining and just as a wonderful human being more generally. And what I thought might be an interesting kind of experiment is if we attempted to use the army of rational sceptics listening to, rationally speaking, to help popularize rationality by trying to help a particular work of his going more viral than it already has. So one of my favorite things in the whole world is something, as a YouTube clip called Storm by Tim mentioned that I'm sure you're aware of Julia.
And I was thinking that if we all shared that on our social media simultaneously, as we can with the podcast, we've everyone listens to it sort of probably over the course of a week or so that might help to start some waves of rationality that would be perhaps quite effective in terms of spreading critical thinking and skepticism to a broader community, because I think that using humor and well constructed pieces of animation and so forth is a really, really effective way of getting the message out beyond our own community.
So the Oscars, I suppose, for everyone to go to YouTube, type in Tim Minchin storm and share that video on social media and see if we can what effect we can have in terms of upend the view count of that over the coming weeks.
You know, this is actually maybe not fully, intentionally appropriate pick for this episode, because I Storm was somewhat influential in my arc as a skeptic as well, but in kind of an interesting way and that I did share it. I came across and it's like, oh, this is so great. And I shared it. And I got some pushback from from some very smart, educated friends of mine who felt that it was like a little bit a little snarky or a little obnoxious.
You're right. And I and so I, I it kind of caused me to step back and examine the messaging, just as we've been talking about in this episode. But I'm not I'm not sure that it was the wrong thing to do to share it, because, as you say, it's a very catchy and funny clip that is is totally, you know, viral worthy and it has gone viral and has the potential to go even more viral. And and I'm not at all sure that the net effect is bad.
I'm sharing of sharing storm. So it's kind of like an encapsulation of my of my confusion around this trade off between between sort of nuance and accuracy on the one hand and and popularity and the popularity and overall effect on the other. Yeah.
I suppose it comes down to a kind of virtuous versus consequentialist point of view, doesn't it? And to my mind, it's it's it's worthwhile. And I mean, it's a judgment call obviously to some extent as well. I, I find a lot of things that the Tim mentioned does in particular just so endearingly sort of human and funny and awesome that the kind of, I suppose, snottiness in it to me is more playful than offensive.
And I mean, certainly that was my first reaction to it as well. Like, I didn't I either didn't notice the snarky ness or it didn't it felt sort of playful or unimportant in the broad scheme of the point of the piece or something like that. But it was taking the piss.
Well said. All right. Well, thanks so much for joining us, Jessie. It's been a pleasure having you on the show.
Lovely. Thanks so much.
Earlier, this concludes another episode of rationally speaking. Join us next time for more explorations in the borderlands between reason and nonsense.
The rationally speaking podcast is presented by New York City skeptics for program notes, links, and to get involved in an online conversation about this and other episodes, please visit rationally speaking podcast Dog. This podcast is produced by Benny Tollan and recorded in the heart of Greenwich Village, New York. Our theme, Truth by Todd Rundgren, is used by permission. Thank you for listening.