Transcribe your podcast
[00:00:14]

Rationally speaking, is a presentation of New York City skeptics dedicated to promoting critical thinking, skeptical inquiry and science education. For more information, please visit us at NYC Skeptic's Doug. Welcome to, rationally speaking, the podcast, where we explore the borderlands between reason and nonsense. I'm your host, Massimo Puchi, and with me, as always, is my co-host, Julia Gillard. Julia, what are we going to talk about today?

[00:00:47]

Masimo, today, we're going to kick off our discussion by talking about a rather shocking legal decision in Italy a few weeks ago. I think I heard of the place.

[00:00:57]

Yeah, ring a bell. So, you know, feel free to jump in. And my pronunciation here is I'm sure you do.

[00:01:05]

So in 2009, there was a pretty severe earthquake in a city in Italy called Lochbihler, several hundred people. There were several hundred casualties. And just recently, six Italian scientists and a former government official were sentenced to six years in prison, each for failing to appropriately warn the public about the severity of the coming earthquake.

[00:01:33]

So this decision, this legal decision has sent shockwaves through much of the world, but especially the scientific community, because of the implications that failure to predict, you know, very uncertain natural disasters can hold can can be held against the scientists legally.

[00:01:53]

And so it's sparked a lot of debate about to what extent it's reasonable to expect scientists to be able to predict uncertain things like earthquakes and you know, what the legal ramifications should be.

[00:02:06]

So we're going to talk about this case, but then expand the discussion into a broader discussion of what scientists responsibility is or should be to the public when it comes to risky, uncertain scenarios. Right.

[00:02:19]

Well, one of the people that was convicted was Bernard of the Bernardini, who was the vice former vice president of the Civil Protection Agency. The Dunant's protection agency actually knew him. He is a geophysicist by training. And he was at some point, I think, the head of the National Geophysical Institute and also had met very briefly. And Zubowski was another one of the people that was convicted in this case. I don't know who one of them well, but but I know their reputation and they're reputed to be very good scientists.

[00:02:52]

And, you know, typically that means that they know what you're talking about.

[00:02:56]

But the case the case is interesting for a variety of reasons. I think, first of all, it may be a the first time in recent history that something like this happens, that scientists are being held responsible.

[00:03:10]

What they said in public, essentially, and we need to we can talk about what what exactly they've been held responsible for. So, for instance, you know, our collaborator at rationally speaking, internationally speaking blog, Jim Pollock, also wrote in a recent blog entry on his take about this whole thing. And he pointed out that, for instance, one of the things that happened was that the people asked the American Association for the Advancement of Science did write a letter to the Italian president on on behalf of the scientists that were on trial.

[00:03:46]

And the Tupolev deeply claimed that the basis for an indictment was that the scientists failed to alert the population to the impending disaster. That's not quite right. What actually happened was that the scientists were, whereas the Bernardini in particular, on television and on national television, he was asked after there had been some tremors in in the area of L'Aquila, which is in central central eastern Italy, it's actually the same exact height as Rome.

[00:04:13]

But on the other side of the Apennines and there have been some tremors and that that is an area known for being prone to earthquakes.

[00:04:22]

And so it's not like this was a big surprise to anybody. But but, of course, really, there is a big an actual big one, devastating, as devastating as the one that hit in 2009. Now, what the Bernardes said when prompted by a question from a journalist was that the tremors where in his opinion, were not likely to be indicative of a major one to come.

[00:04:44]

In fact, if anything, they could help help because they could diffuse the seismic tension that had accumulated in the substrates, in the rocks, in the substories below L'Aquila, and therefore that people you know, there was basically not much of a chance that it wasn't likely that a major earthquake would break out.

[00:05:06]

Now, on top of that, prompted by a journalist sort of quip, basically the Bernardes.

[00:05:12]

Yeah. So the local inhabitants should just relax and open a bottle of wine, which, of course, was an offhand joke, which turned out to be the tragic ones that in fact, a lot of people apparently stayed instead of evacuating the area. And that's why we got, you know, at least in part, that's the reason why we got 700 people died. So it's not a. Exactly that the scientists are being held responsible for not predicting, predicting an earthquake, which anybody who knows anything about earthquakes will tell you, that is in fact impossible to predict an earthquake at an individual event as a matter of frequency in certain areas.

[00:05:44]

Yes, we know perfectly well which areas of the world are more prone or less prone to earthquakes. But as an individual event that essentially we cannot do this, at least not at the moment. So that is but that isn't the accusation. Your position basically was for downplaying the risks.

[00:05:59]

And so the whole case hinges on whether the scientists downplayed the risks while they should have known better or whether they by downplaying the risks here, means making the empirical claim that it was very unlikely or urging people to not take precautions.

[00:06:19]

That's a good question. So, you know, I think that what the Bernini's actually said was that it was very unlikely that this was a harbinger of a major quake.

[00:06:29]

Actually even said to the interview that, yes, occasionally these things do happen before a major quake. But it is in fact, we think that it isn't likely and is true technically. That is correct. I mean, as our colleague Ian points out, you can look at this issue from a vision perspective. Right. So there is a difference between the probability of the tremor given a large quake and the probability of a large quake, even a tremor.

[00:06:54]

Mm hmm. Right. So the probability of a tremor before the quake, given that then afterwards there was a large quake. It's very high. But the probability of a large weak happening, given that you have observed tremors, is actually very low. You know how long it is. It's between one and three percent. And of course, the latter is the probability that is relevant than the one that the Bernardini was talking about.

[00:07:14]

So technically, it was, in fact, correct it. Given the tremors, the probability of a major quake is, in fact, low.

[00:07:21]

Now, there's a difference, of course, between saying that which nobody I think could in part because it's like, OK, well, that's that's what the science tells you.

[00:07:30]

And then going on the further step and saying, therefore, people should not evacuate and that they OK, that they can stay there, they're likely to be OK.

[00:07:40]

That is an actual that's a that's an advice on a policy essentially, which the Japanese was making, because in fact, he had been, as I said, the former head of the, you know, essentially the suit protection agency.

[00:07:53]

And so he was doing that in that capacity.

[00:07:57]

And, you know, he had been playing that role in the past. So but there is, in fact, an interesting difference between what the scientists can say safely and what the sort of practical policy has to be as following that advice. And the two are not exactly, you know, tightly interconnected. You want your policy, of course, to be informed by the best science, but the best science doesn't necessarily dictate a particular policy. You could decide that.

[00:08:24]

Well, even if the chances of a quake, a major earthquake is low, still, we're going to evacuate because we think that the risk is still too high compared to the possible damage or whatever. There are other issues that can come in other than just the risk of the earthquake there. If the earthquake actually does occur, what is the chances that many people are going to die or how many buildings are going to collapse if it doesn't occur? And how much money is it going to be wasted in the evacuation process and on time and that sort of stuff?

[00:08:54]

So that's that's a risk assessment issue that goes beyond this simple risk of the earthquake itself.

[00:09:02]

There's confusion over the inverse probabilities or condition probabilities. Reminds me of the O.J. Simpson case when Harvard law professor Alan Dershowitz argued that among men who beat their wives, only a very small percent, like a tenth of one percent, go on to murder them.

[00:09:21]

But in a letter to nature, the statistician is good, pointed out that that's not actually the relevant probability.

[00:09:29]

The relevant probability is conditioned on husbands who batter their wives and whose lives are then murdered. So that relevant probability is actually one half and one half of such cases. When wives are beaten, the husbands are the murderers. So, yes, exactly.

[00:09:48]

That kind of confusion is actually very common. But you would think that, um, that that is sort of the kind of technical issue that could and should be sorted out in in in a court of law.

[00:10:02]

Right.

[00:10:03]

It's been so disillusioned over time by the degree to which, like probability and logic are taken into account. And in court cases, that's true.

[00:10:14]

Now, this this court case, by the way, by the way, my understanding is, was decided by a judge jury, which is actually likely a better situation because juries are notoriously even more unreliable.

[00:10:25]

Oh, my God. One of my friends who was an economist was on jury duty. A few years ago, lord knows how an economist remained on the jury, because usually they get weeded out early on, but it was about a medical negligence case, actually kind of similar to analogous in some ways to this case that we're discussing. And the question was whether the doctors should have run a certain test that would have revealed that the patient had this disease and then they could have saved him.

[00:10:50]

So was it negligent for them to not run the test?

[00:10:52]

It was kind of a rare condition.

[00:10:53]

So it was, you know, pretty arguable that the doctors had no good reason to think that this was likely enough that they should run the test. And and to my friend, that was pretty obvious from the testimony. But he just has such a difficult time in the jury room because other jurors kept like he would make that argument and would say things like literally verbatim. You don't understand.

[00:11:15]

Somebody died here. That's right. That was the argument, right? Right. Yeah. Actually, if I may cite Ian's our colleague, internationally speaking, he does a good job of going through what he calls huge cognitive problems from the point of view of the audience and in this case, in this case, would have been institutional to come from the jury, although, of course, we cannot we cannot assume that the judge himself wasn't necessarily at fault here in terms of the same kind of cognitive biases.

[00:11:42]

But he lists several of them, which are very relevant to these kinds of cases.

[00:11:48]

One is neglect the probability, you know, don't give me the odds. It is a save or not. I want a simple answer is yes or no. Well, I can't give you a simple, simple answer. I have to give you odds. But most people don't think in terms of odds or had difficulties thinking in terms of odds. So you don't see how much money is made by the casinos as a result, for instance?

[00:12:06]

Well, ironically, I think the understanding that people can't think in terms of odds or, you know, we'll we'll interpret whatever order you give them into like a yes or no answer, then hold you liable if, you know, blame you if things turn out, if something happens that you said was unlikely. That knowledge, I think, makes people like doctors unwilling to give any probabilities at all. This is what I found. It's been really frustrating, actually.

[00:12:31]

I you know, when I when a doctor tells me there's a chance of something like a chance of a complication of a procedure, a chance of transmitting an infection or some or so on, I, I usually try to press them to tell me how much of a chance there is.

[00:12:44]

And they say something to the effect of, well, it varies or I can't say and I have to just like draw the probability out of them, like more than 50 percent, like less than one percent.

[00:12:53]

And they still won't say, which is one of the interesting complications from from these kind of this case for the liquid, from the liquid earthquake that we need to consider a little further. That is what was going to be the chilling effect, especially on expertise, testimony.

[00:13:08]

But I think derail the. I'm sorry. Well, I wanted to hear what the other biases that they didn't read his blog post.

[00:13:14]

And I also came up with a list of biases. I'm curious how so we can compare them. Yeah. So the second one he mentions is the denial of personal responsibility, as in nobody for most of the risk because of for kayaking or something like that. Well, you know, there is risks. Certain things carry certain risks. And it's up to you perhaps to to to inform yourself as a as a intelligent user of things. The bad guy.

[00:13:38]

Byas, my son died on an operating table and someone needs to be held responsible. You're saying out there right now you don't understand this? Well, yes.

[00:13:47]

And that's very unfortunate. But, you know, that doesn't mean that there is an automatic answer where the blame is going to go.

[00:13:53]

The fourth one, that unless there's hindsight or outcome bias and more luck. Yes. No, I can talk about that.

[00:13:59]

So how can you say it was the right call based on what you knew when 40 families are grieving? Well, again, it's unfortunate and 40 families grieving. But, you know, it is a matter of that was still the right call at the time, given what you knew at that particular time. Of course, with hindsight, it turned out to be the wrong call, but that's not the way you want to have been.

[00:14:20]

I mean, it turned out to have been incorrect, but it might have been probabilistically still, correct? That's right.

[00:14:25]

So so there's a difference between making a judgment call at the moment in which you have the information available before you see the consequences. Right. And then after the events have unfolded, I mean, obviously after the fact, the call this one with hindsight, is 20/20. And you can you know whether the call turned out to be right or not. Yeah, but the question usually in play here is when what if the call was correct at the time, meaning based on the information that the agents had at that time, was that the best way to call the situation?

[00:14:53]

Yeah, there. I'm sorry. I know you have more to talk about. OK, well, go ahead.

[00:14:59]

OK, I just wanted to comment on hindsight bias because I just love the demonstrations of hindsight bias there. So striking. And so there's like two ways to demonstrate hindsight bias.

[00:15:08]

The standard way in cognitive scientific studies is to give to statistically equivalent groups of subjects, scenarios and then ones that one group of subjects is told what actually happened and asked know how how obvious do you think it should have been that this was going to happen?

[00:15:28]

And then the second group is just, you know, told all the.

[00:15:31]

That were available before the event and asked, what do you think is going to happen? And people in the first group invariably say, oh, well, this is obvious.

[00:15:38]

Like, of course, people should have known. And then the second group, you know, statistically equivalent group of people, they could not actually predict what was going to happen. So in one in one experiment, based on an actual legal case, people were asked to estimate the probability of flood damage caused by a blockage of a drawbridge in the city. And the control group was told only the background information which the city knew when it decided not to hire a bridge watcher.

[00:16:02]

Then the experimental group was given that information, plus the fact that the flood actually occurred and 76 percent of the control group concluded that a flood was so unlikely that no precautions were needed.

[00:16:15]

But then an experimental group who was told that the flood actually did occur. Fifty seven percent of them said that the flood was so likely that failure to take precautions was legally negligent.

[00:16:24]

Yeah, exactly. Well, the two additional biases that Ian pointed out was the idea of a moral grandstanding. So no risk to our children is acceptable.

[00:16:33]

Well, that sounds very nice, but in fact, that's baloney. We always live with with risk, no matter what we do when you get into a car and there is a risk now associated with almost any any activity that you do, even with not doing anything, you have you know, there's a risk that something's falling on your head when you're standing still.

[00:16:52]

So this idea that especially when it comes to children, particularly this is this is psychological. He plays very, very strongly because, of course, the death of young young people, young children, is always psychologically has a high impact. But still to say that while this is unacceptable, it makes no sense. That's the whole that's a whole issue about risk. You have to decide what risk is acceptable, given the situation, given the circumstances, given the resources and so forth.

[00:17:23]

But that risk is never going to go to zero. Yeah.

[00:17:26]

And what you do, I think that your point about there's no way to completely play it safe. You know, even if you stay home, you can still something fall on your head is a really relevant point here, because by warning the public you can cause a panic. You know, you have a lot more people like traveling and trying to leave town. And I think I haven't done the calculation, but I think it wouldn't be unreasonable to expect that there would be a rise in probabilistically a rise in casualties just as a result of panic.

[00:17:53]

Right. And I know that. But in particular, in this particular case, you know, we just seen a similar situation with with the recent flooding in the New York, New Jersey and Long Island area because of the hurricane, Hurricane Sandy. So even though the authorities in several of those areas had actually issued mandatory evacuation orders, a lot of people stayed. Yeah, you know, they made the choice of, well, I don't care. I don't trust the authorities or whatever.

[00:18:17]

I think I can weather the storm, whatever it is. Then somebody dies. And the question is, you know, even if you actually have taken this from the point of view of authority or government, if you are taking the correct steps, you said, OK, you need to evacuate. Some people are going to stay behind. So it's not quite clear that, for instance, what is the percentage of those 309 people that died that actually are the result of a mistake, assuming that there was a mistake made in the evaluations of the scientists because some of those people would have stayed behind anyway.

[00:18:46]

And we don't know what the percentage what that percentage was, you know, depending on what the trust of the population is in the authorities, in general, in science in general and so on and so forth, that percentage may be very small and may be very large. And there it's really hard to tell. And of course, when you're talking about assigning blame, that is part of the deal because, you know, the question there is how many people actually died because of the alleged mistake and advice?

[00:19:10]

One more from from Ian. And that is the crying probably Wolf. In fact, there is in the last two evacuations were false alarms. I'm not going anywhere.

[00:19:19]

This is what I was just talking about. Oh, not not evacuating and things like that. Right. But the great. I didn't come up with that. Yeah.

[00:19:26]

That that's that's also in play because human psychology is what it is. And so some people say, well, I heard this before and you know, we survived the hurricane last time. So we're going to do fine, just fine this time. Except that, of course, this time the situation is completely different or it's relevantly different so that, you know, certain damages or the risk is going is going to be higher.

[00:19:48]

Oh, I just I had forgotten to say when we were talking about the hindsight bias, that this case reminded me a little bit of what happened after 9/11, when some people pointed to pieces of intelligence that had suggested that we received this before 9/11 happened, that suggested the possibility of a terrorist attack like this. And they said, how could the government not have acted that was so negligent. And, you know, it's understandable that they would react that way.

[00:20:10]

But there's so many other pieces of intelligence about things that never actually came to play or, you know, came to pass. And so it's much harder to decide, given especially how strong hindsight bias is, how credible we should have thought that intelligence was before 9/11.

[00:20:24]

That's right. You also brought up exactly that example, believe me, that I didn't read his bio. I do, but but, yeah, that's that's another important thing, which which brings, again, the idea of you need to estimate whether a particular call was reasonable at the time, not with the hindsight, because with hindsight, you know what happened, but now you have a hell of a lot more information than the person or people that were making the decision at that time.

[00:20:56]

And of course, it's difficult to put yourself back into the mind of people at the time before the event happened and see exactly what kind of information they had access to, how they process that information and so on.

[00:21:09]

So that that, I think, is the major.

[00:21:11]

But I think it may be the major culprit here for this misunderstanding between what is a fair way to do to establish blame and what is an unfair way of establishing blame.

[00:21:23]

Now, that said, let's take a look, I think for a second from at the issue, from the other perspective. And let's ask seriously whether, in fact, the scientist in question is do have a blame. I think that a degree of blame.

[00:21:37]

I think the six years for whatever blame they had from whatever they can be blamed for six years in prison is far exceeding whatever the likely claim that we're talking about here. I think that's insane.

[00:21:49]

Just as a side note, they are appealing, right? Yes, they're going. And according to the law, nothing comes into effect, goes into effect until two the of appeal is being exhausted. And there's two levels of appeals before beyond the first degree conviction, a sense of how long that takes.

[00:22:05]

I'm just curious that that varies significant, depending on whether the proceedings are going to be put on a fast track or not. But but let me just give you an example. The former prime minister of Italy, Berlusconi, has been convicted several times of several crimes and never done a single day of jail because either the appeals run out before the the statute of limitation run out or something like that happened. So he basically never got to the end of the process.

[00:22:39]

So, yeah, it can take years. So I don't think that we are talking about an actual danger any time soon for sure, for the people to go to jail. But they were also barred from public office ever again, holding public office ever again, which means that presumably most sincere Italian scientists are usually government employees. This means these people dropping degrees or so. So there are serious consequences.

[00:23:04]

Now, as I said, I mean that regardless of how one comes down on the actual blame, six years seems like outrageous.

[00:23:15]

But but I think the more interesting question in as far as we're concerned is whether there was, in fact, any blame at all or not and.

[00:23:26]

I think that there is any blame is in the communication of the information that Denise and colleagues did to the journalists and therefore to the public, because, you know, it's one thing to say, OK, technically this is what's going to happen.

[00:23:44]

What is my best estimate for the probability of an earthquake and so on and so forth? But that's not the way they put it. Number one, they didn't explain, for instance, what is the difference between those two probabilities from Beijing perspective we're talking about? Now, you don't typically expect people, scientists, when they talk to the public or to the journalists to bring in Bayesian analysis as as a way to communicate, because that's not really a particularly effective way of communicating to the public.

[00:24:11]

But it is possible that the mistake, if there was a mistake here, was in communicating things in an efficient way, in a way that was not clear and then open them up to blame once that all the tapes of the interviews were played back. Now, that's a distinct question, of course, from saying, well, had they actually explained things correctly, would the public have understood things?

[00:24:35]

Yeah, right. It sounds like you're pointing out a thing that they could have done more strategically for their own benefit, but but not necessarily to lead to a better outcome overall for the public.

[00:24:48]

I mean, not just that we don't know, right?

[00:24:51]

Yes and no. So, yes, at the very minimum, if if they had actually communicated in a clear, most correct way, technically most correct way, they probably would have, you know, be more likely to save their asses from a legal perspective.

[00:25:07]

Right.

[00:25:08]

That's that's true. But. The question the broader question is to what extent can the scientists themselves be blamed for a misunderstanding on the part of the public or the part of the journalists of something that they actually have said clearly and technically? Correct, right. I mean, after all, yes, you do have to be conscious of who your audience is and therefore talk to your audience appropriately and all that sort of stuff. But once you do that, it's really out of your hands at that point.

[00:25:39]

You know what?

[00:25:40]

What the audience understands depends on their own kind of biases, their own background knowledge in the area, their own trust in science and authorities and so on and so forth over which you have absolutely no control.

[00:25:53]

Yeah. Although, yeah, I agree at a certain point and there is a spectrum, so the threshold is fuzzy, but at a certain point if you have enough confidence about how the public is going to or how your audience is going to interpret what you're saying, then it's it's hard for you to just throw up your hands and say, well, you know, I, I said the correct thing.

[00:26:11]

Like, if I run into this problem all the time when I talk to the public about rationality, when I use the word rationality, I intend it to have this in the same sense it does when it's used formally in cognitive science, where I mean, I don't like to find it right now, but essentially it's different from the way the public thinks of rationality.

[00:26:32]

So just as one example, when I use rationality, it refers to some coordination of your intuitive and your Analytical Decision-Making systems, whereas the public, as I've learned, thinks of rationality as being just about the Analytical Decision-Making system.

[00:26:49]

And, you know, if I could use the word in the way I mean it, but I just know ninety nine percent of the time that's going to lead to huge confusion because they're thinking of it a different way. So the onus really is on me, given that I know how they're likely to be understood to explain what I mean. And that's a good point.

[00:27:06]

I mean, I have a similar experience whenever I hear people talking about logic. So, oh, this is the logical thing to do. What they really people normally mean is that is the reasonable thing.

[00:27:17]

Right. It's just not the technical implication. Logical. I mean, it's not like they're making a deductive argument here, but they're totally rational, too.

[00:27:25]

I think to most people, when when you when they say that's not rational, what they mean is I disagree with you.

[00:27:31]

Well, that's right. That's right. I mean, that is even worse. Right. So so you're correct. In that case, there is therefore an additional degree of. So what we're doing, basically, we're unpacking the possibilities of blame here. Right. So assuming that the scientist in question said something that is technically correct, which from what I understand, they did, therefore they cannot be blamed for that. But then there is a question of how they communicated that information to the public.

[00:28:00]

And then the further question that you raised of.

[00:28:03]

Well, but knowing if they knew, because these, in fact, are people who are used to talk to the public on a regular basis, especially the bernadin is given what they know about how the public is going to take that information. Was that even the best, even though it was perhaps technically correct, was that the best way to present that information?

[00:28:21]

So there I think there may be some degree of blame, but sorry, it still seems like that to the extent they screwed up, it was, you know, in covering their own asses, as you said, is there any way that you think for the public's sake, they they should have raised things differently?

[00:28:38]

Well, for the public, say, sake, you want to explain as a scientist, you want to explain things in a way that is that the covers all the above basic, right?

[00:28:45]

That is correct. No one that is understandable to a generally well-educated person, but also that is tailored, that is explained in a way that is actually tailored to the particular audience, because you cannot assume that all audiences are equally knowledgeable or equally well versed in a particular area. So so you need to tailor your message to the to the particular audience. Now, most of the times, if you don't do that as a scientist, there's no consequence. You go to somebody goes on TV and explains, they'd say string theory in a way that he thinks is understandable to the public.

[00:29:23]

And nobody gets any idea what the string theory is about. Well, so what? Your life is probably not going to change whether you understand or don't understand string theory.

[00:29:32]

On the other hand, if you're talking about an actionable piece of scientific information, as in this case, you know, should I or should I not live leave the building because there's likely to be an earthquake or that is not likely to be an earthquake, then that's a practical decision that clearly carries a higher degree of responsibility in terms of explaining things as it is understandable and way as possible. So I think that if there is any blame here, it is a question of did the scientist in question explain things as best it could have been done, given the particular audience that we're talking to right now, that Santa Barbara, again, in terms of the.

[00:30:12]

Sort of the broad consequences, like remember the distinction we made at the beginning between. The scientists giving the best information and it's possible from a scientific perspective, and then the person as a civil servant in the case of the Bernardini is giving actual practical advice.

[00:30:30]

The two are not exactly the same because there are others in the decision of whether to evacuate or not an area. It's not that the only variable is the likelihood of an earthquake.

[00:30:43]

There are other variables that first of all, well, it's likely that there's been an earthquake, a major earthquake. Fine. What about the infrastructure? Is the infrastructure there is likely to to sustain that kind of hit or not?

[00:30:56]

Yeah, it's interesting you mention this because I was thinking about how the assignment of blame to the scientists in this case called up some of my favorite philosophical debates about what what we can sensibly mean, because, like, it might seem very intuitive that when when you strike a match and there, you know, a fire is lit, that the striking of the match caused the fire.

[00:31:19]

But there are other things that if they weren't there, the fire would not have been lit, like if there was no oxygen. The you know, even if you did strike the match with no oxygen, the fire wouldn't have been lit. So does that mean the oxygen also caused the fire? What about the fact that if the match was wet, the fire wouldn't have been lit? Does that mean that the lack of water caused the fire?

[00:31:39]

And so, you know, if the earthquake were the result of a giant meteor hitting the earth, that that's the kind of discrete event that we usually refer to as a cause, as opposed to these, like, background conditions, like oxygen and dryness that we don't usually refer to as causes.

[00:31:58]

And so, you know, if if the earthquake were caused by some discrete event like that, that's like more clearly the sort of thing we call cause. But the scientists failure to give a specific warning to the public seems, I don't know, that seems a little more akin to like government officials failure to build stronger buildings. So in what sense the scientists are more a cause of the casualties of the earthquake then? The government officials are for failing to build stronger buildings.

[00:32:28]

That's right. Now. And the other the other thing is whose decision was it to advise the public? Because my understanding is I a double check on this, but my understanding is that this was the ex president of the of the civilian agency in question. So presumably there was somebody who was actually in charge of giving the evacuation order. Oh, that's a good question.

[00:32:48]

I don't think that any of the articles. So the question is, who actually was in charge? Because this this happened during an interview on television. So, you know, fine, that certainly has consequences. I mean, I don't want to say that scientists who go on television giving advice shouldn't be held responsible if that advice turns out to cause death or destruction. But on the other hand, here we're talking about, particularly in a European country like Italy, many Western European countries are very structured from that perspective.

[00:33:17]

I mean, there is a there's a series of agencies that do make decisions about things like evacuations and and, you know, first responders and that sort of stuff, just like we saw recently, again, in New York, New Jersey. So whose responsibility actually was it? I mean, are we now saying that? Well, the people who made the decision to call for in my question did it because they listened to the TV show where a scientist was saying something.

[00:33:45]

I mean, is that the way the problem was on that? I mean, yeah, exactly. I mean, is that the proper way to go about it? I mean, shouldn't the people who made the decision not to call for an evacuation have actually called directly, for instance, the scientist in question who engaged scientists in an official capacity as opposed to just listening to an interview? So it's like there's all sorts of other possibilities there that that complicate this thing significantly.

[00:34:09]

And I'm sure, you know, I've read the proceedings of the of the trial. I'm sure that some of this did come up. But you would think that the more things like this come up, the less likely you are to fairly put the blame squarely on the shoulders of whoever did the interview. The scientist did the interview on television that it seems like the more the more these issues you bring up, the more the responsibility gets sort of diffused.

[00:34:33]

But again, the public does want a culprit, right? And so somebody has to pay, as you put it, in his blog.

[00:34:42]

Now, should we talk about for a second in terms of sort of more generally the kind of chilling effect that this kind of issue, you know, this kind of of experience can have on it? Not only scientific advice in general to the public, but just expert advice in general?

[00:34:59]

Yeah, I I was thinking about how there's kind of this trade off. So, yes, I totally agree. It has this sort of thing that the chilling effect on science and on expert advice. And I think it maybe it's not obvious immediately, but I definitely think that.

[00:35:18]

The result of that chilling effect is not just negative for the scientists and experts, but it's negative for the public and that there's a trade off between, you know, wanting the best outcome overall and wanting to minimize deaths or harm overall and versus wanting to reduce the amount to which you feel culpable for deaths or harm.

[00:35:38]

And so, you know, if you really want to never feel like you're, you know, responsible or culpable for causing harm, let's say you're a food manufacturer. You want to make sure you're never liable for someone who's allergic to nuts, eating one of your foods and having an allergic reaction. And let's say the chance that somehow, let's say you don't manufacture any nuts or anything with nuts in your factory, but who knows, maybe like one of your factory workers will be eating peanuts one day and like, we'll touch something.

[00:36:07]

There's some tiny chance. Right. So you as a food manufacturer, there aren't that many people with, you know, deadly peanut allergies out there.

[00:36:14]

So it's probably in your best interest to just put a warning on everything you make, saying there's a chance that, you know, this might have traces of nuts on it.

[00:36:22]

And so, you know, if everyone thinks that way, if all the food manufacturers think that way and they just choose to cover their asses rather than try to give a realistic estimate of how unlikely it is that their food is contaminated with nuts, then you essentially wash out your ability to provide any information about risk to the public because you're just covering your ass.

[00:36:40]

So now, you know, people have to sort of take a stab in the dark because they have no estimate, no information about the probabilities.

[00:36:48]

And so now more people, you know, actually end up eating foods that are that we're much more likely to be contaminated with nuts because those are indistinguishable from the ones that were extremely unlikely to be contaminated with nuts.

[00:36:59]

And ironically, coming from a European country in particular from Italy, as you know, I always found it very strange that it is the American culture that tend to be litigious in that way. Then, you know, people sue, you know, Ultra's for all sorts of reasons. And as a result, what you get is these bizarre, you know, warnings about, you know, for instance, in your toaster. Don't bring it in your shower.

[00:37:23]

I mean, nobody in his right mind would bring a toaster in the shower, but evidently somebody did.

[00:37:28]

And some something happened. And the manufacturer now has to come up with this lengthy list of absurd, you know, sort of warnings because otherwise there's going to be legal consequences. So the interesting thing here is that we definitely want to get away as far as possible from that sort of situation, which was not typical until this case of countries like Italy. And this case may open up actually a situation, a trend in that direction where people now not just scientists, but, you know, anybody who makes anything that could potentially potentially be harmful in any way are going to say, well, OK, there's a risk of death from put that on.

[00:38:06]

Everything doesn't matter what it is. And now your assets are covered legally. And, you know, we all have no information about what you are saying. The tradeoff there is that essentially you lose the public loses because you don't have any actionable information anymore left like your case early on off of the doctor who wouldn't tell you what the chances are. Certain consequences just tells you that there are some chance. There's a chance that, well, the chance that it's like my grandmother cooking pasta and telling me that that you need to put in a pinch of salt in a pinch is not a union measure.

[00:38:38]

So tell me how many milligrams now?

[00:38:41]

I'll be fine, but a pinch is just a little too big. So it's that that, I think, is what the long term potential consequence of these kind of case is going to be. And it's not going to be good for anybody. The scientists are going to be extra careful and the public is going to be losing valuable information from expertise. And we don't live in a Society British-American study where expertise or scientific expertise isn't exactly well received anyway with, don't know, climate change and evolution as a couple of cases.

[00:39:12]

So the idea that this is going to get even worse, it's it's because of these kinds of cases and that that situation is spreading to European countries as well. It's certainly not a good, good prospect to look forward to. Mm hmm.

[00:39:28]

You know, I think a slightly less controversial case that I'm sorry, slightly more controversial case that occurred to me when I was thinking about the earthquake issue.

[00:39:37]

You going say we need to finish on a more controversial thing? Yeah, yeah.

[00:39:41]

Maybe maybe this would actually be a good subject for a future podcast. But this trade off between wanting to reduce overall harm and wanting to not be responsible for harm comes up, I think, in the FDA's decisions about whether to approve new and not thoroughly tested drugs. The FDA does not want to be responsible for approving a drug and having people take it and die because, you know, there was it turned out to be harmful.

[00:40:07]

But by delaying the release of the drug in order for more trials to be conducted, plenty of people are dying who could have taken the medicine and survived. And and I. I haven't done these calculations myself and haven't looked into them a lot, but I've I've read about repeated calculations that suggest that the FDA's unwillingness to be liable for deaths due to drugs actually causes more deaths from drugs that aren't released in time, right?

[00:40:34]

Yeah, but that is the kind of thing that is hard to sell to the public, right? Yes. And it comes down to not necessarily what is the most rational decision from a scientific perspective, but what is the most rational decision from the perspective of how the public is going to react to things. By the way, before we end, perhaps again, this also is going to be could be the topic of an entirely different episode. But in the in the article and rationally speaking, in mentioned the term Motluk that term, actually, I don't know whether he intended it that way or not, but is the title of a very famous paper by Thomas Nagel about ethics.

[00:41:11]

Yeah, I read that you know how most of the times we grossly overestimate our ability to actually control our actions and therefore be more responsible, control the outcomes.

[00:41:21]

Right. And therefore be more responsible. The typical case, just briefly, is we all agree that it's a horrible thing if somebody starts drinking and then drives and then hits a child that is crossing the street or something like that.

[00:41:36]

Right. And that could burst. And of course, it's culpable and is niceness is going to be suspended. He's going to be going on trial for manslaughter and so on and so forth. Yes, but plenty of people do drink a little more than they should.

[00:41:51]

They go get into the car, they drive and they cross the intersection at a time when violence right there.

[00:41:59]

And then they get at best, you know, or at worst a test from their friends. Exactly.

[00:42:04]

In fact, the idea is that either the two situations are identical except for luck, and so that either we should in order to be ethically consistent, either we should blame the people in both situations in the same degree, or we should not blame people in either situation to the same degree. If the only difference is something that has actually nothing to do with the behavior of the person, but has to do with external circumstances. So but perhaps that's another that's a whole different topic.

[00:42:29]

One last thing before we move on to the next one of the only pieces of fiction that really changed my mind about an issue, what exactly on this issue of moral luck? And it was a play called An Inspector Calls in which a mysterious inspector shows up at a wealthy family dinner party and tells them that he's investigating the death of a young woman who used to work for them. And he goes through over the course of the play and explains it's revealed how each of the family members contributed to the downfall of this woman in some way.

[00:43:00]

The father fired her from his factory and the young, like the son in the family, got her pregnant and each of them had some role that they played in her death. And so by the end of the investigation, the family members are all feeling incredibly guilty and, you know, wretched.

[00:43:14]

And and then it's revealed that the young woman didn't actually die.

[00:43:18]

And and two of the family members are like, oh, we feel fine. Then they go back to carousing.

[00:43:24]

And then the other two still feel shaken and and like they they should still feel guilty for what they did. And so, yeah, this was fiction, but it functioned as a really nice philosophical thought experiment that got me thinking about more because, OK, now we really are out of time. So let's wrap up this section of the podcast and move on to the rationally speaking, PEX.

[00:43:49]

If you're a fan of the rationally speaking podcast, I highly encourage you to get your tickets sooner rather than later for the 2013 Northeast Conference on Science and Skepticism in New York, New York, from April 5th through 7th of 2013, you can get your tickets at Nexxus dot org. That's N as in Nancy E C Asphaug Masimo and I will be there recording a podcast live along with a great lineup of other speakers, Nexxus Doug.

[00:44:29]

Welcome back every episode, Julie, and I think a couple of our favorite books, movies, websites or whatever tickles our rational fancy. Let's start as usual with due respect.

[00:44:39]

Thanks, Massimo. My pick is a book called Anti Intellectual ISM in American Life. It's a 1964 Pulitzer Prize winner. Oh, yes. I'm actually I'm just in the middle of it now, but I'm enjoying it so much that I just had to pick it as my pick.

[00:44:55]

Basically, I've been interested for a while in what the roots are of the anti intellectual, anti intellectual ism that I see around me in American society. And, you know, I give a talk on on the phenomenon of the straw Volcan, which is this like caricature of rationality and empiricism as being opposed to emotion and opposed to beauty and love and all of these other things.

[00:45:25]

And and I trace the roots back to the romantic poets and how they set up a dichotomy between science and reason on the one hand, and love and beauty and passion on the other hand.

[00:45:37]

But this book is by Richard Hofstadter has totally expanded my understanding of where anti intellectual ism comes from.

[00:45:44]

So he you know, he's an excellent historian and traces the roots back to a bunch of themes of American the development of American society like religion and the tension between faith and reason in an American society and religion in particular, and also to democracy and how the populism of American culture and of democracy are seen as being at odds with a scientific authority telling you what to believe.

[00:46:16]

Right. And I highly recommend it.

[00:46:18]

Yeah, it's a great book. Well, my pick is an article by Julian Baggini, who is the editor of The Philosopher's Magazine. This was published in the Daily Mail, actually in the U.K. Um, the article is called Ten of the Greatest Philosophical Principles.

[00:46:34]

And it goes very quickly with examples on 10 of the major ideas in philosophy, basically. And this includes Jon Stewart, Stuart Mills Ham, Principal Leibnitz Principle of sufficient reason. Aristotle idea about trying to go for the for the mean between extremes, Popper's falsification principle and so on. There are several. But my favorite one in terms of particularly the way the is illustrated is Occam's Razor, of course, by William of Ockham, 12 13 for the eight.

[00:47:04]

And Bungeni explains with the razor is about. And then there is a picture of from the of the cast of the show Sex and the City. Uh huh.

[00:47:13]

Which shows an application of all kinds. Rasor The Occam's Razor principle has been found, has found its way even into Sex and the City if a man is sending a woman the mixed messages. The simple answer is he's just not that into her.

[00:47:28]

So he had to applied for this. So that's my pick.

[00:47:35]

Excellent. And I love that it's in list form, too. I think more philosophical papers should be written in the format.

[00:47:40]

You know, ten examples of something. Yeah, exactly. If the tabloid, you know, women's magazines have have learned it, then I think philosophers can it up too. Excellent. I will check it out. This concludes another episode of Rationally Speaking. Join us next time for more explorations on the borderlands between reason and nonsense.

[00:48:08]

The rationally speaking podcast is presented by New York City skeptics for program notes, links, and to get involved in an online conversation about this and other episodes, please visit rationally speaking podcast Dog. This podcast is produced by Benny Pollack and recorded in the heart of Greenwich Village, New York. Our theme, Truth by Todd Rundgren, is used by permission. Thank you for listening.