Transcribe your podcast
[00:00:14]

Rationally speaking, is a presentation of New York City skeptics dedicated to promoting critical thinking, skeptical inquiry and science education. For more information, please visit us at NYC Skeptic's Doug. Welcome to, rationally speaking, the podcast, where we explore the borderlands between reason and nonsense. I'm your host, Massimo Luchi, and with me, as always, is my co-host, Julia Gillard. Julia, what are we going to talk about today? Massimo, today we have a guest on the show who I'm very excited to introduce.

[00:00:52]

His name is Sam Ebersman. He's an applied mathematician, a network scientist who's currently a senior scholar at the Ewing Marion Kauffman Foundation and a fellow at the Institute for Cognitive Social Science at Harvard University. He just came out with a book called The Half Life of Facts Why Everything We Know Has an Expiration Date. Welcome, Sam.

[00:01:12]

Thank you so much, Sam. My first question to you is, so spinach are not good for me, so this is good for you.

[00:01:22]

OK, good. Thank you.

[00:01:24]

So don't worry, but it's not magically good for you. I keep wondering why I don't have Popeye.

[00:01:31]

So I wouldn't read too much into how good spinach is. It turns out there's actually so since so I discuss in my book about this story about how some people measured the amount of iron in spinach and they thought it was really, really healthy. And then it turns out it was due to a typo and then actually it was no more healthy than any other green vegetable. But that error propagated for a long time. So this story was discussed in the early 1980s in a British medical journal.

[00:02:01]

So I discussed this in my book. It turns out after my book went to press, I found out that this story of how an error spread far, far longer than it should is itself an error.

[00:02:12]

It's a metaphor. So it's a meta error. So I've been tracking it down and talking to someone who helped figure this out. So this one guy tried to debunk this a couple of years ago. And, yeah, it's it sounds like that there might have been some mistaken measurement. It was probably due to confusing iron and iron oxide or maybe some like erroneous measurements or some experimental error. It wasn't really due to a typo. That sounds even better than it should have been.

[00:02:35]

Yeah. And then so it sounds like it wasn't due to that. And then it was also relatively quickly fixed. But the error of the story has also spread for about three decades. So it's kind of interesting to see how error spreads on pop of error in terms of how we all try to understand what's true and what's not. Sam, I have to confess, I think I'm at least partially responsible for the propagation of misinformation because it frequently happens that I hear a story that just is so entertaining and delightful and part of me wonders, maybe I should check check up on the veracity of the story.

[00:03:10]

And then another louder part of me is like, Nat, if I find out it's not true that I can't tell it, so I'm just not going to check for it.

[00:03:19]

This is like a really natural tendency of us. Like we love things that we love great stories, but especially love, especially great stories that conform to what we think we know because they they'll help us to further the kind of perspective that we want. And of course, the world is not always that clean, but it's a lot easier to just say, oh, the story is great. Whether or not it's true, it's a great story and and stuff.

[00:03:42]

But you can you can sort of have both both ways. So I'll give you an example that happened to me several years ago. So at some point in my I was preparing a series of public lectures. I came across these these quote by the mathematician lipless that allegedly notice the use of allegedly allegedly told Napoleone. So please explain. This is a theory about the origin of the solar system to Napoleon. And Napoleon said that at the end of the explanation that applies.

[00:04:15]

What about God? And the plans allegedly responded. I know I don't need that hypothesis anymore. And I thought, oh, this is great. And you fit very well, my my talk. So I used it.

[00:04:25]

And then, of course, eventually I did it three or four times.

[00:04:28]

And then eventually somebody said, you know, that that quote is actually apocryphal. And so I checked and it turns out yesterday's doubts about the quote. So after that, I started using the quote. Anyway, I kept using the quote anyway by saying, you know, and this is an apocryphal quote, it's attributed to lies. But of course, it should be true. It fits very well.

[00:04:46]

And then somebody says, no, actually, it's not apocryphal. It's real. And it's the story about being apocryphal that is actually incorrect that I do these days.

[00:04:55]

I still I still I'm not sure which one, which one it is so out of respect for for my listeners, just say, well, I think it's apocryphal, but it's very good and it makes the point anyway.

[00:05:10]

Yeah, no, I understand there was a quote that I wanted in my book about measurement. So there's so there's a quote by Lord Kelvin that's I guess been enshrined on the the Social Sciences Research Building in University of Chicago that he says when you cannot measure your knowledge as meager and unsatisfactory, which is a really great quote, but it's multiple different versions of it. And so I wasn't really sure until I actually.

[00:05:34]

So in order to figure it out, I had to have someone walk over to the building at University of Chicago and read it off of the wall because I otherwise I had no way of knowing, because there are so many versions out there where you couldn't Google map it, 3D mapping or something like that.

[00:05:49]

But it was hard to read it. Read it on the picture now.

[00:05:52]

Yeah, maybe. Maybe at this point we should back up and have you explain exactly what you mean by halflife with regard to facts. Oh, sure.

[00:06:00]

Yeah. So we all know intuitively that various aspects of knowledge and information that we have in our heads, it it changes over time. So what we think is nutritious or not nutritious, like whether we should eat carbs or fatty foods, these things change facts about science, such as what dinosaurs looked like. They were kind of they used to be like these hunting monsters and now they look sort of like fearsome chickens, all those kind of facts and even facts about technological change, facts about the state of the world.

[00:06:26]

All these things are changing around us. And for a lot of people, this can sort of be overwhelming. And so a lot of people just throw their hands up and say, oh, I changed the constant, that's fine. But it turns out that there's a there's often an order and a regularity to how knowledge grows and changes. And so my book aims to explore and understand how knowledge changes. And so I use the example of the halflife is sort of this analogy.

[00:06:50]

So with radioactive materials, if you have a single atom of uranium, for example, we know exactly how it's going to decay and break down and release a certain amount of energy, but we don't know when it's going to decay. It could decay in the next fraction of a second, or we might have to wait millions and millions of years. But things change when we go from a single atom to many, many atoms. So suddenly we can actually chart curves of decay and actually encapsulated in a single number, in this case, the Half-Life, how long it takes for half of the atoms to decay.

[00:07:19]

And you can't predict which specific atoms are going to be in that half. But we know the overall shape. And the analogy is this is this idea that the same thing is true with knowledge, even though we don't know which specific discoveries are going to occur, which fact is going to be overturned when you actually have a large enough bits of information or pieces of knowledge. Knowledge changes far from random in the aggregate. And so I show how knowledge grows, how it obeys various regularities like exponential curves of growth, how it decays so you can actually measure how knowledge decays over time, how it spreads from.

[00:07:52]

Person to person, how each of us deal with with changing knowledge, how measurement improves all these things and improves it according to rules, and the idea that there are rules and if you can understand the rules, then you won't be as surprised by all the change around us. And how do you measure the half life of of knowledge in a given field? So there's lots of different ways, a really cool way, which doesn't really scale. But it's a very interesting way to understand it is this is to actually just give it to it, to a team of experts and say which of things which of these things are true.

[00:08:23]

So, for example, in two fields in medicine, a team of scientists looked at they were looking at hepatitis and cirrhosis. So they're both related to diseases of the liver. So they said this team of scientists took a whole bunch of papers from a span of 50 years and gave them to a panel of experts and say which which of these papers are true and which ones have been overturned or are just obsolete. And from that, they could create a curve of decay.

[00:08:48]

See and see how as as papers become older, as they increase in age, the likelihood that they're still true actually decays over time. Of course, there are other ways to measure decay as well, like the half life. You can do it indirectly. And this is actually the way a lot of this measurement occurs is by looking at citations. So when a paper is cited by other papers, when it's referred to by other papers, it's considered part of the living scientific literature.

[00:09:15]

And when it's no longer cited, it's often assumed to be no longer relevant. And there's many reasons for why it might not be relevant. But a really good proxy is this idea that it just no longer part it's no longer considered important for for the scientific literature. And so you can actually see how in a field like the the papers begin to receive half of the citations they used to. And you can look at the rates at which the amount of citations decay and you can compare.

[00:09:40]

Field to another and see how fields different. So one of the things that emerges from these kind of analyses is that different information, different fields decays at different rates, which I guess it's not surprising.

[00:09:52]

But but are there are there patterns that that we should be aware of? So, for instance, know what about physical science versus social science or science as a whole versus mathematics or something like that.

[00:10:05]

So and once again, it really depends how you're measuring and also depends what you're looking at papers. You're looking at textbooks and things like that. If I recall correctly, my sense is that the social sciences move a little bit more quickly than the natural sciences. And if you think I mean, intuitively in mathematics, when you prove something, unless it's eventually you find an error in it, it's that truth is going to stand the test of time. And it may not be part of the living mathematical literature.

[00:10:31]

It might be that someone prove something that's sort of a dead end, but it will stand the test of time. On the other hand, in the social sciences, compared to, let's say compared to physics, physics, it's very easy to tease out the signal. And so when if you and if you're trying to trace the arc like the parabolic arc a ball makes when it goes through there, you don't have to worry about that changing. On the other hand, though, when you're measuring things with people, so whether or not you're trying to understand how behavior works or in medicine, how people respond to treatments, there's a lot of noise.

[00:11:06]

And so when when systems are noisier, there's going to be a certain amount of noise. And the results were results you think are true or simply just true due to chance and not actually real results. And so then you're going to have more more turnover. And the idea is that as we can improve, our measurement will often then overturn things we thought were true. And this seems to be happening at a higher rate in the social sciences, mainly because the systems are just noisier.

[00:11:31]

Right.

[00:11:32]

It seems like there's actually two separate problems here, but you're picking up on one of which is that what's actually true about the world changes. So in the social sciences, maybe it was actually true one hundred years ago that people would behave a certain way if you put them in a certain situation. But now it's just no longer true. And then the second problem is that we were wrong about what we thought was true. So the actual truth hasn't changed, but our measurement of it was initially wrong and is now more accurate.

[00:12:08]

So most of what I focus on is more of the latter. This idea that and we're trying to approach a true understanding of the world, but along the way, things that we think are true, these facts are going to be overturned. There are things more in the first case about how people interact that are different, that's more often seen in aspects of technological change where so, for example, as I like, what is the fastest speed that someone can go.

[00:12:33]

And that, of course, is going to change as technology improves or the ease with which we can communicate. On the other hand, though, there are certain things about social interactions that don't really change. I mean, and maybe we have. Better communication techniques and better transportation techniques, but especially when it comes like the number of people you can effectively interact with. It seems that even in the face of technology or social change, these kinds of things don't actually change that much.

[00:13:00]

We just get better at measuring them and maybe reduce the amount of error and in so doing, maybe overturn these more perennial truths rather than the ones that are changing over time.

[00:13:09]

So one one thing that I find interesting about thinking about the decay of information or truthiness, if you will, of facts is that, in fact, there are of course, you've been describing a lot of these examples as being well approximated by a sort of an exponential decay, which, of course, has different rates, as we said, depending on the specific set of facts or the specific field. But now there are certain things that don't actually decay.

[00:13:38]

But you mentioned earlier mathematical truth. I mean, you know, Pythagorean theorem is going to be true, presumably forever. Once we figure it out, that's it. It's not going anywhere. So so there must be some of this that actually sort of bottoms out and it stops changing over time. Would you agree? Oh, yeah.

[00:13:54]

Yeah. And so maybe the exponential does not conform to everything. And I think ideally, if science is a. As a process is successful, we are getting better and better at understanding the world, and so therefore many of these things will stand the test of time. And so the the quote that I mention in my book, and I think it's a great quote to try to understand this, is that even though things are being overturned, it doesn't therefore mean that everything is unknowable.

[00:14:16]

So the quote is from Isaac Asimov. And he he says that when people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both together and.

[00:14:34]

Yes, and which is and because we we now know that the Earth is actually in oblate spheroid, it's sort of like a flattened sphere. But but but the idea is that just because I mean, maybe the the view of the world is overturned, we're reducing our error with each successive view of how the earth is shaped. And so we're actually getting a better and better understanding. And you can actually look at the amount of error in each view of the world.

[00:14:55]

And as you've been discussing this, it shows the amount of curvature per mile that's assumed in each model. And I think that's the right way to think about this, where the idea is that, yes, it's true that things are being overturned and that can be a little surprising. But at the same time, though, this doesn't mean everything is unknowable. We are getting better and better understanding the world. We're certainly not done yet. This is the reason we do science.

[00:15:17]

It's right because there's lots of stuff we don't know, but we're OK. But it doesn't mean, therefore, we don't know anything.

[00:15:21]

This is what used to bother me way before Jonah Lehrer got tarred and feathered for the various plagiarism scandals. Before that, I was still troubled by his interpretation of this process, which he wrote about in several publications, including The New Yorker called, which he referred to as the decline effect. These reversals in our body of scientific knowledge and his sort of take away from this phenomenon was essentially like, well, we just can't know anything. So he concluded his article about the decline effect in The New Yorker by saying when the experiments are done, we still have to choose what to believe.

[00:16:00]

And yet the decline of lectures and it's really not. And when he wrote about it, he kind of wrote about it and making it seem fairly mysterious and worrisome. Actually, one of these kinds of things were and as I was saying before, when you're in a field, it's noisier. You're going to have certain things, you're going to run certain experiments. You're going to think that the effect may be larger than it actually is or is there and it's not actually there.

[00:16:22]

But then the more science you do, the better sense you get about the world. And so it's it's not that oh, the more you do, you're going to constantly flip back and forth between competing views of the world. You're going to get a better understanding, understanding of how the world really is. And you ought to worry about that kind of thing. And so, yeah, when you have only a few people working on a problem, maybe you'll have some spurious results.

[00:16:43]

But as more people try to reproduce problems or try to actually reproduce the experiments, we'll get a better understanding of what's really there. And so we don't to worry. And so the other decline effect, it seems to be in certain cases, it is a real effect, but it's not anything nefarious or concerning. This is just simply how science works. And now that people are realizing that reproducibility is more is actually a very, very important in science.

[00:17:07]

And if we can properly incentivize to make sure scientists are reproducing results that they think are important, then a lot of this will end up being results, which is great. Right.

[00:17:18]

So, so much more comment about the shape of these of these curves and this idea that eventually, at some point, in some cases already, they may be flattening out.

[00:17:29]

There's an analogy there between, you know, the kind of data that you have collected and let's say models, mathematical models that describe the growth in that case instead of the decay of biological populations and biological populations tend to go exponentially initially if they have enough resources available, and especially if they have fewer competitors available. But then eventually, of course, nothing can grow indefinitely. And therefore they sort of the flattening out and becomes what it's called, the logistics of the last part.

[00:17:59]

It becomes becomes flat. We may be looking at something like that, that that that in the case of facts from different disciplines, like one of the things that you mentioned at some point is, for instance, the transition between pre modern medicine and modern modern medicine, more and more evidence based science based medicine has been associated with a sort of slowing down of these decay effects, which you would expect.

[00:18:23]

Right, because you go from something that is Proteau scientific essentially to something that that it's certainly not as rigorous as fundamental physics, but it's better than what it was there before. And so a lot of the information or facts that we arrive at have a longer life shelf or in fact, there eventually at some point enter the kind of facts that we feel confident enough to say, yeah, that that this is pretty much a truth is not going to change.

[00:18:49]

Yeah, and I entirely agree. And I discussed the logistics as another of these kinds of curves that are important. The other thing is also I think we often have to distinguish between knowledge, like sort of the core of scientific. Knowledge and knowledge of the frontier, and so often when we read about the all the interesting scientific results in the newspaper, those are that's that science of the frontier. It's really exciting. But of course, we don't still know if it's going to stand the test of time.

[00:19:13]

And so there's a lot of noise there. The science at the core, though, changes far less. That's the stuff that's in the textbooks. And we have a pretty good sense is going to stand the test of time. And some of it doesn't always. But that stuff changes at a far, far slower rate. And we have to we have to be less concerned about its veracity.

[00:19:33]

Do we have any reason to think that the half life of facts in a given field remains at all constant over time? Like, should we? I would assume that, as I say, I don't know. Biology tackles more difficult problems over time that maybe the half life would go down or as measurement methods gets better or people start focusing more on replicating studies that the life would go up. How?

[00:20:05]

Yeah, how much constancy is there in the half life over time?

[00:20:08]

So that's a good question. And I'm not really sure we have good data for that. And I think you can often make a really good argument for either direction.

[00:20:15]

And so that's trouble. Yeah. So in those kinds of cases, we need to actually study it and try to find out more. So the truth is, I don't really know.

[00:20:23]

Now you talk at some point about the what you call the fact phase transitions. What are those?

[00:20:29]

So those are cases where I went. So part of what I talk about in terms of facts and scientific knowledge, a lot of that stuff changes fairly slowly. So in terms of the number of elements in the table or certain aspects of a larger theory, maybe certain things change somewhat slowly. On the other hand, though, phase transitions and it comes from the world of physics. And is this idea that we're it's the simplest one is phase transitions from water going from a liquid to a solid.

[00:20:59]

And as you slowly change temperature instead of water just becoming colder and colder, eventually it becomes it goes from being liquid water to becoming something entirely different, in this case, ice. And it becomes totally frozen. And it turns out there are similar kinds of things in how knowledge changes. So where there might be. And so it might be like, for example, in large biological theories like we theories on Darwin's theory of evolution was a fairly big factual change or maybe not factual changes the word but but in terms of how we understood the world and how we thought about it and so that that was a very big, big change or in terms of facts about the state of the world going from humans, never having walked on the moon to actually stepping foot on the moon in nineteen sixty nine.

[00:21:47]

This is a big change. It turns out, though, there are still ways of understanding these the regularities behind how these relatively sudden changes in our life, the way we view the world by by doing a similar kind of thing with temperature. There was there was some underlying parameter in this case temperature that allowed for this abrupt change. And oftentimes there are similar kinds of things with how knowledge changes. So in the case of going from humans, never having walked on the moon to humans having walked on the moon, this is a pretty big change in our facts about the world.

[00:22:19]

It turns out that if you look at these, the maximum speeds that humans have been able to to move like in terms of transportation speeds, it's a fairly smooth curve. And this curve has been fairly smooth and been smooth for for several decades.

[00:22:36]

And you can actually predict at what point the speeds were high enough that you could actually get to the moon. And so oftentimes there are ways of understanding the underlying parameters of knowledge, change these that have a smooth change that can actually yield these more abrupt changes. So everything from like walking on the moon or to other knowledge changes. So I actually discussed how to possibly predict when we might discover the first potentially earth like potentially habitable planet, which would be a fairly big change in the way we let's go back to speed there when we're getting warp drive.

[00:23:11]

So it turns out you can actually you can actually predict the curve out. I think it gets like interstellar transportation. I do remember the exact time period. I think it's actually fairly soon. So I think the curve might be breaking down.

[00:23:26]

Well, I mean, that brings actually up an interesting question about the the whole the whole approach. So as fascinating as it is, we're still talking essentially about getting a large number of data and then sort of interpolating first to to determine what the shape of the curve is.

[00:23:44]

It's not always available. There's an art to it. And some of it's more easily done retrospectively rather than prospectively. Some of these people and some of these things can be done prospectively. I've actually written about in terms of predicting certain things within astronomy, but it's. Tough and it's not, and it definitely from field to field, it's more doable or less doable. And so we have to recognize the limits of what we're able to do. And some cases we can make guesses and make conjectures, but we can actually nail down any sort of hard numbers.

[00:24:14]

Right.

[00:24:14]

I mean, you know, it's an interesting exercise. But but the point that a statistician would make, of course, is that, you know, interpellation, you're on safe grounds when you start extrapolating beyond the data range that you started with. You're OK as long as you can make reasonable guesses that the shape of the curve is is unaltered.

[00:24:31]

That is the same thing. But of course, that's exactly what you most cases. That's exactly what you want to know. Is it really you know, can I really extrapolate, let's say, speeds of of of going from one place to another into the next three centuries or millennia? Maybe I can extrapolate them into the next 50 years. But the further you go from the current range of the data, of course, the more the uncertainty grows.

[00:24:55]

So and certainly and I think this is one of the reasons why pundits love to make predictions that will come true or not after they're dead. Right. Because there's no accountability.

[00:25:04]

And there's there are websites actually that try to actually keep experts and pundits accountable for their. But oftentimes people will actually pay attention to those, I think we need more of that. But you're right, oftentimes when it goes farther out in terms of time, these kinds of predictions are. Sort of they're really it's guesswork with some math as a rapper, as opposed to actually something highly quantitative, we when we try, but it's not always doable.

[00:25:31]

So it seems like the key is understanding why the half life is what it is. Right. Why information follows the relationship that it does. And your book did go into some plausible sounding reasons for the exponential curves that we see about knowledge building on past knowledge, hence the exponential relationship and about rates of population growth and speed of information, propagation and so on. Which of those relationships do you think are the main underlying causes of the Halflife phenomena we see?

[00:26:11]

So certainly in terms of knowledge growth and population growth and cumulative knowledge is important in terms of like the half life in the overturning. It's often due to how we actually do science. So it has to do with increases in measurement or how we assume how we decide that papers are publishable and science is reputable. So in this case, it has to do with statistics and probability. So the idea is that if you're testing something and you want to see if some relationship between A and B is really true, you compare it to whether or not you would.

[00:26:44]

Based on the experiment, see that A and B is true, but it's actually due to chance. So in this case, this is the null hypothesis. The null hypothesis is that there, in fact, is no relationship between A and B, and you're detecting it might simply be spurious. And so the idea is that if you can say that five percent or less and there's different cutoffs in different sciences, that five percent or less of the time, I would only detect this spuriously then the rest of the time it would actually really be there.

[00:27:15]

Then people will publish. The problem is, though, is I think John Maynard Smith said that this statistics means that you can do 20 experiments a year and publish one of them in nature. And he is alluding to this this five percent, this one in 20, which is even if the results are not there, if you if you do enough experiments, you're going to find things that might simply be spurious but are publishable. Right. And so and so.

[00:27:39]

But as we get better and better measurement techniques and ways of detecting and teasing apart the difference between what is really there and what is statistically spurious, I think we're getting we're getting a better sense of the world around us. And so oftentimes the overturning of these facts is is in rooting out these things and getting better measurement. And so oftentimes our improvements in measurement and the growth in number of scientists testing things that can often explain the overturning of facts rather than simply just the growth and knowledge.

[00:28:14]

So these are issues that I want to bring up as from from a more philosophical perspective, which is a lot of what we're discussing depends on on concepts such as facts, which we've been talking about for a while, and new discoveries. You know, you can measure the number, quantify the number of discoveries per field and so on over a period of time.

[00:28:34]

But of course, it's not entirely obvious necessarily, at least in all contexts, what counts as a discovery, what counts as a fact. I mean, one of the basic intuitions that that philosophers of science developed over the last of the last several decades actually is that the facts themselves are not independent from theories. For instance, you mentioned earlier on you were talking about the theory of evolution. And then, of course, that's in some sense is not a fact.

[00:28:59]

That's right. It's a theory. But of course, there's a theory that being a scientific theory depends on facts and so on.

[00:29:04]

So it's almost like a bundle of facts. Yeah, right.

[00:29:08]

So so the question is, when you actually moving away, the only reason I brought up this or the broad philosophical perspective is to actually get to a very specific question, which is when you actually do this, these kind of scientometrics work, what kind of how do you select what counts as a data point? You know what? I'm sure there are situations where you can say, well, and is that really a discovery or is that really a new fact or is it just a slight modification of a previously known thing?

[00:29:38]

How do you go about doing that?

[00:29:40]

So by and large, within scientometrics, it's often and it's very hard to distinguish those two. Exactly. What you're saying is it's hard to tease apart. And so what you do is you just look at papers and you say, OK, a paper or some sort of distinct contribution to science. And of course, I mean, there are many papers where there might be, depending on how you define it, multiple discoveries within a single paper. And there are many papers and there are a lot of papers that really don't have any discoveries at all.

[00:30:03]

And there's simply just people publishing things for the sake of publishing. Right. And so it's not always the best proxy, but it's a decent one in terms of understanding a contribution to science. That being said, there are fields where where you can more easily quantify discovery. So, for example, quantifying the discovery of asteroids or the discovery of species and so or even like the discovery of chemical elements. And these are actually three three areas that I discussed that I discussed in my book a little bit.

[00:30:37]

Those are nice because you have a fairly. Clear and discrete definition of what is the discovery like, we know what it means to have a new element in the periodic table or a new species discovered. No, of course, there are many different ways of understanding when those things were discovered and what constitutes discovery. So it's still very it's still fuzzy, but you can still you can still say, OK, what are the properties of this of these this new mammal that that I found, this new creature.

[00:31:07]

And so you can begin to actually have a shape to what a discovery is. But that being said, these are these are often the discoveries that are a lot easier to quantify as opposed to like a discovery that there's some sort of correlation between the presence of a gene and a disease or something like that. Like those like what does that mean? There's a discovery that's harder to define. And of course, there are even many other discoveries that are a lot harder to understand.

[00:31:34]

So oftentimes many people doing this kind of thing will ignore some of those details and simply say we're looking to the discoveries themselves, sometimes more easily defined, sometimes more difficult to find.

[00:31:45]

Let's simply look at papers and assume that their contributions to science discoveries are related to them somehow. But we don't have to worry about the details.

[00:31:54]

Right now, the you talk about the category of facts that you're most interested in is what you call Mazor facts as opposed to what the facts are, these missile facts.

[00:32:07]

And so we have and when we think about knowledge changing, we often there's often these facts that that we learn that either change really rapidly. So, I mean, when we deal with, like, what the weather is going to be like tomorrow with the stock market closed yesterday, these facts change really, really quickly. And we are adapted and know that we need to look them up on a daily basis to handle them. On the other extreme, though, we have facts that change really, really slowly or effectively, numbers like how many continents there are in the earth or how many fingers there are on a human hand.

[00:32:37]

And you learn these facts once and you're good. You don't worry about them changing. But in between, there's a lot of facts that change on the order of years or decades or on the order of a human lifetime. So everything from like nutritional facts or how we take care of babies or facts about the state of the world, like how many billions of people there are on the planet or information about technology, all of these kinds of facts, they change slowly, but they change rapidly enough that in our own lifetimes things change a lot.

[00:33:05]

And so these are what I call measle facts because their facts change on the meesa or middle time scale. And the problem with these facts is that we often and we learn them the same way we learn the facts that change really slowly or effectively never. And then as we grow older, when we're young, we are information journalists. We learn lots of the stuff. When we grow older, we specialize. We learn more and more about less and less and less.

[00:33:28]

A lot of these bits of information are in our area of expertise. We don't realize that a lot of the stuff is changing and we have an out of date fact in our heads. And until often our kid comes home and says, guess what? Dinosaurs are warm blooded, look like birds. And then we're really surprised that all this stuff has changed. And so the facts are really big when I say problem, but it's certainly an important component of how we deal with knowledge.

[00:33:54]

And if we can better become better adapted to these facts, I think we can better handle how knowledge changes over spences in in several fields in several times, I should say.

[00:34:03]

But one that comes up often my mind is that I developed in the last several years a new interest in history, which is something that, as a subject matter, absolutely did not interest me at all when I was in high school.

[00:34:17]

And but, of course, I learned most of my basic historical facts when I was in high school in Italy.

[00:34:24]

The you know, most of that that kind of learning that it's done in high school at the college level, you already specializing in whatever your area is, which in my case was biology. So most of the history I learned was actually in the five years of high school. And I found myself over over the years over and over, reading a much more recent book on a particular historical subject and realizing that, in fact, my mezo facts and evolved.

[00:34:48]

That changed. And it always made the wrong way. The first time that I read it was, now, this can't be right. This is not what I learned when I was in school. But then, of course, yes, it is right. Or it is a better, better understanding of what right is, because this is better scholarship, more recent research and so on and so forth. But the first psychological reaction was always one of resistance to the new notion.

[00:35:09]

Oh yeah, it can be very surprising. I mean, when I was researching the book and read some stuff and I'm learning all about how, in fact the brontosaurus is not really a dinosaur, it had to do with some misclassification putting the wrong skull on some other dinosaurs body. And in fact, the brontosaurus, which I grew up loving as a dinosaur, is not real. And and it's actually the Apatosaurus looks a little bit different and that there was a little little surprising.

[00:35:34]

I certainly had a number of years to adapt to it. So it wasn't like I just I just learned it while I was reading the book but are writing the book.

[00:35:41]

But yeah, a lot of these things and let none even let let me not get started on Pluto and the planets in the solar system because then I get really upset.

[00:35:49]

But anyway, so yeah. Those mezo facts that you've just been discussing were the sort of thing I was thinking of earlier in the conversation when I talked about these two separate problems that caused shorter half lives, one of which being us turning out to have been wrong about how we thought the world works, but then the other being the world, actually changing the way it works. You know, so these facts about the population of the world or the rate of cell phone coverage around the world would fall into that latter category.

[00:36:23]

And that's sort of why I thought the social sciences, Borshoff sciences, they get such a bad rap. Even if they didn't have the problem of noisiness and measurement and the difficulty of figuring things out, I would think they'd still have shorter half lives because the nature of the modern world changes so fast. Yeah, yeah.

[00:36:43]

That that could definitely be true. And one of the reasons I actually discussed both those different types of facts, although I certainly focus a little bit more maybe on the types of scientific knowledge change, I certainly. The other things like technological change is because it turns out oftentimes the regularities behind them, they change in similar ways and they are often more tightly connected than we might have real. So, for example, technology changing certain things about social behaviors. In addition, technology, also technological change obeys regularities, but technology can also change what we can know.

[00:37:16]

So science and technology, science, technological change are often connected more tightly than one might have realized. So when we and there are curves of how the number of chemical elements, the table has increased, but those firms have all have gone hand in hand, at least for part of it, with improvements and tools and technologies such as the increased power of our particle accelerators. So there are a lot of different types of facts that are changing and they're often more tightly coupled and they all kind of obey interesting regularities and they all contribute to measle factual change and halflife and overturning of knowledge.

[00:37:53]

So is there a way we should be thinking differently about the world or I guess behaving differently, given that we have some understanding of the regularities of change in our body of knowledge? Like is the takeaway here in fields like nutrition with short Half-Life of facts? I'm assuming maybe don't make any major life changes when a new fact comes out? Or is there another takeaway I show you that I certainly think that's a good one.

[00:38:19]

And I think one field that seems to be candling, this idea of life changing knowledge better than others is medicine, mainly because they have to because there are lives on the line. But when you go to medical school, medical students are told that within a few years, a lot of what they learn is going to be become obsolete. And so therefore, they have to constantly try to make sure that they're they're up to date. And there are there's continuing medical education.

[00:38:48]

There are websites that have the most recent information by experts. And I think if we can internalize a certain amount of that message of, OK, recognizing the different regularities and how knowledge changes, but also just recognizing that since things are changing at different rates and oftentimes fairly rapid rates, we have to make sure we have the most recent knowledge. And so whether or not that means looking things up more often rather than relying on half remembered bits of information or simply talking to experts in a field to make sure you have more up to the information.

[00:39:18]

I think all these kinds of things are important and and I would say rely more often than not on the fact that fields change more rapidly than you might realize.

[00:39:27]

And it seems like another obvious application there would be in the practice of science education, the way in which we teach science, because, again, if you go to a college level or I'm assuming the things that even more worse, that the high school level, but the college level, typical introductory course in science, especially in certain sciences like biology, which are factually heavy, so to speak, as opposed to, say, physics, which is more conceptually oriented, then three courses really are based on these huge volumes of facts that the students are supposed to send you to memorize with very little emphasis on things like scientific methodologies and critical analysis of of of concepts.

[00:40:12]

That sort of stuff. Seems like instead, what you would want to do is precisely the opposite. That is, yes, you certainly need to know a certain basic number of facts about, let's say, biology, because otherwise you can't even think about the biological world. But in fact, the emphasis should be on the uncertainty conceptualized by the broad experts. The critical evaluation of methods and then the actual facts. Since they will change. You can you can look them up.

[00:40:38]

And in every year or every few months, you have to look them up and you get some some different results. So even science education probably could be affected by this kind of perspective.

[00:40:48]

Oh, yeah. No, I entirely agree. I think we need to shift more shift away from memorization, which I've never really been a fan of towards more of saying. Yeah, how understanding the scientific understanding, how how actually these facts that we see in the textbook have come about. And so once we have a better understanding for that, will better will be better prepared for understanding how they might be overturned or how they change or how they improve, coupled with having a better way, a better understanding of how to critically evaluate Philidor nurture, which I think is very rarely taught.

[00:41:19]

I think these kinds of critical skills are really important because in much of what we learn when we're in school is going to become obsolete. And so if we have the tools, though, to continue learning and to have this more continuing education rather than being sort of this all this education being a youthful indiscretion, I think there will be will be much better prepared.

[00:41:40]

Well, then we're just about out of time for this part of the podcast. So let's move on now to the rationally speaking, PEX.

[00:41:54]

I'd like to take this moment to remind our listeners that if you're a fan of the rationally speaking podcast, you'll definitely enjoy this year's Northeast Conference on Science and Skepticism, which will be held in New York, New York the weekend of April 5th through 7th, 2013. Go to Nexxus dot org now to get your tickets there on sale in addition to Masimo. And you'll also find a lineup of great speakers, including the SGU, Simon Singh, Michael Shermer and our keynote speaker, physicist Leonard Mladenov, author of The Drunkard's Walk.

[00:42:28]

Next, the story. That's an easy asphaug. Go get your tickets now.

[00:42:39]

Welcome back, every episode would be a suggestion for our listeners that has tickled our rational fans. This time we ask our guest, Sam Ourisman, for his suggestion Sam. So my pick is a book by Michael Roberson, he's an investment strategist and he's also a friend of mine. His book that came out at the end of last year is called The Success Equation Untangling Skill and Luck in Business, Sports and Investing. And Michael, he explores how how we can actually understand the difference between skill and luck and in a lot of different fields.

[00:43:14]

And he explores how to quantifiably measure this and also get a better sense of many times when we think we're being skillful or being lucky that we actually turned out it's the opposite. And he shows us how to really tease this apart and use a lot examples of sports because we just have so much data on it. So those are a lot of fun. But he also shows how they can practically impact everything from how we deal with grades when our kids bring them home that are not as good in terms of reversion to the mean and things like that.

[00:43:42]

So it's a lot of fun. It's a really cool book and helps you understand success and skill and luck.

[00:43:47]

That sounds fantastic. Well, Sam, thanks again for you being a guest on, rationally speaking, a fascinating discussion. Join us next time for more explorations on the borderlands between reason and nonsense. The rationally speaking podcast is presented by New York City skeptics for program notes, links, and to get involved in an online conversation about this and other episodes, please visit rationally speaking podcast Dog. This podcast is produced by Benneton and recorded in the heart of Greenwich Village, New York.

[00:44:28]

Our theme, Truth by Todd Rundgren, is used by permission. Thank you for listening.