Happy Scribe Logo


Proofread by 0 readers

Everyone, just before we start our show, we have something to ask you in order to continue providing coverage of the U.S. economy and other events throughout the country in this extraordinary year and to continue keeping up with how workers and other regular folks are doing. We would so appreciate it if you would donate to your local NPR member station at NPR Mauga indicator, it would go a long way towards continuing to be able to provide the kind of journalism that you expect from us.


Thank you so much and happy holidays, NPR.


Everyone stays in Cardiff here, this is the indicator from Planet Money as we come to the end of the year with covid infection rates and death rates having climbed so much, it's worth remembering that there were warnings out there about the world's vulnerability to a viral threat, warnings from the World Health Organization and the World Bank, for example, in a huge study last year and warnings from research scholars and long magazine articles, even from Bill Gates in a famous TED talk in 2015.


If anything kills over 10 million people in the next few decades, it's most likely to be a highly infectious virus rather than a war.


Plus, there have been numerous close calls in the last two decades when epidemics in some part of the world looked like they might have become pandemics, SARS, murres, Ebola. And yet, even as coronavirus started spreading, people and governments throughout the world were unprepared. There was not enough protective equipment. They were slow to produce enough tests for the virus. They took a long time to implement social distancing.


And a big part of the problem is that when it comes to preparing for a big disaster like this one, we humans are fighting an uphill battle against a powerful force, our own brains. The mind is full of psychological biases that can lead us not to take seriously enough these warnings of impending doom.


Tim Harford wrote about this in an article for the Financial Times magazine, and we spoke with him about it earlier this year. And we wanted to rerun the story because some of his observations we've been thinking about ever since we heard them. For example, Tim talked about something that is called a normalcy bias that people have.


Normalcy bias is sometimes called negative panic. It's just this idea that when things are falling apart all around you, there's a total disaster. People very often don't seem to realize how bad things are. They're often surprisingly calm, even when actually they should be taking very prompt action in a pandemic.


The number of infected people grows exponentially so fast that it can be hard to understand how it's spreading at the rate it is. But normalcy bias might prevent us from seeing just how bad things are getting until we are overwhelmed by the problem. There's also something called optimism bias, which is the tendency to think that even if a disaster is happening to, you know, everyone around me, I myself won't be affected. So I don't need to take precautions.


For example, a study of what people expected as Hurricane Sandy was coming in to hit the coast of New Jersey, people actually expected that it would be worse than the meteorologists did, but they thought that they themselves would be fine. They didn't really need to worry.


So that's optimism bias. And there is also the herd instinct, which Tim says just reflects that we are all very social creatures. We take our cues from what other people do.


Yeah, there was just this general sense. If there's a pandemic, a there's this coronavirus thing, it seems pretty bad. It looks pretty bad in Italy. Maybe I should do something. Maybe I should do something, but nobody's really. And then there's that moment where you're like, oh, I need to get toilet paper. I need to get food, I need to get masks. I need to tell my shares. It's too late. Everyone's doing the same thing at the same time.


It's the herd instinct.


So these psychological biases can help explain why the world did not start preparing itself weeks or months earlier than it did once coronavirus started spreading within China, which is where it originated. But the biases do not fully explain why so many countries throughout the world ignored the warnings of the past and did not have a better plan in place for the next pandemic, even before Coronavirus started. And a big reason actually has to do with the ways we make economic decisions. More from Tim Harford after the break.


This message comes from NPR sponsor molecule maker of the molecule air purifier molecule air purifier uses patented light activated Pekoe technology designed with a clean look to fit seamlessly into your home and built to cover a variety of room sizes. Each home device is backed by 100 percent refundable 30 day home trial, save up to one hundred and fifty dollars off at molecule dotcom with promo code and PR. That's molecule spelled with a K. In good times, one of the reasons that the economy grows every year is that businesses and workers are always trying to become more productive, more efficient, and for the most part, it's great that they do that.


Becoming more efficient year after year makes it possible for our wages to go up and for new goods and services to be invented. It makes our lives better. It makes the world a richer place.


But that consistent push to be more efficient also makes it hard for a business to justify spending money on preparing for an event that may never happen like a pandemic. That's according to economist Tim Harford.


So if you think about a supermarket, a supermarket could say, hey, I'm going to have a supply chain back up. I'm going to have a load of drivers who can deliver online groceries. And I'm just going to pay them to sit around with their vans just in case, just in case there's a pandemic, because then I'll really reap the benefits, because suddenly I have all this extra capacity that supermarkets going to be wiped out in the market pre pandemic.


Yeah, and a similar rationale can also apply to policymakers into their plans for a pandemic because policymakers answer to the public. And given that there are people who struggle to feed their families right now, people who need health care now, bridges that need rebuilding now, well, it's just hard for a politician to say, hang on a minute, we're not going to spend money on those things. Instead, we're going to spend money on making masks and ventilators that we might never need.


And we're also going to spend money on facilities that can develop tests and vaccines for a pandemic that may never come.


It's a really hard sell. In fact, one politician actually did try this back in the mid 2000s.


Arnold Schwarzenegger, when he was the governor of California, he spent a lot of money on this big fleet of mobile hospitals that were going to be tens of thousands of beds that were going to be respirators, big stockpile of masks. And the idea was you have an earthquake, you have wildfires, or in particular you have a pandemic. But of course, about three or four years later, his successor, Jerry Brown, cut the funding for the scheme.


No one complained at the time and the stockpile is nowhere to be found yet.


The financial crisis hit in 2008 and the state just needed money for other things. And that is why the lack of preparation for future emergencies can be such a hard problem to solve. There are always immediate needs in the economy that we can actually see, and so it's just tough to justify spending money on, you know, a theoretical pandemic that you can't see and might never happen.


Yes, so we have all this evidence that people in government are bad at preparing for a pandemic. We don't listen to warnings and we don't seem to learn much from our past experiences. But Tim argues that the situation is not totally hopeless. He says you can look at how quickly and and how forcefully central banks and governments across the world have acted to prevent a bigger economic collapse in response to the coronavirus.


And that seems to have happened because they learned one of the lessons of the great financial crisis, which is basically that you have to throw a lot of money at a problem as quickly as possible. But that's on the economic side, says Tim.


But the the people in the show, the people in public health, they don't have that same direct parallel. So it's not as easy for them to look back at a recent catastrophic pandemic, you know, not not SARS, but that the flu of 1957 or the flu of 1918. They don't have that recent memory and those recent lessons because coronavirus has been a catastrophic crisis for so much of the world.


Maybe the world is more likely to learn the lessons it needs to so that it can prepare for the next pandemic, which could be much worse.


Or maybe it's like the financial crisis, which we thought was the big crisis of our life, and it led us to prepare in all kinds of ways. And then the next crisis was something totally different. It was a health crisis. So maybe the crisis after this one is going to be artificial intelligence running riot, or maybe it'll be something to do with climate, the environment. So there's always a danger of fighting the last war. I guess what I'm saying is it's hard to be prepared for everything.




You know, Stacey, I myself am not worried about I like I think things will just stay pretty normal. And even if not, I'm optimistic that I personally won't be affected because I look around and, you know, I don't see the rest of the herd freaking out. And so it begins again.


We'll have a link to Tim's piece in the Financial Times magazine over at NPR Agapemone and a link to Tim's own podcast, Cautionary Tales, which is a podcast topically enough about how we make mistakes and learn from them. This episode of the indicator was produced by Daria's Raffan, in fact, checked by Britney Kronin, our editors Paddy Hirsch, and the indicator is a production of NPR.