Transcribe your podcast
[00:00:01]

My guest today, Tim Harford, has an uncanny gift for making economics interesting. His first book, The Undercover Economist, came out over 15 years ago, and it's still the first book I recommend to anyone who asked me for advice about what to read. He's wildly popular in the United Kingdom, where he has a weekly column in the Financial Times, and he hosts a popular BBC radio show that investigates the accuracy of statistical claims in the news outside the U.K..

[00:00:28]

Well, he's not as well known as he should be, but that might change with his latest book, The Data Detective Ten Easy Rules to Make Sense of Statistics.

[00:00:38]

Welcome to people I mostly admire with Steve Levitt.

[00:00:45]

I first met Tim Harford right before Freakonomics was published. I was literally the first person he ever interviewed. He wasn't even a journalist.

[00:00:53]

He was working at the World Bank hoping to start a second career. Usually I would turn down interview requests like that without a second thought, but something made me think it would be worthwhile. And indeed, he turned out to be the second most interesting, clever journalist I've ever interacted with after Steve Dubner, of course.

[00:01:16]

Tim Harford, in the past, it's always been you interviewing me today, I exact my revenge and I hope you're ready.

[00:01:23]

I'm ready as I'll ever be. Steve, it's great to join you on the show. We've all heard of Florence Nightingale, the lady with the lamp who became famous during the Crimean War and the pioneer in nursing. But in your new book, The Dead Detective, you tell a side of Florence Nightingale I had never heard of.

[00:01:40]

Would you mind telling it to the listeners? Florence Nightingale. She's famous as a nurse. She was the first female member of the Royal Statistical Society. She's celebrated as a statistician in nerd land. And Steve, I thought you were a citizen of nerd land, so I thought you would have known this, but perhaps not.

[00:01:57]

I guess I'm not an official member yet in nerd land. So as well as being this remarkable nurse and remarkable pioneer in nursing and the education of nursing, she was big into pie charts and she was big into data. And the story I tell is of the most famous data visualization. She produced something called the Rose Diagram. She's charting the soldiers dying in the Crimean War, which is this grim and pointless war in Crimea that the British got involved with.

[00:02:29]

What does she even get the data? She visited Crimea, but she was in Istanbul, which is where the hospitals were part of the data.

[00:02:35]

She gathered herself. Most of it she put together very carefully afterwards. But the message that she wanted to bring out in the data was pretty simple, which is a whole lot of soldiers died and they died of communicable diseases such as dysentery and cholera. And that particularly happened in the first half of the war. And then halfway through the war, the British medical system really cleaned up its act literally as they found a dead horse in the drinking water.

[00:03:09]

They found that the latrines were leaking into the drinking water supply. It was just appalling. She got a commission to come from the UK to help her clean up these hospitals.

[00:03:20]

And then after that, the soldiers stopped dying of communicable diseases. The rose diagram is basically two blue spirals.

[00:03:28]

So these two spirals are showing what has happened over the course of two years. And one of is large and growing and the other is much smaller and shrinking. And they are showing deaths because of communicable diseases. The story she was telling was it turns out it's a really good idea to clean your hospitals and to wash your hands and to take really good care of hygiene. What makes it such a powerful story is you've got this sense of before and after, look at the catastrophe in the first year and then we changed everything and then look at what happened and how we fixed the problem and people stopped dying.

[00:04:05]

It's a remarkably modern way of thinking. I always thought of her as being saintly, but now I will elevate her to whatever is right above st.

[00:04:12]

You could criticise her on the grounds that this data visualization is not misleading. All the data's there, it's all accurate.

[00:04:19]

But the way that she framed the chart made it look like there's absolutely no possible doubt that that was what was going on. There is something that makes me slightly uncomfortable when I see any argument presented in a particular way that really slants our view as to the truth.

[00:04:40]

They were just more casualties in the first half of the war. So there were more people arriving in the hospitals. So maybe it's not a surprise that more people died.

[00:04:48]

She, I think, was completely scrupulous in the way she used the data. And I think ex-post she was completely right. Hygiene standards really matter.

[00:04:58]

But you could argue that a more modern neutral presentation of the same data would have left open questions as to what was really going on. You've got to remember, all this is happening before germ theory, before Louis Pasteur, before we really knew what was causing these diseases.

[00:05:16]

Well, it's also true. I think most people don't understand how recent statistics is.

[00:05:21]

I mean, math goes back forever before the Greeks, but our understanding of statistics is remarkably modern. And in the 18th 50s, statistics was just a mess. And what is so interesting about the history of it is just how bad everyone was and how the first people who try to get it all wrong. And it's not the way that science is usually written where a genius like Einstein or Darwin comes in and just tells you the right answer and everything changes.

[00:05:50]

It's hard. It is properly hard.

[00:05:52]

And Nightingale, I should say, she wasn't a professional statistician. She was very smart and she had a lot of good quality mathematical training. She worked very carefully with some of the leading statisticians of the time. But she was in this very interesting position of being very influential in the sense that she knew a lot of powerful people. She was literally the most famous person in Victorian Britain. Except for Queen Victoria, he yet at the same time, she's a woman in a man's world, she's trying to challenge the British medical establishment and the British military establishment.

[00:06:24]

She's telling them they're doing it all wrong. She was basically saying we need massive public health reform because the hospitals are breeding grounds for disease.

[00:06:33]

And the chief medical officer at the time, a guy called John Simon, was saying, well, you know, yeah, disease is bad, but there's nothing we can do about it. And she was saying, no, we can do something about it. We just need to clean everything up. We need much better sanitation, much better public health measures. She basically won the argument with her graphs and the British parliament changed the laws, medical practice changed, public health standards improved and life expectancy jumped.

[00:07:03]

She really changed standards of health in the UK. It was remarkable.

[00:07:14]

I think there's a deep problem with modern nonfiction, and that is the idea that the book needs to have a theme, that it can't just be a bunch of good stories, that it has to be something more.

[00:07:24]

I would say your new book, The Daily Detective, is one of the most wonderful collections of stories that I have read in a long time.

[00:07:33]

I read it in two days. It's fascinating. But I think if I asked you what your book was about, you tell this.

[00:07:38]

It's a book that's been written to help people better evaluate the truth or falsehood of databased arguments through the use of 10 rules of thumb. That's probably how you describe the book, right?

[00:07:49]

Yeah, yeah. I think it's about right. But I think that's a total.

[00:07:52]

Bruce, I use that as an excuse to tell your wonderful stories because you can, with a podcast, just tell great stories or a column. But if you write a book, you have to do something more. Do you think I'm being unfair?

[00:08:04]

Yeah, I'm immediately trying to remember exactly what that really nice thing you said was before you said, but to write it down and stick it on the cover of the paperback.

[00:08:13]

Steve Levitt says this is the greatest collection of stories ever.

[00:08:17]

Look, I'm a fan of stories. I like reading good storytelling. I love the challenge of telling a good story because it's really hard to tell a good story while also being truthful and rigorous and telling people everything they need to know. But stories always simplify. They always leave certain things out. And that, I think, poses a challenge. What do we do about that with nonfiction? I'm not too worried myself. I try to be like Florence Nightingale, make sure the facts are unimpeachable, make sure everything's absolutely correct, and then tell your cool story.

[00:08:52]

But even if I live up to that standard, people might not remember the cautionary details. They might not remember that little wrinkle, all that complexity, what you write as a writer and what people take away, what they recall as a reader. Not always the same thing, that's for sure.

[00:09:07]

And I will say when I read books that are popularizing social science, there's typically some material in there I know well enough to evaluate whether people are being truthful. I will be honest with you that most of the popularizers of social science, I think they're fundamentally dishonest in the way that they take the ideas of academics and warp them to tell stories that they would like to tell. I've never seen you do that. I was going to ask you, actually, whether that was something that is really a goal of yours.

[00:09:40]

And it sounds like the answer is absolutely. Yes.

[00:09:42]

You hold truth to be very important. Yeah, I'd like to tell people the truth. And I'm conscious that, of course, you always make mistakes. I get things wrong. And when I get things wrong, I publish corrections where I can. But I'd like to get it right. Actually, just before we began this interview, I was contacted by the world's leading scholar of Florence Nightingale because she had read something I wrote and she really liked it and she wanted to see the whole book.

[00:10:07]

I sent her off the chapter. But of course, I'm now terrified because I know she's going to find something that's wrong.

[00:10:13]

But at the same time, it's like, well, I tried my best. I hope it's right. And if there's a mistake, well, I'm going to find out until she'll tell me. So, yeah, it's important to me. What is the point of being interested in the world? What is the point of being interested in the data, in evidence, in ideas if you then go out and misrepresent them? But I often find that actually mistakes are instructive.

[00:10:40]

I think admitting that you've made a mistake and discussing why you made the mistake is a public duty, but it's also kind of fun. This is completely trivial.

[00:10:49]

But on the radio program I present for the BBC or more or less, which is about maths and statistics, we had a piece about a pop song by Kate Bush, which has her singing nearly 100 digits of the decimal expansion of PI because she's kind of a geeky person and it's quite beautiful.

[00:11:06]

So she's just singing these numbers. And there's a mistake in that, probably because a producer just spliced together some tape, there's a number missing. And so we have this fun item about this. And then I said, well, the decimal expansion of PI is infinite. So I guess eventually whatever string of numbers she sang does in fact appear in the decimal expansion of PI. And it turns out that's true.

[00:11:38]

If a certain fact about PI is true and that certain fact about PI, no one's ever been able to prove. So we were able to get a mathematician on to get really deep into number theory and we were just in there having fun with numbers and never would have happened if I hadn't made this mistake, because zero stakes there. Nobody cares about that mistake.

[00:11:59]

But I think it's always better to own up and to try to learn and to try to teach others rather than just sweep it under the carpet from a social perspective.

[00:12:09]

Of course, that's true from a private perspective. I have found the mistakes I've made to be incredibly painful, embarrassing, time consuming.

[00:12:21]

So a lot of people don't like me. So when I make mistakes, it gives people remarkable fodder to come and attack me. So what I really have tried to do is to start making mistakes. So when I hire a research assistant, I say the only thing that matters to me is that if you ever find a mistake, you tell me that you found the mistake. And I don't care if it's the last day of your job and every single thing that you worked on will be a waste for me.

[00:12:48]

It's a cost benefit thing now that I avoid mistakes with every ounce of my body.

[00:12:53]

Yeah, I think it's important, but it is tricky. And the conversation that you have with your research assistants, I wonder how many people have that kind of conversation? I think a lot of organizations, whether it's just a small relationship between an academic and a research assistant or whether it's a big hierarchy, there are incentives to just bury bad news.

[00:13:19]

So your new book, you have these rules of thumb for being a consumer of statistics, and I think they all make a ton of sense, things like don't let your wishful thinking blind you or try to put any specific set of arguments into a broader context or consider your personal experience with things as you evaluate whether a daily based claim makes sense.

[00:13:45]

But there were two things I wished you had done in the book that you didn't do. The first was to acknowledge just how difficult it really is for a layperson or someone who is very expert about something, but not expert about the exact topic at hand.

[00:14:05]

To be very insightful about whether it's true, I'm really struck by how difficult it is for me to evaluate the truth in economic debates that are just a few steps away from what I know. Yeah, and I think it's a helpful starting point to begin with, the view that, look, I probably can't figure out whether something I'm reading the paper is right or wrong. And it's a different stance because I think somehow we're trained, especially as social scientists, to believe that we can get to the truth.

[00:14:32]

But if everyone could just start by saying, look, how in the world do I know whether some argument made by a physicist is right or. I don't. I can't.

[00:14:42]

And so I should use a lot more caution as I approach the work.

[00:14:46]

Well, before you tell me the other thing that you wished I'd done. Let me respond to that. You're right. But I really wanted to encourage people to have a little bit more confidence in their critical judgment and their ability to ask smart questions in evaluating the statistical arguments that get made.

[00:15:04]

The reason I wanted to do that was partly because I think there's a lot of negative messaging around. There's a lot of people saying you can't understand. It's super complicated. There's no way you'll ever be able to know.

[00:15:15]

So just give up and do whatever it is that the newspaper columnist that you follow tells you to think or the political leader that you like, whatever he or she tells you to think, just follow that and wanted to push back against that and say, no, you can ask smart questions and you can derive some insight. And the reason that I believe that is possible is because a lot of these questions are actually not that hard.

[00:15:37]

And you're right, Steve, I can critically evaluate your abortion and crime work. It's too complicated for me.

[00:15:43]

I've heard you describe it. I've heard you explain why you think it makes sense and that makes sense to me.

[00:15:48]

But if someone were to ask me for an independent evaluation of whether I think you've made a mistake or whether everything's solid, I would have to say, I don't know. I don't have the technical expertise. So some of this stuff is too hard, but a lot of it's not that hard. Let me give you a specific example. The health secretary of the UK said over the summer, if everybody in the country who was overweight lost five pounds, then the British National Health Service would save 150 million pounds over five years.

[00:16:20]

That's about 250 million dollars. And loads of people emailed me and said, hey, you're the data guy, Tim.

[00:16:26]

How does he know this? What's the evidence base for this? My answer was, hang on. We don't need to go into an evaluation of the evidence base for this claim. We just need to understand what the claim is.

[00:16:38]

The population of the UK is about 70 million. So he's basically said if people lost weight, we'd save a couple of pounds per person. Actually, if I remember rightly, it was 100 million pounds. It was a pound fifty per person and it was over five years.

[00:16:54]

So now he's talking about thirty pence per person is about 50 cents. So what he's saying is everybody who's overweight lost weight. Then the UK's health care system would save 50 cents per person per year, at which point you go, what does it really matter? It's just a distraction. I mean, nothing my nine year old son could do the mathematics required to solve that problem.

[00:17:17]

You don't even need a calculator. And there's a lot of claims that get made. You don't need to go very far before you can say, yeah, that makes sense.

[00:17:25]

That really helps me understand or oh, this is completely nonsense is obviously wrong by a factor of a thousand that I can see that clearly.

[00:17:32]

What I hear you saying there is that there are two ways in which arguments can go wrong. One is that the facts can be off and the other is that the interpretation made, given a set of facts, can be wrong. I think that if we divide problems into those two pieces, everything becomes much simpler and people don't agree on facts. Then we should go and evaluate the facts and figure out what facts are. If they don't agree on the interpretation, then I think that is a much easier problem for the human brain to tackle than the problem, which is how do I take storytelling and.

[00:18:14]

Facts and everything all mixed together and try to pass out the importance and I think that's actually in the background in your book, that's really lurking in your book and your own thinking. Yeah, I think that's right.

[00:18:25]

It comes to the foreground in the conclusion where I talk about the illusion of explanatory depth, which I love.

[00:18:32]

But the illusion of explanatory depth is basically if you ask people how well do you understand how a zipper works on a scale of zero to seven, most people will say yes, six. I understand it pretty well. And then you say, OK, great, here's a pen and paper use diagrams, bullet points, whatever. Just explain to me exactly how it does work. And actually, I don't really know how it works.

[00:18:55]

The illusion of explanatory depth says just asking people to lay out the facts may help them to understand that maybe they don't actually know the facts, maybe they don't understand the thing that they're arguing about. It turns out that if you use a similar tactic for, say, policy choices, she said, just explain to me how cap and trade system would work. People who are willing to die in a ditch over whether cap and trade is a good response to climate change or not.

[00:19:24]

Turns out they don't really know how it works. When you ask them to explain it, they start to realize, I don't completely understand this. Maybe I should moderate my political views. Maybe I shouldn't be so critical of people who disagree this process of laying out the facts, which I think is worthwhile in and of itself.

[00:19:43]

There's this bonus which actually gets people to reflect and be a bit more humble about the limits to their own knowledge.

[00:19:56]

The other piece that I think it's really important for laypeople understanding data that I didn't see you cover in the book that I want to mention is thinking hard about the incentives of the people who are putting forth the argument and being suspicious of any argument in which the incentives are such that the creators of the argument could benefit in any way. Yeah. I'll give you example in the academic literature and games, I have never seen an academic paper where from the name of the author and knowing what the author has written before, I didn't already know the answer that the author would find.

[00:20:35]

There just aren't cases where people who are, you know, progun suddenly look at a data center, say, oh my God, this particular question I just asked leads me to believe that guns could be bad in this setting. And so I'm very skeptical of that literature. And my rule of thumb is the more ignorant I am of a particular topic, the bigger the weight I put on, simply looking at the incentives of the providers of the information and judging the reality based on that.

[00:21:04]

I think there's a lot of wisdom in that. The reason that I didn't do that is not because I don't agree, because I do agree. It's because I feel that people have received that message over and over again. I think people are constantly being told to be suspicious of the motives of people who are telling them things. And I think we may have gone too far because although it's true, I think it's bred a lot of cynicism. A lot of people worry that are we believe anything.

[00:21:34]

And actually what I worry about is that we believe nothing at all, that we completely skeptical of everything. And we just think, well, they're all lying to us. It's all fake news. So that's what was very much on my mind in writing the book. And actually, it's interesting because Freakonomics is a book that doesn't make that mistake. So Freakonomics Right from the Start is a book that says, hey, let me tell you something really interesting about the world using this data.

[00:21:56]

But most books about data actually don't do that. Most of the books about data that I've got on my shelf are written by eminent economists, statisticians explaining all the different ways in which data can be used to lie to you. And of course, it's a really engaging way to talk about data. But there is this worry that I have that people hear that message over and over again.

[00:22:18]

And in the end it becomes an excuse to just go. I can't believe any of these people. I don't trust any of the experts. I'll just believe whatever my gut tells me or whatever I feel should be true.

[00:22:28]

And I'm not going to look at any evidence because you can't believe any of it.

[00:22:36]

You're listening to people I mostly admire with Steve Levitt and his conversation with economist and author Tim Harford after this short break. We'll return to talk about gun data and Tim's honorary title bestowed by the British monarchy. I love the exchange we had where I said to Tim, your book is full of wonderful stories, and then I went on to say all the things I didn't like about it. And his response was to be defensive, but instead to say, I need to write down all the nice things you said before the word, but and put it on the cover of my paperback.

[00:23:18]

What a great reaction. And I suspect my partial quote really will end up on the cover of his book in the second half of the interview. I'll try to get him to retell one of my all time favorite stories of his and also get him to reveal his secrets for telling great stories about data and economics. Steve, I wanted to ask you about gun data, actually, so one of the points that I make in the book is we shouldn't take the data for granted because it's easy to have this mental model.

[00:23:47]

Certainly as an outsider, the data is just something that exists in spreadsheets. You can just download it from the Internet and you crunch it. And then once you've crunched it, then outcome insights.

[00:23:58]

And actually data has to be gathered doesn't just accumulate by accident. We should be aware that there are certain bits of data that could be gathered and just aren't. That's what I wanted to ask about guns, because forgive me, because I'm a Brit.

[00:24:11]

So I do not understand the American debate about guns, but my understanding is that there's a lot of political interest in the US as to what data on guns can and cannot be gathered for certain questions that can't even really be asked because it's against the law to even collect the data. Have I got the wrong end of the stick, though? Have I understood?

[00:24:30]

I think you're exactly right. The National Rifle Association has been extremely successful at limiting the collection of data around guns, and that has really hamstrung the academic research into it. In fact, one of the most clever papers ever done on guns was done by my good friend Mark Duggan. He was simply trying to figure out how he could determine how many guns were in different places. He had incredibly clever idea to go to a different data source, which is magazine data.

[00:25:03]

So it was enormously carefully collected data on magazine circulation because that's how advertising payments are done. And so he used purchase of handgun magazines as a proxy for purchases of handguns. And what was very difficult and clever about the papers, he actually showed that over time the changes in the number of guns correlated very, very highly with his measures of magazine subscriptions. And so he used that as a proxy and actually was able to say interesting things about guns, things that can't be measured.

[00:25:34]

It's very difficult to regulate or control them. I think the NRA has understood that for a long time and they've been very, very effective at making sure that guns can't be measured. If you think about the economy, imagine that we couldn't measure income or we couldn't measure GDP. That's kind of the equivalent when it comes to guns. We just don't know how many guns there are in different places and how that changes over time. And so it's really hard to study the problem and certainly extremely hard to get a causality.

[00:26:04]

But of course, there was a time where we couldn't measure incomes and GDP wasn't even defined. So no one was gathering GDP data. There was a point where we didn't have data on inflation and prices. And one of the points I'm making in the book is at a certain point, people said, we need the data on this. We actually need to devote some time and attention, expertise and money to getting these numbers because they help us see things about the world.

[00:26:28]

And so anybody who starts talking about lies, damned lies and statistics and demeaning the statistics by saying, oh, they're always used in a misleading way.

[00:26:38]

You know, the NRA understand how important the statistics are because they really, really don't want them to exist and they're quite effective at ensuring that they don't exist. I think that proves as much as looking at all the data that has been gathered. Look at the data that people are trying to prevent being gathered.

[00:26:58]

One of my favourite stories is from your book, Metti, about creativity.

[00:27:02]

It's about the jazz pianist Keith Jarrett. Could you tell it?

[00:27:07]

The story begins in 1975 when this German teenager called Vera Branders walks out on the stage of the Cologne Opera House. Bursting with excitement because a few hours later. Keith Jarrett is going to be on that stage improvising. He's a great jazz musician. He's going to be sitting at this piano and he's going to be just playing whatever comes into his head. And all this has come about because there is the youngest jazz promoter in Germany. She's 17 years old.

[00:27:39]

She just loves jazz. And she's managed to score this amazing coup of getting Jarrett into the Opera House to play this late night concert. When Jarrett actually comes on the stage to check out the piano immediately, it becomes clear that something has gone wrong. And there's been a mix up. They've brought out a rehearsal model. The keys are sticky, the pedals don't work. It's too small. It sounds tinny. Is it a bad piano?

[00:28:04]

And Jarrett says, well, I'm not going to play, but it turns out there's no way of getting a replacement piano on the stage in time. It's not possible the tickets can't be refunded because of the way the concert's been set up. This teenage kid is about to be ripped apart by fourteen hundred people who show up for a concert and there's no concert. And so Jarrett takes pity on her. And although he's a real perfectionist, although he likes things exactly the way he likes them, although he feels the piano is completely unplayable, he just thinks I've got to do it because I've got to help this girl out.

[00:28:37]

And so he a few hours later walks out on stage, sits down at this piano that he knows is unplayable and begins to play. And instead of musical catastrophe that he expects, it's a masterpiece. The concert was recorded supposedly to provide documentary evidence of what a musical catastrophe sounds like.

[00:29:01]

But in fact, once it was remixed, it sounded great. Many people think it's his best work. It's easily his most successful work. So the concert has been released as the Golden Concert, best selling jazz piano album in history.

[00:29:15]

And it only got played because Jarrett felt he'd been backed into a corner and he couldn't let this girl down. He thought, this is terrible. It's a bad piano, it's got to be a bad concert. But he was, of course, forced to play in a different way and to improvise in a different way. So he stuck to the middle of the keyboard, which made it sound very soothing and ambient because the upper register sounded terrible because it was such a small instrument.

[00:29:39]

It was quiet. So he was pounding down on the keys to try to create more volume. So there's this weird tension. He was playing this kind of nice ambient sort of music, but he was really hammering it hard and playing with a lot of energy. But there's just something about that that worked really well.

[00:29:56]

It's how I begin my book messe the book that's really all about how disruption and challenges and weird stuff that's ambiguous and messes around with us.

[00:30:09]

It can actually lead to a Problem-Solving response. I love that story.

[00:30:14]

But what makes it reverberate in my head is the fact that I don't know what conclusions to take away from it. One is well, it's really good to be a nice guy because Keith Jarrett did this person a favor and ended up paying off for him. Or maybe the idea is real geniuses are able to overcome adversity or that if you put up artificial obstacles to success, then that leads to unexpectedly good outcomes. So we should be in the business of putting a lot of obstacles.

[00:30:45]

In a way. I don't know what your takeaway is.

[00:30:47]

I think when you're doing the work of the nonfiction writer, you're trying to make a particular argument for a particular view of the world.

[00:30:53]

So if I was writing a book that was all about the power of altruism, that would be a cool story about someone who did someone else a favor.

[00:31:00]

But actually, I think it's a better example of how disruption produces this creative response. So how I back the story up is to say, OK, let me tell you a completely different story. It's actually more a piece of research about a strike on the London Underground that shut down half the London Underground for 48 hours. And when researchers looked at the data, they found that tens of thousands of commuters had changed their route because of the closures. And then at the end of the 48 hours, they never changed back.

[00:31:29]

So they discovered a better way to get to work. And all it took was this perturbation to the system. I would tell you the single best example of that that I've observed in my entire life is covid-19 covid-19 disrupted all sorts of things. The idea that you could work from home effectively. And I think that we will never go back in many dimensions to things we were doing before, but no one would have ever had the nerve to experiment to the degree that covid-19 forced us to change our behaviour and to learn about what worked and didn't work.

[00:32:05]

Yeah, I think it's absolutely right and I think there were things that we won't go back, not because we can't go back, but simply because we learned we could have done it that way the whole time. And why didn't we?

[00:32:21]

I've been calling you Tim, but I understand the royal family has bestowed an honorary title and you have what it's called an OBE, which is a sort of mini knighthood.

[00:32:31]

Of course, Britain being Britain, it was all tied up with the royal family. And I have a letter from the queen. I went to the palace and all of these exciting things and I met Prince Charles. But fundamentally, the British government has decided to say thank you.

[00:32:47]

What is Prince Charles like? Well, I met him for about 30 seconds. It's quite an interesting operation because there's like 100 people in the room and some of them are very famous. But most people, he doesn't know who you are. And so as you approach, your name has been called out and as you walk, somebody is just whispering in his ear and presumably saying, this guy has a radio show, which I find.

[00:33:11]

I mean, he's on his feet for an hour and a half, just shaking someone's hand and then the next person and then the next person, he said, well, we're giving you this award to encourage you to keep going. And that was oddly moving. It's a very British thing to say, oh, just keep it up, carry on.

[00:33:29]

And yes, this is the British spirit. Just do some more. Don't stop you. So you have this OBE, but it seems like Your Honor started piling up early.

[00:33:39]

I read someplace that you were the world champion of school. Persuasive speaking. Is that true? And what does that even mean to her where you found that out from?

[00:33:48]

Yes, I was. The World School's persuasive speaking champion. You know, there were teams from Cyprus. The Australians didn't come when the Australians are supposed to be really good. But the Americans came and the Canadians came. The Canadians are really good.

[00:34:02]

So I'm really interested in the subject of persuasion. I suspect that you not only were the world's champion at school persuasive speaking, but you also have insights for everyday people about how to make an argument persuasive.

[00:34:20]

Well, 1992 was a long time away, but. Okay, so here's how I think about it, first of all. I want to get my own head straight. I am worried enough about making a mistake myself before I get remotely interested in persuading anybody else. A lot of people have responded to the data, detective, by saying, oh, I've got this friend who's a total idiot.

[00:34:45]

I get into these arguments on Twitter with these idiots and how can I persuade these idiots not to be idiots?

[00:34:50]

And I always say, just start with yourself. If you can make sure that you're not an idiot, you've done so well.

[00:34:56]

It's such a difficult thing. Don't worry about anybody else. So I've turned away from persuasion in recent years. But OK, if my own head straight were to go for, I would go for a memorable story.

[00:35:09]

As stories are not so threatening. You're not attacking anybody. You're giving them something they'll immediately find interesting and they'll follow the story along and they'll be curious and it starts to open their minds. So if you're talking to people in terms of stories, you're sort of lowering their instinctive psychological defenses that basically say this guy is challenging my sacred beliefs and I'm a champion of all. Let's right stories, I think, get people into a more open minded frame of mind.

[00:35:39]

I guess that's what I do in my books. But I don't think of it as persuasion. I think of it as a hopefully giving people something that they find interesting and engaging. But if people are interested and engaged, you at least then have a chance of persuading them. If they're not interested in they're not engaged, you're going to get nowhere.

[00:35:56]

I agree 100 percent that roughly the only thing that ever persuades anyone is a good story. And the other thing I've come to believe, I think you probably would agree with me, is that almost all good stories share a lot of commonalities. So good stories are almost always about people.

[00:36:14]

They're heroes and anti-heroes in the stories. They're some kind of a conflict. There's some kind of a rising tension which is then resolved in some unexpected way. Would you agree with that assertion?

[00:36:27]

Yeah, there are various theories about how stories work, but yes, I think that's right.

[00:36:30]

And not to be confused with anecdotes. So very often people talk about stories, but actually what's going on is I gave you a little example and that's fine. A nice example is fine, but it's not really a story.

[00:36:40]

Yeah, stories have beginnings in middle and end.

[00:36:43]

So what's interesting is I used to teach a lecture in my course on data to the undergrads where I talked about storytelling with data. And it always left me feeling a little bit strange because it wasn't very powerful.

[00:36:57]

And then one day I just sat back and I thought about it and I thought, wait a second, if a good story has all of the elements which is talked about, analysis of data never leads to a good story. There's almost never a person involved that you can identify. There's almost never any intrigue or uncertainty. There's not a twist at the end. And I completely redid the lecture. And now the way I start the lecture is by telling a great story and everyone laughs.

[00:37:22]

I think it's a great story. And I talk about what made a great story. And then I say, let's take some examples what data and how we would turn them into stories. And it becomes really clear to everyone that you cannot tell great stories with data.

[00:37:33]

And I really came to conclusion that when it comes to data, you just should completely abandon the idea of telling stories that you should use data and just explain the truth.

[00:37:45]

Yeah, you're weaving together these two things. You've got the truth, as evidenced by the data, and then you've got some story that people are going to remember. And the people who are really good at this will just weave the two together so you can hardly tell them apart.

[00:38:01]

But they are different. And Florence Nightingale, it turns out, is great at this. But the story and the data aren't the same. You're absolutely right.

[00:38:17]

It really strikes me, given that you're such an amazing storyteller, that the right domain for you to be producing your ideas is in a podcast, not in books.

[00:38:30]

Well, the same thought had occurred to Pushkin, who are a podcasting company set up by Malcolm Gladwell and Jacob Weisberg. And so they asked me about a year and a half ago, did I want to tell stories in a podcast? And I thought, yeah, I guess I kind of do. The podcast is called Cautionary Tales.

[00:38:51]

There's stories about things going wrong. And what is the social science behind that particular fiasco? Tragedy, hilarious mishap. Some of them are funny. Some of them are really not funny. But what's the lesson? What are the statistics? Tell us? So what does the economics tell us or the psychology? The new season cautionary tale. Season two is coming. Very excited about it because Helena Bonham Carter is playing Florence Nightingale and Jeffrey Wright is playing Martin Luther King.

[00:39:21]

I have written a script for Jeffrey Wright and for Helena Bonham Carter.

[00:39:25]

It's very exciting. Do you have advice for people who maybe aren't natural storytellers? I suppose the fundamental thing is the ideal story has a protagonist. It's got somebody who is moving through this story, who is taking actions and things are happening to them and they're doing things in response. And that's the heart of a good story. If you don't know who the protagonist is in your story, then maybe keep thinking about the story.

[00:39:53]

I have to say, having given this advice, there are lots of stories I tell that don't have a clear protagonist because I'm so interested in a particular academic idea or a particular course of events. And it's not always easy to follow this advice, but get yourself a protagonist, get yourself a central actor. I guess the other key piece of advice is that it's really nice if what you do at the beginning of telling the story foreshadows the end in a way that's not obvious, so that when you get to the end, you go, oh, wow, I see how all of that fits together, but it only seems with hindsight obvious what was going to happen rather than a foregone conclusion.

[00:40:28]

Of course, you can cheat when you're writing a story because you know how it's going to end.

[00:40:32]

You can go back and you can tweak the beginning so you can make it seem like a certain amount of magic. So those are the two piece of advice I'd give. And the third one, of course, is read good stories and think about why they're good.

[00:40:45]

It's interesting to me that Tim Harford simultaneously writes books and does podcast for Dubner and me.

[00:40:52]

It became crystal clear very early on in the life of the Freakonomics Radio podcast that podcasts were just a much more effective way for us to communicate our ideas. You probably noticed that we stopped writing books and we don't have any plans to go back. Admittedly, though, one advantage books have is a long shelf life. People return to books over and over. In a way, I suspect they won't with podcasts, perhaps for that reason, Dubner and I will one day regret our decision.

[00:41:19]

But right now we're having way too much fun podcasting to worry about it. People I mostly admire is part of the Freakonomics Radio Network and is produced by Freakonomics Radio and Stitcher, Morgan Levy is our producer and Dan Dessler is the engineer. Our staff also includes Allison Craig Lowe, Mark McCluskey, Greg Griffin and Emma Tural.

[00:41:52]

All of the music you heard on the show was composed by Louis Scarra to listen ad free, subscribe to Stitcher Premium. We can be reached at Keema at Freakonomics Dotcom. That's P. I am a at Freakonomics dot com.

[00:42:07]

Thanks for listening. Can you imagine Mother Teresa knee deep in data and statistics? Yeah. Mother Teresa with a calculator. Ditcher.