Transcribe your podcast
[00:00:00]

Today's episode of Rationally Speaking is sponsored by Give Well, a nonprofit dedicated to finding outstanding charities and publishing their full analysis to help donors decide where to give. They do rigorous research to quantify how much good a given charity does, how many lives does it save or how much does it reduce? Poverty per dollar donated. You can read all about their research or just check out their short list of top recommended evidence based charities to maximize the amount of good that your donations can do.

[00:00:25]

It's free and available to everyone online. Check them out at Give Weblog. Welcome to, rationally speaking, the podcast, where we explore the borderlands between reason and nonsense. I'm your host, Julia Galef, and I'm here with today's guest, Seth Stephens Davidowitz. Seth is trained as an economist who did his Ph.D. in economics at Harvard. He worked for a while as a data scientist at Google, is a contributing op ed writer for The New York Times and very recently published a book titled Everybody Lies, Big Data, New Data and What the Internet Can Tell US about Who We Really Are.

[00:01:14]

Seth, welcome to rationally speaking.

[00:01:16]

Thanks so much for having me.

[00:01:17]

Julia, one thing that really attracted me to your book, Seth, is well, first of all of all, I just love clever experimental design and kind of clever ways of of tricking the world into giving up information to us, which your book falls squarely into that category. I have this whole folder of examples of clever experiments and clever studies on my computer that I happily added your work to. And and more particularly, I got a big update that that has been happening for me in the last few years is that we just can't trust people to honestly and accurately report how they're feeling and what they believe and why they do the things they do.

[00:02:00]

You know, and if we if we want accurate answers to those questions, we kind of have to get cleverer and look at, you know, infer those answers from from from clever and original sources of data, like, for example, people's Google searches, which is where your research comes in. So my first question for you is just how did you first start or why did you first start looking at sources of data like people's Google searches? Why what was interesting about that kind of data?

[00:02:31]

Yeah, so I was it started when I was doing my PhD program where they might be program, and I was kind of a little lost and burnt out. I didn't really have a dissertation topic.

[00:02:42]

And then I found Google searches and I just became kind of obsessed with it because I suspected that people would tell Google things that they might not tell anyone else. Right. You could kind of see what people really thought about various issues or maybe a more accurate view of people than, like, asking them. And so I just became obsessed with it. And I started doing this research on racism. And that was kind of the first thing I was studying.

[00:03:12]

I was just shocked at how different the results were when you looked at Google searches compared to surveys and that Google searches seem to be, in my opinion, more accurate. So that kind of started me down this whole path, which I've been following for five years or so.

[00:03:27]

Could you say a little bit more about why self reported like people's answers to survey questions about themselves? Why can't we trust those?

[00:03:35]

So one issue is people lie to themselves a lot. So, yeah, if they if they forget something they did or don't, you know, they might be searching for racist jokes and doing a lot of bad things to African-Americans, but they don't like to think of themselves as racist. So they're. So that's one problem. And then the second problem is that people just have been shown to, I think, shade the truth in their answers in the direction of things that make them look good for whatever reason.

[00:04:07]

Nobody knows exactly why. Maybe it's just because it's a habit. People kind of lie consistently in everyday life. You're kind of always trying to trying to make yourself look better. That habit maybe carries over to a survey.

[00:04:21]

Yeah. Even though they're anonymous on the survey. Right. I mean, I assume most of these surveys are not collecting the person's name and address and everything like that.

[00:04:29]

Yeah, they're anonymous, but it still sometimes feels a little weird to people. So even if it's anonymous, they they still lie. And I think, you know, I think one big difference in surveys and Google is a survey can never give you an incentive to tell the truth. It might not give you an incentive to lie, but there's no incentive to tell the truth. So people will just assume a shade the truth, shade their answers in the direction of what will make them look good.

[00:04:55]

But Google, you have an incentive to get the information you need, right?

[00:05:00]

Right. Good point.

[00:05:02]

So if you're like gay, you don't have an incentive to tell and you live in a place where it's hard to be gay, you don't have an incentive to tell a survey I'm gay, but you do have an incentive to search for gay porn. That would be kind of a classic example where if you're if you're going to like if you're deciding whether to vote in election, you don't have an incentive to tell whether or not you're going to go to Google to surveys.

[00:05:26]

But you do have an incentive to search Google for voting information or voting locations or polling places if you're actually going to vote. Right.

[00:05:35]

You mentioned racism as being one of the first topics that you started investigating. And also, I can imagine a topic that where we're looking at people's Google searches would be especially useful or would add an at a special amount of value relative to the standard social science methodology of asking people about their beliefs, because racism is this kind of socially charged or socially disapproved of attitude, or at least it was. At least I know.

[00:06:05]

You know, that's why I hesitated. We're so we're recording this episode for to our listeners. We're recording this episode, the week of the the the Nazi march or rally in Charlottesville. And so the you know, going back over Seth's book, all the stuff about revealing America's latent or hidden racism was especially salient to me. So what what jumped out at you as surprising from looking at people's the Google search data compared to sort of common wisdom or what other social science research had shown?

[00:06:43]

And actually, sorry. But before you answer that, I suppose I should ask you, where did you get the Google search data? Like, I can't just go online and look at Google search data, can I? Yeah, you can.

[00:06:53]

So there's a tool called Google Trends where you can see you can type in and you can see how frequently people make, like you can put any search term or any category of searches and you can see where they're search most frequently and when their search most frequently great and it's complete.

[00:07:14]

The statistics are across the whole country and they're not weighted by where you live or anything like that.

[00:07:20]

Yeah, that's the percent of total Google searches, which is, I think, the right way to do the analysis. Yeah. So I think like some things are a little weird. They like. They make like the they like they make they make it kind of hard to understand sometimes and they like they have a really high, high privacy threshold. Right. Which you can kind of I figured out how to get around for a while, but now they block that.

[00:07:47]

So it's kind of you it's it's the Stevens Davidowitz rule at Google.

[00:07:51]

Now that I think I think actually I'm pretty sure that that's nice.

[00:07:56]

But so but or like and then there are a couple of things I worked at Google for also, like all my data is public, but at Google you can find better ways to find the ideas and then confirm them publicly. So but but in general, you can learn a lot from Google Trends.

[00:08:12]

OK. OK, so back to my question about what surprised you about racism in America from this data.

[00:08:19]

OK, well, a lot of the things that surprise me don't surprise me anymore, but surprise me when I was trying to research given but wouldn't it wouldn't surprise me now. But when I was starting the research and people said we lived in a post-racial society, what year did you start doing this research?

[00:08:34]

I was doing and I think it's 2011, OK, that was when I started and Obama was elected and everyone was saying, know, we move beyond a lot of the really nasty racism in our country's history. And what I started to research was I was just shocked by how frequently Americans were searching for the N-word, not like the N-word, the actual and the N-word, but not in quotes.

[00:08:59]

Yeah, yeah. And like I thought it would be when I first saw it and how frequent it was, like the time that I was looking at it was searched as frequently as migraine and Economist and Lakers and Daily Show. So not a free search. Yeah. And I thought like, oh rap lyrics are like that's what's going on. But that's the rap lyrics of the version that ends in a year. So it was basically jokes, mocking and humiliating.

[00:09:24]

African-Americans were the big theme of it. And then the other thing that was surprising was the location of the searches. Like I would have thought that racism would be predominantly concentrated in the Deep South, in Mississippi and Louisiana and Alabama. If you think of our country's history and it definitely these were among the places with the highest such volumes, but also among the highest were upstate New York and western Pennsylvania, eastern Ohio, industrial Michigan. The real divide was not so much north versus south, but east versus west.

[00:09:54]

And then and then like it starts predicting and then it predicted, like these clear behavior you see back in the day that Barack Obama, compared to other Democratic candidates, did far worse in places that made a lot of racist searches, like it was a really, really strong predictor of where Obama underperformed in 2008 and 2012 elections.

[00:10:15]

Interesting. And have you looked at that same correlation with where Trump like where how Trump performed relative to Republican candidates in the past? Did that correlate with searches? Yeah.

[00:10:27]

So like Nate Cohn, he's a data journalist, New York Times, he asked for my data and he had data on Republican primary support and he found that. Basically, racist searches was the highest predictor he could find of Trump support in the Republican primary. And Nate Silver found the same thing. It's a little harder to compare like compared to other Republicans in a general election because it's compared to the previous election where Obama was black. So it's like there's a lot going on there.

[00:10:55]

But I think it's pretty clear that racism drove a lot of his support in the Republican primary.

[00:11:02]

And how confident can we be that that isn't just a result of, you know, areas that are are have declining industries like in the Rust Belt? People are disillusioned with the current economic regime. And those areas also happened to be racist. But that's not why they're supporting Trump or dislike Democrats.

[00:11:26]

Yeah, so it's just that basically that like that that Makone, Nate Silver start controlling for all these other variables. And it was still the racism that was predicting Trump's support.

[00:11:38]

So even after controlling for things like average income or unemployment. Yeah, after controlling for demographics or economics or exposure to trade or anything else, it's a big predictor. Is racism, right?

[00:11:54]

How confident do you think we can be that we're interpreting people's Google searches correctly, like you briefly mentioned, considering other explanations for why people might have been searching for the N-word, like maybe it's rap lyrics and you were sort of able to, you know, pretty confidently ruled that out because of different spellings. But there are a bunch of really interesting findings that you report in your book. And I kept trying to ask myself, like, could there be other explanations for this, for what people were searching for other than sort of the obvious, straightforward one?

[00:12:26]

Just to give an example, I don't think this is in your book, but for the sake of of illustration, if someone searches for the phrase symptoms of depression, it's kind of unclear whether that's them sort of revealing to Google that they're they think they might be depressed and that might give us a window into actual rates of depression beyond, you know, the rate of people who actually seek out help for depression or, you know, a different sort of K-Tel is that people have friends who are depressed and are, you know, want to want to they're worried about their friends and wondering if they should try to help their friends.

[00:12:59]

You know, how how often that comes up that we really you know, we could be jumping to a conclusion about why people are searching for those terms.

[00:13:08]

I think you never know why a particular individual makes a search. I think it's but in aggregate, it tells you a lot about patterns. So I definitely made a lot of racist searches when I was writing my book.

[00:13:19]

I write about myself. Typical. I mean, yeah, I don't get there myself.

[00:13:23]

Like, I like to think I'm not I'm not racist. I yeah. I think I'll be different if you've got the racist search data and it came back and it's like Cambridge, Massachusetts and Princeton, New Jersey. And I'm like, like those are the top places. You're like, wait, that that's just like professors doing research or something. But when it comes back at West Virginia and Louisiana, Pennsylvania and Michigan, I think it's a little more reliable.

[00:13:45]

And I think when we do kind of get ground truth, you see over and over again that the Google search data correlates with real world outcomes. So if you look at the parts of the country that search for God the most like, it's almost perfectly correlated with religious belief. It's the Bible Belt and like other areas with high rates of religious faith. And I think a lot of things that interesting about that is it doesn't mean that everybody who makes a search with God believes in God, like you could search God, prove that God does not exist.

[00:14:17]

But that and then actually I think that one of the top searches with God is God of war, a video game that would stick out in the in the data. But that's like five percent or six percent of searches. And that is kind of swamp like all the other reasons you search for God, which is you looking for God quotes or Church of God or whatever, you know, and and so that's why correlates so strongly, I think with the help once again, like when we actually have ground truth, when we have CDC data, they're correlated very strongly with with the Google searches, correlate very strongly with the actual health conditions.

[00:14:52]

If you also if your friend has depression and they live near you, then maybe that was like telling us that he was right, although I guess it would still undermine efforts to find correlations like people who, you know, search for fashion magazines and then search for depression, symptoms of depression or something. We couldn't assume that like the same.

[00:15:09]

Yes. At the individual level. Well, that would be problematic. But at the community level, that would still frequently work. Right.

[00:15:15]

So to make sure I understand the two men defenses you're giving of being able to confidently interpret Google search terms are first, that when we are able to check the prevalence of search term against some objective measure, it tends to show that the search term, you know, represents indeed what we thought it represented, and that to the patterns that we find of these search terms match what we would sort of like, what we already expected, like searches.

[00:15:44]

Oh, well, then it wouldn't be interesting if you totally already expect. That's kind of what I was going to say. Yeah, but it's not as is expected. It's not that. It's like it's just there's not like an other explanation. They don't call it. It's like if you saw health conditions and they always correlated with like hospitals or something, maybe like oh like all the people who are making health conditions are doctors or something like that would change how you think of it research so that racist searches are all in hospitals.

[00:16:08]

We're all in college towns. You could say, oh, it's professors, but that doesn't happen. So so that so that kind of gives you some confidence. Then, like with the racism thing, the fact that they correlate with where Obama did worse is, I think, another proof that it means something different.

[00:16:26]

So what about one example you actually do talk about in the book is comparing the rates of people searching for the phrase, I regret having children versus I regret not having children. And if I'm remembering correctly, it's it's it's much more common to. For people to search for I regret having children, then I regret not even controlling for how many people have children. Oh, interesting.

[00:16:51]

I actually hadn't read or noticed that. So that's actually good to know. But but so, again, I don't know how common we should expect this to be, but I'm pretty sure I don't have children. But I'm pretty sure I have searched for phrases like I regret having children just because I was curious. I wanted, you know, to to look at information, examples of other people who had made the choice about whether or not to have children.

[00:17:15]

And we're talking about whether they regretted it. And I suppose you could say, like, well, I wouldn't expect lots of people to be doing that. But I don't know. It's not obvious, obvious to me that you shouldn't expect that.

[00:17:26]

Well, I thought I thought for that one. What was interesting is that you can look, I just thought it's interesting. The questions people don't ask so people don't ask, will I regret, will I regret having children? Which may be the way that if you are trying to decide whether to have children, you may phrase it that way. But but they they do say afterwards that they regret having children. So that one was I think I don't really make too much of that because, like, it kind of an extreme statement, not that many people.

[00:17:56]

But I think it is just kind of interesting that in the idea that that a lot of people do tell Google things that they might not tell other people.

[00:18:02]

Yeah, it is also interesting and you point this out in the book that how many people use Google as kind of a confessional so they're not searching for phrases like symptoms of depression. They're searching for things like I am depressed or I am depressed.

[00:18:15]

They know, which I agree, is that it's harder to interpret that as something other than a statement about the person. Yeah, it's kind of weird, I'm not totally sure what to make of it, I think maybe because people tell Google things that they don't tell other people, they're just in a habit of kind of confiding in Google. Right. They just start typing sentences. But, yeah, I regret having children in such a weird thing to take them to any type of school.

[00:18:39]

But does anyone type in? Dear Google, I regret having children.

[00:18:45]

Now, the people do search for Google a lot, though, which is kind of weird.

[00:18:48]

And that must just be I bet I've done that, though. I think that's probably just a, you know, muscle memory. Like, I forget, I've sort of I'm not paying much and I've forgotten that I'm actually on Google and I'm like typing as if I'm typing something in the euro window or I don't know, maybe ask me. Yeah.

[00:19:04]

I mean, what do you do actually think you would type in if you were trying to decide what happens regarding children? You think you type. And I regret having children versus like do people regret having children or is our children a good decision? Yeah, I don't think he did a full sentence, you're basically saying that it starts with I. Right. But yeah, I think that that's. I mean, I agree. I agree. It's the sort of natural, straightforward explanation is it definitely feels more common sense.

[00:19:35]

I just I keep getting burned with psychology, social psychology, where the the common sense assumption about what's happening just doesn't get borne out by the data. And so I've just been trying to cultivate this extra layer of wariness of assuming that my interpretation must be correct. If it sounds right now know. Yeah, yeah.

[00:19:56]

It's also Google search can be as good as we want it to be. How so? Because like, you could ultimately basically follow individuals over time. I just have aggregate anonymous data, but like you could know people's Internet behavior over time and then you'd know if they actually had children and then you'd be like, OK, that's when you say you could.

[00:20:16]

You mean Google in theory. I mean that data exists, that this is not made available.

[00:20:21]

But, you know, depression, like, you can probably figure out whether someone's actually making a search about themselves if you follow if you had all that information based on their online activity. But, you know, sometimes sometimes like people search panic attacks at like 3:00 a.m. or something. And then it's like you're I think you're pretty sure at that point that that person's actually having a panic attack, even if not everybody is searching for panic attack, is having a panic attack, like there definitely can be clues if you kind of dug deeper into the data that could tell you with more confidence what's actually causing that search.

[00:20:56]

So you talked about a kind of validation of the Google search results by comparing them against, you know, aggregate search terms for disease, for example, comparing those against rates of the disease. Is there any way to validate Google search terms as a metric for other things at the individual level? Like, I don't know, it seems again, it seems pretty intuitive that people who search for the N-word online or for N-word jokes are more likely to be actually racist.

[00:21:24]

But is there any way that we could in the future? I understand your research is pretty new, but some way that we could theoretically check to see if people who search for those things are more likely to behave in a racist way or were more likely to judge people's resumes differently if they're black versus whites, even if it's the same resume, things like that. Does that seem worth doing? Not at the individual level yet, although, you know, maybe someone will kind of you have to at this point get people to volunteer to give you their search data and then do some experiments based on that.

[00:21:59]

I guess you could do it. But they already are comparing the aggregate data on racism to various offline behavior. Is the Obama voting voting pattern was one. Then there's the Trump support. And then recently, people have found that wages that places with lower African figure, white, black, white wage gaps have higher racism. And it seems to survive a lot of controls. So it does seem like, at least in aggregate, this is predicting something pretty important.

[00:22:26]

Yeah, one interesting thing that you can do with this kind of data that you might have mentioned briefly, but we haven't really talked about yet is the temporal component. You can look at how searches change in response to certain events or, you know, over time in certain areas. And one cool example you talk about in the book is the effects of Obama's speech after some terrorist attack. So. Well, I could explain it, but why don't you talk a little bit about Obama's speech and what you were able to learn from from people searches during and after it?

[00:22:59]

Yeah, so this was with having insult us, who I say is a scholar at Princeton, but actually like a junior, he's like a total prodigy. You're going to be hearing his name in the future is the first time you hear of him, but he's going to be big in the future.

[00:23:14]

Good to know that ABC hasn't come up.

[00:23:18]

We were doing this research on Islamophobia. When it's been Islamophobia, it's like rage against Muslims. That and you kind of see there are a lot of maniac's. Who makes searches like I hate Muslims or kill Muslims, but it's such a rare thing to type into Google, just like I hate Muslims, like what? What do you expect I give you?

[00:23:44]

Yeah, I think it's I think it goes into this idea that people make when they're emotionally charged to make kind of like kind of statements that aren't necessarily it's not clear what they're hoping to get from Google. And again, these guys aren't really necessarily, I assume, guys, guys and gals, I guess, are not exactly totally sane.

[00:24:05]

So they're kind of just expressing some sort of rage, not in that moment or not in that moment, I guess, but mentally ill.

[00:24:13]

Yeah. And you actually see that? Well, you also see their highest like three a.m., which kind of also gives you a sense of what these people are maybe like they're not sleeping. They're like, what? Whatever. But then they also they also predict, even though they are these weird searches, like what does that even mean? They actually predict very strongly hate crimes against Muslims. So in these searches are high. There tend to be more hate crimes committed against Muslims.

[00:24:40]

But then we were analyzing these searches after the San Bernardino attack and after the San Bernardino attack, there was an explosion of these searches, like the top search with the word Muslim was kill Muslims. Wow. And and a few days afterwards, Obama gave this speech where he was kind of trying to calm down this mob that I think a lot of people realized that something had gotten out of control and people in Americans attitude towards Muslims. And he gave this speech that was nationally televised and got a lot of attention.

[00:25:13]

And it was kind of a beautiful speech, kind of classic. Obama got rave reviews from all the serious sources from The New York Times, The L.A. Times and Newsweek saying how beautiful the speech was. And he talked about how it's our responsibility as Americans to not give in to this fear, to feel the freedom, to not judge people based on their religion. So Evan and I were actually working on a New York Times column on this this topic while Obama was like during that week when Obama gave the speech.

[00:25:41]

And we're like, oh, let's see if this helps calm things down. Our deal with that, it probably would have because we thought it was a great speech and everyone else seemed to think it was a great speech. So we went to the data and we said, was there a big drop in these really nasty searches about Muslims during and after Obama's speech? And you see that not only did the searches not drop, they didn't stay the same.

[00:26:03]

They basically shot up and say, you know, reasonably long period afterwards, more searches for kill Muslims and I hate Muslims and no Syrian refugees and Muslims are evil. So it seemed like that everything Obama was doing was actually backfiring.

[00:26:19]

But then at the end of the speech, Obama kind of gives this one line that seemed to have a different effect where he said that we have to remember that Muslim Americans are our friends and neighbors. There are sports heroes and they're the men and women who will die for our country. And then you see basically the 30 seconds after he makes this, he said this. You see for the first time, the top descriptor of Muslims on Google was not Muslim terrorists or Muslim extremists or Muslim refugees.

[00:26:49]

It was Muslim athletes and followed by Muslim soldiers. And you kind of kept the top two spots for many days afterwards. And what we suggested in our Times piece, Evan and I, were that this was that basically, if you want to change people's minds from an angry mob, you don't want to lecture them about things that they've been told a thousand times and tell them what they should do and what is their responsibility, but maybe provoke curiosity, a change like change how they think about this group that's causing them so much rage.

[00:27:25]

And then we publish this we published this in The New York Times then I think it's not crazy when you write a New York Times column to think that people in high places read that perhaps in Obama's government, because Obama a couple weeks later gave another speech in a Baltimore mosque. Again, it got a lot of attention. Again, it was on national TV. But this time he basically stopped with all the lectures and the sermon and doubled down on the Curiosity strategy, where he talked about how Muslim Americans built the skyscrapers of Chicago and how Thomas Jefferson had a copy of the Koran.

[00:27:57]

And then you see after this speech, most of these anti-Muslim searches actually dropped. So it did seem like, obviously, I'm not going to go from two speeches. We learned how to how to end hatred. But I think it is suggestive and does suggest something that we could use some of this data to turn something like seemingly as insane as how to calm an angry mob into a real science.

[00:28:25]

That is really interesting. Do you is there any way to tell from the data that we have? Whether I mean, there's two stories that you could tell. One is that people who hadn't previously been angry at Muslims were made angry by Obama's speech, the first part of it, anyway, of his first speech, and they started searching for kill Muslims. Or you could tell the story that, you know, the same people who were angry at Muslims are angry at them still.

[00:28:53]

And so Obama's speech didn't help. But they're just he's like reminded them that they dislike Muslims and that they're doing more searches. So is there any way to tell whether it's more searches from the same people, the same IP addresses or searches across a broader range of people? In theory, yes, because that today exists, not with the data that is publicly available now, you could you could look at areas which we haven't done and you could look at areas since there is some very significant variation, how frequent these searches are made in different areas, you could say, is it areas that didn't previously have these attitudes versus areas that had these attitudes in big numbers, but you couldn't at least now do at the individual level?

[00:29:36]

I'm kind of hoping that because of the power of this data, like there will be more you know, there'll be more that there will be more support for you while protecting people's anonymity, giving some of this data at the individual level, because I do think it is really, really powerful to tell the truth.

[00:29:56]

You're secretly hoping that some hacker is just going to release a ton of search data that you can then use for research, right.

[00:30:03]

So I am and also partly also because it'll be embarrassing for everybody else. But it wouldn't be surprising for me because I could credibly say that every search was for research purposes.

[00:30:12]

That's so true. Oh, man, I should totally write a book about weird sexual preferences I discovered for life.

[00:30:22]

Yeah, well, actually they do a well, and I did use I have used data a little bit in my research and I talk about it. At one point in the book, AOL released their data, anonymous in aggregate to researchers, and it was a huge disaster, which is why.

[00:30:38]

Because it could be anonymized, right?

[00:30:40]

Yeah, because like they thought they just didn't think twice. Like some mid-level employee just gave it to researchers. And just like you hear like what sounds interesting, right. And then someone would search, like, their address and their name. And then they're like and then like herpes symptoms or something that people are figuring all this out. I was like, yeah. So now, like, they're, I think rightfully very cautious.

[00:31:03]

But I think there are ways to do this in ways that would protect people's anonymity but still help with the research.

[00:31:11]

And maybe Google could hire some really good hackers to try to to anonymize the data. And only when the hackers failed would they release it to researchers. Oh, but they'd have to kill the hackers afterwards.

[00:31:24]

I guess they do that with a lot of their products. That's where they test for their products.

[00:31:30]

How was going to say, oh, just one last thing on the like people's reactions to Obama's speech thing and people's reactions to different different strategies to get them to be less racist or, you know, less hateful. I the thing that motivated my question about like, is that the same people or broader base of people being doing racist searches was I have this I have this worry or intuition, this hunch that the strategy, like people tend to look at the most extreme members of society and what happens with them to determine what is the right outreach strategy or the right sort of rhetorical approach.

[00:32:14]

But it might just be the case that. The strategy that works best for most of the population works worst for the most extreme group of people. So, for example, it might be the case that the majority of the population is is sort of is like shamed into trying to not be racist by speeches like this, not necessarily because they like Obama and want to live up to his ideals, but because they think that Obama sort of represents society's ideal and they don't want to be judged a bad person by society.

[00:32:49]

And then there's the small minority of people who react really strongly against that. They're maybe the people who visit 4chan and they're making the alternative thing now and they're going to react really negatively against that and sort of, you know, do the opposite of what the prestigious societal leaders tell them to do. And we just can't have it both ways. You know, we have to we have to, like, choose whether we want to try to to radicalize the moderates or to, you know, to to shame the majority of society into not being racist, that kind of thing.

[00:33:22]

This is just way it might be. I mean, I think, you know, I think for the hate crime scene, you know, with what was going on after the San Bernardino attacks and what Muslim Americans have been experiencing after terrorist attacks, I think they probably will be more concerned with the extreme fringe members, the people who tend to shoot Muslims or attack mosques, terrify Muslims. But, you know, so I think that was more the goal of Obama's speech at that point.

[00:33:49]

But, yeah, I don't know. Yeah, it's certainly possible, I think in general. That's right. That is something that probably we don't do enough on research, which is we usually like to test whether something is effective or not effective on average. And clearly things can, you know, that are effective for one group, make that fire for another group.

[00:34:10]

Right. Do you think that it's it's good on net for it to become common knowledge that just just how common or how frequently people search for these things that are, you know, generally considered shameful or secret like like secret prejudices or, you know, secret fears about their body or things like that.

[00:34:35]

I don't know, I honestly, I thought it was so I think like the secret body wanting definitely, you know, like secret insecurity, I think is generally good, because I think when you when people learn that they're not alone in their insecurity, I talked a lot about like men's bodily insecurity and their focus on man boobs or something or whatever, which is kind of it's kind of amusing, but it's actually a serious issue.

[00:34:59]

And no, right. There are a lot of. There are a lot of serious issues that talk about in the book where I think making people aware that how common these fears are will will make them feel less alone. But you could say that the racism thing is like, oh, why do I feel so secret? Why should I feel so bad by racism? When are all these other racists out there? So.

[00:35:22]

Yeah, so that. Well, I was I was being a little Trixi by lumping those two examples together as if they were examples of the same thing. But I also I feel exactly the same way the, you know, people's insecurities or weird sexual peccadilloes or whatever. That's it seems good for that to be known that, you know, you feel weird, but you're not actually that weird.

[00:35:41]

But when it comes to racism, I'm really torn or other sort of socially damaging attitudes because on the one hand, even the sex yagman, even the sexual stuff, it's not clear that we have this idea that, you know, that like there's something good about a large number of people having it. Like five percent of men are gay. Then men should not be embarrassed. And I don't think they should be gay. But like, if you have a sexual preference that one in one hundred thousand people have, I don't think necessarily that means you should be embarrassed of.

[00:36:11]

And that's definitely something that this data would also reveal. The things that maybe you thought are more common aren't. Oh, I see.

[00:36:16]

It could go the other way, right? That's right. Yeah, I would say very much like. Yeah, everyone always assumes that everything I research, I'm talking about myself. But I would think that everything I discovered was not me at all. And like almost all my insecurities, it turns out, are just totally weird and like so funny.

[00:36:32]

Oh wow. Well, so about the the racism one, the way I've been thinking about it recently, even before reading your book, but just just looking at Trump and sort of the way that the ultra right has become just so, so much more mainstream than it was, you know, five years ago. It doesn't seem to me necessarily that the country is becoming more racist, but it does seem to me that Trump in the whole discourse around Trump is is creating common knowledge about the racism of this country.

[00:37:07]

And I've just been really torn about whether that's good or bad, because on the one hand, you could tell a story where it's good in that like, well, at least now our country knows, you know, we know what we're up against. Maybe now people will sort of take us seriously when we say that it's important to fight racism because it's sort of become much more explicitly a problem or like the problem has become more explicit. But on the other hand, you could tell a story where it's bad because, OK, let's say let's say racism is really common.

[00:37:33]

And, you know, a lot of people, maybe even a majority of people are actually racist and search for the N-word jokes online. They're still not necessarily going to feel emboldened to sort of push publicly as a group for racist policies if they if they don't realize that that a lot of other people are also racist. And even even if each individual person knew that racism was common, they still wouldn't be emboldened unless they knew that other people also knew that this was a common belief.

[00:38:04]

And it's when you get to this third stage of common knowledge where each individual racist person knows that everyone else knows that racism is common and it's there where you start, that's like the spark that that makes those individual people feel like it's totally they'll be totally fine and protected if they start pushing these attitudes and public and that feel like the stage that we're we're getting to with Trump. And it also feels like the kind of thing that apologies, but research like yours could sort of help create sense.

[00:38:35]

I mean, I'm going to be honest. I just really interested in things and just like and don't think of like the third level is common knowledge.

[00:38:42]

What I feel you man. And I'm not pushing for censorship at all. I know.

[00:38:47]

I also think it does. Yeah, it does get hard once you start having those questions in your mind. Like, yeah. Like it becomes really hard to even even do research. But I do take you know, I think that the other one that I think anti-Semitism is another one where like. It's given that people don't know that it's not even that antisemite, don't know that there are other that there are other antisemite with them. I think I've done a lot of research on Stormfront.

[00:39:16]

Yeah. And a lot of this white nationalist movement. And I think a lot of people, it's kind of like tend to be younger people more frequently, men who are just kind of unhappy with their life and looking for something to latch on to. And I think literally just hearing the word stormfront or hearing the word like Jews or hearing the phrase like Jews create problems in society can allow that to fill the vacuum relative to something instead of something else.

[00:39:46]

So I think that is why.

[00:39:47]

Oh, I see. Because you're finding that it's not the the risk is not just the thing I was describing of people who are already racist, feeling emboldened to be public about it or push for racist policies publicly. You're saying people who are sort of at a tipping point and they could go either way if if these views are more common or mainstream or they're more likely to encounter them, that could tip them into being racist when they otherwise wouldn't?

[00:40:11]

Well, yeah, I think the answer you're going to hear once we got to the United States, it's not like at least two years ago, it just wasn't on people's minds. I don't think many people were hearing these conspiracy theories and now they are more. And you actually do see that the Google search data were like Steve Bannon gets in the news and you see people in Montana searching for Steve and then they're searching for white nationalism, Stormfront, and then they're searching for Jews are evil.

[00:40:34]

Oh, my God. It's like, whoa, they're gone. Yeah, they're like that.

[00:40:38]

I think it just wasn't on their mind at all. And now by being told it, it is on the map. So, yeah, like yeah. Like this is obviously like yeah. It's all incredibly explosive and dangerous. Like what. Like what changes causes people to be racist and anti-Semitic and so are necessarily easy answers. Yeah. Maybe I do think we can use this data to research and understand better what causes it, but maybe I should like my research for academic journals and nobody reads and just time just write really boring, really dry article.

[00:41:13]

Yeah. Just the last thing on that point. You're probably more familiar with these studies than I am, but I know there's a whole body of research in, I think, behavioral economics, studying how social proof works as a as an influence tactic. And what they find is that it often backfires, like what we what we when we try to to urge people to stop some behavior, often the way we intuitively try to phrase it is, look, you know, well, take a simple example.

[00:41:42]

God, you know, everyone's been everyone's been failing to wash to rinse their dishes off before they put them in the dishwasher. Please stop that, because, you know, then the dishwasher can't clean them. And you might think that such a, you know, statement would cause people to rinse their dishes off before they put them in the dishwasher. But actually, what it does is it backfires because the part that people focus on is, oh, everyone's doing this.

[00:42:04]

Everyone's not rinsing their dishes off. Well, then I feel like it's OK for me to not do that or, you know, to to not rinse might just as well, because apparently this is a more common thing that I realized. So I don't have to feel so guilty about it like I used to. Yeah.

[00:42:18]

So anyway, I it's it's really interesting.

[00:42:21]

I think I think we definitely can use this data to understand all these facts, more like social proof and backfiring and you know, but yeah, definitely. Just as far as making things available to the best ideas available to the masses, it's obviously complicated. What helps and what hurts.

[00:42:38]

Yeah. Anyway, I'll leave you alone on this topic. I'm sympathetic. It's a tricky question and I and I don't favor censorship, so I'm not telling you to stop doing your research. Let's instead talk about political polarization. So this was another really interesting part for me, because I have I've read a lot about the filter bubble theory that, you know, we the Internet and social media are making us all more politically polarized because, well, for one thing, there's just so many different sources we could choose to consume on the Internet.

[00:43:10]

And so, you know, because of confirmation bias, motivated reasoning, we have more opportunity to just seek out the sources that agree with us and validate our pre-existing beliefs on the Internet than we used to. And then secondly, tech companies like Google or Facebook are making the problem worse by customizing our search results or customizing our Facebook news feed with an eye to results that we're going to like and, you know, click on and read. And that's going to be the stuff that supports what we already believe.

[00:43:40]

So they're sort of making the confirmation bias, making the filter bubble problem even worse. You talk about some results, some research in your book that contradicts this theory. How does that go?

[00:43:51]

Yes, but that's not my research. It's Jesse Shapiro and that Gentzkow and they've basically been studying like exposure to different political views offline and online. So how frequently do you. Watch TV. Watch news of with different viewpoints and how frequently do you have friends with different viewpoints and how frequently do you have work colleagues? Different viewpoints, and how frequently do you are exposed to different viewpoints online? And they say that contrary to this conventional wisdom, you're actually more likely to be exposed to different viewpoints online and offline.

[00:44:32]

There are kind of a couple of reasons for this. One is that despite this idea that there's kind of this long tail and there are all these random websites and you can find whatever information you want, the vast majority of people get most of their news from only a few sources, Yahoo! News and AOL News still maybe or a couple of other news. Yahoo! News is one of the big one, but I think it's another one that shows I'm in a bubble because I had no idea people used the Yahoo News.

[00:44:58]

Yeah, all this research on the Internet just tells me how good I thought. Many feel better, but I'm like all dimensions. Yeah. Yeah. So it's so whatever. Like, they're like four main sites that everyone goes to. And the other reason is that social media, which we think is like that's considered the biggest cause of this filter bubble that people just talk to their friends. It's not even that's not quite as extreme as people think, because on Facebook it might be different on Twitter, which they haven't studied yet.

[00:45:34]

But on Facebook, you tend to associate with your weak ties. So you're not just friends know you're not just friends on Facebook with the people you hang out with offline, but you're also friends with, you know, those who haven't talked to in five years in high school and high school acquaintances you haven't talked to in 20 years. And these people tend to have kids are much more likely to have different viewpoints than you do. And I think one of the big parts of the big points of research is that we experience like filter bubbles offline, like a huge number of the people we come across in our offline lives, share our political views.

[00:46:15]

So our friendship really is a family, our shared political views, our colleagues share our political views.

[00:46:21]

And that's been going up over the years, hasn't it?

[00:46:23]

Yeah, it has our neighbors, our political views. And so like so like even though it's true that that it's true that people online, we tend to associate more with people online, with their political views and without our political views. It's not more extreme than the phenomenon offline. And they also found it recently that the biggest increase in political polarization has been among elderly people who are the least likely to use social media and the least likely to use the Internet frequently.

[00:46:51]

So it doesn't seem to be caused by the Internet.

[00:46:54]

That is that is definitely interesting. And it throws a throws a wrench in my sort of common sense model of the world. I almost wonder if we need a Two-Factor model where there is maybe use of social media on the one hand and then consumption of like talk radio on the other hand or something. And maybe the consumption of talk radio explains the polarization of older people. I don't know. I'm just spit balling here. But it's not obvious to me that that we should just stick with the like the fact that older people use social media less as not and are more polarized.

[00:47:32]

Assuming that is true, it doesn't definitively refute the theory that social media could be all else equal controlling for talk radio. Talk radio could be making things worse. Yeah, that.

[00:47:44]

Yeah, yeah. There's a lot going on, but yeah. But I think the other point that in general, like a lot of there's huge polarization offline I think is I mean yeah that is true and it's getting worse.

[00:47:57]

And so it's got to be part of the explanation. Yeah. But OK, here I'm just, I'm just talking not about research but about my own anecdotal observations and impressions. But one thing that social media seems to be worsening is. The so, yes, in real life, I'm going to sometimes not super often, but sometimes encounter people who have different political opinions from me, but probably our interactions are going to be kind of polite and friendly because in general, people in real life are polite and friendly to each other.

[00:48:31]

But online people's interactions with those, with other people with differing opinions are much less likely to be polite and friendly because the Internet makes it you know, it emboldens us to not be polite and friendly. And we don't really feel like those other people are people. And so it seems quite plausible to me that there's something about the anonymity of the Internet and those weak ties that we're encountering with saying things that we disagree with that could be making polarization worse and making people feel antagonized by an antagonistic towards people with differing views that they didn't previously feel.

[00:49:08]

Does that seem to does that contradict the data? No, I haven't see it like, well, Facebook, it won't be anonymous, it'll be more like anonymous if you just randomly come across someone who you're not going to see and then a week.

[00:49:20]

Yeah, OK. So anonymous isn't quite the right word. But still, I think even on Facebook with people whose names you can see, even people whose identities, you know, something about the Internet media makes people more likely to be rude. I found. That's possible, I don't know, I mean, I've had I remember and I've definitely been in person situations that have been pretty rude. I'd be interested to to to see the date on that.

[00:49:46]

I'm not I'm not. I remember in high school, which was the last time I was with her, a lot of people with opposing political views. Yeah. For me, it was pretty wild and rude and obnoxious and screaming.

[00:49:58]

Maybe that's my, I don't know, politeness.

[00:50:00]

But yeah, I think it definitely does seem like the conversation, the political conversation on Facebook is unusually hostile, but I don't know. Well, one other thing, I just wanted to make sure I ask you before we have to wrap up, is a problem in general with with research of any kind, really, but especially with big data. It's the problem of data mining. Right. So, you know, you report all of these examples of fascinating correlations and and often the you know, there's a very plausible story for that correlation.

[00:50:42]

And it's hard to come up with other plausible stories. But still sort of in the back of my mind as I'm reading these findings, is how many other correlations did Seth or did the researchers look for? Like if, you know, if you find, for example, this is just made of example, that people who search for, you know, fashion magazines or read fashion magazines online are also especially likely to search for the phrase, I'm fat.

[00:51:10]

You know, that could be presented as evidence that reading fashion magazines makes people self-conscious or insecure. But it could also be the case that there were a lot of other correlations that the researchers search for, like searching for the phrase, I'm overweight or I'm ugly or, you know, you know, my hair is ugly or something like that. And maybe there weren't any of those correlations. But you find one for the phrase, I'm fat and present.

[00:51:35]

That is evidence of this effect. And that's just always a problem in research. But it just seems like especially like there's so many different things you could look for in this data and in big data in general. Is this a problem you were you're sort of trying to mitigate? Do you, like, preregister your studies in any way or keep track of the correlations you search for anything like that?

[00:51:57]

I definitely try to agree. It's a huge problem. So. You know, the things I presented a book or things that I thought survived, you know, like we're clearly not, I talked about how the top question that women have about their husbands is, if you do the phrase is, my husband, the top way to complete that is, is my husband gay? Yeah, well, it would be like if they do it in other ways, like maybe when they're searching, whether he's depressed, they're like signs husband is depressed and that's much more common than signs just as gay.

[00:52:28]

But if you actually think together, all the different variants, you know, it's not exactly the numbers I put where eight times more likely to be depressed. But maybe if you put them all together, it's like seven point eight, six times more likely. So it's not like that one held up or like I thought about how the top searched about with my husband wants in India. If my husband wants me to breastfeed him.

[00:52:49]

Yeah, that one really threw me for cheap. So I was really. Yeah. Which is really throws throws people for a loop and it was pretty wild, especially since nobody talks like nobody acknowledges that even after this research has been found. But that one could be like, you know, is that just like one way to phrase it. But if you do all kinds of their variants, it's the same thing. And it's like if you do it tips on breastfeeding.

[00:53:12]

I can pretty much all countries in the world, not surprisingly, like ninety nine percent of searches on tips to breastfeed or tips to breastfeed a child. But in India, they're about equally split between tips to breastfeed a child and to breastfeed the husband. Oh, my God, I don't like that. So, like, I definitely do. Like I like I like to think that the things behind my research are like sort through. And the other thing that Google does is they have categories now.

[00:53:37]

So if they do if you're like looking up anxiety, they'll categorize a whole bunch of anxiety searches, related searches, anxiety symptoms and anxiety, health and anxiety. What do I do? Anxiety and anxious, whatever. And they'll put them all together in a basket of anxiety.

[00:53:55]

I think that helps. It sounds terrifying. A basket of anxiety gone.

[00:54:00]

But but I think it's terrifying. But does help with the cherry picking, I think, because, yeah, they've kind of done all the work behind the scenes to make one big category. But I think, you know, it's definitely. It definitely issue, I think in general, you want to like you want things that work out a sample and hold up to other researchers. So, you know, that's why. Yeah, definitely. When you're reading something, it's you know, if it's a first pass, it hasn't been reproduced a hundred times.

[00:54:29]

I think some skepticism is definitely warranted.

[00:54:32]

Hey, maybe you could make available your search history for the search terms so that people could see what you search for and what different combinations are tried and so on and so forth.

[00:54:43]

Yeah, and they'll see what a liar I am and cherry picking idea.

[00:54:47]

And you could just say, well, you know, everybody lies. It's right there in the title, you know. I told you so. All right. Well, we're just about out of time. So before we close Sath, I want to invite you to give the rationally speaking pick of the episode. That's a book or blog or website or something that has influenced your thinking in some way. What's your pick for this episode?

[00:55:08]

Well, actually, I had another point about this pick, because I'm sure, as you know, I'm all about meta points an hour, an hour and a half before our conversation to tell me about something that's improved my thinking about through my Kindle. And I realized that all the things on the top that I've been reading recently are all right and that have hugely influenced me. All books I'm extremely embarrassed about and would never announce to the world I had.

[00:55:37]

That kind of goes through. Everybody lies, right. So so I have to go through, like, getting through all this like cheesy self-help things.

[00:55:43]

And so we should really all share our Kindles and our and our Netflix watching history and everything. Right.

[00:55:50]

I think there might be more useful than asking asking for a pic which inevitably gets me to find something that's like that. Maybe people haven't heard of what makes me sound like, you know, makes me sound like I know something other people don't and intellectual or whatever.

[00:56:03]

So I just have to ask my guest on the air to open up their Kindle and, like, randomly pick a book and tell me what to do. Anyway, sorry, was your your filtered, you know, respectable icepick, though I don't go with Steven Pinker is the better angels of our nature.

[00:56:21]

Why violence has declined that. I know everybody knows this book, but the reason I'm recommending it is because I hadn't read it. Actually, I just had I knew about it. I knew the argument and I didn't really see anybody come back against the argument. So I thought, OK, like the media exaggerates. You know, the media loves to talk about individual crimes. They make us think that we have this huge crime problem. But really, it goes down over time.

[00:56:45]

I just thought that like to sentence or whatever, that that summary was enough and all I need to know. But recently I read the whole book and I was kind of blown away by how thoughtful it is and how smart it is. And the whole idea of the civilising process, how that's cause how that really helped everybody's impulses are kind of that that you have to come. People have an inherent tendency to just follow their impulses and you have to basically stop people from doing that in all dimensions.

[00:57:16]

That that's kind of how we got violence down. And one of the reasons I don't know if I totally believe it's one of the reasons Pinker thinks quiet crime rose in the 1960s was because the hippies kind of stopped this whole idea that you have to not follow your instincts. So I just had a very smart and very interesting and worth 700 page read about, even if you know the two sentence summary.

[00:57:41]

So you updated about human nature from the book and civilization, I imagine. Did you also update about your ability to understand books based on their summaries in the media or just Pinga?

[00:57:54]

Is he an exception taker in general? I think the you know, he's I think I think all I've kind of read all his books recently, and I've always had the response that there are a lot better than I realized, because I think at first either got the summary or skimmed them in my self-help. And then more recently, I've read them more seriously and been impressed by them. But I don't know if that's a general a general principle.

[00:58:19]

I mean, one general principle I've been noticing is how often people will claim, like so-and-so wrote in his book and they'll say something offensive. But the actual book says makes a much more nuanced point that, you know, explicitly disavowed the offensive interpretation and that just but it becomes common knowledge because very few people actually read the book. So they just repeat what they heard other people say was in the book anyway. Yeah, that's a common phenomenon.

[00:58:45]

Anyway, excellent. We'll we'll link to the better angels of our nature as well as to your own book, Everybody Lies, Big Data, New Data and what the Internet can tell us about who we really are. Thanks so much for joining us. It's been a pleasure.

[00:59:00]

Yeah. Thanks so much. This concludes another episode of Rationally Speaking. Join us next time for more explorations on the borderlands between reason and nonsense.