Transcribe your podcast
[00:00:00]

When you need your bank, Capital One is right in the palm of your hand so you can check your balance deposit checks, pay bills and transfer money from your phone with a top rated app, and when you're done banking, put it back in your pocket. A banking experience built around you in your life. This is banking reimagined. Get started online any time. What's in your wallet? Capital One and a member FDIC.

[00:00:27]

Hi, this is Melanne Verveer and this is Kim Mazzarelli and we're co-hosts of Senecas Conversations on Power and Purpose, brought to you by the Seneca Women Podcast Network and I Heart Radio. We're launching a brand new season of this podcast, which brings you fascinating conversations with leaders like two time gold medalist, author and activist Abby Wambach and actor, producer and entrepreneur Justin Baldoni, among many others. Listen to Senecas conversations on power and purpose on the I Heart radio app, Apple podcasts or wherever you get your podcasts.

[00:01:02]

Welcome to Stuff You Should Know. A production of IHOP Radio's HowStuffWorks. Hey, and welcome to the podcast, I'm Josh Clark, and there's chose to be Chuck Brian over there and we've got the scoop cherries around here somewhere. This is stuff you should know off to a great start. She's in her 80s. She is.

[00:01:28]

She's got this remote thing going on. Yeah, it's like the covid special.

[00:01:35]

That's right. And this was this has been one I've been wanting to do since twenty sixteen. You know, it seems like the the fire kind of went down on it and now it's the fire is back up again in election season. I thought no better time than to talk election polling and this weird sort of black magic which is really not black magic at all. And now see the polling wasn't even really that off. No. 2016. No, it was great.

[00:02:07]

There was a furious fear. We'll talk about it in secret. There's a furious reaction by the media just left polling and pollsters out to dry, saying, like you, you're terrible. Your whole craft is is useless. You lie to us. As the pollsters went back after election night on 2016, which, by the way, was a bit of a surprise to everybody involved. I think so.

[00:02:31]

Including the president? Well, yeah. When the pollsters went back and looked at their stuff, they said, wait a minute, no, this is this is all fine.

[00:02:36]

It was you guys media, you screwed up. You don't know what polling is or what it does or how to talk about it, most importantly.

[00:02:44]

Yeah. And then you public, you have no idea what's going on. You just see some percentages in medically that lead to some conclusions. And this is way off. So it's in part that the media was misrepresenting it. Some polls weren't very good. And then the public in general just needs to be a bit more educated on statistics to understand what they're hearing. And that's what we're here for, because I took statistics three times in college, the same course, Georgia at Georgia.

[00:03:11]

I took one of those classes. I ended it. Intro to statistics, right? Yep. Boy, I hated that class. The professor finally I walked up to her on the last day. The third time was like, please. And she bumped my dear up to a C and I was I never looked back.

[00:03:26]

So say you have a one in four chance and you're like, but what does that mean. Right, what is four.

[00:03:33]

But so if I can understand this after doing some research, then anybody can understand at least the gist of it, enough to understand polling and not be taken in by bad representation of what poll results are. Yeah.

[00:03:45]

So if you remember in twenty sixteen there were pollsters saying or I'm sorry, I'm going to say that wrong over and over again. You had media saying that Hillary Clinton is going to win in a landslide. She's got an eighty five percent chance to win. Some said as high as ninety five. She's going to win the popular vote by three percentage points. All the all the battleground states in the Midwest. She's going to win those narrowly. And it did not work out that way.

[00:04:14]

And like you said, there was a furor over how could everyone be this wrong with the polling. And there's a man named Nate Silver, who everyone probably knows at this point. Yeah. Who has made his name as a data specialist and runs the 538 blog and said, you know what, polling is flawed. And that's probably the first thing that everyone should understand is all polling is a little bit flawed. State polling is is definitely a little more flawed, the national polling.

[00:04:43]

But here's the deal, everybody. These polls from 2016 were not only not so far off, but historically dating back to since 1972. They actually performed a little better than a lot of elections.

[00:04:57]

Yeah, and the state polling, while worse than average, wasn't that far off from the average error rate. So what do you what do you want? So there's a lot of stuff. Like we said, there is a lot of post-mortem that was done on the 2016 polls and what was gotten wrong and what was gotten right. And we'll talk about that later. But the point is, is that overall it wasn't that far off. And so the idea isn't that the polls fail or that there's something inherently flawed with polling or that there's even something inherently wrong with the media.

[00:05:30]

Like I want to go on record here, especially in this climate, the media is not our enemy, like any healthy democracy needs a vital, robust, independent media as free from bias as an objective to do reality and good injustice is possible. But there's also such a thing as a 24 hour news cycle and you've got to fill that. And that's given the rise of opinion, news and pundits and in basically trying to capture as much market share as possible, which is definitely the wrong track for media in general.

[00:06:03]

But I just want to go on record while we're going to be kind of beating the media up a little bit. That does not mean that the media is inherently flawed or even. Or or seeks to to to kill you and your family and your family dog. So Silver goes back and a bunch of people go back and look at history and kind of what went wrong here in 2016 as far as the polling goes. He says, you know, we went back for the past 12 presidential cycles since 1972, and he said the polling error was four point one.

[00:06:34]

He said in 2016 that national polling error was three point one. So technically, by a full point, it was it was a four point better, he said. We predicted that she would win the popular vote by three percentage points. She actually did win the popular vote by two percentage points. The state polls were the real difference maker. They actually did underperform at a 5.2 error rate. And that doesn't sound like that much. I think the overall error rate for state polls since 1972 was four point eight.

[00:07:05]

So four point eight five point two doesn't sound like much. But if you're talking about percentage of error in just a handful of swing states, right. That can make something look like a landslide even though you lose the popular vote. That's exactly what happened, right? That's exactly what happened. Because you've got to remember, Trump didn't win the popular vote. He won the Electoral College and it came down to those swing states.

[00:07:26]

But the fact that they were off just by four point four points from the average for the error rate goes to show you just how close that race actually was, which again, is the opposite of how it was being broadcast throughout the election. It was supposed to be a landslide like Hillary Clinton. Might as well just be like taking measurements for curtains in the Oval Office right now, like it was just that set. So it was presented one way when in reality, if you really looked at the polls and the polling results, if you looked at them with a sober face, it was a much closer race than it appeared or then it was being broadcast.

[00:08:07]

I haven't had a sober face since that night.

[00:08:11]

So we should talk about the margin of error in polling. Any time you see a poll, it's they talk about the margin of error is usually plus plus or minus three or four, and that is on each side. So for each candidate's poll, in other words, it could be a potential like seven to eight point swing and still be within that margin of error. So when Trump is winning states by a point two percent margin or a point five or a point seven percent margin, that's well, well, well, well within the margin of error.

[00:08:45]

Right, right.

[00:08:45]

So that margin of error, by the way, is just built in. We'll talk about it a little more in a little bit. But it's like there's just no way around it. To get around any margin of error, you would have to literally go through and interview every single voter in America and then compile the evidence or their their data perfectly without any mis keys or anything like that. And it's just impossible. So everyone accepts that any poll is going to have a margin of error, but you want to keep it within plus or minus three points right before.

[00:09:19]

Yeah. So a little history of polling. We've always been pretty spellbound by polls in this country. We put a lot of stock in polls, especially the presidential race, the word straw poll, if you've ever heard that, that comes from the idea that you hold up a piece of straw to see which way the wind is blowing. So a straw poll is kind of like, here's how things stand today on something like this is the way the wind is blowing today on this matter.

[00:09:46]

Yeah.

[00:09:46]

And there's this kind of informal. They used to take them like on train cars as journalists would ask people who they were on the train with, who they're going to vote for, nothing like formal or anything. But it was it does kind of reveal how longstanding our fascination with polls really, really is.

[00:10:02]

Yeah, it got pretty serious in the 1930s, specifically the 1936 election, where a Literary Digest, it was it was a pretty big magazine at the time, polled its subscribers. And it's just kind of funny even seeing this in it's they predicted a landslide win for Republican Republican Alf Landon over FDR. So if you've never heard of Alf Landon, you know why? Because Alf Landon did not beat FDR in the magazine's editor said, you know what? We didn't even think about the fact that we just polled our subscribers and that there are wealthy people, or at least wealthier on average, and they're probably going to vote Republican.

[00:10:45]

So Alf Landon was their man, right?

[00:10:48]

So if you go, wow, it's even today and just interview Republicans and say, hey, who you're going to vote for and then take that results and apply it to the entire population of the nation, you've got a flawed poll and that's what Literary Digest it. But in doing so, they establish this kind of they pointed out a real design flaw that now is just one of the first basic. Things that anybody conducting a poll gets rid of. That's right.

[00:11:14]

Gallup came on the scene, they galloped onto the scene so sorry. And they were one of the first big polling companies to say, all right, we got to get this right. We've got to get a representation of all of America here. So we're going to send our people door to door. We're going to go to every zip code in America. And they did that from 1935 to 1984 and got basically within about three percentage points doing a pretty good job, but it was really expensive.

[00:11:41]

So in the 80s, in the mid 80s, they switched to calling people on the telephone much which which I mean, that that's still today.

[00:11:49]

That is the gold standard is for a human being to dial up another human being and ask them some questions and we'll talk a little more about it. But what what Gallup does and what Pew does and what a few others do is it's called randomized sampling or probability sampling, which is where you basically leave it to chance that any voter registered voter in America is going to get a phone call from you. So what what Gallup is doing and what Pew does is called random sampling or probability sampling where the any voter in America has an equal chance of receiving a phone call from Gallup or from Pew and being asked these questions.

[00:12:34]

And it worked pretty well for a while when they moved from in-person over to the phone because they were still asking people questions and they could still get their answers and harass them, which is a big thing, as we'll see about this type of sampling. The problem is when people started to use caller ID, they stopped picking up the phone as much. And so the response rate went down dramatically.

[00:13:02]

Yeah. So they would call people using random digit dialing, which is a computer system where it fed in an area code and then the first three digits and then randomly dialed the last four. So you've got a pretty good start there on the random sampling. But even then they said, you know what, women didn't answer the phone more than men. So to truly randomize it, whenever whoever picks up the phone, we have to then follow up and say we want to talk to the person in the house who's had the most recent birthday right further randomizing.

[00:13:32]

I got kind of a laugh about this because I don't know that I've ever, literally ever seen my father pick up a telephone in his life, or at least growing up for the first 18 years of my life. I don't think I ever saw him answer the phone.

[00:13:46]

It's all ham radio, not he went into that, but just literally not one time. He would just let it ring if no one was around, if my mom was around to answer it. And granted, it was usually never for him. No one ever called to talk to him. But sure, I picked up on that and my friends used to get really frustrated back before texting that I would just never answer my phone. And I always just thought it was an option.

[00:14:09]

Like when the phone rings, it doesn't mean you're obligated. It just means now you have an option. You can answer it or not.

[00:14:13]

Well, technically, that's true. I mean, like, it depends. No, you don't have to answer the phone, but it depends on, you know, who in your life could possibly be calling you. I didn't think it was rude or anything. I just thought it was literally like, you know, I'm going to hedge my bets here that one of my friends isn't stuck on the side of the road. All right. They can leave a message.

[00:14:34]

And if they are, I'll go get them.

[00:14:36]

So what you're talking about, Chuck, is what's called the nonresponse. And that's factored into the response rate, which with phone polling from 1980 until the 1990s, it was manageable. I think the response rate peaked at 36 percent in 1997, which was good, not bad. Now it's down to like nine percent because like I said, people have caller ID and if some unknown number is calling, you typically don't answer. And that actually affects things because there is a certain kind of person who answers the phone no matter what.

[00:15:11]

And they are not like every single American. And that actually factors into the kind of poll your kid ducting. Plus also you want like a certain amount of responses. I think out of a sample size, you want a minimum of 800 survey responses. And back in the day when you got a 36 percent response rate, meaning thirty six percent of those people you called would answer the phone and go through all of the questions and answer them fully and complete the survey.

[00:15:41]

Since it was down to nine percent, you went from having to call between 2000 and 2500 people to to up to 9000 people now just to get eight hundred surveys completed. And that made the whole thing a lot more expensive. On the one hand, because it was expensive, it meant that there were fewer and fewer companies that conduct these polls, which meant that the polls you were seeing were more and more legitimate. But on the other hand, it also usually decreased sample size a little bit because.

[00:16:09]

As Gallup pointed out, like, you can kind of fiddle with the numbers a little bit with a smaller response rate in smaller sample size.

[00:16:17]

Yeah, and it also led to robo calls because of expense, because of people not answering their phone as much. And those systems. I mean, I love how Dave was put it. He said they they range from okay to terrible and how well they work online polls in these other new techniques. But I think we should take a break and then talk about what I found, the very interesting way that they further randomized this thing from this point forward.

[00:16:44]

Right, for this.

[00:17:00]

Hi, welcome to the meeting last year with Sharon, we are having a moment. Everybody has a podcast now, right? Every celebrity, everybody you knew in college, every family member at least once, there are literally hundreds of thousands of podcasts out there. Yeah, it's a bit of a mess. So I figured, what the heck, what's Woodmore? I'm Nick Quaff and my new show, Civita Pod, we'll give you the most interesting and important stories in podcasting.

[00:17:38]

We'll talk to producers, entertainers and journalists. We'll talk to bigwigs and we'll talk to independent creators. Servanda part. We'll give you a sense of what's happening in a growing world of podcasts and more importantly, why you should care.

[00:17:52]

Listen to serve in a pod on the I Heart radio app, Apple podcasts or wherever you get your podcasts.

[00:18:00]

Over the years, host Erin Manque and the team behind law, unobscured and cabinet of curiosities have scoured the globe to bring you tales from the past with a hint of darkness, from superstitions and folklore to the curious and the bizarre. But now it's time to bring that journey home, because while America's history books are filled with people, places and events that sit on lofty pedestals, there's a whole other world of American history that waits for us in the shadows tales of unlikely heroes, world changing tragedies and legends that are unique to the American spirit stories that we call American shadows.

[00:18:36]

Each episode is handcrafted by the Greyman mild team and narrated by me, Lauren Vogel bomb, and while we might be traveling some dark and lonely roads, you're also bound to learn a thing or two along the way. Get ready for a tour of American history unlike any other. Get ready for American Shadows. Catch new episodes of American Shadows every other Thursday. Listen on Apple podcasts, radio app or wherever you get your podcasts. Nerdy stuff with Joshua Walsh.

[00:19:16]

All right, so we've already talked about the fact that they've randomly called someone and then they take one further step on that call by saying, let me speak with whoever had the most recent birthday, even if it's, I guess, your three year old. Right. And one other thing.

[00:19:30]

I kind of made mention to it that I have to interject, dude, like harassing people. Like if you've been picked by this computer, if your phone number has been picked, they're going to keep calling you and calling you. And that is because as a person who doesn't normally participate in phone surveys, you're a specific kind of person that you can't be left out of the population because you represent a large number of people and they want your opinion. So part of this phone standard of calling people is to call them over and over and over again to basically harass them into participating to get their answers for this survey, because it's as important, if not more important sometimes, than the people who are like, oh, yeah, I'd love to answer this phone survey to totally different kinds of people.

[00:20:16]

Yeah, absolutely. And I was totally kidding, by the way, to the listener when I said they will speak to a three year old, they they ask the most recent birthday of someone, a voting age, obviously. All right. So then you've got a pretty, pretty decent random sampling to begin with. And then you have to start the process of waiting, which comes in a lot of different forms. If you want an example of like a really good political poll, it's going to be paid for by a neutral source.

[00:20:43]

It's not going to be like, you know, a CNN poll or a Fox News poll or a superPAC or anything like that. You're going to have a random sample of the public, which we just talked about. You're going to be dialing cell phones and landlines these days. That's a big one. Also, they'll ask you if you have a cell phone and a landline and if you say, yes, I have both, they're going to adjust your response based on the fact that you had a higher chance of being selected because you have two numbers that the computer could have picked.

[00:21:13]

Right. And another thing is, like you mentioned, they're going to keep calling you the best ones, use live interviewers still. And then what they want to do in this last one is really important is they're going to try and improve the accuracy of the results by weighting the response to match. What they want to do is just match a real world demographic age, race, your income level, your education level, and all of that stuff is factored in.

[00:21:39]

And all this stuff is weighed out because, well, we'll talk about it. But, you know, there are many different kinds of Americans.

[00:21:48]

And if you want a really good sampling of different kinds of Americans, you're going to like like you said, have to fidget with the numbers to make it a true representative population.

[00:21:57]

Right. So because even if you just get it exactly right demographically and waited, like you said, we'll talk about some more in a second. You still have that margin of error. And again, that's that, you know, 52 percent plus or minus three points. And that means that it could be 55 percent or it could be 49 percent. They don't know. But somewhere between that, most of your answers are going to be the correct answer is somewhere in there.

[00:22:23]

That's what that means with that margin of error. And the reason that that's built in is because it is basically impossible to perfectly represent the larger population through random sampling. You're just not going to pick everybody correctly just by the fact that it's random and it's a sample.

[00:22:43]

Yeah, and that's important because, like, that's why you hear so much hay being made over a double digit lead in a poll which Biden had sort of semi recently. I know it's gotten a lot tighter since then, but, you know, when Biden was up, I think like 10 percentage points, people were flipping out because, you know, like we said, it's a plus or minus four for each candidate. So that's a total of eight.

[00:23:05]

And so basically, the press started screaming like he's outside of the margin of error or everybody, like nothing can beat him. Right. Right. But now things are back within that margin.

[00:23:17]

I saw on PBS NewsHour, they interview Mark Shields and David Brooks. Brooks is a New York Times columnist. And I think Mark Shields is an independent columnist. And one of them actually said, and this is in July, America is clearly made up its mind on who's going to be the next president.

[00:23:37]

This is July. Why did you not learn anything from 2016? I couldn't believe that. Those words I know. So matter of factly.

[00:23:47]

Yeah, it's irresponsible. And there's been studies about this, too, that have suggested that that words like that, that polls that say 99 percent chance of winning, that this kind of stuff like actually has a negative impact on the leader because it makes people think, well, I don't need to go out and vote. Everybody else is going to go vote. And the turnout might be lower than otherwise. There's also people who well, there's people who dispute that.

[00:24:12]

They say, yes, it makes sense intuitively and anecdotally. But we've yet to actually see genuine data that says clearly that this has this effect, but it's something that's still being studied right now, whether it actually does or not.

[00:24:26]

Well, and I also saw an article the other day about the the quote unquote, silent majority and that another reason those polls were so wrong back then and they're saying are probably wrong now, is because there are there they they say that there's a substantial bloc of voters who very privately and secretly vote for Trump.

[00:24:44]

Yeah, the term for them among pollsters is QAI Trump voters. They won't admit that they're going to vote for Trump, but they're going to vote for Trump and that that affects polls. I saw that that's actually not been proven to actually exist, but I think it was a pew. There's a really great Pew article. If this stuff is speaking to you at all, go check out Pew's key things to know about election polling in the US. And it has a bunch of great links that you should follow in there.

[00:25:14]

And there's also a sideline. They have surveys and polling, which is a guide for journalists to polling. But I found out you don't actually have to be a journalist to read it online. So if you want to go check those out, they have some great like like just some breakdowns of some of the stuff we're talking about, but also about how to read polls and what to trust and look for in general.

[00:25:36]

And a little known fact, Pew was actually originally called Pupu until 1976 when Star Wars came out. And they like. We've got to change your name now, guys.

[00:25:46]

Yeah, can't do it, man. It is dead. Romney today with you, huh?

[00:25:51]

So back to the waiting thing. And by the way, we should mention that Gallup said if they wanted to increase that sample size and actually get the margin of error down to like plus or minus two, that they could do that. But that would be like a literal one hundred percent increase in the cost. So, like, everyone just please live with plus or minus three or four points. Yeah.

[00:26:12]

And everybody generally does. And Dave, is this really good example? Dave Ruse helped us out with this and he said the margin of error is best understood where if you selected 100 marbles five, there's a jar of five hundred red 500 blue marbles and you pick out 100 of them, you might pick out 50 of each one time and weighed 500.

[00:26:33]

What you said 500 marbles. Oh, no, I'm sorry. 1000 marbles. I've lost my 2000.

[00:26:40]

Yes, there's a thousand marbles, OK? And 500 are red and 500 are blue. Your task, Chuck, is to pick out 100. So you go to the trouble of picking out 150 red, 50 or blue and I say do it again. And this time is 47 and 53.

[00:26:54]

And you keep saying again, again, right smack my riding crop on the desk that you're sitting at. And I do it 100 times, gets super turned on. Yeah, I do it 100 times because Dear Leader told me to write and at the end you get a little bell curve and basically a plus or minus four.

[00:27:14]

Right. So yeah, almost all of them. This is what's a 95 percent confidence interval. Almost all of them are going to fall in that bell curve. There's going to be some outliers. There's going to be that one time where it was just absolutely insane. You actually picked one hundred red marbles randomly blindfolded from this jar. That's that's so insignificant. Statistically, it's just such an outlier. But almost all them are going to be in there. So when you're polling like this large group of people like American voters and 95 percent of them are falling within a couple of percentage points of either side of this this middle, you can pretty much feel confident about that.

[00:27:52]

And that is the basis of of election polling, of political polling, of all polling really, that they have this built in margin that they know exists, but everybody can live with it. The problem is, is when you're hovering around that 50 percent mark and you're talking about a two party system. Yeah, one of them has like fifty one percent and the other one is forty nine percent. But there's a plus or minus of like two points.

[00:28:13]

That means flip a coin America. It means we have no idea. And some people would say, well why even do polling. Because what you're showing there is not who's going to win. That's not the point of polling, but the point of polling is to take a snapshot of how America or whoever your polling is feeling, that moment about who to elect, about what laws to pass, about religion, about the Cleveland Indians. It doesn't matter. Right.

[00:28:40]

Like the the the that's what a poll does. But you can preverb polls into making them talk a different language and say, hey, look at this percentage. You take these polls, you convert them into something else. Now you have something like a 95 percent chance this person is going to win. Go shout that. Wolf Blitzer and Wolf Blitzer goes in, shouted as loud as he can.

[00:29:01]

So we need to talk a little bit more about waiting. I mentioned earlier that there's other things they do to sort of tip the scale, and that sounds like a bad term. So I guess I shouldn't say it that way. But things they do to make it equitable and a true. Percent of the American population, for instance, African-American voters make up 12 percent of voters. So if they did a poll and in the end they only got six percent of respondents that were African-American, then they just double it.

[00:29:28]

Basically, if the respondents were overwhelmingly Caucasian, they would wait that down to their true representative number, which is about, I think 66 percent of the electorate is white.

[00:29:42]

And if 80 percent of white people respond or 80 percent of the people that respond are white, then they're going to kick that down. And again, this is just adjusting the poll to the proper weight. So you have a really legitimate snapshot. And, you know, if it sounds crazy that they are using a thousand people's responses and drawing that out to the size of the voting population of America, it is. But if you're a statistician, it isn't, you know, I mean, know reliably works as long as you present it with plus or minus this margin of error.

[00:30:19]

It's as crazy as just an average Joe on the street. It does to be like they ask a thousand people and we're supposed to know and extrapolate that. And a statistician who are no wonks and data wonks would say, yeah, that's exactly what that means.

[00:30:32]

Shut up. That's really all that's really all you need. But it really is a testimony to the power of those those statistics and that data and the way Dallas is of them. Yeah, waiting's really important. It goes far beyond just like age political party. I think Gallup uses eight different variables. The New York Times Siena College poll uses 10 like and they include things like marital status and home ownership. Pew uses 12 variables. They ask things like, do you have home Internet access to your volunteer or do you engage in volunteerism?

[00:31:05]

And all of these things have been shown to be associated. So, like, if you're a white woman aged 65 to 75 who volunteers twice a month and lives in the suburbs, you're a very specific person where you there's a group of people out there who vote like a certain way and you represent like all those people with that. So they'll wait the results based on these additional questions that you're and they don't just ask you, do you are you going to vote for Trump or Biden?

[00:31:35]

And there's also built into that question a really important point. Are you going to vote?

[00:31:40]

Yeah, that's that's a huge thing we haven't talked about. It's one thing to poll registered voters, but here in America, somehow presidential elections only get about 60 percent turnout. That's still man, which is shameful, shameful and crazy. But that's another podcast. But so most of the really, really good polls drill down. And to get a real, real good representation of what might actually happen, they they try to drill down to whether or not you're most likely to actually vote.

[00:32:11]

Right. Because he cares what your point is. You're not going to vote. And I mean, they generally take your word for it that you're telling the truth, you know. Yeah, but sure, they do have like and I think Pew.

[00:32:24]

Yeah. Pew has nine questions that they basically use to to establish that you are planning on voting like you're actually going to vote. You're not full of hot air, you know.

[00:32:33]

Yeah. I don't know what those questions are, but I imagine they have to do with do you know where your polling places do you have transportation stuff?

[00:32:40]

Oh, I was thinking they were going to be like, are you really, really going to vote? Was like question three. They just kept adding, really, I hate you. So so you've got these these people who've been called and they have answered these questions and they have participated in this survey whether they wanted to or not. And they've finally done it, built into that margin of error, built into this poll. Is that understandable margin of error that just comes from the fact that it's a randomized sample?

[00:33:12]

Right. But what Pew and any other legitimate group polling group will point out is that the margin of error is actually greater than that. That the margin of error for the average poll, according to Pew, is that it's something more like six points right now, not three or four.

[00:33:29]

It's actually six.

[00:33:30]

And the reason why is built on top of that margin of error. That's that's automatically part of the poll. Just by the virtue of it being a randomized sample are things like the person typing in the wrong key to accidentally human error. But those kind of things add up or that the question isn't worded clearly enough, that anybody who hears it knows the intent and knows what their answer is, that there's some sort of miscommunication involved. There's also things that they can't control for like people who have pseudo opinions, who don't want to sound dumb.

[00:34:05]

So they just answer yes or no based on something they really don't care about either way. And because they don't actually have an opinion that actually waits things the. Wrong way, so when you add all these stuff, these things up, you have these additional errors. Yeah.

[00:34:23]

That lead to a bias overall in the the the poll which can affect the outcome. But again, the companies that have the money to conduct like these genuinely big gold standard polls are they know enough to know how to kind of factor control for those as much as possible.

[00:34:43]

But still, what Pew says is if you're listening to a poll and somebody says plus or minus three points, you should probably go ahead and double that in your mind, double it in your mind, double your double your pleasure, double your fun, double your margin error.

[00:35:02]

So let's take a break and we're going to come back and talk about what exactly they think went wrong with those state polls right after this.

[00:35:22]

High people to get here, maybe you know me as mayor in my new podcast, I'll be talking to people from every field whose ideas and actions will shape an era that is about to begin.

[00:35:32]

We can take this time and use it in a way to bring people together.

[00:35:36]

When people protest in a courtroom, that means they still love it enough, but they still believe change is what.

[00:35:41]

I have hope that we are actually going to figure out how to allow people to be free hearted, free thinkers.

[00:35:47]

Listen to the deciding decade on the I Heart radio app, Apple podcasts, or wherever you get your podcasts.

[00:35:55]

Paper Ghosts is a true crime podcast that investigates the search for the person responsible for the abductions of four missing girls in neighboring New England towns for more than 50 years. Each case has remain unsolved.

[00:36:09]

Every day is like being lost in limbo. I pray every day that we find Lisa so we can go on. It wasn't until this past year that things took an unexpected turn. Breakthrough answers to decades old questions and witnesses finally ready to talk. I don't think I can describe what he's wearing. It's only a mile away. Jesus, Mary and Josephine. I hope that's brave for many of you know what I think it is? Listen to paper ghosts on the radio app, Apple podcast, or wherever you get your podcasts.

[00:36:54]

Money starts with Joshua. All right, so I think it's generally acknowledged that 2016 the and again, I want to say the polling was was off, but apparently the polling wasn't off, but the way it was reported on was off. But what really happened in 2016, what was off was the state polling and what they think they, like you said, gone back and obsessed over these polls since then, you know, because they were already statistical wonks.

[00:37:34]

But when something like this happens, they really sort of get worked into a dander and get to the bottom of it.

[00:37:40]

I mean, people were calling for the end of polling, said it was a failed profession.

[00:37:45]

Basically, it was like, I'm getting rich off this man. Yeah, we can't. In polling, Jimmy Pew was like, stop, stop talking like that.

[00:37:55]

So what happened in 2016 is they think is that a lot of non educated white people came out in big, big numbers for Donald Trump. And that was sort of a new not a new factor because they had always talked about college education, but a new factor in how outsized of a factor that was. It had never been that outsized. And so all these state pollsters, they didn't wait it and they didn't adjust their polls to reflect this fact that college educated people are more likely going to respond to these surveys.

[00:38:32]

So their polls were just off. Yeah.

[00:38:35]

And they knew that college educated people were more likely to respond to the surveys. That wasn't news to them. What caught them sleeping was that they had not picked up on the fact that this group of people, non college educated white voters, were going to go to the polls in numbers like never before and that they were going to vote for Trump. They did not pick up on that. That was brand new. Like that didn't exist before. Trump basically brought up a new electorate that helped get him elected, especially in battleground states like Wisconsin and Michigan, in Indiana, although I think in the end he was a shoo in because of Pence.

[00:39:17]

But this group of voters that did not exist or this line between college educated and non college educated white voters that that partisan gulf hadn't existed before Election Day. The pollsters didn't pick up on it. And so they didn't wait those responses because they never had to wait the responses before based on college education.

[00:39:38]

Yeah. So suburbs, exurbs and especially the rural vote counted like it had never counted before, which is obviously why you see what's going on right now, like a very hard push by the Trump campaign to to get these these same people out again in the way that they do that. That's the nicest way I can put it genuinely is.

[00:40:01]

So, yeah. So the idea that that there was is was already kind of a close race, a closer race than was being broadcast that these these electoral huge electoral battleground states that got flipped. That was basically the reason that Trump was able to take the Electoral College. But the idea is that these voters kind of came out of nowhere and voted for Trump and that there were some other things that happened to that the pollsters didn't anticipate. One, that the undecided voters, people who said I'm legitimately undecided at this point, a week before the election, from what I read, they broke hard in favor of Trump on Election Day.

[00:40:48]

When they made their decision, they voted for Trump. That hadn't been predicted. That was another big one. And then one of the other things, too, is that the polls were just doing what polls do, which is sometimes they're right, sometimes they're wrong holes in the poll. But polls had gotten so good in the 2000 aughts that people came to to be overconfident in their ability to predict and pick winners in the 2016 race reminded us, like, polling is not perfect.

[00:41:17]

Let's stop pretending it is.

[00:41:20]

Yeah. And it's a lot of it has to do with like we've been kind of harping on the way the media presents it. And then a lot of it has to do with just our conditioned, how we're conditioned to look at things like underdogs. And it's different in politics. And I remember when these aggregator, especially at five thirty eight, they had these predictive models and they started talking about the fact that and I think The Washington Post even wrote a good comparison to sports.

[00:41:49]

And, you know, if someone has a is a real big underdog going into like a Super Bowl or a World Series and they end up winning, people don't get angry and go after the people who said they had a 15 or 20 percent chance of winning. They just said, wow, what a story the underdog won, right? But there are so few presidential elections, you know, one every four years that it's it's the same thing. But people just look at it differently, like an underdog, like Trump was an underdog that supposedly had like a 15 to 30 percent chance of winning.

[00:42:24]

Some people said one.

[00:42:26]

Yeah, well, that's ridiculous. But a 30 percent chance of winning is a real shot at winning for sure. Yes. The way it's framed, it doesn't seem that way in politics. No. And so that's one thing. But another thing is that we shouldn't even be talking about presidential elections with like 15 percent chance of winning, 99 percent chance of winning like that is not how we should present it. And that's not how we used to present it.

[00:42:48]

We used to present it saying like this poll found that that Clinton was going to lead Trump 52 percent to 48 percent or something like that, plus or minus two points. And that would have shown you like, OK, well, this is a really close race, way closer than I think. And that's that. There's my information, not the problem is that you can take that same statistic. Fifty fifty two percent chance of winning plus or minus a four point margin of error.

[00:43:17]

If you convert that to a normal distribution, you come up with an 84 percent probability of a win. That's the problem is that these statistics that are being being the data that's being produced by these polls are being converted in ways that they shouldn't be. And then that's what the media jumps on. That's what the public laps up, because that is the horserace statistic, an 84 percent chance of winning, a 15 percent chance of winning. That's what we we think about.

[00:43:46]

That's what we look at. And so rather than realizing that actually this is a close race, 52 percent plus or minus four points, we see 84 percent chance of winning.

[00:43:56]

And that's a foregone conclusion that that person is going to win. Yes, that ultimately is where the media and the public are culpable for this.

[00:44:04]

Yeah, I don't think I don't think they were meant to be extrapolated like that to begin with. They weren't. And that, you know. Poles are valuable, but like I haven't looked at any poles, and partially because of the way 2016 went down. And in fact, for the past week, I've taken a complete Internet news and social media break. And it's been pretty great, actually, because it so liberating. Yeah. I mean, I literally haven't looked at a single news thing, a very sadly found out that Chadwick Boseman passed away like three days afterward.

[00:44:39]

Oh wow. Like, that's how dark I've gone and not looking at the Internet unless it's something that brings me joy, which is to say, you know, old Led Zeppelin and Van Halen YouTube videos, I was looking up classic Mad magazine covers of the 80s.

[00:44:53]

Yeah, that's all I've been doing is it doesn't bring me joy on the Internet. I'm not doing it. That's good. Um, you know, I got to break that soon because. Do you think you should be active and involved in in the know but. Yeah, but taking a break fairly regularly is definitely mentally healthy.

[00:45:10]

But that aside, I haven't I'm not looking at any polls and I don't care what any poll says.

[00:45:16]

We'll see. So I was I was thinking very similar stuff, too. And like, what's the point of poll? I don't know. OK, well, I finally found it. If you look that up on Google, there's there's just very little on it. But I found somebody who who explained it pretty well. I thought that polls aren't meant to tell you who's going to win. They're not they're not forecasting models. Like I said before, they're meant to be like a snapshot of how whoever your polling feels at the moment.

[00:45:44]

Right.

[00:45:45]

And in doing that, because you are sampling American people and these are independent news organizations typically who are carrying out these these polls, you get to tell everybody else how America's feeling rather than the leaders saying, I'll decide how you're feeling. I can decide what you want and what you need and what you think is important. Polls prevent that from happening by telling the rest of the people, hey, this is how everybody else is feeling right now, too.

[00:46:16]

And in some ways, it is kind of sheeple ish where, you know, the idea is like, oh, you know, is that supposed to sway my opinion that everybody's going to vote for this person and not for that person that should have no bearing or impact on your vote? And it feels like that that's how polls are used sometimes. But if you step back and look and see that they're actually kind of an important part of of sharing what other people are thinking rather than being told what we're thinking or, you know, what to think, then they actually are pretty legitimate in that sense.

[00:46:46]

Yeah, well, you know, I say take your polls and sit on it.

[00:46:51]

Well, one more thing we get we cannot talk about polling and not talk about Internet polling real quick. This is a completely different style of polling than has ever been done before. Rather than a randomized sample, you actually just say, hey, you want to take the survey and people click it. So it's called opting in, opt in, surveying. Yeah, and very specific kinds of people take surveys on purpose on the Internet. So they really are because they're new.

[00:47:16]

They're really now figuring out how to weight these things or not and how to how to use them because they can produce legitimate data. But it depends on who's conducting the poll, how if they know what they're doing, that kind of stuff. But just like everything else, that the moving things online is democratize polling. And so anybody can conduct a poll now and basically enter the news cycle. It's a Kid Rock almost became a senator in Michigan for a second there.

[00:47:45]

But so on the one hand, it's good. But it's also we're in a big period of disruption as far as polling is concerned. So for you, the polling consumer either go like Chuck and just stop listening to polls altogether or look for things like transparency. Do you recognize the company or the name that produced the poll? Are they sharing their data, like how the questions exactly were worded, what their population size was, how they waited, all this stuff?

[00:48:11]

If if there's all if all that stuff is included, you can probably trust the poll. And then beyond that, just remember what you're looking at. This isn't a predictor of who's going to win. It was a snapshot for a very brief moment of a very specific sample of America to just to show how people would vote right then. And it was right then, too. This is not Election Day we're talking about.

[00:48:33]

Yeah. And, you know, I want to be clear. I'm not pooh poohing polls. I just they're they're valid and useful, but I just don't care to look at them right now. I understand. Yeah, that's that's my jam.

[00:48:43]

Well, you got anything else about polls, nothing else about polls? Well, if you want to know about polls, start looking around and go check out Pufnstuf stuff and sideline stuff and all that stuff.

[00:48:55]

And since I said stuff three times, here comes Rumpelstiltskin or Candy Man.

[00:49:05]

So this is from Kelly Price. And Kelly says this. Hi, guys. I'm writing today not only to confess my unending love for stuff you should know. But also to share a link to some black owned bookstores, it would be so cool if all of your listeners purchased your book. She should just say period, comma from a black owned bookstore. Couldn't agree more, by the way. Yeah. A couple of podcasters that I listen to while I wait for stuff you should know, have books out and coming out soon.

[00:49:33]

And they encourage their listeners to support black owned businesses through the purchase of their book. It's a win win. I don't know why it's taken me so long to think to write this to you guys. I blame it on Korona Madness. But last but not least, I'll say I love the end of the World with Josh Clark and Movie Crush as well. Any chance to hear you guys talk is a chance worth taking when we get a covered vaccine and you guys can do your live shows again, please come to Nashville.

[00:49:58]

Oh, yeah, for sure. I think we'd planned on Nashville. Yeah, Nashville got scuttled by cover. This time around we can try to come now. We might not ever be able to come.

[00:50:07]

No, no. It's super close to Atlanta. I'd lose my mind if I get to see you guys here. All the best. Kelly Price. And so Kelly sent a link to a handy website that lists black owned bookstores near you. I made a little you URL shortener to make it easier on everyone. Oh, let's have it so you can go to Bitcoin. At least that's why ask BLM nice and find black owned bookstores near you to purchase stuff you should know an incomplete compendium of mostly interesting things.

[00:50:37]

At the very least, we like to encourage people to go to indie bound drag and support indie bookstores. I don't know if there is an actual black owned indie bookstore website, but I would imagine most of the black on bookstores are indie bookstores.

[00:50:53]

Uh, yeah, probably so. Check it out. Bitly s české BLM go out and buy our book, everybody. You're going to love it. It's really great.

[00:51:04]

Um, and thanks. Thanks for that, Chuck. Thanks for setting us up for that too. Kealy Much appreciated. We'll see you in Nashville. I guess Kelly will be the one, like she said, losing your mind in the crowd if you want to lose your mind on us via email. We love that kind of thing. Kinda you can send it off to Stuff podcast that I heart. Radio dotcom. Stuff you should know is a production of radios HowStuffWorks for more podcasts, My Heart radio, because the radio app, Apple podcasts or wherever you listen to your favorite shows.

[00:51:39]

Paper Ghosts is a true crime podcast that investigates the search for the person responsible for the abductions of four missing girls in neighboring New England towns for more than 50 years. Each case has remain unsolved.

[00:51:54]

Every day is like being lost in limbo. I pray every day that we find Lisa so we can go on. It wasn't until this past year that things took an unexpected turn. Breakthrough answers to decades old questions and witnesses finally ready to talk. I don't think I can describe what he's wearing. It's only a mile away. Jesus, Mary and Josephine. I hope that's brave for many of you know what I think it is? Listen to paper ghosts on the radio app, Apple podcast, or wherever you get your podcasts.

[00:52:39]

Hi, I'm Kristen Holmes. I've covered campaigns, Capitol Hill, the White House and everything Washington for CNN. But nothing tops the importance of this upcoming election and my job is to help you make sense of it all. Welcome to Election 101. For the next 10 weeks, we'll figure out the electoral process together. I'll talk to experts, historians and some of you will address the safety of mail and voting, inform you of deadlines and make sure you know all your options.

[00:53:08]

You'll learn why voter registration is different from state to state and even from person to person.

[00:53:14]

I'll help you figure out how to watch the debates a little more closely and how to get a better read on what the candidates really stand for. Yes, this election year is different and this is a different kind of podcast. Election one. One was created to help you learn how to make the most of your vote this November. Listen to election one to one every Wednesday on the I Heart radio app, Apple podcast, or wherever you get your podcasts.