Transcribe your podcast
[00:00:00]

We don't want to talk about this tonight, but I'm just curious before we begin. Twenty twenty four, twenty twenty eight to the Democratic Iowa caucuses. Stay number one.

[00:00:08]

We'll see. I think I think there's the strongest argument there's been so far to change. But those those relationships are forged by eight.

[00:00:27]

Hello and welcome to the five eight politics podcast. I'm gonna drink in the wake of the twenty twenty election.

[00:00:34]

As you all know, there's been plenty of discussion about the accuracy and usefulness of polling nationally. The polling error in twenty twenty was roughly four points, which is about the average polling error dating back to nineteen seventy two. At the same time, the polls were significantly off in some states, like in the Midwest and in Florida, where we've repeatedly seen larger than average polling errors recently. And of course, those messes have now underestimated Republicans in two presidential contests in a row.

[00:01:05]

So numerically speaking, contrary to the conventional wisdom, polls are not becoming dramatically less accurate on the national level. But there were errors and trends worth discussing.

[00:01:16]

And I'm curious to hear from pollsters about how they're viewing their own work and whether they see a need to try to address any problems. So today we're going to hear from two pollsters, both of whom conducted a plus rated polls, according to five thirty eight pollster ratings. First J. And Selzer, who conducts the Iowa poll for the Des Moines Register. Her poll showing Trump leading Biden in Iowa by seven points in late October was an outlier compared to the average, which showed a dead heat in Iowa.

[00:01:45]

But as with a number of Ann Selzer outlier polls in the past, her poll proved correct.

[00:01:52]

President Trump ended up winning Iowa by about eight points. Next, we'll hear from Patrick Murray, the founding director of Monmouth University Polling Institute. He ran into many of the same challenges in the Midwest and in Florida that other pollsters did.

[00:02:08]

As you'll hear, they have some similar and some pretty different views about the challenges pollsters face and what to do about them.

[00:02:16]

And let's begin with Ann Selzer, the president of Selzer and Company. Welcome back to the show and thank you for joining us. It's nice to be here.

[00:02:24]

So I'm curious, as we begin this conversation, what do you make about the national narrative writ large about the accuracy of polling in the wake of 20, 20?

[00:02:34]

Oh, there's so much to say about it. But I think I would lead off by saying that polls have become almost perceived as a public utility, that the public is owed accurate and competently conducted polls. And so there's an expectation that is very, very difficult for the massive number of polling organizations that are out there. I think you raise over three hundred and thirty polling organizations that are that are out there trying to do it in all sorts of different ways.

[00:03:04]

But it leads to an expectation and therefore a violation of that expectation carries more hurt with it than you had no expectation at all. Right.

[00:03:16]

And so are you concerned that there are certain polls or that polling writ large is becoming less accurate?

[00:03:23]

Oh, my gosh, yes. That polling itself has become incrementally and then sort of with some major leaps, more difficult since the beginning when it all started. It used to be that when the Des Moines Register poll got started, people would go door to door across the state and they had a little map in it. You go to this intersection and this is flip a coin to see what direction you go. So it was all random sample, but it was going door to door and every door was a possible interview.

[00:03:54]

So you missed nobody. Telephone polling started in the late 70s at the Des Moines Register. And at that time, if you can imagine it, almost every household. Ninety eight percent of the households had a phone. And for each household there was one number. And if you knew somebody phone number, you pretty much knew where they lived because of the way the transmission lines were set up. And if you didn't have a listed phone number, which we didn't have phone books with everybody's name and their address and their phone number all tied together, but that's where random digit dialing was invented so that you could hit even unlisted phone numbers and that those are the glory days.

[00:04:34]

And every innovation that's happened with telephone technology after that has made polling more and more difficult.

[00:04:41]

So would you say that reaching a random sample of people because of the technology with phones and people not answering unknown numbers and things like that is the biggest challenge facing polling today?

[00:04:55]

The the biggest challenge is the portability of. Cell phone numbers and being able to account for people who live in a state with it, with an area code on their phone that does not match the area codes assigned to the state. So the complication is that if you move to Iowa and you bring your New York City phone number, it's very difficult for me to find you. It can be done, but it can only be done with some things that happen that messed up the perfect way we would like to do it where every household has an equal.

[00:05:34]

You either have a more than equal chance because of smart cell samples that will overlay on a cell phone signal or you have the less equal chance because we'll just exclude any non Iowa area codes from our sample altogether. What about response rates?

[00:05:49]

I've heard in the aftermath of 20 20, a lot of concern about response rates falling off a cliff. I've seen some data that cites, you know, earlier on in the eighties, one in five people called would respond to a poll. Now it's more like one in a hundred in some cases. Have you experienced that? And do you think that is even a greater concern than the portability of cell phones, or is this something that you think that pollsters will and should be able to get around?

[00:06:18]

Here's the here is that the secret ugliness of the polling industry? Are you ready for this? I'm ready to give it to me.

[00:06:26]

We rely on the kindness of strangers. That is, when their phone rings that they will answer it, that when they find out what the call is about, that they do not hang up and that they will stay and complete the interview all the way to the end, because that's the only thing we count as a completed interview. And they do that for no money. This is uncompensated time. And that I think, given everything else that has happened with telephone technology, with telemarketing and with these random robo calls and nuisance calls, plus all of the campaigns phoning into people, I mean, the the traffic in polling as well as advertising and outreach, get out the vote calls the traffic is a huge burden on the average person.

[00:07:19]

So it just makes them less likely to do it. It's amazing that we're still able to get accurate polling ever given that the odds are really stacked against us.

[00:07:30]

Another theory that I've heard floated gets at this, actually, which is that over the past five years, President Trump in particular has done some polling bashing and of course, questioned the credibility of the media.

[00:07:43]

And so perhaps when Trump supporters here, a pollster on the other line, they may be less interested in talking to that pollster than a Democrat who has a higher level of trust in the media. For example, now, not a shy Trump voter theory in the sense that Trump voters would lie to a pollster, but simply that there are lower response rates amongst people who are more inclined to vote for Trump.

[00:08:05]

Do you think there's any credibility to that theory?

[00:08:08]

There's potential credibility. I'm not sure that the the node at the bottom of that decision tree is voter. It could be that that's rural. It could be other things less than a college educated. It could be other things that are linked to it that explain better. But there's a lot of talk about the QAI Trump voter. And the thing I keep thinking of is you remember back to 2016, then nominee Trump liked nothing better than to parade around his good poll numbers.

[00:08:40]

So I don't know why the people who would be supporting him in twenty twenty would deny him good poll numbers by by opting out of participating in the poll. We've talked about some of the challenges here.

[00:08:53]

Do you have a sense of looking at the polling across the Midwest in places like Iowa, where the polling was off significantly? Of course, not your poll, but also in Wisconsin, where the polling was showing a ten point plus lead for Biden, which ended up being a half a point lead. In the end, we saw polls off significantly in Maine, in Florida. Do you have a sense of what is going on here from where you sit, having had an accurate polling result?

[00:09:22]

When you look at the field and you look at your colleagues in polling, what are they doing wrong?

[00:09:29]

Well, there are a couple of things I'd like to split this up into to a couple of different ways of going at it. One of the things that one of my pet theories is about the nature of the two campaigns, the Democratic campaign and the Republican camp. The Democratic campaign was really focused on getting people to sign up for early voting and absentee ballots. And the surge of that campaign, I think happened, you know, perhaps as long as two weeks before Election Day, which is when the Republican surge might have really kicked in, which is get people out on Election Day.

[00:10:06]

And so it could be that there were people who weren't all that confident that they were going to vote a week before the election. But then came that Republican surge. And so their neighbors were getting them involved and figuring out a plan for how they were going to get to the voting place. And that kind of difference would have meant somebody didn't appear to be a likely voter who later, a few days later, a week later, did turn out to be a likely voter.

[00:10:35]

So that's one angle to think about it.

[00:10:38]

And that, of course, is something that pollsters can't necessarily control. So that's an environmental issue. That wouldn't mean that there's something wrong with the way that people are conducting their surveys. Right.

[00:10:49]

And pollsters have to stop polling at some point. You can't poll on Election Day, which might actually improve the accuracy and some of those polls that contributed to that perception in Wisconsin where we count, that's always a potential. But a second way to think about this, and this is the nerdy part of this conversation. Well, one of the dirty parts is that there are decisions pollsters make. There's a chain of decisions that pollsters make. And so how are you?

[00:11:17]

Finding your sample frame, are you including land lines, there are some pollsters who only do cell phones, so if you're doing cell phones, then how is that how are you taking into account people who have area codes from outside of the area? And are you doing random digit dialing or are you working from a list? Are you working from a voter list where maybe 50, 60 percent you have phone numbers or how what is your sample frame? And each of these decisions either includes people you shouldn't include or excludes, more likely people who shouldn't be excluded.

[00:11:56]

And that's just deciding on your sample frame. If you decide that this is a poll of registered voters or that the first step in qualifying them as a likely voter is whether they're registered, you will be excluding people in states where their same day registration. So they're not already registered, but they might be planning definitely to vote and there are polls that would exclude them. Then how much farther do you go in terms of requiring people to be interested in the campaign and following the campaign and whether they voted before things that go into likely voter models, which can exclude people who should be excluded or include people who who perhaps shouldn't be included?

[00:12:39]

A pause there.

[00:12:40]

But the same sorts of things go into how you weight the data literally every step of the way.

[00:12:47]

Yeah. So let's take it step by step and talk about your process since it worked for you this time. And of course, it's worked for you in the past. How do you go about figuring out what that sample frame is going to be and then discerning what a likely voter is?

[00:13:02]

This is going to be the simplest part of this conversation because I tend to like elegant approaches rather than overcomplicating something and trying to outthink the electorate. So we do random digit dialing with landline phones and cell phones. So we put a quota on what proportion we want to be cell phones. We have our phone bank, do some household selection, but really kind of pay attention to getting a right balance of sex. And that's it. When they make first contact with the respondent, we ask them if they will definitely vote, probably vote, might or might not vote, probably not vote.

[00:13:44]

We define a likely voter is only those people who say they will definitely vote. If you tell us anything else, we're going to capture your age, your sex, your education level in case we decide that we want to use that weighting and the county that you live in so that we can now have a general population sample. Right. And we can wait to that. And from that, we extract those who said they would definitely vote so that if older people are more likely to say that, they will definitely vote, they will show up in larger proportions and our likely voter subsample than they did as we made so in our general election sample.

[00:14:27]

So when you have that original data, one issue that comes up is if response rates are super low, then the kinds of people who are liable to respond to pollsters and take the time to actually complete the survey may be different than, say, the ninety nine other people who said, no, I don't want to talk to the pollster. There may be traits about them that would make them more inclined to vote a certain way. So even if they are, for example, a white, non college educated rural voter, that they might be a white, non college educated will voter who's actually more inclined to vote a certain way than other people who fit those exact same demographic profiles simply because of the like selecting in process of taking a poll.

[00:15:09]

Is that something that you have encountered? Do you think that's that's a challenge in polling or is that overcomplicating things?

[00:15:16]

Well, I don't think that is a challenge we can do much about.

[00:15:19]

We may be fortunate that when we call the interviewer, say they're calling for the Iowa poll, but it isn't the words Iowa poll don't come up on their caller ID. So they would have had to answer the phone in the first place. So I heard the idea that the response rate for all polls got a little bit of a boost with stay in place quarantine rules that people are more likely to be at home. They're more available to participate in our poll that's coming in during the day or in the evening.

[00:15:52]

But Democrats are more likely to be observing the stay at home orders than Republicans who sort of flouted them. And so perhaps this is a one off of polling in a time. Of course. And it's interesting to me, I don't know.

[00:16:08]

Yeah. So once you have this data, then, of course, you have to wait it because as 538 politics podcast listeners. We know when you call a bunch of people, you're more inclined to get an older sample of the population, perhaps a whiter, more educated sample of the population. It's become very common now for pollsters to wait by education. Are there any other indicators that you use for waiting? The other pollsters might not.

[00:16:34]

In fact, I find waiting by education to be dodgy. And here's why. We don't really have a way of gathering education information by county.

[00:16:46]

And of course, there's differential education rates by county. We only get that with the decennial census. And this is the year twenty twenty when those data are the most out of date. So when you have variables that sort of collide, when you add another variable to it, we know that education by race is not uniform. You can't presume the same level of education by race. And certainly when you add in geography at a more micro level, it just is very difficult to say that you've got it.

[00:17:22]

Education is also not fixed. And that's one of the reasons why, except for the decennial census, they don't report out education for eighteen to twenty four year olds because every year that could be changing for a large group in that age cohort. So it's not as precise, even though male female is less precise than maybe it used to be. But education is the least precise of the variables that we could use. We can get good estimates for how many people live in a county.

[00:17:52]

We can give good estimates for what, the ages and age by sex, we can get all of that. But education is mushy. And I had found in working with it in the past that using it could throw off other things that I felt more certain. And when we decided not to use education as a variable, it made our cold better.

[00:18:14]

So the poll that you did showing Trump leading by seven points at the end of October, you did not wait by education? That's correct. What were the things that you waited by?

[00:18:25]

We wait by age and sex and which congressional district in Iowa we have just before. So that gives us a regional breakout.

[00:18:34]

So we talked about conducting the surveys, gathering the data, waiting. Now, the final part here is the turnout model, deciding what exactly the electorate will look like. And we've had conversations we talked in the past before on this podcast. So I know your philosophy on this, which is hands off. But can you spell that out a little bit for our listeners? Yeah.

[00:18:56]

When I hear pollsters talk about deciding what the what the size and shape of the electorate is going to be, I, I just tend to be in awe of the thought that they think they could do that. So my approach is to let my data reveal to me what the size and the state of the electorate will look like so that the only way you could decide, I think, is if you're looking backwards at what's happened historically and anybody who didn't think twenty twenty was going to break the norms for what has happened historically, I, I, I find that hard to fathom.

[00:19:36]

They wouldn't be expecting there to be change. So what do you do? You sort of roll the dice and decide, well, it's going to be more this and it's going to be more that. I don't know how you would make that decision. So I called their approach polling backward and I call my approach polling forward. That is, I set things up so that if there is going to be a change in the proportion of college educated or the proportion of women or the proportion of suburban voters, my data will show it to me.

[00:20:09]

So essentially, is it fair to say that you don't have a turnout model that you use? You just kind of judge your polls based off of what the individual voters tell you?

[00:20:18]

I don't judge and let the individual voters tell me and I don't know if it's worth retelling the story that sort of made me confident in that approach. Go for it.

[00:20:30]

I know it's Obama, two thousand eight the Iowa caucuses, right? That's correct.

[00:20:35]

So in two thousand eight, our final poll, which was New Year's Day, because in early caucus that year said that the people attending the Democratic caucus, 60 percent of them, that would be their first caucus. And there was outrage in the campaign world, the non Obama campaign. We're all because we also showed a sizable lead for Barack Obama. And I don't believe this. This is a crazy assumption that she's made. No one would think that is going to be 60 percent.

[00:21:07]

Historically, it has been 20 percent, maybe 30 percent, but no one would put in their model 60 percent. So she's made outrageous of some. To which my answer was, I assumed nothing, my data showed me that the people who were planning to go to caucus nearly six in 10 and the entrance poll proved it up, that would be their first caucus. So the lesson from that is that those that were building models are those that we're assuming that all you needed to know about that caucus were people who had attended before.

[00:21:43]

Oh, well, let's look at registered Democrats as well, because maybe more of them will go up. Those assumptions about what this future electorate would look like blinded them from seeing the Obama wave that was coming.

[00:21:57]

When you look back at your experience in twenty twenty, of course, the September results for your poll were significantly different from the October results. You had a poll at the end of September showing Trump and Biden roughly tied Teresa Greenfield leading in the Senate race. Do you think that there was a significant change in opinion, seven points basically towards Trump in that one month? Or do you think that you ran into some of the challenges in that previous poll that maybe other pollsters in the Midwest ran into?

[00:22:28]

I think it has less to do. This is just my experience with people and political thinking. Generally, it's not like there's a group of people that are out there flipping back and forth. I don't know what to do. I don't know what to do. I think it is more I think I'll vote no. I don't care about voting. Oh, I think I'd better vote. I don't know. So that the expansion and contraction of the electorate, I think earlier on is just hard to to have some precision about who actually is going to vote or not vote.

[00:22:58]

And so earlier in September, you would have been reaching a bunch of people who said, I don't think I'm going to vote. They wouldn't have made it through your likely voter filter, essentially, whereas at the end of October, they might have they could have said they were probably going to vote.

[00:23:12]

But we don't we don't like that. So it's just that that the electorate is a living, breathing organism with people moving in and out, stepping back away from the nitty gritty that we've delved into here.

[00:23:27]

What are your biggest maybe concerns about practices in the industry that you think can be improved going forward from your own experiences and from watching other pollsters handle this election?

[00:23:40]

My biggest concern, and I don't have direct ability to do much about it, is in sampling. And what we want is to have a sample frame of people who represent, you know, this ultimate electorate such that each and every person in the electorate has an equal opportunity to be contacted by our poll. And I think as we talked earlier, that ability has eroded and eroded and eroded. And I am hoping, wishing and if I could influence, I'd be happy to that the sampling firms who take a look at these questions will think long and hard about their ability to deliver to us telephone samples that get us closer back to the gold standard where every voter had an equal chance to be contacted by our poll.

[00:24:35]

I know that firms are iterating with different methods for contacting people, using the Internet, for example, and using text messages, even using mailing envelopes and so on to try to interact with all different kinds of people in the electorate. Are there ways that you have iterated over the years and what do you think of these new ideas? Do you think, you know, try anything, see what works, or do you think that things are getting out of hand at all?

[00:25:02]

Well, I think people get excited by the idea of what they could do and that pulls them away from what they should do. So the ability to email people, well, where do you get the list of people that you're going to mail? How there's no central repository, as they used to be for phone numbers of all of the email addresses that are out there. And you might have two or three and I might have four or five. So there's there's no good way of doing any sort of recruitment by email, sending something through the mail, because there is a repository of addresses that we can have access to.

[00:25:42]

The problem there becomes one of of time sensitivity. And in my work, we want to finish polling as close to the election as we can so that can be turned around. It leads people it led and continues to lead people to want to form a panel of respondents that they can rely on to answer an email and then go back and and answer a poll repeatedly over time. I'm not a big fan of it because that panel, again, that's going to change and shape and size as people drop out to recruit.

[00:26:16]

And that's another. Sort of living, breathing organism, and I worry about a thing called the instrument effect, that is the fact of measurement change is what is being measured. So if you're recruited into a panel and now, you know, from time to time, you're going to be asked your opinion, how does that change your everyday life? Does that change your exposure to political messaging? Does it change the way you talk about it, politics with other people?

[00:26:42]

Do you change your mind? Do you? No, I think that potentially has an effect and would make that group in the panel different from the electorate at large.

[00:26:53]

How about in your own practice? Do you feel like you're constantly iterating or have you been able to stick to the bread and butter that's worked so far?

[00:27:02]

I believe I'm using the same methodology I used in 2008. And keep in mind, you're asking me about whether I make changes when I have been consistently accurate. So I haven't faced an emergency, that there's something wrong in my method that I need to rethink. I worry about how long this method lasts.

[00:27:25]

I worry all the time about it because everything has gotten much more difficult.

[00:27:31]

Do you think that you have some advantages in polling, Iowa, a relatively homogenous state, or, as you said, being a known quantity? The Iowa poll, perhaps there's more trust in the Des Moines Register and the Iowa poll than there might be in an outside or national firm. Do you think that there are certain advantages that are specific to your position in polling?

[00:27:53]

And after the election, there was the thought that every state needed its own Selzer and Company that need an indigenous polling firm, because it must be that I just know Iowa so well and I explained my method to you. There isn't anything that I add that I wouldn't do in other states. And in fact, I polled in other states and used the same method there.

[00:28:16]

I think the trick and it gets a little more complicated in other states is being sure that the urban rural mix is protected so that that you're you're not if you're polling in Michigan, not letting Detroit and its suburbs so dominate that you don't end up being accurate, outstate in that case. So we've pulled elsewhere using the same methodology. I don't think there's any thing indigenous to Iowa that makes it different.

[00:28:45]

So this isn't what our conversation has centered about. It's mostly been about polling itself. But there are a lot of questions in the aftermath of the 20 20 election about what motivated voters to behave the way that they did. Why was turnout so high? Why did some people change their minds in certain geographies? There's something of a realignment of the inner suburbs, it seems like. Where have you been able to, through your polling, make some judgments about why voters behaved the way that they did?

[00:29:17]

What motivated them in twenty twenty? I don't think I have an answer to that question. I think there certainly was more activity in twenty eighteen and that's going to lead to some reaction on the Democratic side to say, hey, we were successful, so let's keep at it. And there'd be a reaction on the Republican side to, oh, well, we can't take anything for granted.

[00:29:43]

I think what a lot of the media world and maybe some of the pollsters don't have a good enough handle on is why voters support Trump. And we've done a lot of focus groups with voters in the last few years for one of my clients in order to really understand where there might be common ground and other kinds of thing. And we recruited, for example, last year, I think, or perhaps the year before. But people who had voted for Trump but that were less than certain to say they would vote to re-elect.

[00:30:19]

So now they all have that in common at the table, which once they learn that there's kind of a sigh of relief that they're not going to have to fight about it.

[00:30:27]

And there is when you start pressing on what we like it, what do you like that's happened? And they'll tick off a dozen things that they like that President Trump did. And then you say, well, is there another Republican that you think you could support? They've got nothing. They have nothing. There's no name that they can come up with. Well, so what about voting for a Democrat? Oh, no. And so this idea that there were people who are flipping, there might have been there might have been a show.

[00:30:58]

And I just don't know how big it is of people who voted for President Trump, kind of they kind of gave gave him a shot so that politics is so ugly that at least that would be something different. And it could be that there was regret among those. I don't think it's a big segment of voters, but it could have been enough in places like Wisconsin and Michigan and Pennsylvania.

[00:31:22]

What are the main things that the Trump supporters and focus groups that you mentioned will check off? They like the wall, they like standing up to China. They liked the tariffs, they like the restrictions on immigration.

[00:31:39]

They were not uncomfortable with reducing refugee admissions into the country.

[00:31:45]

Interesting stuff. One thing that I'm curious about is how the coronavirus shaped this election. Of course, Iowa was experiencing a significant degree of case surges around the time of the election. And it looked like from the polling, at least nationally, Americans were pretty unhappy about the president's handling of the coronavirus crisis. I know from looking at some polling in Iowa that that seemed to be the case there as well.

[00:32:11]

Was that issue polling just wrong? Were people more satisfied with the circumstances of the pandemic than they were letting on? How how did the electorate and polling interact with the pandemic?

[00:32:24]

Well, I don't I don't know that I have a good answer to that.

[00:32:30]

The observation I will make is there was such a blitz of people putting information in my mailbox about about how to request an absentee ballot. I had dozens of these things arrive. So, you know, the message I got is might not be safe to vote on Election Day, might be a long line of people. And so I prefer to vote on Election Day. I love voting on Election Day, but I voted early by absentee ballot. And then I went by my voting place and there was no line there.

[00:33:04]

There was more traffic in and out as I kind of sat and watched for a while. And so it could be that there were people who decided to vote that day that they had decided, well, I missed the cut off for an absentee ballot so that they drive by their polling place and everything looks to be business as normal. No long lines in and out to get that done. So that's just one observation. I think the other and you'll want to talk to a political pundit or this, that the messaging from the Democratic Party about the coronavirus and really perhaps other Trump issues as well.

[00:33:46]

Again, to my observation, seem to be we hold some truths to be self-evident. Look at that. Look at what's happening with the assumption that the person they're saying this to go, oh, yes, that's terrible. And the person they're saying it to, if they're a Trump supporter going, yeah, yeah, yeah, I shouldn't have to wear a mask. This is my freedom. And there's such a disconnect about how people were relating to that, I think will be studying the effect of the coronavirus on this election for decades.

[00:34:18]

Yeah, no, that's a good point. And we'll definitely look further into it on this podcast, because another thing that we're contending with here, not just trying to let people know that the polling error, while it was significant in some states on average nationally, was did not show some kind of crisis in the industry. We're also contending with the fact that the twenty eighteen polling was super accurate. And so what happened in twenty twenty? As you know, we've learned from this discussion.

[00:34:45]

We have ideas. We have theories.

[00:34:46]

We're going to learn more about it as pollsters continue to confront the challenge of getting a random sample and all sorts of things to end things here, despite the fact that you have done some great, very accurate polling and that we have 538 disagree with the suggestion that polling should be shunned from the national conversation. There are people out there who are questioning the usefulness of polling and suggesting that it be used less in in media or that it not be used at all, that horserace polling be ignored.

[00:35:20]

What do you see as your role in society as a pollster to respond to some of those suggestions?

[00:35:26]

My role in society. Again, this is sort of positioning polling as some sort of public service, public utility. The public is not my client. My client is the doing register and Mediacom. And these are media outlets and they want polling to help tell the story. And imagine a world without polls that tell you who is getting traction and who's not getting traction. Imagine the the twenty sixteen election. If there were no polls saying that Donald Trump had a shot, because the way that that campaign was covered was more from that point of view of look at that.

[00:36:08]

Nobody would want that, right? That's my observation. That's my take. And so the surprise when a Trump victory, when you didn't have any polling to suggest that that was coming, I think that would be a much different kind of violation of expectations. So I think my job is to obviously serve my clients and to do that, I believe that the best news I can give them as a pollster is my best shot at the truth. So when I think about my method, I'm always trying to think about does this get me closer to the truth or does this sort of make me veer off into something that might sound more complicated or difficult, but but is less than the direction of time to go?

[00:37:00]

And for those of us who are not pollsters, my final final question relates to what it's like to publish a poll that runs counter to conventional wisdom, because you've done this plenty of times. I remember the weekend before the twenty twenty election. There were a lot of people saying, you know what?

[00:37:19]

Ann Selzer is a great pollster and it's only proof that she's such a good pollster that she is publishing this outlier because it's probably not true, but she's publishing it anyway. And good pollsters publish outliers. You've got a lot of heat in 2008 running up to the caucuses because you showed Obama winning the caucuses. So, you know, this isn't your first rodeo doing something very publicly that counters the conventional wisdom. How do you contend with that? I mean, maybe this is like a life lesson, like how do you feel and how do you deal with the psychology of being out there in the public, countering what everyone else when everyone else's data is showing and what everyone else thinks?

[00:37:57]

Yeah, it's not my first rodeo, and I will tell you, it never gets any more comfortable to be there, I call it, you know, spending my time in the Hot Shot Corral, use the word, because everybody's saying this is terrible, she's awful.

[00:38:12]

This is going to be the end of her at long last, whatever it is that they're saying. And I know that that's going to be the reaction when we see the data. We know that's going to be the reaction. But it's a few days, a few days of discomfort. And the little talk I have with myself is, look, we'll see what happens. I'll either be golden or a goat, hopefully golden. But, you know, if I'm the goat, OK, I think I can survive it and move forward in some way.

[00:38:43]

And that lessons learned. But I it's uncomfortable.

[00:38:47]

And I do have to have that little almost Stuart Smalley kind of chat with myself that I'll be OK no matter what.

[00:38:55]

Thank you so much for taking the time today to speak with me and talk to our listeners about some complicated stuff. And hopefully they learned a lot. I learned a lot as well. So thank you. My pleasure, J. Ann Selzer is the president of the polling firm Selzer and Company, which conducts the Des Moines Register's Iowa poll.

[00:39:14]

Next, we're going to hear another perspective about the state of polling and its challenges. But first, today's podcast is brought to you by neutrophil. 80 million men and women in the US experience thinning hair.

[00:39:26]

Yet it's still not really openly talked about, which can make going through it feel scary or stressful.

[00:39:31]

And that just adds to the problem in a time when self care is more important than ever. Every day is an opportunity to skip damaging styling tools and chemicals and focus on better hair growth from within.

[00:39:42]

Visit neutrophil dotcom and take their hair wellness quiz for customized product recommendations that put the power to grow thicker, stronger hair in your hands. When you subscribe, you'll receive monthly deliveries so you never miss a dose. Shipping is free and you can pause or cancel any time. In clinical studies, neutrophil users saw thicker, stronger hair growth with less shedding. In three to six months, you can grow thicker, healthier hair by going to neutrophil dotcom and using promo code five three eight to get 20 percent off their best offer anywhere 20 percent off at Neutrophil Dotcom spelled and R.A.F. OEL Dotcom Promo Code five three eight four hair as strong as you are.

[00:40:25]

Today's podcast is brought to you by Lifestream. The holidays are here, and this year, give yourself the gift of extra money in your pocket by paying off your credit card balances and saving with a credit card consolidation loan from Lifestream, roll your high interest credit card payments into just one payment at a lower fixed rate Lifestream. Credit card consolidation loans have rates as low as five point nine five percent APPR with auto pay an excellent credit get a loan from five thousand dollars to one hundred thousand dollars and there are absolutely no fees.

[00:40:54]

Listeners can save even more with an additional interest rate discount. The only way to get that discount is to go to Lifestream dot com five three eight. That's still Preetam Dotcom slash five three eight subject to credit approval rates range from five point nine five percent APPR to nineteen point nine nine percent APR and include point five percent auto pay discount. Lowest rate requires excellent credit terms and conditions apply and offers are subject to change without notice. Visit Lifestream Dotcom Slash five three eight for more information.

[00:41:31]

Patrick Murray is the founding director of Monmouth University Polling Institute, which conducted polling in the 2010 election both nationally and in six battleground states around the country.

[00:41:42]

Welcome, Patrick. Good to be with you. Well, thanks for being here.

[00:41:46]

You did a lot of both national and state level polling in twenty twenty. And we know the big picture looking at the averages that in some states polls were pretty good. In others they were not at all. So looking at how you did at Monmouth, can you can you tell me your assessment of your state and national level polling in twenty twenty?

[00:42:08]

Yeah, I think we had a pretty good bead on it things until the last few weeks. And it's kind of interesting because, you know, we did not see a lot of major movement in the polls throughout this entire campaign, but that might have been actually masking something that was actually going on at the very end in terms of turnout. You know, one of the things that polls have problems with is is predicting an election with a likely voter model.

[00:42:37]

And I don't like to predict anything. So if if you're talking about two thirds of the eligible electorate shows up to vote, going from one hundred thirty five million in 2016 to 160 million in 2020, that that extra twenty five million, if it's skewed in one in some way or another, could throw off the polling results if people aren't talking to us. So, I mean, the long way of answering a question is we were off just like everybody else was off.

[00:43:01]

I'm looking at two particular demographic groups regionally that seem to be important. One is the Latino vote and the various aspects of the Latino vote because it's not monolithic. As we know, when the results were coming in from Florida county by county on election night, I was looking at the results and saying not far from where I would expect them, given what my polls said, not far from where I expect them. And then suddenly Miami-Dade came in and I said, oh, crap, that is not what we expected.

[00:43:30]

So what was going on there? And then when I looked at the what was going on in the Midwest and our polls in Iowa and in Pennsylvania, the question seems to be the same thing that we hit four years ago is that are we we might be getting things right in the suburbs and the urban areas, but we're missing the rural areas. And in Iowa, that's a big deal. In Pennsylvania, it's about a third of the electorate. But if you get that wrong and you're under underestimating Trump's strength there, then that throws off your numbers by a few points.

[00:44:03]

And I think that's probably what happened, particularly when I compared to Arizona and Georgia, where we are also off as well, but not off by the same magnitude as we were in in Florida and in Pennsylvania. So, you know, the question is, is this a systemic error? Is this something that polling can fix or is this something that's that's in the political culture, that maybe this is something that that that is a blind spot for polling, that polling cannot be a tool to to measure accurately.

[00:44:34]

And that's something that we have to actually consider very seriously.

[00:44:37]

So you think that there's the possibility that if it's simply difficult to reach a particular kind of voter, that there's not really any waiting or adjusting of how you reach out to voters that could be done? You might just have to accept that pollsters won't be able to reach some people.

[00:44:56]

I think that's possible. And you might come up with some tools that are Extra-Curricular to what polling is actually designed to do to help kind of mediate that a bit. And I know some folks have been trying out things such as, you know, your social circle, questions about your social circle and so forth.

[00:45:15]

But if we look at when we get the final results and by final results, I mean when we actually get the voter histories for 20, 20 and see who won, our likely voter model actually voted, who didn't, who among those who refused to talk to us, where we kept records of who they were and we looked to see who voted and who didn't and try to figure out whether the the so-called QAI Trump vote is a significant part of of this miss in 2020 as it as it was not in 2016, that that's going to be something that we has to raise questions about the entire aspect of what polling can can actually do and how accurate we can be.

[00:45:58]

And be sure to trump voter. You mean people who support Trump who just didn't want to talk to pollsters, not people who actively lied to pollsters about their preferences?

[00:46:06]

Right. I because I still don't think that we see a lot of evidence of lying and we ask a lot of, you know, follow up questions to see if people are consistent or if it if their vote choice doesn't match other things that they say in the poll. And I still don't think we're getting lying. I think we're getting this idea that people just are unwilling to talk to us. And I think there are two different types of Trump voters if this turns out to be the case.

[00:46:29]

There is the shy Trump voter who's truly a shy Trump voter, somebody who's not enthused over, enthusiastic about Trump, does not thrilled about his behavior, does not talk about their support about Donald Trump in their everyday life. You know, at work, I heard about this and a lot of the qualitative work that I did, one on one interviews focus groups at the very beginning of this election cycle over a year and a half ago. And this is what I was hearing, is that more and more people were saying that they were they new friends or co-workers who were not talking about Donald Trump or the support for Trump, more so than even before the 2016 election, because now that he's president, it's becoming less acceptable in their social circle, in their workplace, whatever it happens to be, to talk about Donald Trump.

[00:47:11]

And then that's one type of shy Trump voter than the other Trump voter that we might be missing is the anti institutional Trump voter that that Trump has been cultivating. They followed Donald Trump and saying don't trust the media, don't trust institutions. Polling is part of that whole structure. Therefore, don't talk to them at all and tell them what's going with what what you're thinking. So I think those two groups, if they're sizable enough when we get there, our final results that they threw us off.

[00:47:36]

And that's a real question about what polling can actually do to overcome that.

[00:47:41]

On the question of the Trump vote, the biggest polling errors were in places that largely supported Trump. Right. The polling error in red states, states that ultimately went for Trump was like twice the size of the polling error in blue states. Now that those people would be ashamed of their political beliefs because they're surrounded by people who share their political beliefs, what do you actually think is going on? And, you know, we look at elections abroad where there are nationalist populist figures and we don't see larger than average polling errors there or any kind of trend where they're systemically underestimated in the polling.

[00:48:17]

Do you feel like confident that it is a shy Trump voter or is it just one possibility amongst many?

[00:48:24]

Well, I think we also have a higher level of distrust, underlying distrust in government and what government can do when you ask questions about your government, get involved in this, or should government back off and you look at international polling or cross national polling, the United States generally comes up higher on that number. And I think that's we've had a populist leader who's played into that underlying level of opinion among some people, and that has actually fostered that. So I think that might be one of the reasons why we are different than some others in terms of underestimating the support for populism here, because it's you're you're you're much more inclined or the groups who support people who support populists are much more inclined to distrust the system, distrust institutions to start with.

[00:49:15]

It's not just a matter of populism. It's a matter of distrust of government and all the institutions that are related to it. So that's the media and that's polling. So that's that's my working hypothesis on that, about why it's different. And then why would it be different in the redder areas? And that's why even even not the states that you just mentioned, but just even in in Pennsylvania, as I said, I think we had a bigger miss in the most red areas of Pennsylvania than we did in the suburbs and the urban areas is because that that is reinforced.

[00:49:47]

If you live in a population where the vast majority of voters support that particular point of view, that you can't trust government that you need to bang out from government, you're hearing that message over and over again. Then you're more likely to get a polling miss in those particular areas.

[00:50:03]

But not those people would be ashamed of their political beliefs. Right.

[00:50:07]

So that's why I said there's two different types of QAI Trump voters. This suit, true QAI Trump voter who probably lives in suburban areas. And then there is the anti-establishment Trump voter who lives in those heavily red areas who won't take part in polling. And that's the one that that's probably more problematic than that other guy Trump, that suburban Shi'i Trump vote. I think I would really have bet on that. They disappear in terms of creating a polling problem with without Trump being part of the picture.

[00:50:38]

My concern is as Trump permanently changed how people view government and institutions and polling in those deep red areas where those people will now continue to excuse themselves from participation in normal political discourse.

[00:50:56]

Yeah, I think I mean, it seems like there's a lot of the conventional wisdom is converging around this idea of the latter, quote unquote, QAI Trump voter that you mentioned, the former, the one that's in suburbs but doesn't want to share their opinions. Do we have good evidence that that person exists or exists to a greater extent than someone who lives in a historically Republican suburb, who plans on voting for Biden but doesn't want to talk to their Republican friends about, you know, and we don't have evidence that that exists to any great numbers.

[00:51:25]

And in fact, it might be the same type of shit Trump vote that we saw in 2016, which is the. A sense of saying, yes, in 2016, there were shy Trump voters out there just weren't enough of them to affect the polls, to throw the polls off.

[00:51:37]

And so I think that the true shy Trump voter in the suburban areas, the ones that don't want to talk about it, exist, but probably not enough that if they were isolated, that they would impact the polls.

[00:51:47]

I think the ones that were really worried about are these very vocal Trump voters, but those who have who have basically absented themselves from normal political discourse in these heavily red areas.

[00:51:59]

So do you take the position that polling is potentially just broken?

[00:52:04]

No, I don't think polling is broken. I think I think our political culture may be broken and polling is reflecting that, to be quite frank, because polling can only do so much. Polling has always had a certain amount of error attached to it. And when we get into incredibly volatile situations, polling is less accurate if we get into a situation where we don't know who's going to vote. Polling is going to be less accurate if we get to a situation that that's caused by people saying that I don't won't participate in public life because, you know, I am anti-establishment and we have a significant number of people who are willing to do that.

[00:52:43]

That's a reflection of of ill health for the republic. And polling is is going to be off because of that. But it's not because polling is broke. It's because the political culture itself is broke. That's what I'm actually more worried about. I don't I'm not worried about polling being being broken. I'm worried about that. Polling has revealed some deeper problems that we have as a political society. And if we just focus on the election aspect as just one, as the only piece of it that matters, then we probably are missing a much more, much bigger lesson that we need to learn that's going on here right now.

[00:53:22]

Understanding your concerns about the broader political culture. You are a pollster. So what do you plan on doing going forward? What kind of iterating are you thinking of in order to try to reach people who may be more difficult to contact, you know, to make polling more accurate?

[00:53:38]

Yeah, I mean, we have to figure out who they are and whether it's like a certain demographic group and whether we're going to have to adjust for that.

[00:53:44]

That's why I said I'm looking at what seems to be that rural voter or industrial voter in the Midwest who are outside of urban and suburban areas that might be key. Or we're also looking at the the problems that we've always had with surveying Latino voters just seemed to be exacerbated with this election. And again, there seems to be a Trump effect around that. So first off, doing polling, that's simply asking questions about what do you think the state of the republic is right now?

[00:54:17]

You know, what's our state of our democracy? What what do you think that you value to go and get a sense of how much have we moved off of offered norms and then figure out to what extent does that impact polling and what do we need to do to adjust to that? So I'm actually right now looking at a bigger picture than just the election polling itself, because, as I said, I think that what happened in the twenty twenty election just revealed a lot of things that we've been talking about all along, about the partisanship that we've been measuring, polling, throwing off other measures of polling that are in the long run, much more important.

[00:54:50]

We are just asking different questions, not necessarily trying to engage voters in a different way.

[00:54:55]

Yeah, right now I'm not sure exactly what until I know who we're missing. And as I said, we need to be able to validate our voter histories for our our polling samples in terms of who we talk to and who refuse to talk to us before we get a better sense of what do we need to do to adjust for for those groups. I mean, that's something that we're definitely going to do if that turns out to be a significant factor.

[00:55:17]

As we said, you know, I took a look at this in twenty sixteen and found we didn't that wasn't the problem.

[00:55:23]

The problem was that people change their mind. At the last minute we didn't have education wait and correct. Those things were fixable. I do believe or suspect at least that we're going to find that we are missing a certain segment of voters enough that it's going to throw the polls off by two or three percentage points when we do an election that involves a very, very divisive, populist, authoritarian type of figure. And we're going to have to account for that.

[00:55:51]

The question is, does that disappear all on its own when you don't have that type of figure, you know, creating the lightning rod effect that Donald Trump seems to have an impact? And that's that's a question we're going to have to ask. And we won't be able to answer that one until we get to twenty twenty two.

[00:56:08]

And how would you adjust for that if it does bear out?

[00:56:12]

I think we're going to have to look at other things besides standard demographics. For example, a number of online panels, like, for example, Pew's online panel uses volunteerism as something that is related to nonresponse for their panel. And we might find that there's something more than demographics that's related to partisan based nonresponse that we're going to have to adjust to, and I don't know what that factor is yet.

[00:56:41]

I think that's going to take a lot of examination on the part of pollsters to figure out if there's something else besides our standard demographics that we can use to adjust for the potential that a certain political or partisan skew exists in our non-response. That can be fixed with a weighting factor based on something that's not a typical demographic.

[00:57:03]

And how do you deal broadly with the issue of nonresponse rates? Because all kinds of people are not talking to pollsters. And so one of the arguments has been that it used to be maybe one out of five people that you called would respond to the poll. Now it's closer to one out of a hundred. And so all of a sudden that one person out of hundred is no longer representative of the broader public because they specifically are the kind of person to respond to a poll.

[00:57:31]

How do you deal with those those issues more broadly, whether it is specifically a rule Trump supporting portion of the population or other sections of the population? Yeah, up till the past four years, I would say that didn't pose a problem in terms of creating nonresponse bias.

[00:57:52]

And there are a number of studies PUE, one of the leading studies of that, that suggested that as a response rates were declining, that it wasn't affecting the accuracy of polling. I think that had definitely has changed over the last four years. Now, your question is, what are we doing about it?

[00:58:08]

I am looking for other ways to contact people.

[00:58:11]

So I actually have been experimenting with both online email polling, emailing directly to voters where we have email addresses and there's a whole range of problems with that, making sure that the email actually matches the voter that you are looking for.

[00:58:26]

Also using SMS text polling, doing the poll via text, I'm finding that those are problematic as well. But we are getting different types of people who are willing to respond to different different things. But in the end, the telephone call seems to be still be the best at starting point for getting a more. Probabilistic and representative sample, but again, as you mentioned, there's still problems because, yes, they are significantly more likely to want to talk to folks.

[00:59:05]

And how are we getting the people who don't want to talk to folks if they are, in fact, skewed in a certain way politically as they haven't been before? That's been the problem. And I don't have, I guess. And so the answer is, I don't I don't have an answer.

[00:59:21]

We have been testing out a bunch of different methodologies, and none of them so far has proved fruitful in this environment, as I as I said. I'm worried that it disappears on their own in twenty twenty two. And we say, aha, the poles fixed what was wrong and we actually didn't fix anything that was wrong. It's just that the culture itself changed.

[00:59:43]

And do you have any ideas of other things you'd like to try apart from what you just mentioned? Yeah, as I said, we need different ways to try to get people to participate, and we I think we need to find that non demographic weighting factor that can best capture the nonresponse bias that seems to exist out there.

[01:00:07]

Because if if we go back to a situation where the nonresponse, the group of non respondents looks like the group of respondents and there's not a political skew there, then that weighting factors shouldn't matter at all, then we should say, oh, we don't have to wait by this because it's coming up exactly as we look. It's only when it when it's off that we know, oh, then we're getting a nonresponse bias and it's kind of identifying that, whether it's it's volunteerism itself, reports of volunteerism, if it's if we can use their their political and charitable donations, I don't know what these factors are going to be, but that's kind of the magic bullet that I think that we're looking for right now.

[01:00:47]

One suggestion is that this actually has to do with the pandemic and that Democrats were sheltering in place, staying home, working from home, particularly people who are part of the knowledge economy. While Republicans may have well lived in states that didn't have a strict restrictions, but also not have been following them to the same degree. And, you know, pollsters did say that they saw an increase in response rates during the shutdowns.

[01:01:14]

I mean, could that be the one off issue here or do you think it goes deeper than the pandemic? That could be if you're doing random digit dial polling, but if you're doing, you know, pulling off a voter lists where you know their partisan history beforehand, then that's not an issue. The issue is you're getting you are getting the right proportion of Democrats and and Republicans and nonpartisans or independents. It's just are you getting the right mix within those groups, even if you're selecting proportionally within those four region?

[01:01:49]

Because we know that people who live in in different areas of partisan concentration behave differently if they're a Republican in a heavily Republican area versus a Republican where Republicans are in the minority, or if it's a much more competitive area, we know that they are different.

[01:02:04]

But even still doing that, it seems that we're still missing something. So so I don't think that that in and of itself is the answer for what we see as the missing in these 20, 20 state polls.

[01:02:15]

Why don't you do random digit dialing? We do it our national samples, and just because we're just getting a general sense of what the electorate looks like, but as I said, vote history has been significantly important for us in terms of our our voter models in the past, and particularly when we get to low turnout elections, such as off year elections, midterm elections.

[01:02:43]

So the only way to do that accurately is to pull off a voter lists where we actually have that information, rather than asking people to self report how often they vote, which we know that the overreport because of social desirability.

[01:02:57]

So that's why we use less samples for our election polling, whereas we used RTD for most of our other general public policy polling.

[01:03:08]

You mentioned that part of the problem that you saw in your own polling was that you were OK for much of the election, but that you ultimately got the turnout model wrong. In speaking with Ann Selzer, who conducts the Iowa poll, she essentially said, I don't use a turnout model. You know, I just ask voters, do you plan on voting? And if they say definitely, then she accepts that at face value and that's about it. Is there any wisdom to that method of polling?

[01:03:35]

What is the use of creating a turnout model?

[01:03:39]

Yeah, I think Anselmo's approach is interesting and I think it might show that there are different ways to look at voters depending on the political culture, whether it's Latino voters and different groups of Latinos or if it's in different states.

[01:03:54]

So the six states that I polled, I actually went back before you and I got on to talking and I changed. I threw out the likely voter model that I was using and just use an seltzer's approach.

[01:04:05]

The question that we ask about, are you going to vote? You're sure you're going to vote? And of the six states in five states that didn't change from winning, likely voter model was the only one where I had a significant shift. It was shift of five points from a small lead for for Joe Biden to a small lead for Donald Trump was the poll that I did in the middle of October in Iowa. So I think she's right. And I think that applies to Iowa.

[01:04:29]

But it doesn't apply to like a state that I've been polling for 25 years since I started New Jersey, which doesn't have that same kind of political culture where you do need to use some type of vote history model to get a more accurate read of who's going to show up to vote and why.

[01:04:45]

Is that like what's specific about a state like New Jersey or Florida that you need to do that because they don't have a the kind of participatory culture that you are less likely to get an honest answer from folks about when you ask that first question up front, because they have every intent to do so, but not really. And you look at their past voting history and say this is not a definite voter by any stretch of the imagination, whereas and I think in Iowa, it's more likely that that coalesces.

[01:05:19]

Their answer to those questions coalesce much more likely with their willingness to do go out and vote or whatever they have to do. I think it has to do with more densely populated states, more more culturally diverse electorates. I think, you know, you have to throw all that into the mix and decide which model is going to work best. In which area.

[01:05:42]

Do you feel like the conversation about polling right now nationally questioning its usefulness in covering politics or its usefulness altogether? How do you feel about it? I think 538 we've already recorded podcasts pretty open about you know, we think that polling is still important and that people shouldn't be freaking out about polling errors because they should be expected to happen. Where are you on that?

[01:06:07]

Yeah, I mean, look, obviously, this is what I do for a living, so I'm not going to try to sell my entire industry down the river here.

[01:06:14]

But, you know, we had some errors that were maybe slightly larger because they were all on one side than normal, but not much larger than normal.

[01:06:25]

We need to do a better job in the media, needs to do a better job of understanding the level of uncertainty that the polling that is involved in the industry of polling and what polling can and can't do for. So, for example, we go out there and say, you know, support for this policy. Is it is it? Sixty five percent and the real number is sixty one percent or the real number is sixty nine percent. Most people would just brush that off.

[01:06:53]

That doesn't matter.

[01:06:54]

And it's you're telling me about two thirds, somewhere close to two thirds support this. That's great.

[01:06:59]

But when we get to an election where we say somebody is ahead by five points and the only one by one point, then suddenly polling is off polling, you can't use polling. Well, maybe it's because your expectations for polling an election should be the same as expectations for what? Polling. And do and it just it's everyday use in terms of measuring support for different policies, understanding, you know, people's needs for different things in their lives, which issues are most important at a certain point in time?

[01:07:35]

That's what we need to do, do a better job. One, you know, we always need to try to improve a polling and make it as accurate as possible. And we've been successful with that over the years and in doing that.

[01:07:47]

But it's never going to be as close as people want it to be, particularly when we're talking about national elections right now, which continue to be razor thin in terms of polling numbers. I mean, you know, five point a four point or five point win is considered a big win. But in polling terms, that's that's a margin of error win. And that's what we consistently have. We don't have times now where we're going to have a president of the United States winning by 10 points, those that that error is gone.

[01:08:17]

And so that just simply means that polling just might not be as precise as you want it to be to measure this type of environment.

[01:08:24]

Do you worry that you mentioned the potential inaccuracies in issue polling? Do you worry that, you know, some of the issue polling may be off to the point where parties embrace policies that they think are popular but aren't actually?

[01:08:40]

No, actually, I don't think that right now is we're looking at it that the polling is off by more than, you know, three or four points at most. So let's say, you know, you have a poll that comes out, Donald Trump's job approval rating is 42 percent. But in reality, if we could interview everybody in the country, it would be at 46 percent.

[01:08:59]

And it's not. And that's a consistent, systematic error, not just a random error that changes with each poll, but systematically that would be what happened in each poll.

[01:09:09]

That doesn't change much at all in terms of what I think the public would do or what our political leaders would do. What that does actually worry me, though, is why are why is a certain group of the public, you know, four or five percent of the public consistently just banging out of participating in polls? Is it just polls? Probably not. So what are they saying? Is that is it because they distrust the media? Is it because they distrust government?

[01:09:38]

And if you get to that level, then what? Then polling is telling us an answer to a question that we might not want to ask, which is.

[01:09:50]

Oh, do we have enough public faith in the institutions for those institutions to remain healthy through this? This is what this is what I'm concerned about in the larger picture of what happened in twenty twenty. Is that that kind of revealed that that might be something that we're missing in measuring more so than anything in terms of a four or five point miss on an election outcome.

[01:10:11]

Wrapping things up here. What lessons do you hope that we all take away from the 2020 election in regards to polling and how we use it?

[01:10:23]

Yeah, I, I mean, I'm not overly optimistic because I think the people who are screaming the loudest that I'll never trust another poll again will be the are the ones who are online first thing in the morning looking for the next poll. So those folks got to throw out.

[01:10:40]

But I, I think we're so fixated on the election itself and election polling that we may be missing a bigger issue that polling is revealing in terms of a change in our entire political culture, in the way we view ourselves, in the way we talk about politics and and talk about things that are not political, but now have become political because everything is now highly charged, partisan and tribalism. All these things that we've been looking at and talking about may have actually affected polling a lot more than than we realize.

[01:11:21]

And I'm worried that just focusing on the election itself might point us in the wrong direction, particularly if the election polls in twenty twenty two are right, then we might have missed this underlying thing that will continue to exist that suggests that there's something else going on in the body politic that is going to that could could raise its head again and we would miss it again if we didn't address that that deeper, deeper problem.

[01:11:51]

All right. Thank you so much for sharing your time with me today. I really appreciate it.

[01:11:55]

Oh, it's my pleasure, Gail. Look forward to listening to it. Patrick Murray is the founding director of Monmouth University Polling Institute.

[01:12:04]

Well, that does it for today's podcast. Two different perspectives on polling and the challenges of polling in twenty twenty. Hopefully you learn something. I certainly did, but that's it for now. We'll see you again soon. My name is Gail Droog. Tony Chow is in the virtual control room. Claire Bidi Gary Curtis is on audio editing. You can get in touch by emailing us at podcast at five thirty eight dotcom. You can also of course, tweeted us with any questions or comments.

[01:12:29]

If you're a fan of the show, leave us a rating or review in the Apple podcast store or tell someone about us. Thanks for listening and we'll see you soon.