Transcribe your podcast
[00:00:06]

Hello and welcome to the 538 Politics podcast.

[00:00:10]

I'm Galen Droog, I'm Nate Silver and this is my little talk.

[00:00:17]

We are recording this on Thursday. But if you're hearing this, it means that it's at least Friday and that we have officially launched our Senate forecast model. So I encourage everyone to go check it out on 538 dotcom. But we're going to discuss it here. And the topline numbers are Republicans have a forty two percent chance of maintaining control of the Senate and Democrats have a fifty eight percent chance of winning control of the chamber. That's according to our deluxe model, which will get into this single likeliest outcome, according to the forecast, is a 50 50 split in the chamber.

[00:00:54]

And, of course, the vice president breaks a tie in the Senate. Nathaniel Silver, the first question I have for you is, have you pressed record on your microphone, on my microphone or on my computer?

[00:01:06]

You know how your computer connected to it has been recording for upwards of two minutes now.

[00:01:12]

All right. All right. Without incident. Yeah. You can't fail me twice like that. That's got to be the only time. So what election season?

[00:01:20]

What happens when you record it? So Tony records a backup Skype audio file, which is pretty poor quality, and then also requires us to go through and cut out everyone else's voice from your audio in that file. OK, so it's more work and the audio quality sucks, but we didn't have to retape the entire podcast in case anyone was wondering from what happened on Monday.

[00:01:46]

Well, that's good. Yeah. OK, now that we have cleared that up, how would you describe the state of the race for the Senate since we have you've completed the forecast. You can see the output. What's the state of the race?

[00:02:00]

The state of the race is that Democrats are in a better position than they probably thought they would have been in a year ago. There are an awful lot of competitive seats, but by no means is it a done deal. There are basically no seats where you would guarantee a Democratic pickup. There is one seat in particular in Alabama where they're likely to lose ground or lose.

[00:02:25]

Doug Jones probably will not be re-elected running against a non Roy Moore candidate. So that means Democrats might need to pick up four or five seats depending on if they win the presidency, which is kind of it, a pretty tall order.

[00:02:37]

There's also a kind of contrast between the polls and other indicators of the race. So if you look at our forecast, there are actually is a return from twenty eighteen. They're actually three different versions that are called light, classic and delux.

[00:02:53]

And they Leya makes me hungry.

[00:02:55]

We're taping this just before lunch time could use a burger and they layered different versions on top of another different information, whereas light as much as possible is based on polls, classic ads and fundamentals, factors like fundraising, for example. And then Delux also adds an expert projections. So based on the polls alone, Democrats have a sixty nine percent chance, a pretty nice chance of winning the Senate and the light version and classic that goes down to sixty four and then delux it goes down to fifty eight.

[00:03:28]

So part of what's happening here is that I mean Democrats polls and lot of these Senate races look pretty good. Right? There was a poll earlier this week that had Lindsey Graham tied in South Carolina with Jamie Harrison. And that's an interesting poll for Democrats. Right? There is a model essentially saying that, yeah, that's a great pull for Democrats. But based on other things like the partisanship of the state or fundraising or whatever else that we don't really trust that Lindsey Graham is actually going to lose that race, there's a chance.

[00:03:59]

Right. But like I think the deluxe forecast has him with like a 15 percent chance of losing, which is not trivial. It's higher than a lot of people thought. But, you know, Democrats are competing against a lot of incumbents and a lot of red states and states that Trump will probably win.

[00:04:15]

And so it's not an easy lift necessarily, even though incumbents aren't nearly as advantaged as they once were. I mean, look, you have some pretty easy pickups for Democrats right now, like Colorado, Colorado, Arizona, probably the two easiest.

[00:04:31]

You know, Mark Kelly has been ahead in almost every poll, although McSally has tightened the margins a little bit in some polls. Colorado is a very purple state now.

[00:04:42]

Maine is a not relatively easy pickup.

[00:04:47]

I mean, Susan Collins is kind of a tradition there. She has been popular for a long time. Her popularity has gone way down. But an incumbent who won by whatever it was, twenty five or thirty points, like she won last time, I think should not be written off.

[00:05:00]

Right. But it's a. Still, it's a state where, Gideon, her polling has been strong, Biden strong, has been strong in Maine as well. One question I have to ask, though, is that in 2013, we saw that the partisanship of the state, whether or not the state voted for Trump in the 2016 election, essentially overrode any thoughts of an incumbent advantage. We saw that in a lot of states. Right.

[00:05:24]

Ultimately, even if there was a Democratic incumbent, like in North Dakota or Indiana, the Republican challenger ultimately prevailed. When we look at states like North Carolina, maybe a bit more purple, but Iowa or Montana, some of the states that Democrats could be relying on in order to win the Senate, those are fundamentally red states. Is there a reason that we should think things will go differently in twenty twenty from the way that they did in twenty eighteen in terms of state partisanship really overriding everything?

[00:05:52]

No, I mean, it probably will, right? I mean, there is some degree of ticket splitting, but if we look at the Senate map. Right. Then OK, Democrats are favored to win the presidential race in Colorado, Maine and Arizona. That's places where there at least Biden's at least 60 percent to win. Biden has almost no chance of winning Alabama. So take those three. Right. Democrats are a net two now North Carolina, Biden is favored, but that's much closer, for example.

[00:06:23]

But he is favored there. So that would make it net three, which would just be barely enough. Right. But then there's two races in Georgia, which is we have Trump a little bit ahead, but very close. There's a race in Iowa of Trump a little bit ahead, but very close. Texas Trump a little bit ahead, but very close. Right. So, you know, I mean, this is kind of a situation here where a really narrow Biden win.

[00:06:46]

Let's say that Biden wins Pennsylvania.

[00:06:51]

By the way, there aren't a lot of Senate races in the Midwest this year. Democrats are playing defense of them anyway. Right. Let's say Biden has a good but not great night. And that consists of winning Pennsylvania, Michigan, Wisconsin and Arizona, all states that Clinton did not win in 2016. That's a comfortable ish Biden margin. It might translate to like a four or five point win in the popular vote for Biden, but let's say he loses Florida, North Carolina and Iowa and Texas and Georgia.

[00:07:21]

And every Senate race follows. A president who are Democrats actually don't win the Senate. They're there 51 to 49. They're one seat short. And so, you know, the Senate is a heavier lift for Democrats than the presidency.

[00:07:35]

It's not crazy to think that the races behave in a quirky way and somehow Trump wins and Democrats take the Senate like crazy, crazy. But it's that's less likely than the other way around.

[00:07:48]

Where does it look like the races are most likely to diverge in terms of the presidency and the Senate? I think, for example, looking at Texas, right. Biden has a much better chance than the Democratic challenger, MJ Hager there again in Maine. We already discussed about Susan Collins has a better chance of winning Maine than Trump does. Yeah. Are those the two main ticket splitting contests? We see maybe Montana as well. Are there other ones?

[00:08:13]

Yeah. I mean, you know, Democrats would say, hey, look, Montana, we have the current governor there. He's very popular. He's a moderate. We are other senator is a Democrat and Jon Tester, and he won re-election last time.

[00:08:26]

So, you know, it might not be crazy. In North Carolina, the Democrat, Cal Cunningham, has generally pulled a bit more strongly than Biden and had clear leads. The Republican come in there. Thom Tillis has a fairly low approval rating. He has not raised a lot of money.

[00:08:40]

He barely won last time. He might just kind of be like a classic kind of weak incumbent.

[00:08:45]

We get some more exotic things. I think that Lindsey Graham, I actually would probably be a buyer of our 15 percent for Jamie Harrison.

[00:08:54]

They're interesting because I think Lindsey Graham, like our kind of models, as always, seems like a fairly strong incumbent.

[00:09:00]

I think like the kind of being seen as kind of a moderate squish and then being really magga, like you say, that we cover politics like I'm never sure it's Magor Magga or hashtag M.G. I think it depends on your regional accent.

[00:09:14]

As a Midwesterner, I think you'd say Magga Magga. I'm not sure that Lindsey Graham kind of has made anybody in that state happy. Right. But there are.

[00:09:22]

Yeah, I mean, look, there are there are all these races that are quirky and sometimes quirks do happen. Right? Democrats have a very good midterm in twenty eighteen.

[00:09:31]

However, Bill Nelson lost in Florida, still a very consequential loss.

[00:09:36]

He's an incumbent in an environment that was Democrat plus eight or plus nine nationally. And yes, Florida is been frustrating for Democrats, but still, that was kind of a surprise Tester and Joe Manchin held on when Claire McCaskill and all the other people did not. Right.

[00:09:52]

And so it's Senate races can march to their own drummer a little bit. There's also in Georgia, there's two elections there, including. A special election with Kelly Loffler, among other Republicans running and several Democrats running, so there could be quirks on the map. And there are places where, you know, in Michigan, Republicans have a candidate, John James, who was pretty vigorous in twenty eighteen. He's running again against Gary Peters, who is not the most spectacular incumbent.

[00:10:18]

So there are quirks on either side here.

[00:10:21]

You mentioned that the national environment in twenty eighteen was Democrat plus eight or nine. How do we expect the national environment to compare with that this fall? And of course it's worth pointing out to listeners who maybe haven't paid copious amounts of attention to the Senate map that the map just looks better for Democrats this year. They're more on the attack than on the defense as they were in twenty eighteen. So it's not just a case of the environment mattering, it's also which seats are up.

[00:10:50]

But in general, in terms of the environment, what do we expect this year?

[00:10:54]

So Democrats are ahead by about six points on the generic congressional ballot.

[00:10:59]

So just going to ask which party you'd prefer to have control of Congress or who would vote for in your district that six point lead compared to an eight or nine point lead the Democrats finished with in twenty eighteen. So the environment is not quite as good for them as it was.

[00:11:14]

Look, so part of it, it's the echo year of 2014 when Republicans won everything that moved basically in the Senate. But still remember that the Senate has a pretty strong built in design where Wyoming has as many senators as California that tends to overweight rural states relative to their share of the population.

[00:11:36]

And rural states tend to be more Republican. So there are all these targets that Democrats have in Georgia and Texas and Arizona. You know, South Carolina. Right. With the exception maybe of Arizona, is truly becoming more of a purple state. These are all red leaning states. They are red leaning states in an environment where you're ahead six on the generic ballot and Biden's ahead, seven right. Where you can make them very competitive.

[00:12:01]

But they're not easy places to win for Democrats when it comes to liberal or progressive Democrats who think about this fall as their opportunity to pass a bunch of more progressive or liberal legislation.

[00:12:15]

Should they be looking more at the Democrats odds in the Senate almost as an indicator of whether or not they can pass that agenda than necessarily Joe Biden's? So Joe Biden's odds of winning right now are 75 percent. But our deluxe version shows that Democrats have about a 58 percent chance of winning the Senate. Is that the actual chance or those are the actual odds that the progressive Democrats will be able to implement their agenda?

[00:12:39]

No, look, if you're a progressive Democrat, then you have. Two or three problems, right, one of which is Joe Manchin, maybe Kyrsten Sinema in Arizona is often pretty maverick in the Arizona tradition.

[00:12:55]

You know, if Doug Jones somehow wins reelection, I think you'll have to decide whether whether he's hoping to win re-election again in six years or he's like, screw it. I'm just going to vote for the progressive agenda.

[00:13:06]

The less than 57 percent is what you're saying, like they'd actually have to win big a sizable majority, not just poll even.

[00:13:12]

I mean, every no margins at the margin. You know, I think, you know, with 50 to you can do more than with 50. With 54, you can do a lot more than with 52. Right. Once you get to fifty four or so, then you have a comfortable margin where Joe Manchin and Kyrsten Sinema can do their thing and you can have one person object for weird reasons and someone else gets covid and can't vote. Right.

[00:13:36]

And you still have 50. And Vice President Harris breaks a tie. Right.

[00:13:41]

And this is also assuming that they blow up the filibuster. But just to give listeners a sense, what are the chances that Democrats get fifty four seats or more in the Senate?

[00:13:51]

So in Delux, a 15 percent chance of having 54 or more. Fifty two or more gets up to like a, you know, thirty three percent chance or something. Right.

[00:14:03]

So if you can somehow get to 52 and keep everyone except for a in cinema in line, then you can do some things.

[00:14:12]

So again, the odds of that happening are more or less similar to the odds of Trump winning in 2016. So if you believe that things that have a 30 percent chance of happening can happen, then there you have it. But I want to dig into a little more of the nerdy aspect of this for a second. How much did the Senate forecast model change since 2000? 18, so not really very much.

[00:14:36]

We did a couple of things, though, and these are going to be fairly technical.

[00:14:42]

One is that the kind of fundamentals that we talked about before, we we estimated those equations to account for changes as a result of partisanship, you know, basically now a state's partisan lean or district. Personally, when we released the House model matters even more factors that we call candidate quality like, you know, experience. And whether you had a scandal or not in fundraising, those matter less than they once did as a result of higher polarization and partisanship.

[00:15:11]

That's when change doesn't have huge effects, but some effects. Another change is we are estimating house effects a bit differently, where mostly we look at how a poll compares to others of the same state or district, and there's not as much crossover permitted between districts because there are some pollsters that might have a different view of the electorate. Right. And they're very friendly to Democrats in some states, not so friendly and others. And just empirically, it works better if you kind of keep those house effects mostly contained within the state, then there are a few changes as a result of KOVA that kind of mirrors changes in the presidential model.

[00:15:50]

We assume there's a bit more uncertainty on Election Day with respect to turnout and the voting margin. So specifically, about 20 percent more uncertainty than the default version of the model. But there aren't as many kind of complicated heuristics around covid as we use for the presidential model. So those are the main things. None of them are hugely consequential. It does make the model a teensy tiny bit more conservative. So maybe, you know, maybe Democrats are 58 when they would be 61 one or something, but it's not really going to make that much difference.

[00:16:24]

And when we look at the polling across the country in the Senate contests, how does it compare to past cycles? Are we getting a lot of Senate polling? More than twenty? Eighteen less? Is there less focus on the Senate because of the presidential race? I mean, it's.

[00:16:39]

Decently, well covered in general, we're getting like a lot of kind of pollsters learn the lesson. So ABC News, our parent company, WashingtonPost, right. They hardly ever did state polls last cycle. They're doing like several waves of state polls this year. Our friends at The Upshot and CNN are doing a wave of state polls. You know, these Kaiser Family Foundation state polls, I mean, definitely right there are like the presidential states where you get the double dip.

[00:17:04]

Right. You know, in North Carolina and Arizona. Then you have, you know, Minnesota with the Democrat defending her seat in those states. You're getting a lot of double dipping the states where you have a competitive Senate race, but not a competitive presidential race. You know, like South Carolina. You know, at this point, Colorado is barely competitive in the presidential race. Montana is probably not competitive in the presidential race. I mean, Montana has been a little bit under poll, but it's not it's not terrible.

[00:17:33]

I mean, pollsters got the message that like, hey, state polls are a lot more and a lot more utility the national polls. And one reason for that is that in some of those states, you happen to have a competitive Senate race, too.

[00:17:44]

So I want to move on and ask some listener questions. But before we do that, just a general question. When is the House forecast coming out? And can you preview Republican's chances of winning the House?

[00:17:59]

That's models, about 98 percent done. So it's just going to come out relatively soon. When we finally are seeing on it, it's not going to say anything surprising. I mean, Republicans are how to put it.

[00:18:12]

They're barely even putting up an effort to contest enough House seats to really make it competitive. It's a narrower playing field than in twenty eighteen, and they could win it. But that probably involves a case where, like, just everything is going to hell for Democrats. The polls are way off and it's a Democratic nightmare. Any scenario where Democrats win the Senate, they're very likely to have retain the House as well, probably by a fairly large margin.

[00:18:42]

And in fact, Democrats might retain the House even in the event of a Trump presidency, that Trump's odds are better by some magnitude than Republicans odds in the House.

[00:18:53]

All right. Let's get to some listener questions.

[00:18:54]

But first, today's podcast is brought to you by Draft Kings. There's no better place to get in on all the action than with draft kings. The leader in one day fantasy sports draft Kings has millions of dollars in total prizes up for grabs. If you haven't tried draft kings yet, head to the app store it now because you don't want to miss this draft or lineup and feel the sweat like never before. Every run, every pass and catch means more with draft kings.

[00:19:19]

It's simple. Just pick your line up, stay under the salary cap and see how your team stacks up against the competition. Nothing adds to the excitement of watching the game quite like having a shot at a million dollars in prizes. Download the Draft King's app now and use Code five three eight for a limited time. New users can get a free shot at millions of dollars in prizes. Enter Code five three eight to get a free shot at millions of dollars in prizes with your first deposit.

[00:19:45]

Again, that's code five three eight. Only at draft Kings minimum of five dollar deposit required eligibility restrictions. Apply Draft Kings Dotcom for details.

[00:19:57]

Today's podcast is brought to you by Lifestream, if you want to save money this summer, why not start by paying less interest on your credit card balances? You can refinance with a credit card consolidation loan from Lifestream. It's an easy way to save hundreds to thousands of dollars and lower your interest rate. Lifestream offers fixed rate credit card consolidation loans from five point nine five percent APPR with auto pay, an excellent credit that's lower than the average credit card interest rate of over 19 percent.

[00:20:24]

APPR get a loan from five thousand dollars to one hundred thousand dollars and you can even get your money as soon as the day you apply. Listeners can save even more with an additional interest rate discount. The only way to get that discount is to go to Lifestream Dotcom slash five three eight. That's like Astarita A.M. Dotcom five three eight subject to credit approval rating colludes point five percent. Discount terms and conditions apply and offers are subject to change without notice.

[00:20:53]

Visit Lifestream NORCOM five three eight for more information.

[00:20:59]

We again got a lot of listener questions, so thanks to everyone for sending those in, our first question comes from Noah, and I think this might be a question that you've gotten a lot. So here we go. Joe Biden's odds of winning have gone up, according to the forecast, as his lead in the national polling hasn't really changed. Why is this and then just to give an example of the numbers, at the end of August, Biden had a 67 percent chance of winning.

[00:21:25]

Now he has a seventy five percent chance of winning, whereas at the end of August, he had a seven point one point lead nationally and now he has a six point nine point lead nationally. So not much of a difference in the national lead, but a noticeable difference in his odds of winning the presidency. So what's going on?

[00:21:41]

There's a main thing and there's a secondary thing. The main thing is that as you get closer to the election, then there's less uncertainty. As we kind of famously talked about. We think this is an election that comes with fairly high uncertainty for various reasons. But a lot of that is based on the notion that with all the crazy news and all the crazy economic data that, like things could just be very volatile. And instead the polls have been fairly steady.

[00:22:05]

So every day that goes by without Trump closing the race is a good day for Biden. I think our model with the current standing, the polls will get up to, you know, Biden being I don't know the number exactly, probably 88 ish or 90 percent favorite on Election Day with his current lead in the polls. So the passage of time helps Biden keep in mind to our model has a prior that expects the great race to tighten. So if the race does tighten a bit, that's in line with the model's expectations.

[00:22:35]

If it doesn't tighten, then Biden may actually be gaining relative to where the model thought the race would be.

[00:22:40]

That's the that's the main thing.

[00:22:42]

And how much does it expect it to tighten by Election Day? Well, it's confusing because it's still kind of thinks Trump's a little bit of a convention bounce, which offsets new direction, but then it will tighten after that. Right. But it expects actually not that much at this point, expects it to be like half a point or a points worth of tightening.

[00:22:59]

The secondary thing is that national polls are not used that heavily by the model. People mistake national polls for our forecast of the popular vote, which is not true. Right.

[00:23:11]

National polls are just one way you can forecast the popular vote. The other way you can do it is to kind of add up all your state by state estimates. And in fact, that's closer to what the model does.

[00:23:24]

So national polls are influential on the model, mainly because of what we call a trend line adjustment, which is let's say there's been no polling in, I don't know, Alaska for a month. And in that time, Trump has gained two points, more or less with the model will do is take the result that we had a month ago in Alaska, add two points to Trump for it, and then, voila, you have an updated estimate of where that race currently stands in Alaska.

[00:23:53]

There are some more wrinkles that use similar states, et cetera.

[00:23:56]

But right now, we actually have quite a lot of recent state polling data. So we don't have to resort to using a poll from a month ago in Minnesota. And we have thirty seven thousand polls in Minnesota in the past week. And so therefore, national polls don't really have much influence on the forecast right now.

[00:24:16]

So are you saying that Biden has improved his standing in state polls as national polls have remained steady?

[00:24:23]

And that's part of the national polls that have probably tightened a bit? You know, I mean, Trump's lead down to like six point seven or something. As of this morning, it was eight and a half or nine at one point. Let's tightening. Right.

[00:24:34]

If you look at the state by state polling averages, you wouldn't see that much tightening. You might see might see half a point to a point. But there are also states like Arizona and Minnesota where Trump is polling better than he was preconvention, or states like Wisconsin, which polling like at least as well as he was preconvention. So it's not clear. I think, too, is like we have a higher quality set of data in the state polls this year.

[00:25:03]

A lot of the expensive high budget pollsters have gravitated toward state polls, which is good. A lot of weird Canadian online firms have been doing national polls and seem to be afraid to get in the states and things like that.

[00:25:16]

And so so it's higher quality data in the states. And that's some reason to to be more skeptical, the notion that the race is tightening. But again, the race can tighten a bit, but that can be outweighed by the passage of time.

[00:25:28]

And just to clarify, what's the reason that we would see the race tighten nationally but not tighten in the battleground states?

[00:25:38]

I'm trying to get across like it's not clear the race is tightening nationally, right.

[00:25:42]

Nationally, but it might just be bad polling, whereas we have better quality polling in the states that says that it's about perhaps.

[00:25:49]

Right. But the national poll average is not our estimate of the popular vote. It's an average of national polls, which is one poll series. Right. If you think of national polls as like.

[00:25:59]

Think of it as like one really big state, does that make sense, right? It's a polling series of polls that are national. Right. But like but it's not a summation of all the individual states necessarily. There can be a gap there and where that gap exists. So in 2012, national polls kind of showed Barack Obama and Mitt Romney nearly tied. Maybe Obama had by a point or two. Right. If you looked at state polls and then said, OK, here's the average in every state, let's wait by the turnout and have them together, Obama led by more like two or three points.

[00:26:31]

He went up winning by four points. Right.

[00:26:32]

So when there's a gap between the state polls and the national polls, the state polls, if you do the work, can actually give you a better estimate of the national vote in the national polls themselves.

[00:26:44]

So that's kind of more complicated math there.

[00:26:47]

Next question, does the model take into consideration how much of the population has already voted? So, for example, if the race were to tighten in the final week or so, but because of early votes, some percentage of the population has already voted and won't be affected by that tightening. Does the model try to account for that in any way? No.

[00:27:08]

I mean, look, intrinsically, the model gets more confident as you get closer to the election. Intrinsically, polls get better as you get closer to the election or more accurate, I should say. I certainly think it's plausible that when you have more early voting that polls therefore lock in a bit earlier. I would keep in mind that in 2016, though, we had a fairly dramatic late swing toward Trump in the polls that our model picked up. Maybe not didn't get quite enough of it, but it certainly picked up some big shifts after the KOMY letter.

[00:27:42]

And there were a lot of theories about how there I've been mail voting. The numbers are good for Clinton in those.

[00:27:46]

There's food not to be very wise. So I don't know, especially in a year where you have so many people using a different voting method when they have before.

[00:27:56]

Pollsters may have trouble getting a handle on that and they might be surprised by certain things. Right. And so I think this is like like that's a change you might implement. In the twenty twenty four model, if the polls are accurate this year, including in states where you have a bunch of male voting for the first time, I would not.

[00:28:13]

Count on that persay to be something pollsters will have an easy time with this year, although they could, it's it's a plausible theory.

[00:28:20]

Our next question is from Jeffrey. He asks, What would the model output currently be if every state operated like Maine and Nebraska, awarding two delegates to the statewide winner and all other delegates based on congressional district or voting? Would this system tend to favor Democratic or Republican candidates?

[00:28:39]

It would tend to favor Republicans because the median congressional district is a little bit to the right of the median voter because of kind of gerrymandering left over from 2010 and some clustering of Democrats in urban districts. Now, this has changed a bit because a lot of exurban suburban districts were drawn to favor. Republicans are now much more purple.

[00:29:03]

But would it favor Republicans overall or would it favor Republicans relative to the current Electoral College system that we have?

[00:29:13]

Oh, it might be pretty close. Like in 2012, Mitt Romney would have done much better with that map, for example, but between kind of the state. Level map getting worse for Democrats and the districts getting better than it would be. Pretty close, but both would have kind of a a built in Republican bias. I mean, the main issue with that is like, you know, you therefore make gerrymandering super important to have huge numbers for the presidential race.

[00:29:43]

So personally, I think it's fine if Nebraska and may want to do it, although I will say Nebraska and Maine, there's a lot of code that I have to write every year because of you, Georgia, your runoffs. Georgia, I have to write a lot in Louisiana, God forbid, Louisiana. I have to write a lot of extra code every year because of you states like you name with the ranked choice voting, they stop it. You're already doing the congressional districts, but you can't to rigorous voting to take a lot of code.

[00:30:11]

I mean, come on, guys, I like I like Maine. I'll put up with Maine. If it was like like Oregon or something that was making me do this, I mean, just forget it.

[00:30:21]

We got a bunch of questions about turnout and essentially how we consider pollsters different turnout models if they publish various different polls based on different turnout models. And also, this is a question we got a lot, which the answer is no.

[00:30:36]

But do we do our own turnout model to try to, like, filter the polls through what we think the turnout will be? So the answer to that is no. But how do we consider polls that have different turnout models?

[00:30:48]

So we do actually technically project turnout in the sense that we need to know how to forecast the popular vote. And we forecast the popular vote, which again, doesn't really matter. But we do have a popular vote forecast. If you care about it and it's an Electoral College forecast, you have to kind of know at least what the relative turnout is in each state. But we don't do those. We don't say, OK, we think turnout is going to be higher than the polls expect among this group.

[00:31:13]

Therefore, let's shift the polls to Biden or whatever or to Trump. We trust the pollsters themselves to do that. So we kind of project turnout in the vote based in the polls. But they're kind of they're kind of orthogonal to another. Right. They don't really affect each other. What are we do a pollsters, multiple turnout models? Well, we always use anything they call likely voters over anything they call registered voters in general. Likely voter polls are more accurate.

[00:31:39]

You have some polls that have multiple versions of a likely voter model.

[00:31:44]

What we generally do is just average those together. So Monmouth, for example, today they're recording this head like a high turnout model and a low turnout model in Arizona. One showed a tie, one showed Biden, plus two. We average those together. You are with Biden plus one. Right. They also had a registered voter poll that had Biden plus four. We don't use that registered number at all. The one exception is sometimes a pollster will say, here is our main likely voter model and here are some alternative scenarios.

[00:32:16]

If a pollster does that, then we defer to them and say, OK, this is their preferred likely voter model. So, again, it's a little complicated and we have it kind of like patchwork of rules that have evolved over the years. But that's how we handle that particular issue.

[00:32:30]

Next question is from a cache.

[00:32:32]

It essentially does the margin of error that a poll comes with affect how much importance we place on it in our polling averages? So say the margin of error of one poll of Pennsylvania is five percent and another poll is three percent. Does the one with three percent get more weight or how do we account for that?

[00:32:55]

So the poll with a larger sample size gets more weight and margin of error, more or less is an indication the sample size.

[00:33:03]

We don't use sample size per say because some polls are more honest about their margin of error than others. And they're generally better polls that do that right. So there's something called a design effect, which means that when you do a lot of weighting in a poll, it increases the margin of error.

[00:33:24]

Some pollsters will be honest about that, said, hey, we had to do a lot of waiting here to get enough Hispanics are enough working class white voters. Right. Therefore, the margin of error is actually five percent, not four percent. We don't want to punish that poll because the other polls are doing waiting, too. They're probably just not telling you about it. Right.

[00:33:40]

So we use we use a sample size is a short version to to figure out how much weight to put on polls are in our averages.

[00:33:46]

All right. We got two more question. The last one is pretty nasty and applies to people who have probably been following closely all of the polls that we have been adding to the model. The question is, what on earth is going on with the USC Dornsife tracking poll? Is it considered any differently by the model? And to give some context for people who have not closely been following the USC Dornsife tracking poll recently over the past couple of weeks, it's gone from twelve point Biden lead to a seven point Biden lead where we haven't seen the same tightening persay quite nationally.

[00:34:21]

What's up with that poll?

[00:34:23]

So there is a big flaw. In the USC Dornsife tracking poll calling out the USC PR department, you can send me all the nasty emails you want, but I think you did a not very thoughtful job with designing this poll, USC, that I just discovered this morning.

[00:34:41]

So the USC poll is what's called a panel survey, which means they reinterview the same people. I like panel surveys. They let you see movement. Right. There's some concern that, OK, well, let's say you happen to get stuck with a really Biden leading or Trump leaning panel when you're stuck with that the whole year, which I agree is not ideal, but like, it's nice to see whether the same voters are changing their minds.

[00:35:04]

So in 2016, how their poll work is, they would report the results for the last seven days, every seven days, one seventh of their sample was asked, how do you feel today? Right. Have a chance to respond. And so the whole panel is represented in the poll this year. They are showing you what happened the last seven days, but they're only interviewing people once every 14 days, which means at any given time, only half the panels represent.

[00:35:32]

And it kind of oscillates back and forth between one half and the other half. And if you look at their poll, it's kind of like a sine wave pattern where there seems to be a more Trump leaning group just by chance alone. Right. And a more Biden leaning group. And depending on where you are in that two week cycle, that will affect whether Biden or Trump has relatively good numbers. So this is like a I just discovered this this morning.

[00:36:00]

This is a pretty.

[00:36:03]

Strange way to do a tracking poll, it's kind of the worst of both worlds now, right?

[00:36:09]

So the movement that people are seeing is not necessarily based on news events or people changing their minds. It's the fact that the sample has completely switched.

[00:36:19]

It's like you're a merry go round, right? And one side of the carousel and one side of the carousel, it's a little bit more biting and one side's little bit more trumpy. Right. And so but the movement is kind of it's not fake, but like but the whole point of a panel survey is like you are surveying the same people over and over, or there are some panels where you do a mix of older and newer people.

[00:36:42]

But they're kind of it's kind of like in the uncanny valley where like it's not a new random sample every time, but it's also not really letting you show movement. And in fact, like the set of people that are in the poll this week are exactly the opposite of the people that were in the poll one week ago, but the same as people that were in the poll two weeks ago. It just doesn't make a lot of sense. My advice, USC Dornsife, the way you're doing this, I mean, I don't know why you're doing it this way, but go ahead and show a 14 day interval.

[00:37:09]

I mean, that's at least kind of honest where, like, you're showing the whole panel you're going to have these kind of phantom swings that are, you know, I don't know. I think it's it's it's a big problem. And it doesn't change how we affect the polling averages. We're not making, like, subjective adjustments based on our feelings about a poll. We don't treat panel service any differently. Maybe we should. But it does mean I think people should be very careful about about making inferences about how the race is changing.

[00:37:34]

It's also a bit lagging to where actually the only serve you once every 14 days and then you have 14 more days to respond. So you may say, oh, there's a big shift toward Biden today. Right. That actually reflects a change for voters view from twenty seven days ago or something.

[00:37:49]

Right. And so it's just it's a cool concept. But like, they just made this change that makes it really hard to interpret. It makes it kind of useless and is going to probably be this like sine wave pattern where there is movement that doesn't really track with anything else.

[00:38:07]

All right. Final question is and it reads my question for Nate is, if he absolutely had to place a bet on the forecast, does he lean one way or the other? If so, why would you buy what you sell one candidate or the other based on the forecast model today? And I assume this is the presidential forecast model. I will say what I've said before in writing it on the show, which is that the model came out with a lower probability for Biden than I expected intuitively.

[00:38:41]

And it's Biden since gaining ground in our forecast. Right. So maybe there's not that disconnect anymore. Another thing I'll say is, you know, I think that all the incentives in twenty twenty are to be people overweight. The last example and the fight. The last war. Right. And so kind of all the incentives for pollsters, et cetera, are to be kind of quite careful. Certainly all the incentives for people that are formulating the conventional wisdom.

[00:39:10]

Right. You can't go wrong by saying, oh, I'm still going to change the way we do that. To write our headline in our kind of story of the middle came out last month was Don't count Trump out. Right. In part because there are some other models that we think we're way too bullish on Biden. So we feel like we have some actual non hypocritical setting to say that when there are you know, there are some models that we think we're very exuberant on Biden's chances.

[00:39:30]

So I think in some ways, like the model makes assumptions that are a little bit on the conservative side. On the other hand, having Biden with a roughly three out of four chance feels like a pretty sensible position for a model to be in, given where the race is right now. And again, it has increased. It's gone from, you know, I guess hasn't increased that much. I went from like 71 to like 60, 70 to 75 or 76.

[00:40:01]

Right. So there's some you see some shifts there as a result of kind of getting near the election. But, yeah, if I actually had to bet I mean, people are dumb about this stuff, right? People like I'm going to be some robot placing bets on my forecast. I mean, no, I mean, you would like you know, I face a lot of professional. We all do it. Five thirty eight in some ways.

[00:40:21]

Right. I mean, there are lots of like kind of consequences to the way people perceive our forecast versus the reality of them and kind of how candidates win and what consequences they win by. And those consequences are probably larger than than some stupid amount of money that most people are betting and predicted or whatever. Right. I mean, the whole people are like, oh, you know, if you really had game, first of all, I might make some bets if I were ethically and contractually allowed to do so.

[00:40:45]

Right. But the notion that, oh, I don't have skin in the game, I mean, like you, I have like, so much more skin in the game than anybody else. I actually get a forecast. Right. You know, and you're betting. Twenty bucks doesn't like it.

[00:40:55]

If people perceive our forecast to be wrong, that's pretty bad for us at five thirty. Right.

[00:41:02]

And again, there are all types of issues about what's going to be perceived to be wrong or right. I think people become more educated about that. But I have all the incentives in the world to have the most accurate forecast that we can.

[00:41:13]

Right? Yeah. So, yeah. All right. Well, that's a good note to end on, so thank you.

[00:41:20]

Thank you, Galen. I get worked up in the middle talk sometimes.

[00:41:23]

I know. I know. It's good. We like to see it. It's because you have skin in the game like you said. Anyway, of course, we will have the opportunity to answer more listener questions in the future. So if you have any questions, send them our way either on Twitter or to podcasts at five thirty eight dotcom. But for now, my name is Gaylan Drew. Tony Chow is in the virtual control room. You can get in touch by emailing us at podcast at 538 dotcom.

[00:41:47]

You can also, of course, tweeted us with any questions or comments. If you're a fan of the show, leave us a reading or review in the Apple podcast store or tell someone about us. Also, make sure to subscribe to us on YouTube. If you are listening to this right now, you should know that you can also watch it if you so choose. But anyway, thanks for listening and we'll see you soon.