Trusted Election Info: Register to Vote & Read a Poll Like a Pro

Need some help navigating election news? See and share voter registration deadlines by state, and learn from a pollster how to read and interpret polls.
Table of Contents

This year at Guru, we’ve broadened our mandate to connect people who have questions with trusted answers. And because there’s a ton of honest confusion and misinformation surrounding the 2020 election, we’re here to help in the way we know best: by putting together resources from verified sources and subject matter experts to clarify things!

register-to-vote-hero.png

When we went digging, we found out that it was tough to find answers to simple questions like “What’s the latest I can register to vote in my state?” and “What is appropriate sampling for a poll, and why is everyone yelling about it?” So we decided to start at the top. We pulled together mail-in and online (where available) voter registration deadlines by state, and interviewed Jim Williams, a pollster with Public Policy Polling.

Find your state’s voter registration deadline

There’s a timeworn political saying: “The only poll that counts is the one taken on election day.” In 2016, almost half of eligible voters… didn’t vote. So let’s try and fix that together! Share this list of deadlines with a friend or colleague to help everyone exercise their hard-won right to vote.

How to read and understand political polls

Unskew the polls. But what does 538 say? We can ignore that poll; it seems like an outlier. What do you really know about political polls? There’s more nuance to them than you might think, and everything from the kind of day a person is having to the way the question is asked can influence the interpretation of and narrative around the numbers. Plus, it turns out, there are some pretty basic things — like the margin of error — that even seasoned pros tend to get wrong.

We sat down with Jim Williams from Public Policy Polling (PPP) to clear up the stuff that confuses casual observers, political junkies, and everyone in between.

What are the biggest misconceptions about what polls indicate?

Jim Williams: One of the biggest misconceptions is that if you look at a poll, it's going to tell you who will win the election. “Oh, Obama's up three? Well, then he's going to win.” And well, no, that's not what the poll says.

What the polls actually say is something like, “At this time in late August, 95 out of 100 times a poll conducted under this methodology is going to have a result that's within ±4 of these results.”

So it's a snapshot, but it's also not the end all be all. Not just for that reason; there's a margin of error, and the fact that this is based on what 500 people said last Thursday and Friday.

Right. With everything going on in their lives at that point.

Yeah. And we saw it a lot in 2016 with Trump and Hillary. Depending on what was going on in the news, Trump either looked worse or better. When really bad stories came out about Trump, the polls looked bad. But then when they kind of faded away, it went back to Trump looking pretty close to being tied.

And it's unfortunate for Hillary Clinton that the election came at a point in time where Trump was kind of in calm seas for himself. And meanwhile, afterward, a lot of people are like, "I can't believe it. The polls said all year that Hillary was winning."

And it's like, "Well, yes and no. What the polls showed was that it was close all year for most of the time.”
Error in national polls was historically low in 2016

Source: Pew Research Center

So what is the appropriate sample for a local state or federal election? I guess what I'm really asking is is there a specific percentage that you should look for based on the overall populace?

No. It's more about getting to a certain threshold, such as 500 [poll completions]; that's what we try to get at PPP on most of our polls. At least 500. That's what I promise clients I'm going to get for them. That gets you a margin of error in the low 4s. And I would say, if someone were to say to me, what's the industry standard for the number of responses, I'd say, "Oh, around 500."

That's what you’d aim for with a typical Democrat versus Republican horse race poll in Nebraska, or even a national or congressional poll. But then there are also scenarios where it's okay to get less than 500. Like if you're trying to get a sample of African-Americans, it's difficult and expensive to get 500 of them. So maybe in a situation like that, you're more okay with 300, if you're doing an oversample. Or if you're in a really challenging environment where you're trying to poll a state house race in North Carolina in the Democratic primary.

So you're like, “Well geez, only 9,000 people even vote in these things normally. How am I going to get 500?” And in that case, what we tell clients is, "Hey, we're going to do our best. We like to get at least 300 in this scenario. We're going to do as best as we can for you. I can't promise you 500."

So if you're trying to pull left-handed people with brown eyes, maybe you don't want to pay to get 500. Maybe, in that case, 300 is okay. A margin of error ±6.4 is fine. You can live with that. You don't need 4.1 because it's a harder target situation.

margin-of-error

I know most people misunderstand what margin of error actually means. In fact, I did up until 2018!

So if a poll says Obama 49, Romney 47, and the margin of error is ±3, Obama could be at 52 or he could be at 46, and Romney could be at 50 or 43.

Reporters misconstrue that all the time, where they say things like, "Biden's leading by three, but it's inside the poll's margin of error." And I read that, and I'm like, "I know what you think you're saying, and you're not wrong, but you still don't really know what margin of error means when you write that.”

I personally think that the margin of error is one of the most over-discussed and overrated metrics with regard to polls. Your poll is going to be probably around 500 completes. It doesn't really matter if your margin of error is 5.5 or 3.9. It doesn't matter.  That doesn't even help you understand the poll that much. It's going to be 4 or it's going to be 5 or it's going to be 3.9. People harp on it too much.

I think it's probably a way for people to think they can trust or dismiss a poll.

Yeah. And we deal [with that] a lot, and everybody does this. People trying to discredit polls. I used to work on political campaigns and I very foolishly thought, in 2011, when I made the jump from working on campaigns and working in state government to working in polling, that I was going to now be above the fray. And like, "I'm just dealing with numbers." I don't have to engage in as much of the spin or the fighting.

Not true at all, especially because right around that time was when the rise of polling aggregators came and people started obsessing about polls a lot more, Twitter blew up, and all the media outlets got their own polling aggregators. People started trying to spin the polls and unskew the polls and all this. And I quickly realized, "Oh, this is just another battlefield for more politics." I suppose I was probably naive.

What impact does partisan weighting have? Or, why do people think that a sample that includes more Democrats is skewed?

We get this a lot, especially in North Carolina, because we're based out of North Carolina. And North Carolina is one of those states that people think of as a swing state or maybe even a pink state.

And that's pretty much true, but it's also an interesting state in the sense that if you look at the Secretary of State statistics for North Carolina, there are more registered Democrats here than registered Republicans by a lot. And that's partly because it's kind of a Southern state, and it's one of those interesting states where you've got a lot of people that registered as a Democrat in the '70s or the '80s or whatever, and they're still registered as a Democrat, but we know they're probably not voting in Democratic primaries anymore. Or maybe they think of themselves as a Democrat, but not a "Pelosi" Democrat.

But what that boils down to is that when you're creating your composition for your poll sample in North Carolina, you do want to have more Democrats in there. That's correct; it doesn't mean the poll's wrong.

But we get people who want to try and discredit our poll, emailing us and saying, "You guys are trying to put out this fake poll. Your poll has more Democrats in it. Everybody knows that North Carolina is not a blue state. You guys are cheating." It's not true. Kentucky is another example of that. It's even more pronounced.

Is that what's called ancestral Democrats?

Yeah — there's a ton of them in Kentucky, but as we all know, on the federal level, Kentucky is one of the reddest states now. But they'll still like the Democratic governor, and they'll still call themselves a Democrat in the poll. So if you don't know what you're talking about, or you want to be willfully obtuse, you could say, "Hey, I think this is suspicious. They have more Democrats here than Republicans. That doesn't seem right for Kentucky." But if you know better, you know that it is right.

Digging into crosstabs, what are they and how can you understand them?

What a crosstab does is tell you what an individual subgroup said on any given question in a poll. So if you've got a question on your poll, and you're like, "Okay, well I want to know what only that specific group of people think," that’s when you go into the crosstabs.

It's a quick way to drill down and say, “Okay, I'm running for office. I know that this poll says I'm up by 1, but what's my gender gap? Women, I have a 15-point lead with women. I have a 13-point deficit with men, but it looks like 20% of women are undecided and only 9% of men are undecided. So, that makes me think I still have room to grow my standing with women, and I may be able to bring some undecided voters to my side.” That's what people use crosstabs for; to get a more specific understanding.

we-did-a-poll-no-one-likes-you

How can you identify whether a pollster is partisan or nonpartisan?

First of all, they'll usually identify it on their website, but most polling outfits that are not associated with a media group or a university or college are going to be partisan.

In other words, if they're a private company, most of the time they're going to be working for one side or the other. Just like somebody that makes political campaign commercials is probably going to work for Democrats or Republicans. Or somebody that does direct mail for campaigns is probably going to work for Democrats or Republicans.

And there's a lot of reasons why that makes sense. First and foremost, maybe if I'm someone that's trying to hire someone, I want to know that they share my values. But also, I want to know that they know how to connect with the voters who are voting in my Democratic primary.

And they're used to it, and they've done it before; they know what works and what doesn't. If I'm a Democrat, I don't want to hire a Republican. Can I trust them? But it’s an easy way to just try and discredit the poll if you want to.

We get this a lot as a Democratic calling outfit. We can't tell you how many times someone's put out a press release of state Republican parties saying, "Well, this is just a Democrat-funded poll. It's not worth the paper it's printed on because they only work for Democrats. So of course they're going to put out a poll that Democrats like." That's a cheap, easy way to try to discredit the poll.

Right, people will say, "Even Fox News says XYZ."

At the same time, Fox and us, if we only put out polls that make “our side” happy, we wouldn't be staying in business for very long.

It does go to credibility. So speaking of partisan lean, how can the way a question is worded impact the response that is given?

A whole lot. It really matters how you ask a question. I mean, it's so easy to write a question in a way that just adding a word or two will change the dynamic of a question. If you want to try and get an answer, you can do it.

I think I've even seen the cancel culture polls that were like, "Well, do you think cancel culture is a good thing or bad thing?" And it's like... what are you actually asking?

Yeah. Well, that's such a good example of something where it really matters howyou characterize it. Because I think a lot of people — even now — would be like, "I'm not sure I understand what you mean by cancel culture."

I'll give you two examples of how you could phrase that question:

"Do you support or oppose someone's life and career being ruined due to one statement they make online?" A lot of people would say, "Well, no, that doesn't seem very fair."

Or we could say, "Don't you think people should bear responsibility for what they write on the internet? Even if they claim they're joking?” A lot of people would say, "Well, yes. I do think so. I think people should have responsibility for what they post online."

Last question: how often do people lie to pollsters?

I think it does happen sometimes. There's a debate about how much. It's hard to prove. A lot of people think of live call polls as the gold standard of polling, as opposed to doing it online, or with your touch-tone, or on text message, because you're actually talking to someone.

But there are studies that have shown that when people are actually talking to another human being, they will give a more socially desirable answer.
I-didnt-do-this

And that even extended to Trump last time. And we saw this a little bit on our polls. We gathered a lot of our data using IVR [interactive voice response] polling, which is recorded. And we saw this even earlier in the decade before gay marriage was done. We would poll on it to try and show people are fine with it. But what we were finding on a lot of our polls is, we'd do it in Missouri or whatever, and we'd get the results back, and we'd say, "Huh. This is kind of disappointing. I'm looking at this other poll that someone else did in Missouri three months ago, and our numbers are a little bit worse." And we'd be like, "Huh. Why is that? That's frustrating." And then slowly we realized, oh, people feel more comfortable saying, "No, I'm not for gay marriage," if they don't have to say it out loud to another person.

This has been so helpful and I learned a ton. Thanks so much!

This interview has been condensed and edited for clarity

This year at Guru, we’ve broadened our mandate to connect people who have questions with trusted answers. And because there’s a ton of honest confusion and misinformation surrounding the 2020 election, we’re here to help in the way we know best: by putting together resources from verified sources and subject matter experts to clarify things!

register-to-vote-hero.png

When we went digging, we found out that it was tough to find answers to simple questions like “What’s the latest I can register to vote in my state?” and “What is appropriate sampling for a poll, and why is everyone yelling about it?” So we decided to start at the top. We pulled together mail-in and online (where available) voter registration deadlines by state, and interviewed Jim Williams, a pollster with Public Policy Polling.

Find your state’s voter registration deadline

There’s a timeworn political saying: “The only poll that counts is the one taken on election day.” In 2016, almost half of eligible voters… didn’t vote. So let’s try and fix that together! Share this list of deadlines with a friend or colleague to help everyone exercise their hard-won right to vote.

How to read and understand political polls

Unskew the polls. But what does 538 say? We can ignore that poll; it seems like an outlier. What do you really know about political polls? There’s more nuance to them than you might think, and everything from the kind of day a person is having to the way the question is asked can influence the interpretation of and narrative around the numbers. Plus, it turns out, there are some pretty basic things — like the margin of error — that even seasoned pros tend to get wrong.

We sat down with Jim Williams from Public Policy Polling (PPP) to clear up the stuff that confuses casual observers, political junkies, and everyone in between.

What are the biggest misconceptions about what polls indicate?

Jim Williams: One of the biggest misconceptions is that if you look at a poll, it's going to tell you who will win the election. “Oh, Obama's up three? Well, then he's going to win.” And well, no, that's not what the poll says.

What the polls actually say is something like, “At this time in late August, 95 out of 100 times a poll conducted under this methodology is going to have a result that's within ±4 of these results.”

So it's a snapshot, but it's also not the end all be all. Not just for that reason; there's a margin of error, and the fact that this is based on what 500 people said last Thursday and Friday.

Right. With everything going on in their lives at that point.

Yeah. And we saw it a lot in 2016 with Trump and Hillary. Depending on what was going on in the news, Trump either looked worse or better. When really bad stories came out about Trump, the polls looked bad. But then when they kind of faded away, it went back to Trump looking pretty close to being tied.

And it's unfortunate for Hillary Clinton that the election came at a point in time where Trump was kind of in calm seas for himself. And meanwhile, afterward, a lot of people are like, "I can't believe it. The polls said all year that Hillary was winning."

And it's like, "Well, yes and no. What the polls showed was that it was close all year for most of the time.”
Error in national polls was historically low in 2016

Source: Pew Research Center

So what is the appropriate sample for a local state or federal election? I guess what I'm really asking is is there a specific percentage that you should look for based on the overall populace?

No. It's more about getting to a certain threshold, such as 500 [poll completions]; that's what we try to get at PPP on most of our polls. At least 500. That's what I promise clients I'm going to get for them. That gets you a margin of error in the low 4s. And I would say, if someone were to say to me, what's the industry standard for the number of responses, I'd say, "Oh, around 500."

That's what you’d aim for with a typical Democrat versus Republican horse race poll in Nebraska, or even a national or congressional poll. But then there are also scenarios where it's okay to get less than 500. Like if you're trying to get a sample of African-Americans, it's difficult and expensive to get 500 of them. So maybe in a situation like that, you're more okay with 300, if you're doing an oversample. Or if you're in a really challenging environment where you're trying to poll a state house race in North Carolina in the Democratic primary.

So you're like, “Well geez, only 9,000 people even vote in these things normally. How am I going to get 500?” And in that case, what we tell clients is, "Hey, we're going to do our best. We like to get at least 300 in this scenario. We're going to do as best as we can for you. I can't promise you 500."

So if you're trying to pull left-handed people with brown eyes, maybe you don't want to pay to get 500. Maybe, in that case, 300 is okay. A margin of error ±6.4 is fine. You can live with that. You don't need 4.1 because it's a harder target situation.

margin-of-error

I know most people misunderstand what margin of error actually means. In fact, I did up until 2018!

So if a poll says Obama 49, Romney 47, and the margin of error is ±3, Obama could be at 52 or he could be at 46, and Romney could be at 50 or 43.

Reporters misconstrue that all the time, where they say things like, "Biden's leading by three, but it's inside the poll's margin of error." And I read that, and I'm like, "I know what you think you're saying, and you're not wrong, but you still don't really know what margin of error means when you write that.”

I personally think that the margin of error is one of the most over-discussed and overrated metrics with regard to polls. Your poll is going to be probably around 500 completes. It doesn't really matter if your margin of error is 5.5 or 3.9. It doesn't matter.  That doesn't even help you understand the poll that much. It's going to be 4 or it's going to be 5 or it's going to be 3.9. People harp on it too much.

I think it's probably a way for people to think they can trust or dismiss a poll.

Yeah. And we deal [with that] a lot, and everybody does this. People trying to discredit polls. I used to work on political campaigns and I very foolishly thought, in 2011, when I made the jump from working on campaigns and working in state government to working in polling, that I was going to now be above the fray. And like, "I'm just dealing with numbers." I don't have to engage in as much of the spin or the fighting.

Not true at all, especially because right around that time was when the rise of polling aggregators came and people started obsessing about polls a lot more, Twitter blew up, and all the media outlets got their own polling aggregators. People started trying to spin the polls and unskew the polls and all this. And I quickly realized, "Oh, this is just another battlefield for more politics." I suppose I was probably naive.

What impact does partisan weighting have? Or, why do people think that a sample that includes more Democrats is skewed?

We get this a lot, especially in North Carolina, because we're based out of North Carolina. And North Carolina is one of those states that people think of as a swing state or maybe even a pink state.

And that's pretty much true, but it's also an interesting state in the sense that if you look at the Secretary of State statistics for North Carolina, there are more registered Democrats here than registered Republicans by a lot. And that's partly because it's kind of a Southern state, and it's one of those interesting states where you've got a lot of people that registered as a Democrat in the '70s or the '80s or whatever, and they're still registered as a Democrat, but we know they're probably not voting in Democratic primaries anymore. Or maybe they think of themselves as a Democrat, but not a "Pelosi" Democrat.

But what that boils down to is that when you're creating your composition for your poll sample in North Carolina, you do want to have more Democrats in there. That's correct; it doesn't mean the poll's wrong.

But we get people who want to try and discredit our poll, emailing us and saying, "You guys are trying to put out this fake poll. Your poll has more Democrats in it. Everybody knows that North Carolina is not a blue state. You guys are cheating." It's not true. Kentucky is another example of that. It's even more pronounced.

Is that what's called ancestral Democrats?

Yeah — there's a ton of them in Kentucky, but as we all know, on the federal level, Kentucky is one of the reddest states now. But they'll still like the Democratic governor, and they'll still call themselves a Democrat in the poll. So if you don't know what you're talking about, or you want to be willfully obtuse, you could say, "Hey, I think this is suspicious. They have more Democrats here than Republicans. That doesn't seem right for Kentucky." But if you know better, you know that it is right.

Digging into crosstabs, what are they and how can you understand them?

What a crosstab does is tell you what an individual subgroup said on any given question in a poll. So if you've got a question on your poll, and you're like, "Okay, well I want to know what only that specific group of people think," that’s when you go into the crosstabs.

It's a quick way to drill down and say, “Okay, I'm running for office. I know that this poll says I'm up by 1, but what's my gender gap? Women, I have a 15-point lead with women. I have a 13-point deficit with men, but it looks like 20% of women are undecided and only 9% of men are undecided. So, that makes me think I still have room to grow my standing with women, and I may be able to bring some undecided voters to my side.” That's what people use crosstabs for; to get a more specific understanding.

we-did-a-poll-no-one-likes-you

How can you identify whether a pollster is partisan or nonpartisan?

First of all, they'll usually identify it on their website, but most polling outfits that are not associated with a media group or a university or college are going to be partisan.

In other words, if they're a private company, most of the time they're going to be working for one side or the other. Just like somebody that makes political campaign commercials is probably going to work for Democrats or Republicans. Or somebody that does direct mail for campaigns is probably going to work for Democrats or Republicans.

And there's a lot of reasons why that makes sense. First and foremost, maybe if I'm someone that's trying to hire someone, I want to know that they share my values. But also, I want to know that they know how to connect with the voters who are voting in my Democratic primary.

And they're used to it, and they've done it before; they know what works and what doesn't. If I'm a Democrat, I don't want to hire a Republican. Can I trust them? But it’s an easy way to just try and discredit the poll if you want to.

We get this a lot as a Democratic calling outfit. We can't tell you how many times someone's put out a press release of state Republican parties saying, "Well, this is just a Democrat-funded poll. It's not worth the paper it's printed on because they only work for Democrats. So of course they're going to put out a poll that Democrats like." That's a cheap, easy way to try to discredit the poll.

Right, people will say, "Even Fox News says XYZ."

At the same time, Fox and us, if we only put out polls that make “our side” happy, we wouldn't be staying in business for very long.

It does go to credibility. So speaking of partisan lean, how can the way a question is worded impact the response that is given?

A whole lot. It really matters how you ask a question. I mean, it's so easy to write a question in a way that just adding a word or two will change the dynamic of a question. If you want to try and get an answer, you can do it.

I think I've even seen the cancel culture polls that were like, "Well, do you think cancel culture is a good thing or bad thing?" And it's like... what are you actually asking?

Yeah. Well, that's such a good example of something where it really matters howyou characterize it. Because I think a lot of people — even now — would be like, "I'm not sure I understand what you mean by cancel culture."

I'll give you two examples of how you could phrase that question:

"Do you support or oppose someone's life and career being ruined due to one statement they make online?" A lot of people would say, "Well, no, that doesn't seem very fair."

Or we could say, "Don't you think people should bear responsibility for what they write on the internet? Even if they claim they're joking?” A lot of people would say, "Well, yes. I do think so. I think people should have responsibility for what they post online."

Last question: how often do people lie to pollsters?

I think it does happen sometimes. There's a debate about how much. It's hard to prove. A lot of people think of live call polls as the gold standard of polling, as opposed to doing it online, or with your touch-tone, or on text message, because you're actually talking to someone.

But there are studies that have shown that when people are actually talking to another human being, they will give a more socially desirable answer.
I-didnt-do-this

And that even extended to Trump last time. And we saw this a little bit on our polls. We gathered a lot of our data using IVR [interactive voice response] polling, which is recorded. And we saw this even earlier in the decade before gay marriage was done. We would poll on it to try and show people are fine with it. But what we were finding on a lot of our polls is, we'd do it in Missouri or whatever, and we'd get the results back, and we'd say, "Huh. This is kind of disappointing. I'm looking at this other poll that someone else did in Missouri three months ago, and our numbers are a little bit worse." And we'd be like, "Huh. Why is that? That's frustrating." And then slowly we realized, oh, people feel more comfortable saying, "No, I'm not for gay marriage," if they don't have to say it out loud to another person.

This has been so helpful and I learned a ton. Thanks so much!

This interview has been condensed and edited for clarity

Experience the power of the Guru platform firsthand – take our interactive product tour
Take a tour