As the country heads into another fraught election cycle, you can easily get fooled if you don’t know how to interpret those polling results.
In the past week, most presidential election polls have shown Democratic candidate Joe Biden leading in his race against President Trump. Some pollsters found the margin to be thin: anywhere from 1 percentage point to 6 or 8. Others, crunching their numbers differently or crunching different numbers, have found overwhelming public preference for Biden, with leads of 17, 18, even 25 points. One poll found Biden to be trailing Trump by 1 point. Among election junkies, in newsrooms and on social media, each of these polling results makes for a likely conversation topic, headline, or tweet. Everyone likes to watch the horse race, and polling numbers are as close as anyone has to live running commentary on how their horse is doing.
Paying attention to poll results isn’t just a way to turn the democratic process into a sport. Polling results may also influence voter behavior. Some experts attribute the low voter turnout of the 1996 presidential election to polls showing Bill Clinton leading Bob Dole by a wide margin. Now, President Trump’s deficit in the polls has created some conversation about whether he should drop out of the race. Conversely, close polling numbers may stir excitement and motivate people to head to the voting booth. For something with such a firm grip on the news cycle and voter behavior, though, polls themselves often look mysterious and murky to the general public. The numbers arrive daily from the mist, and few people really question how they got there.
That vagueness, while understandable, is also easy to exploit for political gain. Not all polling data is of equal value and quality—far from it. Take polls reporting that Biden is leading Trump by 2 percentage points. “That’s meaningless,” says Jennifer Stromer-Galley, who researches digital politics and campaigning at Syracuse University. “If it’s a good poll, then it’s within the margin of error, which is usually 3 to 5 percentage points.”
Even polling results that are outside of the margin of error don’t necessarily mean what you’d think they mean. In 2016 almost every poll gave Hillary Clinton the lead over Trump, but it didn’t matter. Polls give a snapshot of popular opinion, and perhaps even a glimpse of the popular vote, but US elections are determined by the Electoral College, which can tip outcomes in one candidate’s favor thanks to a relatively small number of ballots. “The perception is that polls have become less accurate recently,” says Michael Traugott, who studies campaigns, elections, and survey methods at the University of Michigan. “But that’s only because the outcome of the popular vote doesn’t necessarily indicate the winner of the Electoral College.” Trump only outperformed his polling numbers by a few percentage points in most states, but that small uptick was all he needed.
Upsets like the one that happened in the 2016 election are the reason people like Stromer-Galley and Traugott often think about polling numbers as little more than sophisticated, data-rich clickbait. “At the end of the day, the news media uses public opinion polls to drive stories,” Stromer-Galley says. “They’re more a device used by journalists to capture attention around a story than a meaningful, newsworthy piece of information.” While that’s concerning, it’s not likely to change. So it’s important for people to be able to distinguish between good and bad polls themselves, and avoid getting swept up in the hype.
Unfortunately, it’s extremely hard for average schmoes to assess the quality of polling data. Researchers have conducted a variety of studies on how much people value source or methodology when it comes to how much they trust polls, but at the end of the day people put the most faith in one thing: the results. “People see polls they agree with as more credible,” says Gabriel Madson, who studies American political behavior at Duke University. “Both sides do this. It’s not unique to a certain population. Everyone is biased. It’s not great.” When you approach evaluating a poll for its rigor and credibility, it’s probably best to take your own prejudices as a given and push past them.
First rule: Never trust a poll done by a campaign. Ditto internet polls. “A convenience poll on a website asking who you think is going to win on election day isn’t a random sample,” Stromer-Galley says. High-quality public opinion polls tend to come from respected news media, universities, and national polling firms. “With polling you get what you pay for,” says Traugott. “We see larger errors in statewide polls conducted by smaller firms with less rigorous methods, often on the internet because it’s cheap.”
Other key factors to consider are sample size and sampling method. According to Stromer-Galley, 1,200 respondents is a good size for a national poll using a random sample of Americans. Also consider who the respondents are. Polls tend to either sample all adults, registered voters, or likely voters. “There is no standard measure of likelihood of voting,” says Traugott. “It’s the secret sauce of preelection polling.” Because not everyone votes, assessing who is likely to vote is a pretty important part of a meaningful poll, but it’s also educated guesswork, and biases (methodological and systemic) can creep in. A sample of likely voters, for example, typically skews in favor of Republican candidates, because your likelihood to vote depends on factors like age and socioeconomic status, and Republican voters tend to be older and wealthier Sampling methods can also skew the data. If pollsters are calling landlines, they’re undersampling Democrats, who tend to be younger and more likely to be minorities. A good poll will publish its methodology, the proportion of cell phones to landlines called, its margin of error, its response rate. Bad polls have something to hide.
If that sounds like a lot of digging through fine print, it is. “I don’t think we should expect people to be able to figure out if an individual poll is good or bad,” Madson says. All of the experts WIRED spoke to had the same practical advice: Don’t just look at one poll, look at a poll aggregator. A poll aggregator tracks individual polls and draws all their results together for easy apples-to-apples comparison or by averaging them. Their favorites are Nate Silver’s FiveThirtyEight and RealClearPolitics. The idea is that seeing the polls as a group will give you a better understanding of what the trends and outliers in public sentiment are.
Ultimately, you probably shouldn’t put all of your trust in polls. Response rates are dropping all the time because people won’t pick up calls from unknown numbers and because the pace of the news cycle means that pollsters often only have a few days to collect their data before it becomes obsolete. They’re impressions of sentiment at a particular moment, not prophecies. “The public should vote based on their understanding of a candidate’s policy decisions and their character to lead, not public opinion polls,” says Stromer-Galley. “Isn’t that an ‘Eat your broccoli’ sort of position?” It is, but it’s also about time American politics ate a vegetable.
All Rights Reserved for Emma Grey Ellis