How Reliable Are Election Polls?
Polls are constantly in the media spotlight during election season. Some voters follow the latest statistics religiously, while others tend to be skeptical of most results. Suffice to say that reliable polling is necessary not just to maintain a valuable political process, but to quell doubts as well.
Why do critics dismiss polls as inaccurate?
There are several primary reasons people take issue with polls:
- Distrust of the Media: Perhaps the most pervasive issue is a general distrust of news sources around the country. A network may publish recent polling data, but if a large part of the public doesn’t trust the network, they’re unlikely to trust it. According to a Pew report published in August, nearly two-thirds of U.S. adults say “they do not feel particularly loyal to the outlets they get their news from.” Over 50% believe the outlets “don’t care about the people they report on.” If Americans put more faith in their news sources, odds are they would have an easier time believing the data they put out.
- Political Affiliation: There is a significant difference in opinion about poll accuracy based along party lines. In general, Democrats tend to be more trustful of public opinion polls than Republicans. According to a 2017 study, almost 50% of Democrats have “a great deal/a good amount” of trust, as opposed to the 26% of Republicans. Meanwhile, three-quarters of Republicans distrust polls “very much/at all”. Due to the pressure to remain within party lines, this is a narrative that’s likely to continue.
- Suspicion Of Lying Participants: There is always a chance of dishonest voters. This fear was heightened by “secret Trump voters” in 2016 who lied out of embarrassment, doubt, or other reasons. However, the data on just how many of his supporters actually did this is thin, so it’s not an immediate worry to pollsters. Still, it has been cemented as a prominent flaw in the eyes of many voters.
How accurate were polls in 2016?
This is certainly the most talked-about reason when questioning the reliability of public opinion polls. The majority of mainstream pollsters predicted a Clinton victory, yet Trump still won. Although this seems simple enough, the truth is actually more complicated.
- Popular Vote: Although the President has repeatedly disputed it, Hillary Clinton won the popular vote. Based on national polls, RealClearPolitics predicted a Clinton victory by 3.2 percentage points. She ended up winning by 2.1 points, a 1.1 difference well within the expected margin of error for most elections. On the national level, polls were clearly successful. However, many people tend to forget about the popular vote and focus solely on who ended up in the White House.
- State Polling: Polling of individual states was, unfortunately, much less precise in 2016. Battleground states that tipped the scales in Trump’s favor were originally predicted to support Clinton. The inaccuracies were based on several reasons:
- Undecided Voters: Many voters who had answered polls as “undecided”, went with Trump on Election Day. The pollsters can collect data on how many people are undecided, but they cannot predict which candidate they will ultimately vote for. An argument has been made that some voters were, at the time, embarrassed to tell pollsters that they supported Trump. However, this idea is mostly disputed, due to the fact that online polls had similar numbers to the polls taken in person.
- Likely Voters: “Likely voters” are people whose answers make it clear they are uncertain about whether they will actually vote at all. According to the 2016 polls, this group favored Clinton. However, not only did some of those voters clearly not vote, voter turnout as a whole was the lowest it’s been since 1996.
- Educated Voters: Typically, voters with higher educations (college or above) are more likely to answer polls than those with lower educations (high school or below). Clinton held much more support from higher-educated voters than Trump did. Conversely, lower-educated voters tended to support Trump. Thus, the state polls that didn’t take this into account were easily skewed in Clinton’s favor.
Moments like these require unrelenting truthtelling. We take pride in being reader-funded. If you like our work, support our journalism.
How have polls improved and what else can be done?
Since the 2016 election, pollsters have begun trying new strategies to avoid drastic inaccuracies. For the midterms in 2018, The New York Times started tracking polls in real-time, posting data online for the public to see. They marketed the idea as such:
- You’ll see the poll results at the same time we do. You’ll see our exact assumptions about who will turn out, where we’re calling and whether someone is picking up. You’ll see what the results might have been had we made different choices.
This was done to display more transparency about the polling process, in hopes of increasing trust from voters.
After finding them increasingly problematic, AP and Fox News quit funding traditional exit polls. Instead, they spearheaded a new program called VoteCast, also introduced for the 2018 midterms. This time, AP and Fox conducted interviews eight days before the election, all the way through Election Day. They then used this data to determine how various demographics voted. Ultimately, the 2018 VoteCast polls correctly predicted over 90% of the Senate and governor elections.
In 2016, Pew released a study on “likely voter” polls and how they could be remodeled for greater accuracy. What they found was a need to focus on certain strategies that had been underutilized by mainstream pollsters:
- Probability: Including probability was a key aspect to the process. Probability can be based on how likely participants were to actually end up voting. Introducing a cutoff in the likelihood factor would also save time, ensuring pollsters count data that they are relatively certain is representative of an electorate. Excluding an emphasis on probability opens up the potential for worthless data–especially in certain geographic regions where voter turnout is worse than the national average.
- Targeted Samples: Surveys taken solely of registered voters are often much more helpful in gathering information about an electorate. This includes more verifiable records of how people vote. Campaign researchers are often hired to seek out these kinds of specific demographics. However, this targeted polling data is rarely released to the public, as the surveys are conducted through a private organization. If more mainstream pollsters would use this strategy, they would likely increase the accuracy of their results.
The Australian news site The Conversation published an article suggesting one straightforward approach: pay participants. The promise of money may convince more people to divulge their choices for an upcoming election, providing better representation for their demographic. However, current funding for public news polls is minimal. Without significant evidence to back it up, there seems little chance that outlets will be adopting this strategy anytime soon.
The Rantt Rundown
Although voters have varying degrees of confidence in them, electorate polls are still relevant to the political environment. They are often distrusted not purely because of their inaccuracies, but rather due to a person’s preconceived doubts. Analysts are still finding ways to improve on an admittedly flawed system. That said, dismissing polls’ significance altogether is a mistake that ignores pollsters’ hard work to help inform the American public.