Americans aren’t known for their patience. A poll conducted a few years ago for a banking corporation found that 96% of Americans “will knowingly consume extremely hot food or drink that burns their mouth” and 63% do so frequently.
Most than half of us hang up the phone after being on hold one minute or less and more than half of Americans admitted to honking their horns after the light turns green, the poll conducted in 2015 by Fifth Third Bankcorp found.
In other words, many if not most of us will knowingly create pain for ourselves and annoy others rather than wait even a few minutes for our coffee to cool off or traffic to start moving.
So it shouldn’t be a revelation that Americans want to know before Election Day who is going to win and we’re quick with recriminations when the polls we rely on to predict the outcome are “wrong.”
For the second presidential race in a row, news commentators and others were wringing their hands over “incorrect” polls the day after Election Day, as votes were still being counted.
It’s easy to point to examples where polls, on average, overestimated Joe Biden’s strength or under-estimated President Trump’s. In Iowa, the average of polls leading up to the election showed Trump leading by 2 percentage points. He ended up winning by 8 points. Only the Des Moines Register/Mediacom poll was “right” in the sense that it showed Trump leading Biden by 7 points.
There are real differences in how polls are conducted. Many pollsters try to make their data conform to an expectation of the size and makeup of the electorate that is based on history. That can be effective — unless voters do something different.
Ann Selzer, the Register’s top-notch pollster, discussed the disparity during a recent “Iowa Press” interview:
“My approach to polling is to think about it as polling forward, that is I don’t want to get in the way of my data revealing to me what is happening. So I don’t want to make assumptions, I don’t want to make any judgments about what is the right outcome or the wrong outcome. There are a lot of other polling outfits that decide how they are going to weight their data by looking backwards to see, ‘what was I expecting’ and how would they have arrived at that except to look at past elections and sort of embed those assumptions into their data. I call that polling backward.”
Perhaps polling overall would be more accurate if more pollsters adopted Selzer’s philosophy. Unfortunately, quality polling is expensive and many media organizations are just as happy to publish the results of cheap surveys with shadowy methodology.
There are issues all pollsters have to contend with, including rapidly changing communication technologies and habits. Many eventually adjusted to the lack of phone landlines in many American households, but now many younger voters won’t respond to phone calls at all and can only be reached by text.
Observers have also pointed to the reluctance of Trump voters, and to some extent Republicans in general, to respond to polls or tell the truth about their preferences. It would stand to reason that voters who are fed a steady diet of criticism and claims of bias against mainstream media would also distrust polling. But the concern isn’t particularly well-documented. As Selzer pointed out, if people won’t answer polls, it’s impossible to know what they’re thinking.
Despite these and other issues, polling overall isn’t significantly less accurate than it has been in the past, Nate Silver of FiveThirtyEight says. Pre-election polls are typically several percentage points off the actual election results, and this year is likely no different, he said in a recent podcast.
“In national polls and in key swing states, the simple fact is on average is that polls miss by 3 points so if we’re 3.5 or 4, that’s pretty normal,” Silver said.
What continues to be out of whack is our expectations. We may know, intellectually, that polls are not predictions. Yet we can’t seem to help treating pre-election surveys as a glimpse into the future rather than a snapshot of the recent past.
We all have an interest in the future of polling, because public opinion research often drives policy. It’s important for pollsters and those who pay them to continue working to improve. We in the media need to do a better job of explaining how polls are conducted and their limitations.
But we shouldn’t let unrealistic expectations, media distrust or our impatience for election results undermine our best tool for learning why people vote the way they do. It’s ridiculous to keep getting burned because we can’t wait a few minutes for our pizza to cool off, but it’s insane to just quit eating because we expect all food to hurt us.