Do Polls Still Work If People Don't Answer Their Phones?

"The jury's out" on whether historically low response rates could make surveys less accurate.
It's getting harder to get Americans to answer their phones, and many of them no longer have landlines. But some data shows polls are still as accurate as they were before.
It's getting harder to get Americans to answer their phones, and many of them no longer have landlines. But some data shows polls are still as accurate as they were before.
Simon Battensby via Getty Images

Pollsters face a growing obstacle in gathering Americans’ opinions: getting people to answer their calls. The proportion of people called who answer the survey -- in pollster jargon, the "response rate" -- has dropped dramatically over the last few decades as Americans have changed how they interact with the world.

The proliferation of text messages, emails and social media is relegating unsolicited phone calls mostly to the realm of telemarketing, so it’s no surprise that fewer people are inclined to talk on the phone with a stranger. Even worse for pollsters, landline telephones, which are quicker and cheaper to call, have been steadily disappearing over the last 20 years. Almost half of the American population is now solely reliant on mobile phones.

All of these factors have resulted in less efficient polling than in the past. A 2012 report from Pew Research, one of the nation's most reputable pollsters, showed a 9 percent response rate, compared to nearly 40 percent in the late 1990s. The vast majority of pollsters don’t even report their response rates.

Americans' increasing unwillingness to answer their phones makes conducting telephone polls trickier and more expensive. But whether it also makes surveys less accurate hinges on something called "nonresponse bias" -- whether the people who answer polls, as a group, hold different opinions from the ones who don't.

So far, that hasn't really been the case.

"If four out of five people hang up on you, it doesn't make any difference compared to the olden days when only one out of five hung up on you, insofar as the people who hang up are fairly random to the people who don't hang up," David Dutwin, the executive vice president and chief methodologist for the survey firm SSRS, said. "And as shocking as it may seem, the research really shows that that's more true than false."

He pointed to data, first published in The Washington Post, that shows little change in the accuracy of several major national surveys, despite plummeting response rates.

David Dutwin, via The Washington Post

"[T]he mere existence of low response rates doesn’t tell us anything about whether or not nonresponse bias exists," Scott Keeter, the Pew Research Center's director of survey research, wrote last summer. "In fact, numerous studies, including our own, have found that the response rate in and of itself is not a good measure of survey quality, and that thus far, nonresponse bias is a manageable problem."

Keeter noted that some nonresponse bias does exist: less-educated people and people of color are less likely to participate in polls, and people who aren't politically active are less likely to answer political surveys. But, he concluded, "most of these biases can be corrected through demographic weighting of the sort that is nearly universally used by pollsters."

At the same time, we don’t know why it’s not more of a problem. As Cliff Zukin, professor of political science and former president of the American Association for Public Opinion Research, wrote last June: “Strangely, for some reason that no one really understands, well-done probability samples seem to have retained their representative character despite the meager response rate.”

That means that response rates remain a concern, and pollsters have attempted to tackle the problem in different ways. Some pollsters are using online methods, which face a different but related set of challenges in recruiting people to complete polls. Telephone polls are increasingly relying on calls to cell phones.

“We’ve been watching response rates go down and down for years -- this is not a new challenge for the industry,” said Mollyann Brodie, executive director for public opinion and survey research at the Kaiser Family Foundation and current president of the American Association for Public Opinion Research. “The jury’s out on whether it’s related to survey error, though. We’re all paying close attention and doing what we can to turn it around.”

While the issue remains a looming problem for the polling industry, there's little evidence that it had much of an impact on the latest high-profile polling miss -- the Iowa Republican caucus. Instead, data suggests much of the blame likely rests on inaccurate assumptions of who would turn out to vote, and a "seismic shift" in opinion during the final days of the campaign. While pollsters and forecasters missed the outcome, many warned in advance that the results would be especially hard to predict.

"[I]n almost every election for a long time, there's been polls that have missed at state levels in primaries. Primaries are always particularly challenging," Dutwin said. "Simply put, there's a lot of reasons why a poll can go wrong, and non-response is just not the major contributor."

Brodie also cautioned against drawing too many conclusions from the results in Iowa.

"That would be like trying to predict the NBA championship based on the first quarter of the first game of the season," she said. "We know what we usually know -- Iowa is hard. We don’t know anything else about how polling will go this election season.”

Popular in the Community

Close

What's Hot