2012 Polling Accuracy: Right Winner, But Different Trends

The Polls of 2012: Were They Really Right?

WASHINGTON -- The 2012 elections were widely regarded as victory for pollsters and poll aggregators , and for good reason. Most of the polls at the end of the campaign, especially in the critical swing states, correctly forecast President Barack Obama as the winner.

But not everyone agreed. "We spent a whole bunch of time," Obama campaign manager Jim Messina said after the election, "figuring out that American polling is broken."

Now that the vote count is nearly complete -- with all but two states, New York and Hawaii, having certified their election results -- the full story appears to fall somewhere in between. While the public polls collectively predicted an Obama victory, they also understated Obama's margins, both nationally and in a half-dozen battleground states. And, perhaps more importantly, some polls told very different stories about how much voter preferences shifted over the final weeks of the campaign.

On Tuesday, the National Council on Public Polls released its biennial report on polling accuracy. "Generally speaking, the national and state polls this year did okay," NCPP president Evans Witt told The Huffington Post, though "they didn't come quite as close to matching the election results as they had, for example, in 2008."

The average error for final week estimates of support for each candidate as calculated by NCPP was slightly higher in national polls this year (1.4 percentage points) than in 2008 (0.9), 2004 (0.9) or 2000 (1.1), but was lower than in 1996 (2.1), 1984 (2.4) or 1980 (3.1).

2012-12-21-NCPPscores.png

At the state level, the story was a little different. Statewide surveys conducted over the final week showed slightly more error than national polls, as they usually do, but the average errors for 2012 (1.9) were barely distinguishable from the last two presidential elections in 2008 (1.8) and 2004 (1.7).

The NCPP scores quantify the size of polling errors regardless of direction. A more intuitive way to consider the accuracy of all polls is to calculate statistical bias (whether the poll averages tended to err in either Obama's or Mitt Romney's favor).

With virtually all votes now counted, the Obama margin of victory over Romney now stands at more than 4.7 million votes, or a margin of 3.7 percentage points (51 percent to 47.3 percent). A simple average of the final-week national polls by each of 18 organizations had Obama leading, but by a smaller 1.4 percent margin. Moreover, most of the individual polls understated Obama's margin, with just five organizations producing final polls showing Obama winning by 3 points to 4 points.

2012-12-21-nationalpollstable.png

The average statistical bias in final-week national polls was the largest since 1996, when less than half as many polls were conducted in the final week (nine) as this year (18).

2012-12-21-pollbiassince1980.png

Public polls in the swing states also tended to understate Obama's margins, but the pattern was less consistent. The errors tended to be bigger in the swing states that Obama won by margins of 5 percentage points or better. Compared with the Pollster tracking model, errors in the margin were 3 percentage points or higher in Colorado, New Hampshire, Nevada and Iowa, and 2 percentage points or greater in Wisconsin and Virginia.

2012-12-21-statepollerror.png

The bigger issue, however, may have been the conflicting stories that polls told about trends over the final weeks of the campaign. "The problem with the polls this year," Witt explained, is "that there were divergent narratives about what was going on."

Three charts illustrate this issue. First, consider the vote preference numbers reported by three organizations that used live interviewers, dual samples of both landline and mobile telephone numbers and a Gallup-style "likely voter model" to determine the likely electorate.

The final surveys by Pew Research, ABC News/Washington Post and Gallup all showed a similar trend following the first debate. All three showed Obama trailing Romney narrowly in mid-October, and all three showed a gain of 3 percentage points to 4 points by Obama over the final week of the campaign. Many observers have attributed this late trend to a "surge in positive coverage" of Obama's handling of Hurricane Sandy during the final week of the campaign.

2012-12-21-ABCGallupPew.png

Gallup famously showed consistently worse results for Obama over the course of the campaign, while the final Pew Research and ABC/Post polls came within a single percentage point of a perfect forecast of the outcome. Despite the wide disparity in the level of support, however, the three pollsters reported a common trend in late October.

But another national pollster told a different story. The YouGov/Economist tracking poll, which selected respondents from an "opt-in" Internet panel using a methodology that attempts to match the demographics and other characteristics of the U.S. population, showed less of a decline for Obama after the first debate and no consistent trend on their four final polls.

2012-12-21-YouGov.png

Why might the YouGov poll show less variation? One big reason is that every respondent came from a panel of individuals that agreed in advance to participate in surveys and, more importantly, had previously answered a party identification question in December 2011. That early data allowed YouGov to weight its weekly surveys by a fixed measurement of party preference for each respondent -- the answers they had given a year earlier -- rather than a by a new self-report of party preference, which may change (a practice that most pollsters shy away from).

Finally, consider the polls conducted in swing states. Each dot in the chart represents the margin separating Obama vs. Romney on an individual poll. The spread of dots is quite wide, since Obama ran stronger in some states than in others, but when combined, a regression trend line shows virtually no trend after Obama's lead narrowed following the first debate. (The 11 states were Colorado, Florida, Iowa, Michigan, Nevada, New Hampshire, North Carolina, Ohio, Pennsylvania, Virginia and Wisconsin.)

2012-12-21-Statepolls.png

A more apples-to-apples analysis of public swing state polls conducted by the Guardian's Harry Enten reached the same conclusion. "In the 67 situations where the same pollster had a pre- and post-Sandy survey," Enten wrote, "only 25 resulted in Obama doing better after [Hurricane] Sandy. Only in Virginia did Obama pick up more than 1pt."

Why did state polls show so little variation? One possibility is that the national shift occurred mostly in non-battleground states. However, Enten also found minimal movement in northeastern states most affected by the hurricane.

A more likely explanation centers on the complex, multi-question "index/cutoff" likely voter models used by Gallup, Pew Research, ABC News/Washington Post and other national pollsters. Though their methods vary widely, state-level pollsters typically use simpler models and many place less emphasis on self-reported voter enthusiasm.

Pollsters are just beginning to reexamine their 2012 data, but questions about the accuracy of self-reported enthusiasm as a predictor of turnout have intensified since Election Day. For example, an infographic produced by Resurgent Republic, a GOP-affiliated polling organization, shows that self-expressed enthusiasm by various demographic subgroups just before the election had virtually no relationship to turnout (as measured the change in size of the subgroup in exit polls from 2008 to 2012).

2012-12-21-ResurgentRepublicEnthusiasmmirage.png

According to the Resurgent Republic analysis, key subgroups that supported Obama, such as Hispanics, young voters and unmarried women, "outperformed their 2008 turnout levels, even though these cohorts exuded less enthusiasm to get to the polls than Governor Romney's core supporters."

The main purpose of media polls, Republican pollster David Winston argued at a recent post-election forum, is to "tell a story." Although they generally forecast the right winner, public polls told very different stories about horse race trends in the final weeks of the campaign. A true judgement of accuracy would take those differing stories into account.

Before You Go

US-VOTE-2012-ELECTION-OBAMA

Election Night Celebrations

Popular in the Community

Close

What's Hot