Not All Polls Were Wrong In Nevada

Not All Polls Were Wrong In Nevada

WASHINGTON -- In Nevada, polls predicted the wrong winner of this week's Senate election. Or did they? While public media polls in late October consistently gave a slight advantage to Republican Senate challenger Sharron Angle, the internal campaign polls gave Democrat Harry Reid the edge and campaign pollsters on both sides attribute the difference to a combination of greater care in modeling the demographics of the electorate, more persistence in reaching all sampled voters and the added value of registered voter lists.

With 99% of the precincts counted, the Associated Press reports that Reid defeated Angle by five percentage points (50% to 45%), but the public polls told a different story. In Nevada, we logged 15 publicly released surveys fielded in October, and all but two -- including all eight fielded in the last 20 days of the campaign -- gave Angle nominal advantages of between 1 and 4 percentage points. While none of the margins on any one poll was large enough to attain statistical significance, the consistency of the results demonstrates that Angle's advantages did not occur by chance alone. Our final "trend estimate" gave Angle a nearly three-point lead (48.8% to 46.0%) -- enough to classify the race as "lean Republican."

But the internal polls sponsored by the campaigns were telling their clients a different story. The final tracking polls conducted for the Reid campaign showed Reid leading narrowly throughout the fall campaign, according to Reid pollster Mark Mellman. Their final tracking poll, conducted during the final week of October, showed Reid leading by five percentage points. "There was really no point," Melman told me, "where Reid was actually behind in this race."

Gene Ulm, partner at the Republican firm Public Opinion Strategies, confirms that their surveys for the Angle campaign showed a similar pattern. "We were typically tied in the low to mid 40s -- which as a rule are not good for any incumbent," Ulm said, while several days of tracking showed them "down by single digits."

Campaign pollsters working in other states have similar stories to tell. One example is Democrat Jef Pollock, who polled for Senator-elect Joe Manchin in West Virginia. Pollock told me just before the election and confirmed in a conversation yesterday, that all of his internal tracking polls over the course of the campaign, including four weeks of rolling average daily tracking in October, never once showed John Raese with even a nominal lead over Manchin. Manchin ultimately defeated Rease by an 11-point margin (54% to 43%).

While the Manchin internal polls showed it leading consistently (albeit narrowly), six automated surveys conducted by Rasmussen Reports and its Pulse Opinion Research subsidiary for Fox News conducted between late September and mid-October showed Rease leading by margins ranging between 2 and 7 percent. The Rasmussen surveys ultimately converged with other public polls and gave Manchin a narrow lead, but they told a very different story about where the campaign stood for much of the fall.

So why did the internal polls, including those conducted on both sides in Nevada, produce different results? I talked to the pollsters involved. Here are their explanations:

Modeling the demographics. "One of the things about our ability to get it right and the failure of the public polls in Nevada," explains Reid pollster Mellman, "was the very careful modeling we did of what the electorate was going to look like."

Bill McInturff, co-founder of Public Opinion Strategies, the firm that polled for Angle, agrees. Speaking with reporters yesterday at a breakfast hosted by the Christian Science Monitor, McInturff described a similar process of modeling the likely electorate to assure that their samples had the appropriate distribution of voters by age, gender, race and region so that those factors remain constant from one survey to the next.

Manchin pollster Pollock agrees that public pollsters do not place "enough focus on making sure that sample composition is rational," that they "are not holding enough things constant" from survey to survey. He says he saw "large fluctuations" in tabulations by age and other demographics in successive surveys that "are just not going to happen" when the composition of the electorate is held more constant.

Voter files. What is the basis for these demographic models and less volatile samples of likely voters? For campaign pollsters, more often than not, the answer comes from the publicly available lists of registered voters compiled by each state.

"We have so much rich data nowadays in terms of telling us about past [voting] performance," says Pollock, including commercial data and statistical turnout modeling appended appended by data analysts. Today, he says, pollsters "have ten times more in terms of rich vote history" to draw on.

Mellman would not comment on whether his firm used voter lists to sample the Nevada electorate, but he argued that the "careful modeling" they did of the demographics of the Nevada electorate, would have been "impossible without using a voter file."

While all the pollsters I talked to spoke of the value of demographic data gleaned from voter files, not all pollsters endorse drawing samples from them. McIntruff says his firm remains committed to the more traditional approach of calling randomly generated numbers (an approach known to pollsters as "random digit dial" or RDD) because of the need to match voters lists to commercial telephone number listings. Phone matched lists don't match well for blacks, Hispanics, younger voters and especially younger women," groups that do not typically vote Republican. McInturff explains that they will sample from voter lists in California, where most voters provide a phone number when they register to vote.

Persistence. Traditional polling methods have always placed a great premium on making multiple efforts to contact and interview the randomly identified respondent before substituting another respondent. Don't do that, Mellman says, and the result is a "random sample of easy to reach voters." He reports that in his Nevada surveys, the easiest to reach voters were typically more Republican than those they had to dial 5 to 6 times before completing an interview. "Those public polls that are moving quickly and cheaply," he adds, "and don't do the callbacks can't possibly get the right answer."

Angle pollster Ulm agrees that their polls, "particularly in Nevada had difficulty reaching and connecting with Hispanic audiences which are a pivotal vote," adding that this difficulty is "even higher this year" than in the previous election. Pollsters struggle, Ulm says, "to reach anybody who's under 45 or of a lower socio-economic class."

Diane Feldman, a Democratic pollster who conducted surveys for Governor Ted Strickland in Ohio, agrees: "We consistently saw that those who were modeled as Democratic on the [registered voter] file were more difficult to reach by phone -- with higher rates of both no answer and busy signals."

While the national media polls typically use the same rigorous call-back procedures that the campaign pollsters describe, few public pollsters at the state level disclose the number of attempts they make to each sampled number. However, the one-night polls used by Rasmussen and some other automated pollsters preclude the use of multiple calls over successive night.

Cell phones. Nationally, surveys that sampled both landlines and cell phones produced a more accurate forecast of the national House vote that surveys that did not. Mellman says that voters reachable by cell phones only are "critically important," but not just because cell-phone-only voters have differing demographics. "Two years ago, four years ago," he says, "you could just weight by age, weight by race, weight by ethnicity, and get the right answer" because the young people reachable by cell phone only had the same answers as those who did not. Now, he says, "that is no longer true…a pollster that doesn't insist on contacting cell phones is doing a disservice to their client."

McInturff agrees about the importance of calling cell phones but has a different perspective on the cost. "If you don't do cell phones," he says, "you underrepresent the people under 35, you underrepresent Hispanics and especially in Nevada, those are huge constituencies." He adds that he was "surprised by the Hispanic and African-American turnouts" in Nevada, and "among the reasons is, you can't track the cell phones, they are too expensive."

One of the reasons for this greater expense is a federal regulation that prohibits pollsters from using computer autodialers to place calls to cell phones. The need to hand dial calls to cell phones inflates the cost of cell phone sampling. As such, speaking at the Monitor breakfast, McInturff announced plans to lobby members of Congress to "change the law and create an exemption so that political polling can use auto dialers with cell phones." If not, he said, "in 2012 or '14 this industry's going to collapse because too much of this stuff is open to being crap."

Popular in the Community

Close

What's Hot