Attacking CNN: Our Time Asks If Its Polls Miss Adults Under 30?

WASHINGTON -- A fledgling advocacy organization attacked CNN pollsters in April, but the transgression they alleged did not pan out. Still, the details of this dust-up help illustrate two big challenges for opinion polling: First, the growing number of cellphone-only households is making it much harder for pollsters to reach adults under 30 years old; second, when pollsters are opaque about their methods, it makes it hard for the rest of us to make sense of their data.

It all began when Our Time co-founder Matthew Segal noticed something odd in a set of tabulations published by CNN. The data was from a recent national poll conducted by the Opinion Research Corporation (ORC) for the news organization. Segal and his colleagues, who advocate on behalf of under 30 Americans, were puzzled that the questions on marijuana legalization and gay marriage were broken out by demographics like age, gender and race, but included no data for the 18-to-34 age group. The pollsters had instead inserted the abbreviation "N/A" in place of the numbers.


"The N/A jumped out at us," Segal told The Huffington Post, since the common definition is not available or not applicable. He assumed that CNN had not interviewed any respondents between the ages of 18 and 35.

Without contacting the cable news network, Our Time launched an online petition, urging visitors to tell CNN that "N/A is not OK when failing to poll Americans under 35." Their website inititally declared that "CNN and Opinion Research think your opinion only matters if you're over 35." Segal said that the petition garnered more than 2,500 signatures in just 12 hours.

But as it turns out, CNN and ORC did interview younger Americans -- just not enough of them for a reliable, stand-alone tabulation.

CNN sent Segal a statement explaining that "the 18-34 year-old age group is included in all surveys conducted and released by CNN," but added that "the sample size was too small for statistically valid analysis." The statement also explains that CNN, like all other media pollsters, "adjusts many groups to reflect their actual share of the total adult population as reported by the U.S. Census, so the overall results are unaffected by the small number of 18-to-34 year olds interviewed."

The statement omitted any explanation for the small number of interviews with younger adults or any reference to their continueduse of landline-only samples.

Our Time also received a phone call from Keating Holland, CNN's polling director. Segal says he pressed Holland to disclose how many younger respondents had actually been interviewed. As Segal subsequently wrote in a Huffington Post blog entry, Holland "ballparked that roughly 9-10 percent of the 824 poll respondents were under the age of 35, but refused to release the raw number of young survey respondents."

Segal goes on to explain that the group aged 18 to 34 should represent just over 30 percent of the adult population, a fact supported by the Census 2011 Statistical Abstract. So, since CNN's sample included just 9-10 percent, they had to weight the younger respondents up by a factor of roughly three to correct the skew in age.

Once informed of CNN's actual practices, Segal and his colleagues struck their original claim that the network was "failing to poll" younger Americans and expressed regret "that a part of our messaging was technically inaccurate," Segal wrote.

However, Our Time remains convinced that "N/A is not OK." Their petition remains in place, and Segal now argues that it is wrong that the number of younger adults interviewed by CNN is "too few to indicate reliably the views of 70 million young Americans." Our Time wants the the polling industry to "innovate" and "make the investment to reach our demographic in order to reflect public opinion accurately."

This controversy raises two important questions: First, why are CNN's pollsters having so much trouble reaching younger Americans? Second, does it matter that CNN's landline sample misses so many younger Americans that it has to weight the youngest age group up by a factor of at least three?

The answer to the first question is easy: CNN's unweighted poll was light on younger Americans because their sample covered only households with a landline telephone. As documented by the National Center for Health Statistics, the percentage of American households with a cellphone but no landline telephone service has been steadily rising over the last ten years, especially among younger Americans. As of last year, 24.9 percent of all adults have wireless service only, but among those aged 25 to 29 years that number is now more than half -- 51.3 percent.


Four years ago, the Pew Research Center released the chart below showing how their unweighted percentage of 18-to-34-year-olds was steadily declining to roughly 20 percent.


As Pew Research reported last year, those trends have continued unabated. Thus, the age skew in the unweighted data obtained by CNN is about what we would expect from their landline only samples.


The second question -- can CNN factor in its lack of young adults? -- is tougher. The answer, according to the available data, is that demographic weighting alone does not entirely make up for its missing cellphone-only respondents.

The best support for this conclusion comes from a series of studies involving parallel surveys using dual samples of both landline and mobile phones. Pew Research, which conducted the surveys, has produced a series of reports since 2007 comparing the weighted results obtained by combining landline and mobile phone samples with those obtained from landline calling only.

At first, Pew Research found few significant differences that persisted after weighting between the two types of surveys. But the frequency has grown as the size of the cellphone-only population has steadily increased. Its report last year examined 72 questions and found 29 instances where the difference was 3 percentage points or greater when comparing weighted results from landline-only calling to a "dual-frame" calling methodology, which combines both mobile and landline samples.

Last fall, Pew Research reported that support for Republican Congressional candidates was significantly higher on landline-only samples. Republican margins were roughly five percentage points greater on the weighted landline only samples (+12.7) than on the combined landline and cellphone samples (+7.6), with the latter estimates coming very close to the actual results of the 2010 elections.

Similarly, the average of final surveys conducted by pollsters that called landlines only (including CNN) just before the midterm votes showed a bigger Republican margin than pollsters that called combined samples of landlines and cellphones. Again, the landline-only polls typically overstated the Republican margin, while the average of the dual-frame samples came much closer.

Part of the reason for these differences is that younger adults with landlines were more likely to vote Republican in recent years than younger adults reachable only by cellphone.

"Estimates from the combined landline and cell sample based on the last three pre-election Pew Research surveys showed Democrats with a 53%-to-38% lead over Republicans among registered voters younger than age 30," the 2010 report noted. "But estimates based only on interviews from the landline sample showed Democratic and Republican candidates running about even among young voters -- 49% said that if the elections were held today they would vote for the Democratic candidate, while 45% backed the Republican candidate in their district. The difference in the margin between the combined sample and the landline sample was 11 points."

These differences help explain why the most recent national surveys conducted by Pew Research, as well as by ABC News/Washington Post, Associated Press/GfK, CBS News/New York Times, Fox News, Gallup (and USA Today), McClatchy/Marist College, NBC News/Wall Street Journal, Quinnipiac University and SurveyUSA all sampled both landline and mobile telephones. Most of these organizations shifted to dual-frame sampled during 2010 or since.

Will CNN be next? In an email response to The Huffington Post, Holland hedged.

CNN "may change in the future" but the network's polling director first wants to "hear the latest round of cellphone papers" presented at next week's conference of the American Association for Public Opinion Research (AAPOR), Holland said. There is "[n]o sense in making a change that is based on year-old assumptions when there is (presumably) fresh data about to be released -- both on the best practices and whether adding cell phones does more harm than good."

To be fair, the shift to a dual-frame methodology is far from trivial. As Pew Research explains, the costs of a completed cellphone interview can be as much as twice that of a landline call, and the process of combining and weighting two very different samples remains new and complex. Some pollsters fear that the experimental nature of dual-frame sampling may be introducing new forms of bias.

Whatever CNN's decision, this episode does illustrate how confusing this topic can be -- especially when pollsters are opaque about their methods. Our Time's initial petition did get its facts wrong, but the polling releases from CNN provide no description of the sample design, no description of the variables used for weighting and no indication of whether the reported margin of error has been adjusted to reflect the greater error that usually results from weighting. These are all mandated by AAPOR's Standards for Minimal Disclosure.

Despite Our Time's initial misreading of CNN's "N/A" abbreviation, the group continues to ask an important question: How can a leading cable news network continue to produce survey data based on samples that fail to cover nearly half the population of young adults?