CBS and The New York Times unveil a massive new online panel tracking survey conducted by YouGov, showing the GOP with a slight edge in the Senate. The Upshot's David Leonhardt explains the shift in the Times' polling standards, and how they will continue to evolve. This is HuffPollster for Monday, July 28, 2014.
NEW TIMES/CBS/YOUGOV POLL DATA FINDS GOP SENATE EDGE -
Anthony Salvanto, Doug Rivers and Andy Guess: "A new CBS News/New York Times Battleground Tracker estimate finds the Republicans positioned to take the Senate this year, with a likely 51-49 seat edge if the November election were held right now….The data is based on more than 100,000 interviews conducted online for CBS News and the New York Times by YouGov exclusively as part of this joint project, with samples for every individual Senate race and House race, oversamples in competitive races, and each matched to the demographics and voter characteristics in the states and districts….The seat estimate is made through a computer simulation model that considers all the possible outcomes that might arise out of the data, and points us to the likeliest overall Senate tally, as though this were election day. ...Republicans' current edge looks like enough to win at the moment, but that edge is politically tenuous and statistically narrow. It's based on a string of razor-tight races that are all-but tossups, notably one-point race estimates that narrowly favor the GOP in Louisiana, North Carolina, Iowa and Michigan." [CBS]
Bigger news: Times/CBS embrace 'non-probability' internet polling - Nate Cohn: "Random-digit dialing has long been the gold standard for public polling, but declining response rates may be complicating the ability of telephone polls to capitalize on the advantages of random sampling...As the young voters who are less likely to respond to telephone surveys become an ever-greater share of the population over time, it is probably more important for analysts to have an ensemble of surveys using diverse sampling and weighting practices...There are still questions about the effectiveness of web panels, which can reach only the 81 percent of Americans who use the Internet....Another issue is that the YouGov panel does not use probability sampling, the theoretical underpinning of modern polling....Instead, YouGov attempts to build a large, diverse panel and then match its panelists to demographically similar respondents from the American Community Survey, an extremely rigorous probability survey conducted by the Census Bureau. This step is intended to mimic probability sampling. But it can require significant assumptions about the composition of the electorate, including partisanship. These assumptions are contestable and based on varying amounts of evidence. All of this is controversial among survey methodologists, who are vigorously debating whether a non-probability web panel should be used for survey research...While the methodology debate rages, it’s probably best to have an eye on a diverse suite of surveys employing diverse methodologies, with the knowledge that none are perfect in an increasingly challenging era for public-opinion research." [NY Times]
A BFD in the polling world - Pew Research Director of Survey Research Scott Keeter: "This is a very big deal in the survey world. Until now, no major news organization has put its brand on using surveys based on non-probability methods. The move has set off a very lively debate on Twitter among journalists and pollsters. There are strong opinions about the issue of non-probability samples...I can’t predict what other organizations are going to do, but I do expect this to spur more experimentation – and that’s a good thing for the field. Because the Times and CBS News have good reputations for transparency, I fully expect that we will learn a lot more about the YouGov methodology in the coming weeks. That’s a good thing as well. [Pew Research]
-Washington Post pollster Scott Clement, via Twitter, was more critical: "News of the weekend: @nytimes and @CBSNewsPoll abandon decades of quality research methods." [@sfcpoll]
Our take - HuffPollster is no stranger to YouGov and its work. Doug Rivers, now YouGov's Chief Innovations Officer, co-founded the original Pollster.com and the company was our prinicpal sponsor. The explanation that Nate Cohn outlined for The Times' decision to publish YouGov's panel data mirrors our own rationale for launching the HuffPost/YouGov polling collaboration nearly two years ago. The reality is that all media polls, whether they begin with random samples or not, now collect data that shows considerable bias in its raw form. All such polls attempt to remove or correct those errors, usually by weighting the data. The use of non-probability panels involves a trade-off: The panels themselves are far from representative, but they come with a richer set of tools to help correct bias. YouGov, as Cohn explains, "has tracked many of its respondents over months, if not years, which gives it additional variables, such as a panelist’s [self-reported] voting history, to try to correct for non-response. After the first 2012 debate, YouGov showed less of a swing than many other polls, and its final pre-election polls were as good as or better than many other surveys in forecasting the results."
Spurs a change in the New York Times polling standards - Until mid-day on Monday, the New York Times Poll Watch page continued to point to the New York Times Polling Standards as last revised in 2011. Those standards offered an unambiguous ban on the reporting of non-probability internet polls:
Self-selected or 'opt-in' samples — including Internet, e-mail, fax, call-in, street intercept, and non-probability mail-in samples — do not meet The Times’s standards regardless of the number of people who participate....In order to be worthy of publication in The Times, a survey must be representative, that is, based on a random sample of respondents. Any survey that relies on the ability and/or availability of respondents to access the Web and choose whether to participate is not representative and therefore not reliable.
Sometime Monday afternoon, the Poll Watch page started pointing to an updated version that makes no mention of the old rules barring publication of specific categories of "bad polls" [CLARIFICATION: The first paragraph of the quoted text below also appeared in the 2011 Times Standards]:
Polls – both those that meet our standards and those that do not – may be used in larger discussions of the polls themselves as long as it is clear to readers that some of the polls may not be valid measures of opinion...
The world of polling is currently in the midst of significant change, and The Times has begun a process to review its polling standards. While the process is ongoing, the paper will be making individual decisions about which polls meet Times standards and specifically how they should be used. As technology changes, we expect there will be multiple methods for capturing public opinion; we also fully expect that there will continue to be a proliferation of polls that do not meet our standards. (A copy of the standards released in May 2011 is available online.)
Is this an amendment, or do the new standards entirely supplant the old? - It's "somewhere in between," Upshot Editor David Leonhardt explained to HuffPollster via email. "It's an explanation that there will be a new document in the near future. It explains where we are in the interim."
More from Leonhardt's email:
Response rates on traditional polls have fallen, and a few of the newer polls are making serious efforts to reflect public opinion accurately. Given these developments, we have started a process to review our polling standards to decide which polls today meet them. We'll make the decision empirically….Until the process is complete, we'll make individual decisions about which polls meet our standards and how to use them.
Obviously, there is a difference between dropping a single poll number into a story set in a swing state, to use as a snapshot of that race, and talking about a range of polls in the context of a forecasting model or of detailed data analysis. The Times has been doing the latter for several years now, and we'll continue to. Our work with YouGov is partly an attempt to learn more about some of the most promising efforts to capture public opinion through an online panel, with all the benefits and limitations that we described.
MENTIONING OBAMA DRASTICALLY CHANGES PARTISAN VIEWS ON ACA DELAY - Kathy Frankovic: "Whether or not people support delaying the business health insurance mandate depends on whether or not you mention the president in the question. Americans take their cues from the people associated with an action. In last week’s Economist/YouGov Poll, opinions about the employer mandate and its delayed implementation may have suggested significant confusion, but really indicated how partisans react to mentions of President Obama. Those who favored the mandate (who are more Democratic than Republican), when told the Obama Administration had delayed implementation, supported the delay. Those opposed to the employer mandate (more Republicans than Democrats) opposed the delay in implementation. This week, respondents were asked the same question without the information that the delayed implementation was a decision made by the Obama administration. And the responses became much less partisan than they had been. Fundamentally, without party cues and the indication it was the Obama administration that delayed implementation, Americans aren’t quite sure what to think about delaying the employer mandate." [YouGov]
RELIGION, POLITICS REMAIN TIGHTLY LINKED - Frank Newport: "Even as overall party identification trends in the U.S. have shifted over the past six and half years, the relationship between religion and party identification has remained consistent. Very religious Americans are more likely to identify with or lean toward the Republican Party and less frequently identify with or lean toward the Democratic Party, compared with those who are moderately or nonreligious….From a practical politics standpoint, Republicans face the challenge of expanding their party's appeal beyond the minority of Americans who are very religious, and appealing to Hispanics and Asians given that even the most religious of these growing groups tilt Democratic, albeit not as much as others in these groups who are less religious. Democrats face the challenge of attempting to broaden their party's appeal beyond the base of those who are moderately or nonreligious, a tactic that most likely will require effort to frame the party's positions on social justice and equality issues in a way that is compatible with a high degree of religiousness." [Gallup]
HAMAS TAKES MORE BLAME THAN ISRAEL FOR VIOLENCE - Pew Research: "As fighting continues to rage in Gaza amid calls for a cease-fire, about twice as many Americans say Hamas (40%) as Israel (19%) is responsible for the current violence. Just a quarter (25%) believe that Israel has gone too far in responding to the conflict; far more think Israel’s response has been about right (35%) or that it has not gone far enough (15%). A majority of Republicans (60%) say Hamas is most responsible for the current violence. Democrats are divided: 29% say Hamas is more responsible, 26% Israel, while 18% volunteer that both sides are responsible." [Pew]
HUFFPOLLSTER VIA EMAIL! - You can receive this daily update every weekday via email! Just click here, enter your email address, and and click "sign up." That's all there is to it (and you can unsubscribe anytime).
MONDAY'S 'OUTLIERS' - Links to the best of news at the intersection of polling, politics and political data:
-Ben Highton doubts Georgia is turning Democratic. [WashPost]
-Seth Masket ponders why the president's approval rating isn't higher. [Pacific Standard]
-CNN/ORC finds Mitt Romney winning in a rematch against Barack Obama, but losing in a matchup with Hillary Clinton. [CNN]
-Aaron Blake defends the CNN Obama/Romney question. [WashPost]
-An Oregon Republican accuses her Democratic rival of push polling. [Salem Statesman-Journal]