If you’re looking forward to finding out who "won" the GOP debate just minutes or hours after it ends on Wednesday night, you'd probably be better off waiting a few days instead.
It's not that there won't be plenty of immediate data to look at, from focus groups and instant polls to social media and web traffic. But as the past few months have shown, none of those are especially reliable guides.
A focus group led by GOP pollster Frank Luntz after the first Republican debate in August suggested that voters had soured on Donald Trump -- a prediction not borne out in the business mogul's poll numbers, which continued to rise sharply.
Initial reactions to the first Democratic debate also proved to be dissimilar from the results of later polling on the race. Another Luntz focus group named Sen. Bernie Sanders (I-Vt.) the clear winner, and suggested many of Clinton's supporters had deserted her. Other focus groups and instant online polls told a similar story, as did a surge of web searches and Facebook follows for Sanders. Days later, though, scientific surveys showed a more quotidian outcome: While both candidates made a good impression, the debate, if anything, strengthened Clinton's lead.
It’s possible for both of those reactions to be true, to some extent: Media coverage, which was largely favorable to Clinton, is at least as important as the candidate's actual performance in guiding opinion. Respondents’ opinions on later polls are likely influenced by the media’s dominant narrative, especially if the poll includes people who didn’t watch the entire debate.
But there are plenty of other reasons to be skeptical of most of the attempts to instantly gauge reaction after a debate.
Online "instant polls," which don't attempt to weight their responses and may allow people to vote multiple times, often end up rewarding enthusiasm over numbers, reflecting the views of a small group of supporters rather than the nation's debate-watchers as a whole.
"The results rely on a self-selecting group of respondents with no regard to political affiliation, age, country, or even whether the person doing the responding actually watched the debate," Slate's Josh Voorhees wrote, explaining why a reader poll posted to the site's homepage had shown Sanders dominating. "They also tend to favor those candidates with active and impassioned fans -- something that Bernie’s fundraising numbers and campaign crowds suggest he clearly has in spades."
Focus groups, a similarly imperfect snapshot of the American electorate, are also often dominated by the loudest voices.
"While these discussions make for far more compelling television than dry survey statistics, they have important limitations," HuffPost's Mark Blumenthal wrote in a 2008 column. "Every group is a small, non-random sample, and it is hard to know the degree to which the views of participants may be influenced by the atmospherics of the telecast, the probes of the moderator or the opinions expressed by others in the group."
And even if a focus group or online poll accurately takes the measure of public opinion, there's another, more fundamental challenge with trying to figure out who "won." Unlike a sports game or an election, debates don't necessarily produce clear-cut winners or an objective way of naming them. Supporters of a politician are likely to think their preferred candidate won, regardless of the outcome, and even a strong performance can have relatively little impact on the polls.
"[C]andidate debates, at least as commonly practiced in the United States, do not contain measures for determining an actual winner," political scientist Seth Masket wrote earlier this month in the Pacific Standard. "The audience that tuned in to watch invariably saw their candidate doing well. It re-affirmed the things they liked about their preferred candidate and reminded them of what they didn't like about the other one."
Mark Blumenthal contributed reporting.