Last week, I had the opportunity to attend the annual conference of the American Association for Public Opinion Research. The conference featured presentations from private sector, government, and academic researchers about their methods and findings, in addition to the release of reports by two AAPOR task forces - one on online panels, one on cell-phone surveying.
On the whole, I had a phenomenal experience. I truly enjoyed the spirit of collaboration as attendees and presenters shared best practices and supported each other's research. I had an opportunity to meet an impressive group of established public opinion researchers, and also got to meet many young students and professionals who are doing fascinating work. (I now believe that pollsters are excellent conversationalists precisely because they're so good at asking questions.)
I think that one of the great benefits of attending AAPOR came in seeing how research is conducted by those in other industries. For example, political pollsters deal with a variety of pressures that are lessened in academic research: the speed of data production, the need to insert your findings into the conversation quickly, as well as client demands and cost pressures. While a major academic study may consume years of a doctoral candidate's life, a campaign poll typically needs rapid turnaround and subsequent immediate release in order to remain "fresh." A campaign rarely if ever has time to improve its coverage and conduct in-person face-to-face interviews of populations missed by land-line and cell-phone surveys, for instance. Weeks or months of post-stratification are a luxury not afforded to those in the world of campaign polling.
At AAPOR, you get exposure to "the ideal" - projects refined by the most advanced and rigorous techniques, exploring the toughest challenges of sampling, processing and analysis that the survey research field faces. It highlights ways to improve your methods, regardless of field, and helps a researcher facing time and cost pressures make informed decisions about what is critical to producing useful data. And for a political pollster, AAPOR is a great time to focus on these issues exclusively, away from discussion about whose clients won more races or who got which race predictions closest.
There was one thing that surprised me a bit about the AAPOR conference, and I'd love to hear comments on this from those who have been to the conference before or who have been involved in the organization more deeply. Essentially, if AAPOR is the "American Association for Public Opinion Research," one might logically assume the conference would devote a substantial portion of time to the findings of public opinion research in addition to the methods of collecting data.
A great example of a panel that balanced these two was the Gary Langer/Matthew Warshaw presentation about ABC News' "Where Things Stand" research in Afghanistan. I walked away with a greater understanding of how to conduct research in the most incredibly challenging circumstances, but I also learned what the people of Afghanistan think about the future of their nation.
However, the vast majority of content from the conference was about the process of social science research. In some cases, it was not necessarily even about opinion research in the strictest sense of the word "opinion", but rather the collection of demographics. This is understandable, given that at a professional conference, everyone is trying to figure out how to do what they do better, but I felt there was a very narrow focus on the methods of research and less attention paid to what we're finding.
Why do we conduct opinion research in the first place? We do it to learn about certain groups of people and audiences. Developing a research methodology that perfectly captures cell-only populations is as useful as the research findings it generates. So what are we finding? Opinion research conducted by another organization about, say, shifting attitudes in America about the media, have a great deal of application to my work as a political pollster, even if that research presentation does not impact the methods of how I do my work in the future.
With the wealth of knowledge possessed by the various professional and academic organizations in AAPOR, it would be great to see more panels highlighting the findings of public opinion professionals.
In the end, I think it is critical that more political pollsters take the opportunity to focus on their methods in order to create the highest quality data. We often measure political pollsters by the accuracy of their results and how often their numbers are "on the money" when final ballots counts are in. A conference like AAPOR gives researchers the tools to make sure they are right rather than lucky. There is a great deal that political polling professionals can learn from their counterparts in other industries and I feel very thankful that I had the opportunity to attend this conference and learn from their experiences.