The Blog

Rating the YouTube Debate Questions

Our students and faculty at theand theprogram set out to answer just that question and we have some interesting, albeit preliminary, answers.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Jack Muse
reported on these pages that Monday's first YouTube debate "has been widely praised in all corners of the media" because of the one respect in which it really differed from previous debates, the use of citizen generated video questions.

But what is a good question? Which of the questions in Monday's debate were better and which worse? Our students and faculty at the Ben Franklin Transatlantic Fellows Initiative and the Wake Forest University Debate program set out to answer just that question and we have some interesting, albeit preliminary, answers.

After teaching students the art of questioning (learn yourself from Damien Pfister: YouTube Questions: Pitfalls; YouTube Questions: Exemplars; Production Tips; The Art of Questioning) students created and submitted a great set of questions.

The next step was to devise a simple rubric for assessing questions. We wanted something that did not require young students, many from Europe, to have intimate familiarity with the campaign to date. That's both a feature and a bug -- there are qualities of questions that are independent of their content. What's makes a question a good one regardless of whether it's on Iraq, education or health care?"

During the debate our students and used a Likert scale to evaluate each question on a 1-5 scale (5 being "strongly agree) in four categories:

*Clarity. The question made sense.
*Interesting. The question made me especially interested to hear the answer.
*Demanding. The question demanded that the candidates explain or justify their answer.
*Audio/visual. The video was visually powerful with good sound quality.

You can read more about the method (including its limitations) but this is, to my knowledge, the first report on judging the "winners and losers" among the real stars of the debate, the questioners.

So what "won"? People love to jump straight to the results and you can click-and-go to the question ranking page. For the stats inclined you can view scatter plots of the individual variables vs. total score or contact me for the raw data.

But aren't the real winners our students whose YouTube submissions have already had over 10,000 views? And the students who are using what we learned from the first debate to submit questions for the GOP debate in September? Can Mitt Romney really contemplate skipping that debate if he understands this story?

The educational effect of the YouTube debate is of inestimable value. The younger generations are not only viewing the debates in record numbers, they are becoming empowered. A note a student left on my Facebook wall should inspire us all: "I learned how much power a kid and a laptop possessed."

Finally, for the patient who have not already clicked off to the full results page, a drum roll and the best and worst questions according to our study (go ahead to the results page to see the embedded video with the precise scores for all 41 questions) :

Best: CNN's question 9 Is it OK to cite religion as a reason to deny gay rights?
Worst: CNN's question 18 Who was your favorite teacher?