Depending on which news source you read, House Republicans' final report in its Benghazi investigation Tuesday either uncovered no new wrongdoing on the part of Hillary Clinton and the State Department, or slammed both parties for inaction and lax security prior to the 2012 terrorist attacks in Libya.
As the Washington Post rightly notes, news outlets reported the facts of the report similarly, but framed the report in very different ways:
One possible explanation for the headline discrepancy is confirmation bias, a psychology and social science term that means interpreting new information to conform with what you already believe to be true.
Believe that Benghazi involves an Obama administration coverup? Well, then you might choose to read headlines that depict the report as a damning indictment of Clinton and the State Department.
If, on the other hand, you think Republicans have been picking on Clinton over the last two years for partisan reasons, or because she's a woman, you might click on a headline like the one that The Washington Post ran on its website: "Republicans’ Benghazi goose chase comes up empty."
In other words, your past assumptions and knowledge color the way you see the world, from the way you interpret news stories to the stories that you chose to consume in the first place.
Does this hit a little too close to home? It should, because we all do it.
According to a 2009 paper published by the American Psychological Association, a meta-analysis of 91 studies and nearly 8,000 subjects found that people are almost twice as likely to seek out information that confirms their existing beliefs than they are to investigative information that would challenge those existing beliefs.
"We're all mentally lazy," Scott Lilienfeld, a psychology professor at Emory University, told the Wall Street Journal at the time. "It's simply easier to focus our attention on data that supports our hypothesis, rather than to seek out evidence that might disprove it."
It's not just a problem among laypeople, either. Despite designing double-blind studies and having a reputation for objectivity, scientists are as suggestible as the rest of us.
“There is clear evidence in the literature that people tend to look for the errors in their analysis only when they get a surprising result or effect,” Saul Perlmutter, a Berkeley professor who won the 2011 Nobel Prize in Physics, told Berkeley News last year.
“This leads to people re-examining their analyses, and since there are often alternative approaches and/or subtle hidden bugs, the final conclusions typically end up more in line with previous results.”
How to fight back against your own confirmation bias
While it may be difficult, try turning your skepticism on yourself for once. Examine your opinions and accept that they could be colored by your world view.
Next, consider your critics' perspectives, especially the perspectives of individuals whose views don't match your own. Engage with those people. Yes, it will be hard.
And when it comes to consuming news, especially polarizing articles about issues you think you already understand, like the Benghazi investigation, it doesn't hurt to pick up a newspaper you wouldn't normally read. And keep the eye-rolling to a minimum.