So Madonna joked about a "Black Muslim in the White House." It was pretty obvious to most people that she meant it as a joke, but chances are it will have reinforced belief in the Obama-Muslim myth in others, or perhaps even planted a new seed of misinformation in the minds of people who have never even thought about Obama's faith.
In a recent review paper in Psychological Science in the Public Interest, we follow the trails of misinformation: where it originates, how it is spread, how it is processed, how it affects our cognition, and how its effects can be alleviated.
Misinformation comes in many guises. It can come from jokes, from the grapevine, or from works of fiction (if you now wonder whether people really extract information from fiction, think about the fact that fiction author Michael Crichton has been invited as a climate "expert" to testify before a U.S. Senate committee.)
The evening news may report something believed to be accurate at the time, but overnight further investigations may reveal new evidence. What is genuinely believed to be a clandestine biological lab to produce weapons of mass destruction one day may actually turn out to be a legitimate commercial laboratory the next day.
And then of course there is intentional fabrication and propaganda.
Misinformation is also spread for a variety of reasons. People simply prefer passing on information that is likely to evoke an emotional response in the recipient -- whether or not it is true is not always the top criterion (how many lives does Charlie Sheen actually have? According to Facebook, he keeps dying!).
Sometimes misinformation is spread deliberately: Claims that Obama was born outside the US, or that there is no evidence that humans are causing climate change, have a clear aim and purpose.
Unfortunately, the media often contribute to the spread of unsubstantiated myths because of a focus on "balanced" coverage. Alas, the "two sides of a story" don't always deserve equal space because of an imbalance in evidence.
So why is that a problem? Surely, people can tease apart the truths from the falsehoods, right? Unfortunately, no.
Our research has shown that people continue to rely on misinformation even when there are clear retractions, with retracted misinformation affecting people's memory, inferential reasoning, and decision making. For example, even when people know that health concerns associated with some treatment have been thoroughly debunked, they will hesitate to get that treatment.
But maybe better access to more and more information will eventually solve the problem? Probably not: I was recently asked whether I found it strange that in a time with unprecedented access to credible information, there were so many "truthers," "birthers," and science deniers. Apart from the fact that a minority group can be very vocal without actually being very large, I don't find that strange for a number of reasons.
First, the amount of misinformation available grows proportionally with the availability of valid information. In fact, it may grow even faster because of the lack of fact-checking in much of the new media.
Second, the now-common idea that we can "check facts ourselves" is often an illusion: The fact we can "look things up on the net" can give people the impression they understand something when in fact they are overlooking important domain-specific details, or when they are trusting the wrong sources, and this ultimately leads to a decline in trust in true experts.
Third, it's easy to get bogged down in a misinformation "echo chamber." The same misinformation can appear on many linked websites, which may lead to the impression of corroborative evidence from multiple independent sources, when it is not.
Fourth, more and more information available also means that it is impossible to critically evaluate every piece of information we get. Sometimes we just have to use "heuristics," or rules of thumb: we believe what fits in with what we already know, or what others believe etc. Being a skeptic in the true sense of the word -- critically assessing evidence and questioning people's motives, not to be confused with denial! -- requires effort and time, and often we lack one or both of these.
Usually, these heuristics are benign: we try to save cognitive effort while trying to maintain a coherent and accurate view of the world. However, when people's beliefs are very strong, these beliefs will bias information processing and lead to what is known as "motivated reasoning" -- people with strong beliefs and motivations preferentially attend to and interpret information in a way that supports their beliefs.
Motivated reasoning is a major obstacle for rational argument. If someone wants to believe that Obama is a Muslim, or if someone wants to believe that virtually all of the world's climate scientists have conspired to make up a huge global "climate-change hoax," then it is very difficult to change their minds even when the actual evidence is very, very clear.
Misinformation is an issue for any political citizen who wants to form opinions and make decisions based on facts. If we want evidence-based practice and policy in a democratic society, then science communication, journalism and education will have to take on the challenges associated with misinformation. Some guidelines on how best to do that can be found here.