Unfortunately, there is no magic pill that will inoculate us from these cognitive biases. But we can reduce their power over us by understanding these distortions.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

In my last post, I argued that common sense was vastly over-rated as a tool for making sound judgments, and that we need to engage in "reasoned sense" that includes both extensive direct experience and critical thinking. Taking steps that include the informal use of the scientific method can help us make better decisions.

However, as recent research has demonstrated, even scientists who adhere to the scientific method can't guarantee they will draw the best possible conclusions. When I read this research my first thought was, "How could such highly-educated and precisely-trained professionals veer off the path of objectivity?" The answer is simple: They, like all of us, possess one quality from which it is impossible to divorce themselves. That quality? Being human.

As the fields of psychology and behavioral economics have demonstrated, homo sapiens is a seemingly irrational species that appears to, more often than not, think and behave in nonsensical rather than commonsensical ways. The reason is that we fall victim to a veritable laundry list of cognitive biases that cause us to engage in distorted, imprecise and incomplete thinking which, not surprisingly, results in "perceptual distortion, inaccurate judgment or illogical interpretation" (thanks Wikipedia), and, by extension, poor and sometimes catastrophic decisions.

Well-known examples of the results of cognitive biases include the Internet, the housing and financial crises of the past decade, truly stupid use of social media by politicians, celebrities and professional athletes, the existence of the $2.5 billion self-help industry, and, well, believing that a change in the controlling party in Washington will somehow change its toxic political culture.

What is interesting is that many of these cognitive biases must have had, at some point in our evolution, adaptive value. These distortions helped us to process information more quickly (e.g., stalking prey in the jungle), meet our most basic needs (e.g., help us find mates) and connect with others (e.g., be a part of a "tribe").

The biases that helped us survive in primitive times when life was much simpler (e.g., life goal: live through the day) and speed of a decision rightfully trumped its absolute accuracy doesn't appear to be quite as adaptive in today's much more complex world. Due to the complicated nature of life these days, correctness of information, thoroughness of processing, precision of interpretation and soundness of judgment are, in most situations today, far more important than the simplest and fastest route to a judgment.

Unfortunately, there is no magic pill that will inoculate us from these cognitive biases. But we can reduce their power over us by understanding these distortions, looking for them in our own thinking and making an effort to counter their influence over us as we draw conclusions, make choices and come to decisions. In other words, just knowing and considering these universal biases (in truth, what most people call common sense is actually common bias) will make us less likely to fall victim to them.

Here are some of the most widespread cognitive biases that contaminate our ability to use common sense:

  • The bandwagon effect (aka herd mentality) describes the tendency to think or act in ways because other people do. Examples include the popularity of Apple products, use of "in-group" slang and clothing style and watching the "The Real Housewives of ... " reality-TV franchise.

  • The confirmation bias involves the inclination to seek out information that supports our own preconceived notions. The reality is that most people don't like to be wrong, so they surround themselves with people and information that confirm their beliefs. The most obvious example these days is the tendency to follow news outlets that reinforce our political beliefs.
  • Illusion of control is the propensity to believe that we have more control over a situation than we actually do. If we don't actually have control, we fool ourselves into thinking we do. Examples include rally caps in sports and "lucky" items.
  • The Semmelweis reflex (just had to include this one because of its name) is the predisposition to deny new information that challenges our established views. Sort of the yang to the yin of the confirmation bias, it exemplifies the adage "if the facts don't fit the theory, throw out the facts." An example is the "Seinfeld" episode in which George Costanza's girlfriend simply refuses to allow him to break up with her.
  • The causation bias suggests the tendency to assume a cause-effect relationship in situations in which none exists (or there is a correlation or association). An example is believing someone is angry with you because they haven't responded to your email when, more likely, they are busy and just haven't gotten to it yet.
  • The overconfidence effect involves unwarranted confidence in one's own knowledge. Examples include political and sports prognosticators.
  • The false consensus effect is the penchant to believe that others agree with you more than they actually do. Examples include guys who assume that all guys like sexist humor.
  • Finally, the granddaddy of all cognitive biases, the fundamental attribution error, which involves the tendency to attribute other people's behavior to their personalities and to attribute our own behavior to the situation. An example is when someone treats you poorly, you probably assume they are a jerk, but when you're not nice to someone, it's because you are having a bad day.
  • I could go on and on (for an exhaustive list of cognitive biases, do a search on Wikipedia), but you get the point. If you look at your own thinking, you'll likely find yourself at the mercy of these distortions -- though I may just be suffering from the "false consensus" effect. But I really am sure that we fall for cognitive biases all of the time (I may be guilty of the "overconfidence" effect). In any event, all the research I read supports this post's claims (uh-oh, I think I just fell for the "confirmation" bias).

    Note to self: Need to continue to work on resisting cognitive biases.

    Popular in the Community

    Close

    HuffPost Shopping’s Best Finds

    MORE IN LIFE