As human beings, we generally like to conceive of ourselves as rational creatures. We think logically, make decisions based on the best interests of ourselves and others, and do the things we need to do in order to not just survive, but also thrive in the world.
But we're often unaware of the myriad little -- and big -- ways that our thinking is irrational and biased. To become the highly-evolved, rational creatures we are today, our brains evolved with certain handy shortcuts (known in psychology as cognitive biases) to help us identify threats and make quick judgments. And even in the modern world, where we don't face threats to our survival every day, they're still very much present, and they shape the way we experience the world and ourselves.
"Humans suffer... the consequences of living in a time and place we didn't evolve to live in," neuroscientist Dean Buonomano, author of Brain Bugs: How The Brain's Flaws Shape Our Lives, told NPR. "And by peering into the brain, we can learn a lot about why we are good at some things and why we are not very good at others."
Here are eight common thinking errors and cognitive biases that you may not even be aware of -- but that shape the way you view yourself and the world.
We can't help but focus on the negative.
According to psychologist Rick Hanson, author of Hardwiring Happiness, our brains are wired to scout for the bad stuff -- as he puts it, the brain is like Velcro for negative experience and Teflon for positive ones. The brain is constantly scanning for threats -- which of course was in our favor as we evolved -- and when it finds one, it isolates and fixates on the threat, sometimes losing sight of the big picture. And even though we no longer deal with the threat of being eaten by wild animals in our daily lives, our brain hasn't let go of its sensitivity to perceived threats, even if they come in the form of an email from your boss.
This threat-awareness creates a "negativity bias" which causes the brain to react very intensely to bad news in comparison to how it responds to good news. Because negative experiences affect us so much more powerfully, research has even shown that strong, long-lasting relationships require a five to one ratio of positive to negative interactions in order to thrive.
"We've got this negativity bias that's a kind of bug in the stone-age brain in the 21st century," Hanson told The Huffington Post last year. "It makes it hard for us to learn from our positive experiences, even though learning from your positive experiences is the primary way to grow inner strength."
We see patterns where there are none.
One of the most basic thinking mistakes is called Type 1 Error, which is believing a false hypothesis to be true, often by mistaking correlation (or lack thereof) for causality. (This is one explanation for why we love coincidences so much.) While it does lead to thought errors, thinking this way may have given us an evolutionary advantage.
"Causal thinking evolved because it allows people to understand and control their environment, i.e. to be able to predict that, for example, if you eat a red mushroom you will die," writes an Oxford University Press psychology textbook. "This causal thinking is adaptive but may sometimes lead to Type 1 errors –- where you believe something is true when it isn’t, for example you believe that tying your shoes laces twice causes luck."
This tendency to seek out connections and patterns in random information is what's known as apophenia. This inclination plays out in a number of different ways, from spotting coincidences to conspiracy theories to finding hidden codes or significance in numbers or text.
And yet we don't see what's right in front of us.
Think you're present and mindful to your environment? While it may be true to a certain extent, you're probably not as aware as you think. In a now-famous 1998 study, researchers from Harvard and Kent State University targeted college campus pedestrians to see how much they noticed about their immediate surroundings. In the experiment, an actor approached one of the pedestrians and asked for directions, and while the pedestrian was giving the directions, two men carrying a large wooden door walked between the actor and the pedestrian, completely blocking their view of one another for several seconds. During that time, the actor was replaced by an actor of different height and build, complete with a different outfit, haircut and voice. Roughly half of the participants did not notice the substitution.
The experiment illustrates the phenomenon of "change blindness," which shows just how selective we are about what we take in from a visual scene. It seems that we rely on memory and pattern-recognition (going back to Type 1 Error) significantly more than we think we do, and that our visual perception may not be as reliable as we think.
We're heavily biased towards things that agree with us.
Our brains have quite a distaste for conflict and disagreement -- and they'll go to great lengths to avoid it. For this reason, we naturally gravitate towards things that we agree with or that reinforce our existing beliefs, and avoid those that oppose any of our beliefs.
Cognitive dissonance -- a psychological term coined in the 1950s to refer to this innate distaste -- leads us to the brain's confirmation bias, a tendency to search only for information that confirms our hypothesis, while ignoring information that refutes or challenges it. This is often why we have such a hard time changing our mind about things -- it's mentally taxing and confusing for us to let go of what we think we know and start collecting evidence for a new hypothesis. But this bias can lead us into error in work, life and politics.
"Paradoxically, the Internet has only made this tendency even worse," the blog io9 notes. And it's true: Whatever your political or religious beliefs (or your stance on anything, really) it's easy to find the information that tells you you're right -- and to simply tune out the rest.
We put ourselves under a harsh spotlight.
Did you ever have something mortifying happen to you in high school, after which your mother advised you to stop panicking, because "People don't notice the little things you do wrong because they're too busy worrying about themselves." Turns out, she was onto something. We do tend to magnify our mistakes and flaws, thinking that people are paying more attention to them than they really are. This is referred to in psychology as "Spotlight Effect" -- our tendency to think that other people notice things about us more than they actually do, a phenomenon that's been demonstrated time and again in social psychology experiments. The effect is basically the result of our naturally egocentric worldview, explains psychologist Nathan Heflick.
"We all are the center of our own universes," Heflick wrote in Psychology Today. "This is not to say we are arrogant, or value ourselves more than others, but rather, that our entire existence is from our own experiences and perspective.... But other people not only lack the knowledge of, for instance, the stain that you have, but they are the center of their own universes too, and in turn, are focused on other things."
Our choices are highly subject to a number of biases.
In American consumer culture, we're faced with a feast of choices for even the most mundane decisions -- we can choose from 35 types of toothpaste at the drugstore, pick a shirt from the 50 hanging in our closet, select a movie to watch from the hundreds available on Netflix, and the options for what to tweet or share on Facebook are practically infinite. And despite the illusion of freedom, all of these choices may be skewing our decisions and leading our minds into error.
Having too many options creates a sort of paralysis, according to psychologist Barry Schwartz. Sometimes, having too many options keeps us from making any decisions at all. And when we do settle on something, we're more likely to regret or be disappointed by it.
"It's easy to imagine that you could have made a decision that would have been better," Schwartz said in a popular TED talk. "This regrets subtracts from the satisfaction that you would have gotten out of the decision you made, even if it was a good decision. The more options there are, the easier it is to regret anything at all that is disappointing."
And what's more, Schwartz explains, the way we measure the value of things is by comparing them to other things. And when there are lots of things to compare something to, we tend to imagine the attractive features of those other things, diminishing the perceived value of the thing we have. It's a sort of grass-is-always-greener syndrome that keeps us from viewing our choices objectively.
We can't trust our memories.
Most of us would like to think that we recall past events with accuracy -- but we don't need psychologists to tell us that in reality, our memory is highly fallible and subject to a laundry list of biases and errors. Eyewitness testimonials are notoriously unreliable, as extensive research has found. One study even demonstrated that 25 percent of people could be induced to remember events that never even happened to them.
One common error is allowing our view of the past to be colored by our emotions in the present. Just broke up with your boyfriend? Your entire relationship history may start to look pretty grim. Just got a promotion? That grueling, soul-sucking first job suddenly looks like a valuable stepping-stone to bigger and better things. As the band Oasis said, "Don't look back in anger" -- because your anger, or any other emotion you're experiencing, will change the way you think about the past.
As Buonomano explained to NPR:
"One type of memory error that we make -- a memory bug -- is really a product of the fact that in human memory, there's no distinction between storage and retrieval. So when a computer writes something down, it has one laser that's used to store the memory and another laser to retrieve the memory, and those are very distinct processes. In human memory, the distinction between storage and retrieval is not very clear, and this can have very dramatic consequences. ... The act of retrieving a memory can affect the storage."
We're (too) partial to our own kind.
Both historical events and everyday experience demonstrate, again and again, our favoritism towards members of our own social groups. Human beings have a well-documented cognitive bias towards members of their own clans (real or imagined), and this even goes beyond ethnic, social or nationality groupings. Psychologists have found in-group bias to exist even among randomly-assigned groups. Favoritism of our own can sometimes, although not necessarily, lead to judging, stereotyping and hostility towards other groups.