On February 12, 1995, a party of three seasoned backcountry skiers set out for a day on the pristine slopes of Utah's Wasatch Mountain Range. Steve Carruthers, 37 years old, was the most experienced of the group, though they were all skilled skiers and mountaineers. Carruthers had skied these hills many times, and was intimately familiar with the terrain. Their plan was to trek over the divide from Big Cottonwood Canyon to Porter Fork, the next canyon to the north.
Within three hours, Carruthers was dead. As the skiers headed across a shallow, treed expanse, they triggered an avalanche. More than 100 metric tons of snow roared down the mountainside at 50 miles per hour, blanketing the slope and pinning Carruthers against an aspen. The other party heard the avalanche and rushed to the rescue, but by the time they dug Carruthers out, he was unconscious. He never regained awareness.
This anecdote appears in my recent book, "On Second Thought," and I use it to introduce the reader to the heuristic mind -- fast, automatic and often irrational. This irrationality can be quirky and entertaining, and I offer many examples of this in the book. But all too often -- as with this skiing tragedy -- our quirkiness crosses the line into what can only be called perversion. We make self-destructive decisions when we should know better; we choose options that are (seemingly) designed to sabotage our hopes and end up in failure and unhappiness.
One of the powerful, deep-seated cognitive biases that doomed Carruthers is called the "familiarity heuristic." What this means, simply, is that we all favor the familiar over the strange. Things that are unfamiliar or foreign -- people, places, ideas -- may carry unknown risks, so on a gut level we equate familiarity with safety and well-being. Indeed, the familiarity heuristic is one of the most potent cognitive biases at work in the mind, and much of the time this bias serves us well.
But not all the time, and there's the rub. Stanford University psychological scientist Ab Litt and his colleagues suspected that this powerful bias for what's known might lead us into self-defeating choices in routine matters as well. They suspected furthermore that we are more apt to rely on these automatic judgments when we're under pressure, and that bad choices increase the stress, leading to a cycle of poor decisions. Here's how they tested this idea in the laboratory.
The scientists recruited a large group of men and women from an online pool to work on a difficult word puzzle, with the prospect that they could win some money by doing well. Some of the volunteers were told that they could take as long as they wished to complete the puzzle, while others were told that they only had four minutes to complete the task. In other words, some were working under pressure, others were not. Then all of them got to choose between two puzzles -- one short and one long. The only other information the volunteers got was this: The shorter puzzle had been designed by a stranger, while the longer puzzle was the work of someone familiar to the puzzlers.
Litt and his colleagues had basically created a situation where all objective evidence argued against choosing the longer task. In fact, it was obvious: If they wanted to succeed, they should choose the shorter puzzle task, regardless of who designed it. Yet they didn't. Perversely, the volunteers who were under time pressure were more likely to choose the puzzle associated with a known person, even though it would clearly take more time and thus lead to more stress and, likely, to failure. Those who were working at a leisurely pace made the sensible, and less self-destructive, choice that favored their success.
It should be noted that the "familiarity" under these laboratory conditions was incidental and superficial at best. The volunteers had no real reason to trust one puzzle maker over another. Their preference was completely illogical. Yet when the scientists asked the volunteers about their choices, those were the kinds of rationales they cited: As reported online in the journal Psychological Science, the volunteers said their puzzle choice felt like the "safer" decision; it was "less risky" and offered a better chance of success. And, perhaps most telling, the self-destructive puzzlers said their decision simply felt right, down in their "gut."
Clearly, these gut feelings are untrustworthy. Not only is familiarity a maladaptive guide to what's beneficial, it can lead to choices that actually exacerbate stress -- increasing the likelihood of more poor judgments, potentially creating a destructive cycle of self-defeating actions. This pattern of decision-making may not lead to tragedy, but over time it can eat away at our happiness.