We all tell little fibs now and then to help us get through awkward social situations or stressful workweeks. But the more lies you tell, the easier it becomes to tell them. And although the fibs may start off small, don’t be surprised if you find yourself easily telling big whoppers.
A new study claims to provide the first empirical evidence showing that dishonesty gradually increases over time. By using scans that measured the brain’s response to lying, researchers saw that each new lie resulted in smaller and smaller neurological reactions ― especially in the amygdala, which is the brain’s emotional core.
In effect, each new fib appeared to desensitize the brain, making it easier and easier to tell more lies.
“We need to be careful of small lies, because even though they may be seemingly small, they can escalate,” said Neil Garrett, first author of the study. “It may be beneficial to perhaps nudge people away from even small acts of dishonesty.”
Take the recent scandal at Wells Fargo bank, where thousands of employees were starting accounts without their customers’ knowledge. They were initially motivated by an aggressive incentive plan to improve employee performance, and management’s poor decision to comply with the sales quota instead of pushing back on it eventually led to the dismissal of more than 5,000 staffers.
Or consider the Bernie Madoff Scam, a long-term Ponzi scheme that prosecutors say started in the 1970s. The compounding lies it took, both to sustain the fraud and continue adding new clients to fill Madoff’s coffers, are examples of what Garrett described as “minor dishonest decisions” snowballing into big lies over time.
Getting to the bottom of why some people seem at ease with lying, and over long periods of time, may one day be able to explain and perhaps prevent large-scale fraud from wreaking havoc on society at large.
Would you lie to earn more money?
To see how escalating lies affect the brain, Garrett invited 80 participants to come to a lab in London, where they were paired up with someone whom they thought was another participant. In fact, the second participant was an actor helping facilitate the experiment. The pairs then had to play a game in which they guessed the amount of money in photos depicting jars of pennies.
However, the 80 study participants were incentivized differently according to which scenario they were privately assigned beforehand. Some were incentivized to be accurate and honest with their lab partner, while others were paid more if they lied to their partner to influence the guesses.
The researchers made two important observations. The first: Lies participants told got larger and larger over the course of the hour-long game, but only when the participants lied for their own benefit. This finding means that in a constant environment with the same incentives, repeated lies increase over time if it helps the person who is lying.
Secondly, the parts of the brain most strongly linked to emotions lit up at the first few lies, but over time and with more lies, these regions responded less and less to the dishonesty. This finding suggests that the parts of the brain that regulate our emotions become desensitized to repeated lies over time.
The difference between lies in a lab and real world dishonesty
Of course, a guessing game of coins for low stakes doesn’t even begin to encompass all of the real-world factors that might play into, say, a Madoff employee’s decision to continue perpetuating a financial scam.
A person’s decision to lie in the real world factors in other concerns, such as the behavior or dishonesty of colleagues and the likelihood of being caught.
“In terms of how it translates to the real world, I think there’s still some important questions we need to try to get to the bottom of,” said Garrett, now a postdoctoral researcher at the Princeton Neuroscience Institute. “For example, does lying escalate over the long term, over several years? Does it escalate even if you alter the context by which people tell lies?”
What this small, controlled experiment does demonstrate, however, is that people’s lies will escalate simply because they had repeated opportunities to act dishonestly.
“What our results may suggest is that if someone is repeatedly engaging in dishonest behavior, it’s likely that the person has emotionally adapted to their own lie and lacks the negative emotional response that would usually curb it,” Garrett said.
“If someone is repeatedly engaging in dishonest behavior, it’s likely that the person has emotionally adapted to their own lie.”
His findings align with past research that shows students who took mild beta blockers, a medication that reduces the effect of stress hormones and lowers a person’s physical reactions to fear and other negative stimuli, were twice as likely to cheat on a test than students who took a placebo. The less physical reaction their body had to the dishonesty ― in this experiment, induced with a pill ― the more likely these students were to cheat.
How to prevent yourself from telling escalating lies
So what can we do to increase honesty in ourselves and others? Behavioral economist Dan Ariely, a co-author on Garrett’s paper, has a few research-backed ways to encourage truthfulness: avoid conflicts of interest, say no to “fuzzy” rules that are open to interpretation (perhaps by signing an ethics agreement before the opportunity to cheat) and remind yourself of your values.
In one prior experiment, Ariely had participants write down as many of the Ten Commandments as they could remember before being given the opportunity to cheat, which resulted in significantly higher levels of honesty in this group compared to the control.
And perhaps in the case of Garrett’s coin experiment, one more suggestion: Don’t put yourself in a profit-making situation where dishonesty is baked into the incentive process (or in the case of Wells Fargo, don’t put other people in a position where they may have to choose between honesty and keeping their jobs). You think you’re strong enough to withstand temptation, but you’re actually setting yourself up for failure.