How We Make Moral Decisions

The concept of humans as rational beings whose actions are driven primarily by logic and reason needs to go -- our cognitive resources are more limited than we think, and we take shortcuts through reasoning more often than we know.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Imagine you are hiding from enemy soldiers in a basement, with several other people -- friends, family, neighbours. You can hear the soldiers walking overhead, and any sound will alert them to your presence, leading to everyone's death. In your arms is your infant child, who is about to cry. Do you hold your hand over his mouth, smothering him but saving everyone you are with, or do you let him cry, knowing that doing so will result in the death of not only the baby, but everyone in the basement?

Dilemmas like this, beloved of philosophers and psychologists who work on moral decision-making, may be far-fetched, but they illustrate important discrepancies between our immediate reactions and logical reasoning that we often find difficult, if not impossible, to resolve. While ethics has a long history in philosophy, recent research in psychology has brought a new perspective to issues of moral behaviour and decision-making.

One of the most intriguing models of moral judgments, the Social Intuitionist Model (SIM) proposed by Jonathan Haidt, has its roots in the philosophy of Hume. Hume observed that moral judgments were not derived from reason, but from moral sentiments. In a similar line, SIM proposes that emotional intuitions drive the moral judgments we make, while rationales and justifications are generated post hoc. One interesting consequence of this is that unrelated emotions can influence our moral judgments, often without our conscious awareness. For example, feelings of disgust induced by unpleasant smells have been shown to lead to more severe moral judgments. There is some evidence supporting Haidt's theory, but the role of emotions in moral judgments is likely more nuanced.

A slightly different theory suggests that reason plays a bigger role in our moral decisions -- but still not as big as proponents of human rationality would like to believe. According to Joshua Greene's dual process model, there are two systems that influence our moral judgments: an explicit, rational system and an implicit, emotional system. These systems are linked to consequentialist and deontological ethics, respectively -- take the infamous trolley problem, where we face the choice of whether or not to push one man off a bridge to stop a trolley which otherwise would kill five people in its path. Our immediate, emotional reaction against pushing the man to his death is deontological; it follows the rule that killing, no matter the circumstances, is wrong. The rational, consequentialist choice is to push the one man to save the five.

But not all moral stimuli produce instinctive emotional responses -- Greene distinguishes between personal and impersonal moral dilemmas. The classic trolley problem is personal, but with a slight variation, sometimes called the "switch dilemma," it can become impersonal: Imagine you are deciding whether or not to flip a switch to divert the trolley from a track on which it will kill five men to a track where it will kill one. For many people, flipping a switch seems more acceptable than pushing a man, even if both actions have the same outcome. Making consequentialist decisions in personal moral dilemmas takes longer, and Greene suggests this is because cognitive control has to override our immediate emotions, echoing Haidt's idea of a fast, intuitive system and a slower, analytic system.

Splitting ideas on morality into dualities is not an invention of modern psychology, but a tradition that extends far into the history of philosophy. The dividing lines between deontologists, who focus on the rightness or wrongness of actions themselves, and consequentialists, who focus on the consequences of those actions, go beyond ethics to yet older divides between reason and emotion. Psychologists are now testing the ideas of philosophers, and are beginning to uncover the role of emotions in moral decision making and the form of their interaction with reason. What Greene's dual process model suggests is that we are all both deontologists and consequentialists, interpreting the moral rules which deontology centers around in terms of automatic emotional responses. Rather than consistently taking one moral stance, most of us are greatly influenced by contextual factors like how personal a moral dilemma is, how difficult it is, and even by unrelated emotions.

What's becoming increasingly clear is that it's unproductive to think of morality in terms of strict dualities -- it's common to discuss emotion and intuition interchangeably, but while our intuitive responses are often emotional (we think of intuitions as "feelings"), they may stem from reasoning that, for practical purposes, has been internalized. We have an intuitive response that killing is wrong because, at least in most cases, it is. Antonio Damasio illustrates this when he draws upon evidence from patients with frontal lobe damage to support his somatic marker hypothesis, the idea that physiological sensations (like our "gut feelings") and the emotions they evoke can influence our decision-making behaviours. According to Damasio, such patients lack the somatic markers that usually inform us when we're making poor decisions -- in a gambling task, participants with no brain damage showed increased skin conductance response before they were about to choose from a "bad" deck with a higher potential for loss, while frontal lobe damage patients showed no such anticipatory effect, and selected from these decks more often as a result. Our intuitions and heuristics are not random; they exist because they are generally useful -- that is, precisely because they direct us towards behaviours that are typically reasonable. To return to Greene's dual process model, evidence is suggesting that rather than reason overriding emotion, the two are being integrated -- both seem to be crucial in determining our moral judgments.

The concept of humans as rational beings whose actions are driven primarily by logic and reason needs to go -- our cognitive resources are more limited than we think, and we take shortcuts through reasoning more often than we know. As research makes us more aware of the processes underlying our moral decision-making, the question we should ask ourselves is not whether we should be deontologists or consequentialists, but how we can be more cognizant of the factors that are influencing our decision making process. Despite hearing the trolley problem countless times, choosing an answer is still difficult -- and justifying the inconsistent decisions many of us make in personal versus impersonal dilemmas is even more difficult. We may be closer to getting answers on how moral judgments are made, but whether there are right or wrong answers, and what those answers are, remains a question outside the realm of science.

Popular in the Community

Close

What's Hot