The essence of ultimate decision remains impenetrable to the observer -- often, indeed, to the decider himself. -- John Kennedy (1963)
In my previous post I discussed why crises cannot be predicted and why theories that are not rational expectations cannot be both believed and correct. Rational expectations captures the uncertainty principle in economics: your behavior depends on the theories you believe. In this post I want to elaborate on the relevance of these ideas.
A simple example illustrates the connection between rational expectations and the uncertainty principle. In Conan Doyle's story "The Final Problem," written in 1893, Sherlock Holmes is being pursued by his arch-enemy the genius but evil Professor Moriarty. If Holmes can escape to France he wins; if Moriarty can catch Holmes first Moriarty wins. The climactic conclusion of the story finds Holmes on a train bound for Dover and Moriarty pursuing Holmes on another train. The only stop is at Canterbury. If both get off at the same stop Moriarty catches Holmes and Moriarty wins. If they get off at different stops Holmes wins. Despite the supposed brilliance of Holmes and Moriarty, their creator Conan Doyle did not understand rational expectations -- in the story Holmes reasons that Moriarty thinks he is going to Dover, so he gets off at Canterbury while Moriarty continues to Dover and loses the game. But why does not the supposedly genius mathematician Moriarty understand Holmes reasoning and so get off at Canterbury himself? And why does not Holmes anticipate this and get off at Dover?
Despite the fact that we can repeat this logic endlessly there is a rational expectations equilibrium -- it requires that players have probabilistic rather than certain beliefs. If each believes the other has a 50% chance of getting off at Canterbury or Dover, then each has a 50% chance of winning the game no matter what they do -- and indeed are willing to flip a coin to decide.
I started this essay is a quote from John F. Kennedy, "The essence of ultimate decision remains impenetrable to the observer -- often, indeed, to the decider himself." This inspired a book by the noted political scientist Graham Allison entitled The Essence of Decision -- a book highly critical of rational choice theory. That is ironic: it is the essence of rational choice that the ultimate decision indeed remains impenetrable -- even perhaps to the decision maker.
Nobody I know claims that you can't fool some of the people some of the time. If you implement complicated and hard-to-understand policies and regulations there is little point in waving the banner of "rational expectations" and claiming everyone is going to respond the best possible way right now. So even if we accept that people rationally adjust their behavior to account for new understandings of reality rather than repeating the same mistakes over and over again, we may wonder how long it takes. In laboratory studies a situation may need to be repeated 10, 50 or even 500 times before a rational expectations equilibrium is reached. So how long does it take in the vastly more complicated real world? Years? Decades? In practice the adjustment can be astoundingly fast. A dramatic example took place on September 11, 2001.
In the decade from 1988 to 1997 there were approximately 18 hijackings of commercial aircraft per year. The vast majority ended peacefully, and there was strong evidence that the longer a hijacking persisted the better the chance of a peaceful ending. Consequently -- and rationally -- pilots and flight attendants were trained in the FAA-approved "Common Strategy." This dictated that hijackers' demands be complied with, that the plane be landed safely as soon as possible, and that security forces be allowed to handle the situation. Passengers were advised to sit quietly, and they followed that advice. Flight personnel were taught not to endanger the passengers by playing "hero." This "Common Strategy" was well-established, rational, successful, and strongly validated by decades of experience.
Reality changed abruptly on September 11, 2001 when hijackers, rather than landing planes and making demands, used the hijacked aircraft for suicide attacks on ground targets. The rational response was no longer the passive "Common Strategy" but rather to resist at any cost. Indeed since September 11, 2001 passengers and flight crews -- who rarely resisted prior to that time -- have equally rarely failed to resist. A quick search on Google for "passengers subdue hijackers" turns up dozens of hits in the last year alone.
How long did it take to overturn the long-established and successful "Common Rule?" The timeline is instructive. At 8:42 a.m. on September 11, 2001, United Airlines Flight 93 took off. The first evidence of a regime change occurred four minutes later when American Airlines Flight 11 crashed into the North Tower of the World Trade Center. Seventeen minutes later United Flight 175 crashed into the South Tower of the World Trade Center. Twenty-five minutes later, at 9:28 a.m., United Airlines Flight 93 was hijacked. Nine more minutes, then American Airlines Flight 77 crashed into the west side of the Pentagon. It took only another 20 minutes for passengers and flight crews to adjust their behavior: at 9:57 a.m. the passengers and crew on United Airlines Flight 93 assaulted their hijackers. It took an hour and 11 minutes from the first evidence of a regime change until the rational response was determined and implemented. It happened on a plane already in the air based on limited information obtained through a few telephone calls. Yet the response was no minor adjustment. It was dangerous and dramatic. The passengers and crew of flight 93 risked -- and sacrificed -- their lives.
In our day to day life and in familiar situations it is scarcely surprising that we act rationally -- and behavioral economists and psychologists do not generally argue otherwise. They instead point at dramatic events -- stock market crashes and traders selling in panic for instance -- as evidence of the importance of irrationality. Yet the evidence of Flight 93 on September 11 shows the opposite: a careful, deliberate and supremely rational -- yet extremely rapid -- decision was taken in the face of an unfamiliar and totally unexpected situation. And the evidence of crashes and panics misses the point: if the stock market is crashing, it is completely rational to flee for the exits.
It is true that complicated policies and regulations are not going to be understood right away. Indeed: it seems characteristic of regulations and regulatory agencies that they work better at the beginning before people have a chance to figure the loopholes and before firms have a chance to engage in "regulatory capture" by getting the regulatory agencies to work for them. It may be that our best alternative is to implement policies that fool people for a while. But if so: let us do it with the understanding that they are doomed to fail. The only robust policies and institutions -- ones that we may hope to withstand the test of time -- are those based on rational expectations -- those that once understood will continue to function.
If there is something that ordinary economists can agree with behavioral economists about it is this op-ed written in 2010 in the New York Times by two behavioral economists:
As policymakers use it to devise programs, it is becoming clear that behavioral economics is being asked to solve problems it wasn't meant to address. Indeed, it seems in some cases that behavioral economics is being used as a political expedient, allowing policymakers to avoid painful but more effective solutions rooted in traditional economics.