Below is a conversation between Thomas Homer-Dixon and Ron Dembo. Dr. Dembo is a risk expert and founder and CEO of Zerofootprint, a not-for-profit dedicated to reducing our ecological footprint. Dr. Homer-Dixon is Director of the Trudeau Centre for Peace and Conflict Studies and Professor in the Department of Political Science at the University of Toronto. He is also the author of The Ingenuity Gap, winner of the Governor-General's Award. His most recent book, The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization assesses the concatenated risks facing the planet's civilizations, and urges not a solution to this or that problem, but a whole new way of understanding the systems we've built. The challenges are not political or economic (or not fundamentally so); they are structural. And the only way to address them before it's too late, he urges, is to understand that the repertoire of solutions we usually turn to is not going to work.
This is Part 1 of a two-part serialization.
Ron Dembo: There are a couple of things that really stick out for me. My trouble with reading books like this is that I immediately try to think of solutions. In particular the whole notion of resilience interested me because I think you can actually simulate systems to check how resilient they are. You emphasize the concept of resilience and the need to create resilient societies. We don't put a priority on it. Imagine you are now the prime minister, and you can do what you like. What would you do to Canadian society to make it more resilient?
Thomas Homer-Dixon: One of the problems would be that I don't think the conditions are ripe for the kind of paradigm change that I'm suggesting in terms of thinking of resilience. We had this enormous blackout in 2003 and it didn't get us thinking about resilience, it got us thinking about more of the same, just improving micro managing and tinkering with the grid in various ways to make sure it never happens again. But the idea of decentralized energy production, more off-grid production, or the ability of individual communities and households to reduce their dependence upon the grid, that wasn't raised at all. People unfortunately tend to think in sort of dichotomies about these things. For example, they think we need nothing but connectivity and the more connectivity the better (an example that comes up in Chapter 5). On the other hand, participants at a solar conference I attended tended to assume we want complete autonomy. In the end, we don't want either. This is sort of a goldilocks situation; we need the "just right" goal where you have a certain amount of connectivity but not too much. The problem is that to bring about change, this goal is going to have to be shared across the culture, and I'm not sure that it's there yet. So it's not going to be easy for a prime minister to change things all at once.
RD: A question on dependence: If you have something that is ultra connected then does this not imply dependence?
THD: Let's say we have scale-free networks, and we are highly dependent upon a few heavily connected nodes. That's potentially a very vulnerable system, especially to a targeted attack. We don't spend a lot of time mapping our networks to determine where we might be vulnerable. It's just not something that policy makers and our technocrats are encouraged to do -- there's not a demand from the public for resilience. In a sense this is my starting point -- that the public needs to realize, or needs to be educated to realize, that there is a need for something we call resilience and it's probably going to cost something in terms of productivity and efficiency, but ultimately we might be better off because we are buying insurance against shocks. And so it would be difficult to sort of just plunk me down in the role of prime minister and say ok how do you implement resilience, because you need some receptivity from the part of the public and the public will become receptive if the circumstances have changed in some way -- let's say some crises or some shocks that tell us that the existing systems are not working very well.
RD: So are you saying that you would need to have a crisis before you could think about this?
THD: Not entirely, because towards the end of the book I'm arguing that we need to try to improve resilience as much as possible, even prior to a crisis, because it will help us manage that crisis. I say that we need to keep breakdown constraint all the way through the book. People jumped to the conclusion without reading the book that I say catastrophe is certain. I'm not saying that. I'm saying some kind of discontinuity is very likely and may be inevitable. It doesn't have to be catastrophic, and avoiding a cascading failure may well depend upon building resilience into our systems. So you know, if I were prime minister, I guess I would start that educational process. I would say we are very vulnerable to cascading failure within our highly interlinked systems. The case of electricity, which we mentioned a moment ago, is a little different. This is a public grid and the government needs to provide it or some larger institution needs to provide it -- what social scientists would call collective action problem or a social dilemma. It's not going to be provided by individuals.
RD: So how do we get the government to realize the need to distribute electricity generation?
THD: Kind of a chicken and the egg problem. But that's part of why I wrote the book. Just to start to get people to think about resilience. I'll be hammering away at this issue. And others will too.
RD: Some people refer to this as adaptation.
THD: Yeah, but it's not adaptation.
RD: It is just good risk management.
THD: Right. Like the Roman engineers overbuilding their arches, or the San Fran engineers who built a parallel water supply system for firefighting in the event that the mains are broken. And what I would like to see and encourage is a leader putting this concept of resilience through redundancy into the mainstream of decision-making about the design of our society and the design of our institutions. What's Canada going to do when a nuclear bomb goes off in New York? We are not talking about that, yet they put the probability of that happening around 50% in the next ten years.
RD: Well, what about an earthquake in Toronto?
THD: Right, and my argument basically is that earthquakes aren't entirely exogenous. We are producing enormous stress within our ecological and social systems that creates the possibility of what I call social earthquakes; the probability is going up of major system discontinuity and we're acting as if the world is going to be the same in 50 years. Meanwhile, we are almost certain that it will be radically different. It may not be dramatically worse, but it is almost certain to be dramatically different and the changes will be discontinuous and non-linear. If we don't build resilience into our society then change could be catastrophic.
RD: Well, when you get close to the limits of a system, any system, small changes can lead to chaotic results.
THD: And you start to get huge amplitude swings in the system. I want to come back to that point in a moment. I want to stay with this risk management idea for now. I want to think about the point you just made. I had that working in the back of my mind: risk management. Remember our earlier conversation -- you mentioned the idea then. But I like that old distinction between risk and uncertainty, and I'd be interested to hear what you think about that distinction. Risk is something to which you can apply probability distribution of outcomes and uncertainty is something where you don't even know what the outcomes might be. There's no possibility of applying probability distribution because you don't know what the outcomes might be and you don't have any ideas of what the probabilities are, so uncertainty is characterized by unknown unknowns or ignorance of ignorance.
RD: You are thinking of risk as a statistical measure.
THD: And that's common in the community -- the distinction between risk and uncertainty. I think it's a useful distinction because what I try to emphasize is that this is a white wall of fog we're facing and you don't know what's around the next corner. In a world where you really don't know what's going to happen, you back off the accelerator pedal. So, that leads us to your point. When you start to approach the limits of any system, you get these high amplitude swings. You want to back away from that limit as much as possible because by the time you start to approach those cusps it's getting late in the day for adding resilience in the system.
RD: Imagine that we need to choose between two strategies; and suppose that there are only two possible outcomes, scenarios A and B. Imagine we pick one particular strategy; scenario A occurs and, in hindsight, we find we were wrong. Now imagine scenario B occurs and we were wrong. In other words, there was no right answer. The question we should have asked when we were choosing a strategy was not which one is right?, but which will we regret more? The one with the larger regret is the riskier of the two.
THD: Have you heard of "Pascal's wager"? I can't remember how this was stated by Pascal himself, but he proposed settling the question of religious faith with a strategy like the one you just articulated: I can believe in god or I can withhold faith; what are my regrets if I'm wrong in either of those two choices? The downside of withholding faith from a god who turns out to exist is an eternity of regret.
RD: I'd love to know about that because it's really very simple. Imagine the uncertainty around global warming. Now imagine what we did was we took strong action to curb carbon dioxide emissions. What's the downside? The regret is that we will have spent too much on something that turned out not to be useful. That's not so terrible. We do that with insurance premiums all the time. On the other hand, imagine we don't spend that money and we were wrong. The regret is enormous.
THD: And now you can even get more sophisticated because economists would do what they call inspected value calculation to compute some probability. If we act and we're wrong we know with 100% certainly that the cost will be, say, 2% of GDP. If we don't act and we're wrong let's say the probability of the non-linear shift in the climate is 10% and the cost is 50% of the world GDP, so you multiply the two together and you come out with a figure that is far larger.
RD: It's really simple and it's what I call the "expected regret." This is really a simple argument for resilience. Imagine we assume there is enough resilience in our system, so we don't spend money on resilience and we're wrong. On the other hand, imagine a scenario in which we do spend money on resilience and we're wrong. One way of being wrong is much worse than the other. The interesting thing about regret is that it's about more than numbers. There is also the psychological fact.
THD: You know, in a practical sense that is something we need to internalize within our societies, we need to be doing that kind of investment all the time, sensible management.
RD: One important issue with regret is what you learn from it. If you lived in a stock market that has never gone down and you think that your kids will live to their nineties, you would never plan for the possibility of the market going down because you'd never experienced regret. That's maybe why teenagers do crazy things. They haven't experienced regret yet. Once they experience regret, they can place a higher value on it. So partly, Canadian society has been a fat cat, we've had a lot of energy, we live a good life -- our lack of regret may have blunted our faculties. For example, we haven't taken notice of many of the unintended consequences of our lifestyles. Perhaps the age of regret is around the corner. I wondered, though, why you didn't push your argument even more, because it's almost like at the end of the book you might have said, ok guys we must really focus on one or two things. What do you think we should focus on?
THD: I had the fourfold prescription. Reduce the tectonic stress as much as possible. Second, adopt what I call a "prospective mind." Third, build resilience into our systems from the local level to the global level. I give examples: at the global level I talk about changing our international financial architecture so that cascading failures are less likely; I talk about distributive production of the key goods and services within economies, energy, and food.
RD: There's a funny thing here -- what you're asking for seems both utterly necessary and nearly impossible.
THD: It comes down to what I think of the central issue, and what the central issue is, and that's the reason I started the answer the way I did. I think there is a certain amount you can do to get people convinced about resiliency and start building it into systems, but "just in time" production for example is deeply entrenched in our economy now. It's very much a product of ferocious competition between corporations. No one corporation is going to give it up unless they all do, and no one country is going to move away from that kind of production process unless everybody does, so there's a larger challenge in the structure of economic systems which goes back for instance to all those arguments I'm trying to make in chapters 8 and 9 about the growth imperative, about the way our capital system is structured, and I don't see that changing in a fundamental way until there is a crisis of some sort.
RD: I think that is the central issue -- that we are now living in a world where every economy is based on growth. You cannot do well unless you grow. Economic systems are set up that way, that if you don't grow you aren't valuable.
THD: No one country can step out of that process independently without huge consequences.
RD: It's hard to imagine changing that without some enormous earthquake. So if you think about reducing the stresses, well it's not that easy to do.
THD: And we're not making a lot of progress, but that doesn't mean we should stop.
RD: That's something we will do when we start thinking like risk managers and less like economists.
THD: And start thinking about the world as a complex system and less like a machine. That's part of my public policy agenda: to get audiences to think about the world as a complex system.
RD: Part of my agenda in risk was to get people to think in risk terms. You don't address only the risks you believe are actually going to come to pass; you manage the risks that could. This I really think is a big issue. We should spend time on it.
THD: It takes us to the core problem of our economic systems.
RD: It also takes us to a conversation about cities. We're moving towards urbanization at a shocking pace. I think for example of the massive flight from the countryside going on in China, and then I think back to Rome, the precarious megacity of its day. Do you think that large cities are inherently unstable?
THD: They certainly are if we can't provide the same enormous outputs of high-quality energy that we are providing right now. So energy transition becomes the Achilles heel of this economic model. Rapid urbanization is dependent on energy availability.
RD: Do you think there is such a thing as an argument against resilience.
THD: No and yes. This is why if you remember at the end of the book I fly into Lebanon. I had that little discussion of the reaction in the streets of Beirut when there was an energy crisis going on and there were riots across the city. I think they were very vulnerable to energy shocks. We're going to start to see things pull apart in a similar way if there are energy shocks in the future.
RD: That would be enormously challenging given the way the world has gone the last hundred years towards urbanization. To think of reversing that process today, when you think of the numbers of people involved, it becomes very challenging.
THD: This is the footprint thing. It's not clear if people can move out to the countryside and be sustained. There are points at which we reach the thresholds of our imagination. Part of what I do at the end of the book is talk about open source. I want to encourage a conversation that collectively starts to imagine these possibilities. Not just the scenarios where there are shocks and large numbers of people start to move out of cities, but also what the world might look like if we somehow made the transition to a less urbanized climate. It helps to know where we are going even if the transition is messy.
RD: But clearly the magnitude of a shock today is unimaginable.
THD: It's the fact that these shocks are unimaginable that makes them dangerous. We are at the thresholds of our imagination because we get so locked into thinking about the way the world is now and assuming it would be like that indefinitely.
RD: Ok, let's assume the opposite for a moment. Let's talk about a simultaneous shock to four major cities: London, Tokyo, New York, and Singapore.
THD: This is a problem that needs to be investigated. People have talked about it but nobody has done the research. The growth of the financial system is a scale-free network. The big hubs are New York, London, Tokyo, and Singapore. What happens if you take out the financial centers of two of those? I am thinking New York and London.
RD: Taking them out is not a trivial thing.
THD: Let's say you set off nuclear devices in both simultaneously, and that's not an impossible scenario. So what is the impact on the global economy? It's a non-trivial research problem, but I don't think it's impractical. I'm currently having a conversation with a major bank about getting hold of their data on financial transactions and information flows, and human capital concentrations, so that you can actually map the international financial system and look to try to determine to what extent it's scale-free and what happens when you pull out these nodes. I think you can come up with a first order of approximation of what the impact would be. I would say that this is actually an emergency research issue. I think that intelligence agencies for instance should be investing significant resources to map our scale-free networks, which include financial. I read a lot on this topic, and I get no sense that anyone is doing the necessary work. But, even if half of intelligence agencies are called upon, it's classified. I think this needs some sort of public debate. There should be publicly available research done.
RD: Technology is trusted to do a lot of the basic surface operational risk and methodology. How do you measure the information risk of the bank and how much capital you should put towards it? If you're HSBC and your headquarters are in London, how much capital do you need to support the fact that an office might be taken out? I really think that these things are hard to quantify. If you think about the cost of SARS to our economy, it was really high and I don't think you would have been able to forecast that. There were two little planes hitting two towers in New York and it did small numbers by Baghdad's standards --
THD: -- and it cost a trillion dollars.
RD: We couldn't have forecast that either. Forecasting doesn't work. Risk management and resilience do.
(Part II of this interview will be posted next week.)