By Juan Enriquez, The European Magazine
What can a chronic traffic jam around Washington, D.C., tell us about the future of biotechnology and the hidden costs of excessive caution?
A few decades ago, as strange as it may seem today, cars on the beltway that surrounds Washington, D.C., still used to move. That does not mean that there weren't occasional traffic jams and backups. In fact, every morning, same place, same time, traffic slowed. There were no obvious obstacles or problems. It drove traffic engineers crazy, until they finally figured out that one particular person had a habit of commuting in the far-left lane just slightly below the speed limit.
When this behavior was reported, a flurry of furious letters followed. Then the perpetrator, John Nestor, wrote back stating that the speed limit was posted, he followed it exactly, and it was someone else's problem if they wished to speed. The traffic backups continued.
He Never Approved a Single Drug
The outraged citizens of D.C. dubbed this behavior "Nestoring" -- that is, following the exact letter of the law while slowing everyone else down to a crawl. All this would have been a funny anecdote had Mr. Nestor's not job been to approve critical drugs in the U.S.'s Food and Drug Administration (FDA).
For years, for decades, Mr. Nestor protected the public interest by following the exact letter of the law. Drug applications stacked up, approvals were postponed, new trials ordered. Eventually even the most timid and cautious of bureaucrats realized that Nestor was generating an enormous backup and problem. Nestor was fired, only to be reinstated after public interest groups sued, showing that Nestor was simply and literally following the letter of the law.
Nestor's behavior never wavered. He had to be sure that a drug was safe before it was approved, and he could find no perfectly safe drugs. He eventually retired, and Ralph Nader's group praised him for an unblemished record of protecting the public. Apparently he never approved a single drug.
Absent from the whole debate was the cost of not acting. If a drug saves thousands of lives per year but is delayed for years, should we account for lives lost? Can ethical concerns spark new moral problems? Is it possible that being cautious and risk-averse can be just as dangerous, or more so, than the most avaricious corporate behaviors? The truth is that we don't know. We do not systematically measure the cost of not acting, of acting really slowly, of putting so many obstacles in place that nothing gets done.
Because of Fear
We are quick to discuss the negative and unintended consequences of new medical technologies, but could it be that we are killing more people by increasing the cost and time to market of drugs than we are saving through careful, meticulous oversight?
One frustrated former head of the FDA stated:
In all of FDA's history, I am unable to find a single instance where a Congressional committee investigated the failure of FDA to approve a new drug. But, the times when hearings have been held to criticize our approval of new drugs have been so frequent that we aren't able to count them. ... The message to FDA staff could not be clearer.
Broad-based Nestoring can look like careful regulation and protection, but it can devastate research applications and startups. While each defensive regulatory decision may find some justification, when you add them up, the cost of bringing a medicine to market, per billion dollars spent, in constant dollars, has increased by a factor of 100 over the last 60 years. Far fewer medicines come to market. The needs of the poor are ignored. And the costs of medicine crowd out other spending. What is the specific crossover point where more are killed or hurt by excessive costs and regulation than are saved by careful oversight? We do not know, and that is precisely medicine's missing measure.
Why are we so cautious? Because of fear: fear of failure, of change, of genetic modification, of nuclear power, of robots, of nanotech, and a host of other controversial subjects have to be examined in the context of what will happen if current trends continue, if growing problems and challenges are not solved, if growth does not reduce deficits and produce jobs. Yes, there can be misuse and abuse. Yes, we will make mistakes. But perhaps the worst abuses are committed by those espousing the precautionary principle: "Prove this will never harm anyone, and I'll give you permission." Had such folks been in charge a few decades ago, we would never have had cars, stairs, electricity or hammers.
The Cost of Being Fearful
This debate and awareness of the potential, as well as the pitfalls, of technology is particularly relevant in a period of extreme, rapid change, one where we will double the amount of data generated by all humans, across all time, in the next five years. And it is not just raw data; thousands of experiments, in hundreds of fields, are gradually changing who and what a Homo sapiens is. We are starting to take control, sometimes deliberate, over the evolution of ourselves and of many other species.
Our kids, and maybe even ourselves, will have to deal with difficult and interesting choices and opportunities on life span, organ regeneration, cloning, tissue preservation, when and how we have kids, etc. As we learn how to rebuild many of our body parts, and eventually, maybe, as we learn how to store memories and download them, the complexity and sophistication required in ethical, moral, business and governmental tradeoffs will become mind-blowingly complex. Simple yes-or-no answers to ethical and risk queries may end up derailing progress and competitiveness in whole societies and countries. We are entering a time of extraordinary change, one where the meek and the risk-averse can do enormous damage while protecting us. It is essential that we understand who is doing what and why, and what the end consequences of not acting might be.