Exhortations to learn from history are nothing new. But how to make sense of that which is wholly novel, for which history books fail to offer a tidy playbook?
It was only after-the-fact that Bertrand Russell and Albert Einstein published their 1955 manifesto urging world leaders to consider the unintended consequences of their 21st century's game-changer: nuclear weapons.
Sir Joseph Rotblat, the youngest signatory of the Russell-Einstein statement and the only physicist to actively step away from the Manhattan Project, would eventually be awarded the 1995 Nobel Peace Prize for convening the Pugwash Conferences on Science and World Affairs. By engaging scientists, engineers, policymakers, and other experts to proactively consider challenges created at the nexus of science, technology, and society, these leaders set a precedent of responsibility for generations.
So why then do today's headlines, headaches, and heartaches smack of these very questions? This week security experts, CEOs, and pundits alike wrestle with unlocking San Bernardino shooter Syed Farook's iPhone, but what does next week, next month, or next year hold? Unfortunately, answering the government's call to Apple today does little to address society's challenges tomorrow.
From Uber drivers, connected by thumbs, organizing despite never sharing a traditional workplace, to Airbnb ruffling traditional hoteliers' feathers, apps facilitating an increasingly decentralized sharing economy eat into traditional firms' market share while introducing questions of zoning and labor policy. Netflix and Amazon are transforming the ways we watch and shop. Meanwhile, CRISPR-Cas9 technology makes it possible to unzip and replace DNA, driverless cars pose questions of liability, and the very gadgets that allow us to connect and collaborate challenge the traditional brick-and-mortar workplace.
As a proud product of Carnegie Mellon, I am far from a Luddite; rather, I welcome technological disruption and even the market messiness introduced by human ingenuity. But we share a responsibility to navigate the very paradigm shifts we instigate. How is this even possible when, by definition, there is no telling where "blue sky research" will lead us, when curiosity-driven science is conducted separately from real-world application?
There exists a linear model of technological advance that assumes that more theoretical basic research leads to more applied research, begetting development as ideas edge toward commercial markets. This linear assumption is also naïve. The stages interact, often introducing feedback loops between research and development.
In contrast with basic research, applied research leverages scientists' findings to make innovation technologically feasible. The development component is conducted by engineers but includes individuals with direct knowledge of sector and market trends. Development is often less risky than research; however, it can also be expensive, involving prototype development, pilot plan design, beta testing, etc. At this stage, firms must also assess manufacturing and marketing costs, intellectual property and legal considerations, health and environmental risks, and so on.
The time between scientific "eureka" moment and commercialization can run years, but start-up firms can be agile, responding swiftly to technical or market opportunities. Human ingenuity regularly outstrips existing law and policy.
In stark contrast, the framers of our Constitution designed our government to be slow, purposeful, and eminently deliberate. By baking the executive, legislative, and judicial branches' checks and balances into the mix - thereby limiting the powers of government - no single body can impose its will upon the American people. While gridlock regularly frustrates, the founders forced us to have serious discussions. And so we should.
Many of the most interesting developments in our increasingly interconnected, fast-paced world are those that bring to the forefront unanswered questions around science and technology. Year on year, we see the benefits of technologies allowing us to communicate more readily or to travel more freely, to wrestle infectious diseases into submission, or to better link smallholder farmers to markets, thereby lifting them from poverty. Are our innovations outpacing policymakers' ability to keep up, or might it be that dated institutions are not built to understand and act?
While conducting research at Oxford to consider lessons learned from the frothy dot-com era, I discovered that our most influential innovators - be they in science, technology, finance, business, law, or government - were the individuals whose experiences allowed them to develop fluency across otherwise stove-piped communities. A brief glance at global research agendas reveals that today's donors are stimulating cutting-edge work at the very intersection of disciplines.
By now, we have advanced beyond the William Whyte "organization man" era, in which employees work an entire career in a single firm. Today's economy and society require boundary spanners, versatile thinkers who can communicate across disciplines, sectors, and, yes, party lines. When considering the most intractable challenges, there is no "silver bullet" solution to be found within a disciplinary silo. If we throw a narrow solution at the world's most challenging problems, we are quickly setting ourselves up for failure.
So why then do we assume that questions of technology introducing issues of security, liberty, and privacy can be answered by a single tech luminary, no matter how sage and silver-haired? Ditto a single intelligence expert or elected official.
In recent years, Nelson Mandela's Elders, comprised of internationally-respected public figures trusted for their "almost 1,000 years of collective experience," are called upon to resolve international challenges. Just as Russell and Einstein called upon world leaders to take responsibility for their actions after atomic power was unleashed on their world, today's leaders must do likewise.
What we lack today and will increasingly need is a blue-ribbon commission to serve as the technological equivalent of the Elders, or perhaps a Pugwash with legs. Comprised of a diversity of unassailable voices - young and old, public and private, left and right, innovators and users - such a group must be empowered to consider creative efforts, as well as their potential consequences. Free from political influence and representative of the public, such a panel's value stems from its collective expertise brought to bear when issuing responsible recommendations.
Rather than focus myopically on a single iPhone, we must anticipate and field future questions. Sadly, a terrorist attack is not an isolated incident in today's world, but it would be folly not to learn from such tragedies. Einstein reminds us that "it has become appallingly obvious that our technology has exceeded our humanity," while also suggesting that "we cannot solve our problems with the same thinking we used when we created them."
We are a resourceful, generative society. We are problem-solvers. In our midst are Nobel Laureates, seasoned policymakers, legal eagles, military strategists, tech titans, serial start-up Millennials. Rarely are they in the same room, but they could be.
Just as we will continue to innovate, we will continue to introduce controversy. So long as we keep dreaming and doing big - and we will - we must bridge a more conscientious path from blue sky science to blue ribbon policy.