Take a step back in time -- way back -- to the dark days of the early 1990s. These were the days before the Internet was in everyone's office, home, purse and pants pocket. Back then, I served on the senior executive team at AOL, and we had a vision: "To make the Internet as popular as the telephone and the television, but even more valuable." We wanted to change the world of communication, commerce and entertainment.
Others were fearful. And arguments regarding the value and danger of the Internet were very real, and often held by important decision-makers:
I love the power of the Internet, but can it track me?
Watch television on my computer? No way!
Never give your credit card number to anyone on the Internet.
Why should I send a computer message to a friend when I have a telephone?
What about the children!?!
From the luxury of 2014, the early anti-Internet nay-sayers seem as irrelevant as the Know-Nothings of the 19th century. But not long ago, the success of the Internet depended upon finding the right balance: the balance between its potential positive power and the fears and concerns that come along with any new technology.
This is the same challenge we face today. Medicine is advancing not just in the chemical and the biological domains, but also in the engineering and technological fields. The future of detection, prevention and treatment for the world's most devastating diseases may not only be a pill or shot; it may also be found in a microchip that goes inside your body to detect, monitor and treat disease-based infirmities.
People are worried. Their fears and concerns are overlooking the potential. They're turning to the Internet to voice their concerns, mounting hyperbolic arguments against the potential for new tools to detect and fight disease. (The irony, one must assume, of using an "old" new powerful technology to fight the "new" new technology is overlooked.)
It's an invasion of privacy!
We're undermining our humanity!
What if electronics "take over" my body?
What about the elderly!?!
To be sure, there are serious questions of ethics and privacy to be worked through, but it would be terribly misguided to let what-if anxieties stand in the way of very legitimate breakthrough health solutions. Especially at a time when age-related diseases like Alzheimer's are progressing at an astronomical pace, and ravaging countless lives in the process.
So let's proceed with both haste and caution. We must encourage the "technological revolution" in healthcare, while also balancing this exciting sea-change moment with ethical sensitivity and sound policymaking. It is only by finding the right balance that this new frontier of healthcare can be a win for all involved.
In seeking this balance, however, we cannot lose sight of what's at stake: The health, happiness, and livelihood of so many people around the world can be improved by emerging health technologies.
On one level, digital devices can give people greater independence and a higher quality of life. These technologies run the scale from the gee-whiz (memory prostheses; smart sensors in home floors to detect falls; therapeutic robots to assist the functionally disabled) to the quotidian (brain-body prostheses for missing or damaged limbs, IC wheelchairs, medication reminders). All these devices bring new technologies -- and data collection -- into the home. And that's a good thing. Data leads to insights, and insights lead to better medical and technological development.
This isn't Big Brother. It's Big Data.
On another level, technologies can begin to work hand-in-hand with prevention and treatment. They can provide information about if, when, and under what circumstances patients need medical or other interventions. To use the Alzheimer's example, scientists and other researchers have a devilishly difficult time identifying and treating each individual's particular brain impairment.
But what if a microchip could understand individual brain impairments and assist with memory and reasoning capabilities? DARPA, the small innovative government agency that developed the precursor to the Internet, is working today to develop such a "memory prosthetic" for victims of traumatic brain injuries. While such a device is still years away, the prospect is exciting. And the power of Moore's Law -- the escalating rate at which costs and speed of micro-electronics improves -- can make the extraordinary become the ordinary quite quickly. (Again, one recalls the rise of the Internet!)
Innovations like DARPA's microchip and others mark the emergence of the new field of "bioelectronics." Bioelectronics are devices or systems that interact with the electronic signals sent by your nervous system and allow each of us to detect, monitor or react to our diseases. These bioelectronics technologies hold the potential to become one of the most revolutionary developments in the history of health and medicine.
It's truly groundbreaking, but parallel attention must be paid to questions of privacy, security and personal control. We must, as we did successfully with the Internet, get the balance right between technology and ethics. Only then can we ensure that the highest and best use is made of these potentially exciting new advances.
The marriage of medicine and digital technology should be embraced. If the history of the Internet teaches us anything -- and surely it does -- it is that technology should and can be guided by carefully-crafted policy and social constraints without losing its power to advance society. Let's balance ethical and moral questions with the vigorous pursuit of technological development.
Neither will benefit by ignoring the other.