Please Have a Seat, Your Smartphone Will Be Right With You

Doctors are nowhere close to being wiped out of hospitals by their robotic counterparts, but they could certainly use help in some areas, and swarms of little bots are more than eager to do their bidding.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

How many of you chose to type your symptoms into a Google search box instead of picking up the phone and calling your doctor's office the last time you felt under the weather? Chances are, if you have access to a computer -- and if you're reading this, you probably do -- nearly 100 percent.

While Congress gets caught up in the political circus over the health care bill, overhauling it, amending it, threatening to repeal it, and sending it to the Supreme Court for review, Internet technology is quietly making health care better simply by arming people with more information. More and more people now use the Web to better educate themselves on healthy lifestyle choices, medical conditions and available therapies. Researchers at MIT's New Media Medicine, who are building on this information revolution and working on various projects to empower ordinary people with medical knowledge, call this "destroying the information asymmetry."

With easy access to their medical data, Media Lab participants can take a more proactive approach to monitoring their health, be it tele-collaborating with health care personnel for shared decision-making, participating in collective community databases pulled from anecdotal medical information, or simply using their cell phones as self-awareness systems to alert them of daily health routines.

Mobile devices as self-awareness systems, of course, are fulfilling obligations as man's best friend in a variety of areas, from fitness schedules to temperament control.

Smartphone apps are taking the medical information revolution one step farther. There is an app (or 20) to keep on top of workout schedules and weight loss plans, track food intake, conduct vision tests, and monitor blood pressure and cardiovascular health. In addition to overseeing general health and well-being, apps can also gauge more serious illnesses: Chemo Dairy keeps a chemotherapy schedule for cancer patients, Glucose Buddy manages blood sugar in diabetics, and Depression Check does just what its name indicates. More ambitious apps, like iTriage -- designed by physicians -- attempt to diagnose conditions based on specific symptoms, even proposing possible treatments.

I know what you're thinking. Wouldn't it be great if this know-it-all, palm-sized device, whose abilities only went as far as dinner planning a decade ago, could also examine bodily fluids before spitting out detailed assessments?

A team of Korean researchers may soon be able to get a mobile phone to do just that. Tapping into the sensitivity of a smartphone's touchscreen, they are working to enable mobile devices to double as test tubes and sample slides. Using a film that can selectively react to biomolecules, phones of the future will be able to respond to molecules in much the same way that they react to electrical signals generated by fingertips.

For anyone who has, like me, dealt with the stubborn resistance of an iPhone to deliver on even simple commands, such as sliding to open or declining a call, this may be hard to fathom. But using the screen's "capacitive sensitivity," the Korean team has achieved near-perfect recognition rates in experiments. So there still may be hope for us lesser mortals to decline that persistent caller -- or perhaps test ourselves for infectious diseases from the comfort of our living rooms. And when we do decide to get off our couches to get professional help, we can find the closest health care facility for treatment or locate clinical trials for the newest drug -- there's an app for that.

But this sort of armchair diagnosis has more to it than reducing inconvenient wait times. Public health experts are hoping that self-testing personal devices will allow people to diagnose themselves for embarrassing illnesses such as sexually transmitted diseases, thus helping treat and curtail transmission.

This is the kind of thing that makes Vinod Khosla wonder if health care of the future will need doctors or algorithms.

An "average doctor" could eventually be replaced by a computer, says Khosla, bringing down the price of doctor's visits while providing patients with convenient care and the reliability of computer-assisted data and information. He is right in his argument that most people rely on the Internet's anecdotal and untested information to diagnose themselves for non-critical illnesses anyway. What the computer would do is put together all of this data, check its reliability, create a consensus, and deliver it within the context of a patient's medical history and biographic and genetic information.

And what about the non-average doctor? The same technology that is bestowing healing powers on smartphones is beginning to equip thousands of robots at hospitals and other caregiving facilities to do routine tasks and tests -- from transporting medication to stitching up wounds to checking on vital statistics. High-power microprocessors, astute sensors, motion detectors, and voice activation capabilities are allowing robots to interact with patients, performing more complicated functions than simply providing additional hands.

Sophisticated bots like the da Vinci are not only filling in for doctors, but also outperforming them in some areas, such as making precise incisions and small movements -- tasks that a computer-operated machine can perform with higher fidelity than a human hand, thus rendering surgeries less invasive and more precise.

A couple thousand da Vincis have been deployed around the world and conduct about 200,000 surgeries a year. Its newer, cheaper and more portable cousin, the Raven, is expected to one day operate on an open heart.

Robots are acquiring more specialized skills than ever before, because they are -- let's face it -- better than human beings at learning by repeating the same task over and over again. Robot doctors are proving efficient at performing minimally-invasive techniques that in some cases are too risky to accomplish with clumsy human hands.

More mainstream and universal than remote-controlled robotic alienesque claws performing heart surgeries are devices enabled with videoconferencing capabilities, allowing doctors to diagnose and check on patients remotely, and even conduct psychiatric/psychological therapies online.

With state-of-the-art cameras and video capabilities, smartphones are allowing images of skin lesions and bacterial infections to be transmitted within minutes for expert observations and instantaneous diagnoses. Communications such as this can be particularly helpful in developing countries and inaccessible areas where health care workers may be able to obtain speedy diagnosis by transmitting images thousands of miles away.

But while it's probably easy to envision robots handling precision tools, meticulously scanning sample slides, or overseeing tedious data sets, are we ready to let them encroach upon what has long been considered the human domain -- cerebral comprehension and cognitive processing?

As Sarah Kliff notes, diagnostic robots with their open minds and wealth of information are not far behind.

In addition to blue-collar robots -- as Timothy Hay has aptly named the physically-gifted bots -- there may soon be the "white-collar" ones doing more than just tedious physical labor or precision instrument handling.

Last year, shortly after Watson, IBM's know-it-all supercomputer, had trounced its fellow human beings at Jeopardy!, there was talk of how its vast information sources and unbiased analytical mind could solve many of the problems existent in health care today. Watson is already proving capable at making individualized assessments and diagnoses, not because it has precise hands or can haul heavy objects, but because it can -- lo and behold -- think. All evidence indicates that the open-mindedness of the robot combined with its vast stores of Web-enabled information can make it a fairly reliable medical assistant at a patient's bedside or doctor's office (shameless self-plug warning!).

More recently, Watson has moved on to be trained for cancer diagnosis and treatments.

But what about human interaction? What about a doctor's experience, judgment, empathy? Notwithstanding Paro, the therapeutic bot trained to provide emotional support, those exclusively human traits can't be replaced yet. However, as health care becomes more and more expensive and doctors have less and less time, that is neither a skill we can afford to look for in doctors anymore, nor one they uniquely possess.

As Ezra Klein points out, if prolonged conversation and empathy are a physician's biggest advantages over a robot, they're in trouble (and given my three most recent doctor's visits, I'd have to agree).

Doctors are nowhere close to being wiped out of hospitals by their robotic counterparts, but they could certainly use help in some areas, and swarms of little bots are more than eager to do their bidding. What could possibly be wrong with that?

For more by Karthika Muthukumaraswamy, click here.

For more health news, click here.

Popular in the Community

Close

HuffPost Shopping’s Best Finds

MORE IN LIFE