There's a growing trend in health care for providers and payors to look beyond traditional medical claims at the data that captures what we're doing in our daily lives -- "for our own good." The objectives, they say: to provide better health care sooner so that we can prevent illness, avert downstream medical complications, and more medical spending.
To be sure, data are the lifeblood for research, which can benefit you and me and the people we love. The upsides of Big Data for health are many: amassing huge amounts of information from large numbers of individuals to conduct more effective clinical trials that lead to cures for sick people; helping individuals benchmark themselves against other people like them to stay well and manage chronic conditions; anticipating and preventing the spread of public health epidemics in specific geographies; and, managing population health in groups of people that helps drive down medical costs for employers and consumers alike. Many of the Big Benefits of Big Data were presented in the July 2014 issue of Health Affairs, documenting dozens of use cases for data analytics to drive population health and cost-savings for the health system at-large.
We each create "digital exhaust" quite passively in our daily lives: when we use our smartphones (via the GPS function, for example), swipe credit cards for retail spending, check-in on social networks, and use mobile apps tracking personal activities. These personal data bits are collected up by third party data brokers and mashed up into Big Data mines. The information can be very valuable for health because it captures us outside of the health care system: where we live, work, play, and learn.
Our health is fostered (or diminished) by external factors outside of the health system and our genes. The health gods are in our daily details -- of what and how much we eat, how much we move about, our alcohol consumption, our safe (or un-) sex lives, our moods, and social connectivity. These bits of data help build health profiles which, when aggregated with other data, better inform research. As Kipp Bradford, a biomedical engineer active in the Maker Health movement, explained to me, "When you do a clinical trial with 1,000 people you're taking 1,000 peoples' physiologic measurements and generalizing that across a billion people. But if you can have a trial with billions of people, then you can understand the nuances of the effects on one person."
The case for integrating consumer-generated data with traditional medical numbers was made by the Chief Clinical Officer from the Carolinas HealthCare System who told Business Week in their story titled "Hospitals Are Mining Patients' Credit Card Data to Predict Who Will Get Sick," "Information on consumer spending can provide a more complete picture than the glimpse doctors get during an office visit or through lab results."
In Here's Looking At You: How Personal Health Data Are Being Tracked and Used, a report I wrote for the California HealthCare Foundation, I present the case for Big Data in health care, the promising uses for incorporating our individual "small data" into Big Data sets, and the potential darker sides of doing so. There's much promise for people in the analysis of Big Data for health. There's also peril on the personal privacy front for U.S. health citizens if fragmented U.S. regulations don't fast-forward to meet up with this Brave New Healthcare World which is already steaming ahead of consumer protections.
There's a lack of transparency in several aspects of the Big Data/health nexus: the lack of accessible, simply-communicated privacy policies for the apps and tools we use in daily life in and beyond our health; the lack of protections for several kinds of consumer-generated data that isn't covered by HIPAA and various state laws that make up a very leaky patchwork quilt of privacy laws (contrast with the European Privacy Directive that covers the EU's so-called "health citizens"); and, the "black box" nature of data analytics algorithms, those mathematical calculations that mash up our data and make assumptions, churning out consumer segmentation profiles and report cards that may be flat incorrect, and also used for discriminatory purposes (say, for financial or employment purposes).
In a nutshell, the algorithm is only as fair as the data it analyzes. Because of that, Fred Trotter of O'Reilly Radar warns, "We are outsourcing decisions to a virtual entity which has none of our humanist notions of fairness or a sense of fair play."
Until Americans have comprehensive privacy laws protecting all aspects of our personal information, people will need to be mindful about their check-ins on social networks (hint: don't boast in Foursquare or Facebook about your visits every night in bars), credit card swipes in places you wouldn't want to go public about, and mobile apps with which you share personal data. Be your own best steward of your personal data.