Debates about health care frequently focus on the number of people with and without insurance, because it’s a relatively straightforward thing to measure. Either you have coverage or you don’t.
But an equally important question is what kind of insurance you have -- and that includes whether your policy leaves you exposed to large, potentially crippling out-of-pocket expenses. The answer makes a big difference.
A big co-payment might discourage you from filling a prescription to treat a serious condition, or from seeing a specialist whose attention you need. If you get sick and end up in the hospital, a high deductible could leave you owing many thousands of dollars, forcing you into financial distress.
You wouldn’t be uninsured, but you would be underinsured.
You’d also have lots of company.
Recently released data, including two brand new surveys, suggests that millions of Americans with insurance are in just that situation. And while there would almost certainly be more of these people if the Affordable Care Act were not in place, those who still face high medical bills are disproportionately poor and sick -- in other words, the very people who most need the protection that health insurance is supposed to provide.
It’s a complicated situation to parse -- in part because, strange as it sounds, high out-of-pocket costs also have an upside. But the findings from those two new surveys tell a story that gets too little attention from policymakers.
The first comes from the Commonwealth Fund, which for many years has been asking respondents about their medical expenses and how big those expenses are, relative to household income. It defined the underinsured as those whose out-of-pocket expenses are more than 10 percent of household income or, for people who are at or below twice the poverty line, more than 5 percent of household income. By its definition, about a quarter of all working-age adults, some 31 million people, were underinsured as of 2014.
That’s proportionally more than twice as many people as were underinsured in 2003, although the percentage has been stable for the last few years. The report is full of interesting and potentially telling details -- like the fact that, among the most populated states, the problem was far worse in Florida and Texas than it was in California and New York.
But perhaps the most important survey finding is about the consequences for people who fit into the underinsured category. Some took financial hits: 47 percent of respondents said they exhausted their savings to pay medical bills, 23 percent were dealing with collection agencies and 7 percent had to declare bankruptcy. Others who were underinsured opted not to get recommended care: 26 percent didn’t fill a prescription because of expense, while 24 percent didn’t get prescribed follow-up treatment or tests for the same reason.
These findings are consistent with other recent studies -- including one from the Henry J. Kaiser Family Foundation, published in the spring, which found that about one-quarter of all non-elderly Americans with private insurance don’t have enough liquid assets to pay the deductibles on their health insurance policies. As Drew Altman, the foundation’s president, wrote in the Wall Street Journal, those people will “have to put off care or incur medical debt” if they develop any serious health problems.
Of course, a major goal of the Affordable Care Act, or “Obamacare,” is to protect people from precisely these situations -- not simply by helping people without insurance to get coverage, but also by setting minimum standards for what insurance must cover. Mostly those standards apply to the “non-group” market -- that is, for people who buy coverage on their own, directly from insurers or through new insurance exchanges, rather than through employers.
The new healthcare law requires that all plans include “essential benefits,” even benefits such as prescription drugs, rehabilitation, mental health and maternity care that non-group plans often previously left out. The law also sets limits on out-of-pocket expenses. People with household incomes below 250 percent of the poverty line ($48,500 for a family of four nowadays) are eligible for special tax credits that offer yet more protection against high out-of-pocket expenses. These requirements raise premiums overall, and that's why so many people (particularly younger and healthier consumers) experienced "rate shock" when the the Affordable Care Act first took effect. But the law by design discounts those premiums for low- and middle-income buyers through a series of tax credits.
Have those reforms helped make health care more accessible and affordable? If so, by how much? That’s where the other new study -- from the advocacy group Families USA, based on data from the Urban Institute -- can shed some light. More people had insurance, the report noted, and people buying coverage with federal financial assistance tended to have better protection from expenses. Even so, the report pointed out, one in four people in the non-group market were still going without recommended medical care because of the cost.
Ron Pollack, president of Families USA and among the Affordable Care Act’s most enthusiastic proponents, told The Huffington Post the Affordable Care Act "has not only expanded health coverage to more than one out of three of the previously uninsured, it also provides subsidies to low-income people seeking help with deductibles and co-payments. However, even with these significant improvements, more than 25 percent of those with year-round insurance are forgoing needed health care because they find these costs unaffordable. This can best be fixed by reducing the burden of high deductibles.”
Survey data like this is not always the most reliable source of information, since it relies so heavily on personal recollections and impressions. The RAND Health Experiment, a decades-old study conducted by the RAND Corp. that is more or less the holy grail of health care economics research, demonstrated that people are pretty lousy at distinguishing bad care from good care. Throw in the fact that physicians don’t always know, either, and it’s likely that at least some of the people skipping medical care were no worse off -- since the tests or treatments they avoided were unnecessary.
Economists also argue, plausibly, that high cost-sharing encourages people to be more aggressive consumers for health care -- whether that means shopping for cheaper policies or picking providers of care that can perform the same services for less money. In fact, most experts think the increase in out-of-pocket costs is a major reason that private health insurance premiums, and health care costs generally, are rising at historically low rates. When conservatives and business executives talk about giving people “skin in the game,” this is what they have in mind. It’s why employers were increasing cost-sharing before the Affordable Care Act became law -- and why they’ve continued doing so since.
Many experts would say the trick with designing an insurance system is striking the right balance -- in effect, exposing people to just enough costs that they start to think like consumers but can still afford their medical care when they are sick. Others emphasize the need to design benefits in a way that steers patients toward the most effective kinds of care, either by linking cost-sharing to quality (as an initiative called Value-Based Insurance Design would) or simply by managing and coordinating care more effectively (as integrated group practices like Kaiser Permanente traditionally have).
Notwithstanding the caricature of Obamacare as a far-left, government takeover of health insurance, the Affordable Care Act was an effort to blend these different approaches. If these two reports are correct, there’s more work to be done.