Your doctor copays are too high!

We’ve chosen a cost-sharing system of high-deductible health care. Here’s why it’s grossly unfair to Americans

Published August 5, 2013 7:28PM (EDT)

  (Shutterstock/Alexander Raths)
(Shutterstock/Alexander Raths)

Until recently, the high-deductible health insurance plan – pay less up front, and more when you get sick – was something of a novelty product, marketed to the young and healthy. Now, however, high-deductibles are rapidly going mainstream – whether for young or old, professionals or poor alike.

Let’s say that you are in the market for a health insurance plan for your family. Perusing the choices on United Health’s website, you play it safe, avoid the plan labeled “high-deductible,” and settle on the company’s “comprehensive plan,” so-called “Copay Select.”

You expect a hefty annual premium, but are surprised to learn that after the premium is paid, you will still have a sizable deductible – money to be paid out-of-pocket for services ranging from lab tests to surgery – starting at $1,000 and ranging up to $12,500 per year. You may then be susceptible to “co-insurance” – additional out-of-pocket expenses of up to 30% of your medical bills –with maximums reaching as high as $10,000 per year. Then there are copays for visits to the doctor, as well as four “tiers” of cost sharing for prescription drugs.

Realizing that such expenses might turn a medical illness into a financial catastrophe, you consider waiting to 2014, when you can buy health insurance through the new exchanges created under the Affordable Care Act (ACA). You would, however, be disappointed to learn that the mid-level “silver” plans offered on the exchange are only required to cover 70% of your annual health care expenses. Out-of-pocket “cost-sharing” – in the form of copays, deductibles, or co-insurance – could go as high as $12,700 a year for your family, depending on your income.

Now those fortunate enough to have insurance through their employer might hope to be free from this phenomenon. The protection, however, is only partial, as plans in the employer market have been trending in a similar direction for years. Between 2006 and 2012, for instance, the percentage of covered workers with a deductible of $1,000 or more tripled, as did those with a deductible of more than $2,000.

Even those sufficiently impoverished to be eligible for public assistance are not immune: in January, the administration moved to allow states to charge Medicaid patients higher copayments for drugs, emergency room use and doctor’s visits.

Welcome to Copay Country.

To understand the origins of this new era of “cost-sharing,” we have to look back to the 1970s, when the rising costs of American health care became for the first time the predominant concern of policymakers.

During that decade, accelerating health care inflation was superimposed on economy wide-stagflation, a situation that only got worse after the expiration of Nixon’s health care sector price controls in 1974. From 1970 to 1980 health care spending jumped from $75 billion to $256 billion, an increase from 7.2 to 9.2% of GDP. The health care cost crisis was born.

The idea of “cost sharing” is neither a new idea, nor a universal one. Nixon’s 1971 health care plan, for instance, featured deductibles and copayments as well as a cap on annual expenses. The British National Health Service, conversely, has since its inception made health care free at the point of service.

Those who support cost sharing argue that it reduces overall health care spending by deterring patients from seeking unnecessary health care, or – more recently – by pursuing less expensive care from competing providers.  Those who oppose cost sharing, on the other hand, argue that the incentive to save could also serve as a disincentive to seek care, as well as a financial liability for many working class families.

To address these concerns, in the late 1970s the federal government funded the “Rand Health Insurance Experiment,” a difficult-to-perform study that has not been repeated, and which continues to inform discussions on the topic today. The experiment assigned – at random – 3958 people aged 14 to 61 to one of four categories of health insurance: a “free plan” with no cost sharing, and three other plans with variable degrees of cost-sharing in the form of copays, deductibles and co-insurance.

Some of the results were predictable. Those who had to pay each time they used health care, for instance, used less of it, making about a third-fewer visits to the doctor and being hospitalized about a third less. But notably, for the group as a whole, such cost sharing didn’t seem to worsen overall health outcomes. The experiment has subsequently been used to argue that there was a potential free lunch to be had: cost sharing could decrease overall spending, and no one would get hurt in the process.

Of course, there was some fine print. First, even at the time of publication, when the researchers looked specifically at the group of patients with low income and elevated health risks, the results were concerning: within this group, those who were in the cost sharing plans, as compared to the free plans, had worse vision as well as higher blood pressure. In fact, the researchers calculated that they had an increased risk of dying as a result of cost sharing.

Furthermore, later evaluations demonstrated that while cost sharing clearly reduced health care usage, it reduced both “appropriate care” (for instance, effective care for acute conditions) and “inappropriate care,” not a surprising fact given that patients generally trust the advice of their physicians, and are usually not equipped to decide which care is necessary and which is dispensable.

More modern research on cost sharing has raised even more concerns. When it comes to prescription drugs, for instance, a 2007 study in the Journal of the American Medical Association demonstrated that cost sharing results in lower rates of medication usage, with worse adherence to prescribed regimens. Additionally, those with higher cost sharing seemed to use more non-drug medical services – in particular patients with heart failure and schizophrenia – suggesting that not taking medications might be increasing medical costs in other ways.

Cost sharing frequently also puts individuals and families in extremely unenviable situations: should I go for the follow-up CT scan and make sure that some small growth isn’t developing into a cancer, or should I pay my rent? A 2011 study published in the journal Health Affairs looked at the effect of “high-deductible plans” (defined as plans with a $1,000 or more deductible) on families in which one member was chronically ill. Almost half of the families with a high-deductible plan – more than double that of families in traditional plans – had substantial financial burdens, such as having difficulty paying basic bills or having to set up payment plans. Cost sharing may also cause some families to avoid care altogether: a 2012 study published by some of the same researchers demonstrated that high-deductible plans were associated with an increase in delayed or forgone care, both for adults and children.

Cost sharing, it is becoming increasingly clear, may not be a free lunch after all, particularly for those who have to pay for it.

But putting aside the empiric data for a moment, there is also something deeply counterintuitive about the underlying logic of cost sharing: that someone who is sick – frequently operating under significant physical, psychological, or financial constraints – will be able to impartially dissect the rationale for his or her doctor’s recommendations, parse the nuances of the frequently controversial and incomplete medical evidence, and safely decline only the unneeded medical interventions. That the same individual – so the theory goes – will simultaneously engage in a lifelong hunt for high-quality but bargain-priced deals among competing groups of laboratories, doctors, pharmacies, drug-manufacturers, hospitals, dialysis providers, imaging centers, and medical supply companies.

Such an expectation is – to put it modestly – entirely unrealistic. But even worse, it is grossly unfair to the sick patient.

Regardless, unless things change, the future promises only more and more cost sharing.  In addition to the trends already in place, in 2018 another provision of the ACA will go into effect – a 40% excise tax on high-cost insurance plans, defined as family plans costing more than $27,500, or individual plans costing more than $10,200. These are no doubt very expensive plans by today’s dollars, and if this provision went into effect today, few would be affected. But over the years, as a result of continued health sector inflation, more and more plans will begin to hit that ceiling. And when they do, it will not be in the employer’s or insurer’s interest for them to go any higher: further increases in the cost of care will simply take the form of more and more cost sharing. As the New York Times reported in May, many companies are already increasing copays and deductibles now, so that they can remain below the threshold when 2018 hits.

Understanding the logic of this tax is crucial: its purpose is not to raise revenue, but to contain costs by limiting the scope of benefits under these so-called “cadillac plans.” But the problem is that costs are contained only by dumping more of the price of care onto us when we get sick, forcing us to either decline care, or theoretically to find cheaper alternatives.

The logic of cost sharing is therefore essentially the logic of health care consumerism, increasingly a dominant mantra in political and health policy circles. Make the patient a consumer, and the system will be saved.

Yet, the United States already has more health care “consumerism” and cost sharing than other developed countries – and yet we have much higher costs. The truth is that there are safer and better-proven methods of cost control that we could employ, and which wouldn’t involve making a patient pay every time he or she gets sick.

We could, for instance, allow Medicare to directly bargain with drug companies over prescription drug prices, as other wealthy countries already do: by one estimate, the savings from this reform alone could range from $230 to $541 billion over ten years.

More ambitiously, we could work towards a “single payer system,” which could save billions through reduced administrative and clerical expenditures, while allowing costs to be directly controlled through global budgets and fee schedules. Gerald Friedman, an economist at the University of Massachusetts at Amherst, recently estimated the savings of such a system at $592 billion annually.

At the same time, we are of course obliged to continue to move away from procedures and tests with high cost and little health benefit, both through physician and patient education.

But rather than borrow these and other ideas from better performing health care systems, we are only making our system more and more like itself.

“Copay Country,” in other words, is not an inevitability – it is a choice. But is it really the country we want to become?


By Adam Gaffney

AG is a physician and writer in Massachusetts.  He blogs at www.theprogressivephysician.org and is on twitter at @awgaffney.

MORE FROM Adam Gaffney


Related Topics ------------------------------------------

Health Insurance Medicare Obamacare Prescription Drugs Single Payer