"The high cost of health care" seems to have settled in as a lingering issue. Every six months or so, the fever sweeps the body politic, diagnoses are pronounced on the opinion pages of newspapers and in popular magazines, and handy prescriptions are written out. "Just take this to your representatives in Congress," the opining experts say to the citizens, "and be sure to follow the directions when Congress dispenses the bitter pills."
More often than not, the remedy of choice is socialized medicine. (That's the accurate designation and is usually avoided. For public consumption, the recommended palliative has brand-name labels such as "a nationwide health-care system.")
It would be nice to think that America still doesn't have a national health service after all these years (that's the way liberals intone it, looking mournfully across the Atlantic) because the American people have enough horse sense not to turn over something as important and intimate as their health care to pols and bureaucrats. Depressingly, though, it seems that a majority of Americans—close to three-quarters in one recent survey—think the feds should start up a national health-care program.
In fact, in an era when the public is otherwise willing to test the less-government tonic, various surveys show that lots of people, as the National Journal recently summarized the poll data, "will go along with just about any change [in health care delivery] that contributes to cost containment." Anything, that is, that they think will cut costs. And that includes everything from requiring patients to get second opinions on surgery to imposing government price controls on hospitals and doctors.
All this is based on the premise that health-care costs are out of line, out of control, and out of the hands of anyone—patients, providers, or payors—involved in the system. It's not clear that that premise ever was quite true, though, and there's increasing evidence that it's out of sync with current reality.
Consider some disparate bits and pieces of evidence:
• Hospitals absorb 40 cents out of every health-care dollar spent in the United States. In 1984, hospital costs rose only 4.5 percent, a whopping 56 percent less than the 10.2 percent rate of increase the year before.
• Part of the reason: hospital admissions dropped in 1984 by nearly 1.5 million—the first decline after decades of uninterrupted increases.
• Health costs are being shared more by patients. In 1984, 21 percent of the health plans for medium and large employers had deductibles of at least $150, up from 7 percent just two years earlier. And, as economic theory predicts and studies now confirm, when a third-party payor doesn't pick up the tab for everything, people exert more commonsense restraint over their demand for medical services (with, interestingly, no measurable ill effects).
• Doctors are increasingly signing on with alternative health-insurance programs that require them to accept payments up to 20 percent less than what they had been charging.
• Hospitals are beginning to look to discounts to attract more of the hospital patients in their communities.
• For years, governments, federal and state, tried to exert control over hospital costs by limiting purchases of high-cost, sophisticated diagnostic equipment so as to avoid duplication. Hospitals generally managed to get around the regulations. But now, the American Hospital Association reports that more hospitals are…you guessed it, sharing high-cost, sophisticated diagnostic equipment to save money.
What's happened in the health care market that everyone thought could only be restrained by controls imposed from above by government? Good old competition is part of the answer.
In 1960, there were 86 medical schools in the country, and they turned out 7,100 new doctors that year. By 1979, there were 125 schools, churning out 15,000 new doctors. However much the docs complain about what they deem an "oversupply" of their kind (and a lot of them complain a lot), the fact is that it's great for health consumers. It largely accounts for doctors going along with the new insurance programs that are trying to control costs and also has contributed to the health-care-delivery phenomenon of the decade: the growth of independent, stand-alone clinics, emergency-care facilities, surgery centers, and so on, which are widely hailed by experts as an efficiency-enhancing, cost-saving development.
Meanwhile, for-profit hospitals have been growing, injecting a healthy dose of competition in that sector, too. Between 1970 and 1980, for-profits boosted their share of the country's hospital capacity from 6 to 9 percent. And they're the leading edge in cost-saving innovations such as computerized patient charts.
But there's another new and important factor. Health insurance companies have finally decided to do something about the cost of medical care instead of just paying the bills and raising premiums when they needed to to cover the costs. The last few years have been marked by new kinds of plans that negotiate charges ahead of time with doctors and hospitals, mandatory second opinions to avoid unnecessary and marginally necessary procedures, incentives to reduce hospital use, and so on. It's activism unheard of since private health insurance started taking off in 1945.
And it's paying off. Blue Cross-Blue Shield, for example, figures it saved subscribers $6 billion in 1984. And in May it announced unprecedented refunds of nearly $750 million because a sharp drop in hospital use had saved so much money.
Incredibly, some people persist in blathering the old saw that the medical market isn't like any other market. Vita Ostrander, for example, president of the 19-million member American Association of Retired Persons, got up before a conference on the future of Medicare this summer to criticize the Reagan administration's efforts to control Medicare costs by encouraging more competition in the medical market. "The health care marketplace," she charged, "resists the normal forces of supply and demand."
Maybe Ostrander can be excused. Economist Lester Thurow really ought to be ashamed of himself. In a piece written for the New England Journal of Medicine and picked up by Harper's recently, Thurow had to ignore the evidence of the last few years in order to allege that "insurance companies actually have an interest in increasing healthcare spending, since they make money…by taking a management fee that is usually a percentage of total expenditures."
Apparently, the insurance companies are acting against their interests these days. As an economist, Thurow ought to know that doesn't fly as an explanation of anyone's behavior in the marketplace. But then, one has to wonder about his economic learning—in the same article, arguing against greater reliance on market mechanisms, he made this ludicrous statement: "Since the richest 20 percent of all U.S. households have 11 times as much income as the poorest 20 percent, any efficient market mechanism will give 11 times as much medical care to the top 20 percent as it gives to the bottom 20 percent." Do the top 20 percent also consume 11 times as much food as the poorest, drink 11 times as much water, buy 11 times as many cars? Of course not. This is an elementary economic principle that any Econ 101 student grasps in a flash.
So what does explain the fact that insurance companies for a long time didn't do anything about spiraling costs but now they are hopping to it? For one thing, world competition in all markets is forcing employers and, increasingly, even unions to look carefully at all costs, and health care accounts for a large chunk of employee benefits. So employers, the major purchasers of private health insurance, have goosed the insurance companies.
Another plausible factor—no doubt not to the liking of Lester Thurow and his fellow liberals—is offered by Clark Havighurst, a specialist in health-care legal issues, in his contribution to a new book from the American Enterprise Institute, Incentives Vs. Controls in Health Policy. It's actually government getting out of the picture that's done a lot to spur insurance companies to act, Havighurst suggests.
When a major cost-containment measure pushed by the Carter administration was defeated in 1979, he explains, employers and insurers alike began to give up on a long-held assumption that the government would sooner or later take regulatory action to get the situation under control. Then along came the Reagan administration, with reforms aimed at controlling the costs of its own programs, Medicare and Medicaid. The result, he notes, is that "both governments and private payers [began] for the first time to act as prudent buyers rather than as passive intermediaries."
There's still a lot of "price stickiness" afflicting the health-care market. As contributors to the AEI volume make clear, there's a lot of stepping aside that government could do to help competition make an even bigger difference to consumers than it has in the last few years. And if the opinion makers would just start bringing people up to speed on what's actually going on these days in the medical marketplace, the public could cure itself of the high-cost-therefore-government-action syndrome. Which would be great for the health of the body politic, given everything we already know about the bracing effects of competition—even on health care.
This article originally appeared in print under the headline "Forget Placebos—Let’s Go for the Strong Stuff".