Post navigation

How Medicine Became a Growth Business

Below, a guest post by Dr. Clifton Meador. Over the years, Meador has practiced as a family doctor, an epidemiologist, a health care administrator and Dean of the University of Alabama Medical School. He also has published many books and articles including a tale set in the not too distant future called The Last Well Person, which uses satire to comment on the folly of our obsessive drive to test and screen every well person in America—until we find something wrong with each and every one of them. If you have seen the film version of Money-Driven Medicine, you will remember Meador as the doctor who takes the viewer on a wonderful tour of Nashville. Thanks to Dr. George Lundberg for sending me this essay.

I would add only that I don’t think that Meador is saying that “the worried well” caused the overtreatment that has become so prevalent in our health care system. Rather, they responded to the advertising and the hype as hospitals, drug-makers and others began to persuade us that there is a cure for everything—if you can just detect it early enough.

Meador quotes Lewis Thomas on “the general belief these days seems to be that the body is fundamentally flawed, subject to disintegration at any moment, always on the verge of mortal disease, always in need of continual monitoring and support by health care professionals.” This, I think, is key.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Personal Observations on the Changing Scene In American Medicine: 1955 to 2010

Clifton K. Meador, M.D.

When I entered private practice in 1962 (after graduating from medical school in 1955, completing a medical residency, serving two years in the Army Medical Corps, and completing a NIH Fellowship in Endocrinology), there was no Medicare, no Medicaid, and very little medical insurance of any kind. Patients paid cash, vegetables, meat, or nothing.

We turned no one away for lack of insurance or inability to pay. I had no idea when I saw a patient if they did or did not have insurance or were able to pay. The medical insurance of those times paid only if the patient was admitted to a hospital with a known diagnosis. For those patients with insurance, we admitted all we could in order to get tests and imaging studies paid. This led to an abuse of hospitalizations and many bogus diagnoses. Almost any diagnosis would justify the admission, including pseudo-diagnoses such as achlorhydria, hiatus hernia, or even retroverted uterus. This wrong use of admissions to get insurance coverage was common. There was no insurance for outpatient care. That did not change until late in the 1960s.

To illustrate how irrational medical insurance was, consider this: I had a female patient in the late 1960s with hyperthyroidism and I wanted to treat her with Radio Iodine (I131). She had Blue Cross insurance. I called to tell them I could treat her as an outpatient for around $150. If I admitted her, it would cost over a $1000. The insurance refused to cover her as an outpatient so I admitted her, costing the company over $900. There were many examples of this sort of absurdity until insurance began to cover out patient care. When outpatient insurance coverage came into existence it became very difficult to get patients admitted to hospitals.

We gave free care as a “professional courtesy” to physicians, ministers, and veterinarians and all members of their families. “Professional courtesy” disappeared some time ago and is now illegal under Medicare rules (deliberate free care is judged to be an enticement!). Hospitals were largely charitable organizations, barely able to stay alive. I saw this first hand in the early 60s when we had to pay cash on delivery to get milk for the nursery at the University Hospital in Birmingham. No cash, no milk.

Around 1966, Medicare and Medicaid came into existence. By the late 1960s for-profit health care companies and hospitals appeared on the scene, relying on the steady stream of money from the federally funded Medicare program. In addition capital costs of hospitals were allowed to be reimbursed as a “pass-through.” This steady source of federal money for expansion of facilities coupled with the increasing profits from for-profits companies funded an explosion of new technologies. Wall Street for the first time saw health care as some sort of fundable commodity and began to fuel the rapidly expanding medical-industrial complex with stock offerings. Stock prices soared feeding the expansions of programs across the board. The preceding increases in NIH funds in the 1950s and 1960s had led to the discoveries and new knowledge that were needed before new technologies could be developed. (Scientific knowledge precedes technologic or engineering development in most cases.) There was a remarkable timing of the convergence of forces and sequence of events:

*Scientific discoveries from increased NIH funding in the 1950s and 1960s *Inventions in technology and engineering *Federal funding of Medicare *Medicare capital pass-through for equipment and expansions *Wall Street recognition that health care was a fundable commodity *Creation of for-profit companies with Medicare funds and stock offerings

These events led to the exponential increase in growth and costs for medical care. But to sustain all of this the businesses of medical care needed to grow. And grow it did, not always attending to the clinical need or lack thereof.

The public, previously reluctant to seek medical care and only when sick, now came in droves to physicians and hospitals. The AMA’s restraint on physician advertising was set down by federal court orders in 1975. The flood gates were opened and the public began to be saturated with appealing ads for hospitals, drugs, tests, and procedures, whether needed or not. Each major television channel soon had medical experts extolling the latest device or drug. Drug companies began to monger new drugs to treat new and thinly defined ailments. For the first time, drug ads were aimed directly at the public. The flood of patients now became a tsunami and the cost for health care soared to 17% of the gross national product.

Now add to this federally funded technological growth, another factor: the appearance of well people for the first time. As medicine began to make more inroads and technical improvements, the public came in ever increasing numbers. For the first time, well people started to appear, seeking detection of early disease in the hopes that it would be curable. In addition, a whole new category of patients began coming to see doctors. Sidney Garfield of Kaiser Health System dubbed these as the “worried well.” The well had now become worried, noticing smaller and smaller symptoms generating more and more worries. The public’s expectations for curative medicine became unreasonable and not sustainable.

A system originally designed to find disease in sick, symptomatic people was now faced with finding disease in well, asymptomatic people. This fundamental change in the course of events has received little attention. One of my favorite quotes says, “There must be something the matter with someone who goes to see a doctor when there is nothing the matter.” Another, from an unnamed medical resident when asked to define a well person answered, “A well person is someone who has not been completely worked up.” The new technologies, almost all visualizations, could find smaller and smaller lesions whether they were the source of a patient’s problem or not. False positive test results sky rocketed. As visualizations increased in power through the new technologies, listening to patients decreased. It was as if the whole auditory modality of medicine was vanishing: if you could not see the disease, it did not exist.

Prior to Medicare, patients came only when they were sick or thought they were sick. All had one or more symptoms. The job of the physician was to determine if the patient was sick or not and if so, what sickness. In those times, disease could be considered to be a “digital” disease. Like an on or off signal, the disease was either present or not. That is no longer the case.

We have entered the age of “analogue” diseases. Consider atherosclerosis of the vessels. It begins early and builds gradually along a scale or continuum. When does the process become a treatable disease, say of the coronary arteries? Consensus seems to say a 50% stenosis (narrowing) of an artery is treatable disease but we know that some doctors treat 40 % or even 30% stenoses. With analogue diseases there is no easily definable answer. This is not like typhoid fever – either present or absent. We now have a whole array of analogue diseases or even pre-diseases. In effect we have converted nearly the entire population of America into analogue diseases, needing medical attention at earlier and earlier stages.

Lewis Thomas noticed and wrote about this change in the American public in a publication “Doing Better and Feeling Worse” published in 1977:

“Nothing has changed so much in the health-care system over the past 25 years as the public’s perception of its own health. The change amounts to a loss of confidence in the human form. The general belief these days seems to be that the body is fundamentally flawed, subject to disintegration at any moment, always on the verge of mortal disease, always in need of continual monitoring and support by health care professionals. This is a new phenomenon in our society.”

Some of this rush to medical care is fueled by a large misunderstanding by the public about the difference between life expectancy and human life span. The rise in life expectancy from birth has been widely and incorrectly attributed to advances in curative medicine. People believe that the human species is being made to live longer mainly by medical care. This is not the case. The rise in life expectancy from birth is attributable to reductions in deaths in childhood, mostly before the age of one year. Most of the reductions in childhood and infancy came from public health measures, clean water, clean milk and from immunizations. More people are living to older ages because they did not die in childhood. Humans as a species are not living longer life spans. The life span of humans is fixed at 85 years plus or minus 5 years. It has not changed in recorded history.

The lack of understanding and acceptance of a fixed life span is another factor driving up the cost of medical care. A large percent of Medicare funds is spent in the last six months of life of the recipients. Much of these funds go to futile care to prolong death, not meaningful or useful life. Sometimes it appears that medicine has abdicated one of its most essential roles – telling a family that all has been done that can be done. That it is now time to stop futile care and begin pure comfort care and support. We need to define futile care and explain palliative and hospice care to the public.

Another force, sometimes overlooked, in the increases in use and costs of medical care is the near disappearance of primary care. At the level of primary care or first contact care, somewhere between 25 and 50 percent of patients have stress or a psychological basis for their symptoms. There is simply not a definable medical disease behind every symptom. It takes time and listening to understand the sources of human distress. That is the job of primary care. By listening and understanding the patient, the primary care physician makes careful use of tests and imaging procedures. The good primary care physician knows the high rate of false positives if he jumps too soon to unfocused or indiscriminate use of tests. There is an old dictum that says, “It is more important to know the patient with the disease than to know the disease.” That is the heart of a primary care physician.

A seldom discussed feature leading to the decline of primary care occurred in the early 1980s, when managed care began to expand. Primary care internists and family physicians charged for their time, for lab work done in their offices, and for simple x-rays, also done in their offices. This was the classic cottage industry. The internist or family physician could on one or two visits define the problem within the office practice. Managed care mistakenly went after primary care reimbursement early on. They refused to pay for lab or x-rays done in the office, saying the primary care physicians were doing unnecessary tests. These office procedures accounted for over 50% of revenues for primary care. When reimbursement for office lab and office x-rays were refused, the internists and family physicians had to double or even triple the volume of patients seen to stay at the same revenue level. This great reduction in time for the visit led to shorter and shorter visits and much less attention to listening and sorting out psycho-social problems. This over scheduling led to long waiting periods for appointments. The overcrowding of Emergency Rooms can be traced to these same root causes. Referrals directly to specialists became a common method for dealing with complex patients. In addition to the reduction in the amount of time available for patients, there was a much larger increase in costs from having to use commercial labs and imaging facilities. The loss of the cottage industry decreased quality and increased costs for care.

As primary care fades away, the process of careful listening and sorting for life stresses is going away. The public, unfiltered by primary care, will go directly to specialists, as they already do in large numbers. When this happens the prevalence of medical diseases (read “probability of disease”) falls and when prevalence or probability of disease falls, the rate of false positive results increases greatly. (At a 2% probability of a disease, testing with a very sensitive and specific test will yield 72% of results to be false positives!) Specialists are not trained to sort out social or psychological factors, they are trained to test and do procedures to detect diseases of their specialty organs- e.g. heart, kidney, lung, bladder, brain, spinal cord, and so on. Given the analogue and continuum nature of many disease processes, they often find lesions at very early stages. These lesions will have to be followed by periodic examinations so eventually many people will be in some situation of being followed for some or another lesion by a specialist. The real causes of their distress, if social or psychological, will persist and not be addressed by the specialist.

So here is the current situation as I see it: The public is seeking care far beyond any need or reasoning. The influx of well, worried well, and worried sick people into a system designed to find medical diseases in sick people leads to large increases in false positives. Advertising brings even more people into the system, further lowering the probability of disease producing even more false positives. The excessive over use of tests leads to huge increases in profits, with built- in incentives to do even more testing and procedures. The analogue or continuum of many disease processes leads to more and more finding of early lesions. Working through the false positive problems either generates more and more repeated testing OR it produces people labeled with diseases they do not have.

Removing false labels is very difficult –a nearly impossible task. If anyone doubts this, try stopping thyroid replacement in an obese female proven to have normal thyroid function or try stopping B12 injections in older patients when there is no proven B12 deficiency.

Almost all suggested approaches and solutions to the high costs of medical care have been exclusively financial or payment based. While financial considerations are very important, the underlying causes of the high costs are based on misdirected clinical and diagnostic thinking together with financial incentives to do more and more. Very few suggested changes focus on the details of the clinical system or on the needed but lacking emphasis on evidence based medicine.

It is quite clear that medicine has become a very big business. It’s also a truism that all businesses must take in more money than they spend, call it black ink or profit. The business we are currently in calls for more and more to be done to patients. The procedures and tests must grow and expand. I suggest we are in the wrong business.

Instead of making money from doing things to people, we should make the needed money from keeping people healthy by doing less and preventing more. The transition from our current business system to a new business system will take a massive shift in thinking. I’m not sure it is possible.

The only way out of this problem is to greatly increase the primary care specialties. These are the doctors and nurses who can listen and think through a clinical problem BEFORE jumping to testing. How to accomplish the needed increase in primary care has not been defined. There is simply too much money in the system and no easy way to redistribute it among the specialties or the multiple businesses.

13 thoughts on “How Medicine Became a Growth Business”

I used to daydream about writing a story I would title “The Richest Man in America”. “They wouldn’t let him die, no matter how hard he tried to end it. Then his money ran out and then they pulled his plug.”

Thanks Maggie,
One concept that I have promoted is that all of the confused and “trapped” US Doctors who obtained their MBA’s in the 80’s and 90’s should now trade them in for MPH’s.
Another phrase that I use is that “we have ALL been royally duped by Big Bio-Medicine”. We are indeed classical suckers.
However, I am more optimistic that Dr Meador. Why? Because my own observation is that a growing number of patients, much more so than the medical professionals, are becoming aware of “the con”
Dr. Rick Lippin
Southampton,Pa.

Gerald & Dr. Rick
Gerald–First, welcome to HealthBeat.
I am afraid you are right If someone is very well-insured (or extremely wealthy and can pay any amount out of pocket), they are more likely to become the victims of over–treatment.
This may be the one important way that the very rich are not as lucky as the average American.
Dr. Rick–
I agree — If more M.D.’s get Masters in Public Health (MPH’s), the
line between the two professions will begin to fade.
“Medicine” realy should be about public health (making all of us healthier.) MDs and MPHs need to collaborate.
I, too, am optimistic that more patients are beginning to be aware of the “con” in all of those drug ads etc.
The public is slower to see that all of those angiplastites are not helping them. But I’m hopeful that, over time,
this will happen.
Finally, I think that the youngest generation of new doctors is much more aware of (and open to the idea that) overtreatment is a major problem.

Great article, Maggie, thanks for posting it! I agree with Dr. Rick that MD-MPH makes perfect sense, but for full disclosure, that is the combination of my degrees, so am somewhat biased.
There is one other important influence that Dr. Meador seems to have left out, and that is the insidious entry of lawyers into the field of medicine in the form of frivolous malpractice suits. I do not dispute a citizen’s right to litigate, but it has surely spiraled into the ridiculous. I am aware that studies do not place a high monetary value on this component of the national healthcare expenditures, but every doc will tell you that the specter of litigation makes a huge difference psychologically and colors one’s decisions in favor of over-ordering.

“Very few suggested changes focus on the details of the clinical system or on the needed but lacking emphasis on evidence based medicine.”
I think the medical community shares some of the blame here. For example, it used to be that to be diagnosed with hypertension, systolic blood pressure needed to be above 160. Now the cutoff is 140. For diabetes, blood glucose needed to be above 125. Now it’s above 110. High total cholesterol used to mean above 240. Now it’s over 200. As a result of these changes, many millions of patients previously considered healthy now need medical management. I don’t think patients drove that.
As for changes in the payment structure, financial payments are surely important. However, I’m told that in Massachusetts at least, the most expensive multi-specialty group practices have been paid on a capitated basis for at least the last ten years. If market power is great enough, the global payment can still cost as much, if not more, than fee for service would have cost.
The problems with the healthcare system in the U.S. are multi-factorial and will need to be attacked on numerous fronts including robust price and quality transparency tools, substantive tort reform, more widespread use of living wills and advance directives, payers not paying for services, tests, procedures and drugs that either don’t work or cost more than they’re worth and doctors making it part of their job to know and to care about costs, among numerous others. While better and more available primary care would be helpful, I think it’s only a very small part of the solution to our healthcare cost problem.

‘In addition capital costs of hospitals were allowed to be reimbursed as a “pass-through.” ‘
I saw this scam first hand, here is how it works – buy a money losing hospital with a big debt. Add that debt to your capital pass through for all your hospitals. Then shut down the debt ridden hospital and write it all down. Their debt serves to increase your medicare payments for all your other hospitals. Repeat as often as possible.

How about removing financial incentive to do more? A yearly salary for an agreed on minimum number of patients (unless less patients try to make appointments). If you have poor patient outcomes as determined by periodic anonymous review by peers you are coached and helped and if still poor patient outcomes you lose your position and are reported to the licensing board.

I agree that we should pay primary care physicians for their time, and for listening to patients in ways that lead to better chronic disease management– not for how many tests they order or how many other things they do.

Under the Affordable Care Act, primary care physicans who
decide to set up “medical homes” or decide to join “Affordable
Care Organizations” will be paid based on their patients ‘outcomes.
If they decide to make their practice a “medical home” they will be paid more if they manage their patients’ chronic diseases and keep them out of the hospital

If they choose to form an “Accountable Care Organization” with a group of other doctors ( or doctors & a hospital) they all wil be paid more for better outcomes at a lower cost, sharing in the cost savings. (This will happen if doctotrs or doctors and a hospital learn to work together–because this is the only way that they will receive the bonus–reducing preventable errors caused by lack of communication, etc.

Today, HHS announced that 89 more groups of doctors or doctors and hospitals have signed on to create “Accountable Care Organizations.”

Marya, Barry, Al
Marya– Yes, I agree the MD MPH is a great combination. But I also think that people who have one of those degrees should collaborate with each other. Together, they can move reform forward.
On malpractice suits, see this two-part post: herehttp://www.healthbeatblog.org/2008/05/medical-malpr-1.html and here:http://www.healthbeatblog.org/2008/05/medical-malprac.html
Most suits are not frivolous (the patient was actually seriously injured) and doctors’ fears of being sued are exaggerated. (Only 1 percent of patients who are injured ever sue, and most doctors are never sued.)
Nevertheless, even if physicians’ fears are overblown, they are worried. Even if you win, being sued is a terrible experience. As a result, many doctors say they practice “defensive medicine.”
But when you get down to it, a doctor usually has many reasons for ordering an additional test or treatment. Fear of malpractice may well be one of those reasons, but it’s very difficult to untangle that motive from others including a general feeling that “doing more” is always better. (This is something med students and residents are taught. “Doing more” is often described as “being thorough,” without much discussion of the risks involved in every test and treatment . . .
In terms of malpractice what we really need is a better system to address errors: full disclosure and arbitration– plus fewer errors.
To cut down on malpractice the profession really needs to police itself. Doctors should blow the whistle on physicians who are in some way impaired –(alcohol, drugs, age) — or just reckless. Whistle blowers also need legal protection so that the doctor in question can’t turn around and sue them.
Doctors are human, and mistakes happen. But a small group of doctors account for a large number of repeated malpractice suits, and in some cases, these are people who really shouldn’t be practicing . . .
Barry–
I don’t think Meador is suggesting that the “worried well” caused things like lowering the bar to define what constitutes “high cholesterol.”
(The drug industry was heavily involved there.)
But the “worried well” are quick to buy into the Lipitor ads, and they rarely ask their doctor: “shouldn’t I try changing my diet first, and see if my cholesterol goes down?”
They want a quick fix. This is why they so often go directlly to a specialist rather than to a primary care doc: they are hoping for a “procedure” (like angioplasty–see Naomi’s recent post) to fix the problem. The idea of changing diet, exercising, trying medication first is not as appealing.
Specialists are generally far more expensive than primary care docs, and primary care docs can teach patients how to manage their chronic diseases (which are the biggest cost in our health care system.) This is why we want patients to get primary care. And why we need to pay primcary care docs more, so that they have time to talk to patients about chronic disease management.
On malpractice, see the posts that I recommended to Marya. All of the evidence says that tort reform is not the answer.
They have tried it in Texas, and there is as much or more overtreatment in Texas than in other states.
When it comes to health care costs, Massachusetts is the most expensive spot on the globe, and so not a very good example of whether or not capitation works to lower costs. (We have many examples showing that it does.)
At the very end of your list you come to what is really driving costs– “services, tests, procedures and drugs that either don’t work or cost more than they’re worth.”
As Princeton health care economist Uwe Reinhardt has shown, the ever-rising cost of new medical technology (whcih includes durgs, new surgical procedures, eqipment, devices)) is what drives health care inflation in the U.S. People assume that the “new new thing” is better so doctors prescribe it, patients want it and we all pay the bills for treatments and tests that, too often, are ineffective–or no better than the older treatments that they are replacing.
This is why comparative effectivness research is, as you say, so important.
Health care reform is funding that research, and under the Affordable Care ACt it will be disseminated widely. Doctors will not be forced to use it, but they will know it’s there. And those who hope to share in savings (in accountable care organizations or in medical homes) are likely to find that if they use the resreach, they can bring down costs while lifting the quality of care.
Al–
The Affordable Care Act does reward Groups of doctors– as well as doctor/hospital groups– for better outcomes at a lower price, and it does penalize them for less efficient care (poorer outcomes at a higher price) as well as preventable errors.
But resarch shows that it is very difficult to penalize or reward individual doctors because:
a)the pool of patients that an individual treats is too small, and so a few very difficult cases, (includng non-compliant patients who don’t take their medicine, etc. skews the results, and
b) these days most sick patients are treated by several doctors, and it is very difficult to determine who is responsible for the poor outcome:the primary care doctor who originaly treated the patient (and didn’t manage his chronic disease)? the specialists who then saw him? the hospitalist who was supposed to co-ordinate his care in the hospital? the surgeon? the doctor who saw him after he left the hospital?
Peer-review can spot outright malpractice and can be useful in those cases.
But medicine is so ambiguous (and there is so much that we don’t know) that it is very difficult to judge quality of care in individual cases. Most doctors are neither extraordinary or terrible. Like most of us, they fall into the middle of a bell curve.
Finally, if we began rewarding and penalizing individual doctors for outcomes within their relatively small patient pool, doctors would be tempted to avoid the sickest patients and the poorest patients (who often are not as compliant, for a variety of reasons) and of course, these are the patients who most need care.
This is why the Affordable Care Act provides bonuses (and penaltiies) for large groups of health care providers, working together. Paying the group also encourages everyone to collaborate–no one gets the bonus unless everyone rows in the same direction.

“Most suits are not frivolous (the patient was actually seriously injured) and doctors’ fears of being sued are exaggerated. (Only 1 percent of patients who are injured ever sue, and most doctors are never sued.)”
This unfortunately a generalized and blatantly untrue statement. A study in the NEJM showed that 1 in 14 doctors will be sued each year.
Furtherthe study states: “Overall, the study authors said, 75 percent of physicians practicing in a low-risk specialty will have been sued by the time they are age of 65 years, 19 percent will have made an indemnity payment. For those in the high risk specialties, 99 percent will have been sued by age 65, and 71 percent will have lost.”
Please do not minimize the affects and sheer numbers of litigation. It matters on every level. And yes, even when there is no wrong doing on the physicians part the incredible financial and physical/mental toll for the doctor and the system(financial) is incredible.

A study done by Harvard PHYSICIANS who looked closely at the cases concluded that “most suits are not frivolous”– the patient waz actually insured: “in a 2006 study of malpractice claims reserachers pored over claims involving approximately 33,000 physicians, 61 acute care hospitals and 428 outpatient facilities in four regions of the US. and wrote: “Portraits of a malpractice system that is stricken with frivolous litigation are overblown. . . Our findings suggest that moves to curb frivolous litigation, if successful, will have a relatively limited effect on the caseload and costs of litigation.” Because this was such a large study that relied on physicians to investigate claims in their own specialty, it has set a gold standard for malpractice reviews.”http://www.healthbeatblog.com/2011/06/myths-about-medical-malpractice-part-2-crisis-or-hoax/
The NEJM study you cite found that “a tiny fraction of the patients harmed by medical mistakes actually file claims.
Why? High up-front costs for hiring expert witnesses and preparing a case. Doctors, hospitals and their insurers, meanwhile, have significant financial and legal firepower. Some states also have caps on malpractice awards. That makes it likely that only strong cases with high expected payouts are pursued.
Given the expense and other challenges, it’s doubtful most claims are filed greedily, the researchers said”
“A lawyer would have to be an idiot to take a frivolous case to court,” one of the authors of the NEJM study said.http://www.cbsnews.com/8301-504763_162-20094133-10391704.html

So, the author believes/implies free market medicine was the problem, yet he acknowledges that it became a massive problem after medicare, medicaid and (reading between the lines) corporate socialism (i.e private companies getting benefits in money or regulation from government).

I agree with this.

The problems with US healthcare were created with government legislation backing big corporations over individuals, medicare and medicaid and unrestricted migration. This has pushed up prices for all. Obamacare merges them all this ‘badness’ together.