"They mean better care for the millions of Americans who
require surgery, or who are hospitalized with heart attack, heart failure,
stroke, or pneumonia. They mean better care for children coping with asthma,
better inpatient psychiatric treatment, increased rates of immunization, and
better prevention of dangerous blood clots known as venous
thromboembolism."

And then Monday, the Healthcare
Association of New York State issued its own report card on hospital
ratings, measuring their utility and finding many of them deficient. Among
those that didn’t meet the grade were some of the most prominent ratings: U.S.
News and World Reports, the Leapfrog Group, HealthGrades
and Consumer Reports. Among those that did: The Joint Commission.

“We had been hearing more and more from members their
general frustration of all the different report cards,” said Kathleen Ciccone, director of HANYS’ Quality Institute, according to
a Kaiser
Health News story. “It’s so time consuming for them to be able to respond
to the reports, to be able to see what’s useful about them. They’re really
looking for some guidance on how to use the information.”

The Kaiser story notes that, “By at least one criterion, the
HANYS report on report card falls short of its own standards: HANYS did not
give the ratings groups an opportunity to preview the report before
publication.”

The take-aways: people are not
about to fish around for a hospital when they need a hospital procedure,
especially one that is not elective, and they want to be near family and
friends. Where does that leave hospital ratings? When it comes to usefulness,
ratings are probably no more helpful than they were 20 years ago when
marketplace consumerism attempted to gain a toehold in health care.

Lieberman wrote that the best
advice she's heard about hospital care came from Don Berwick, founder of the Institute
for Healthcare Improvement and past administrator of the Centers for Medicare and
Medicaid Services: When you go to the hospital, take someone with you and get
out quickly.

I continue to believe that reporters should tread with
caution. As report cards proliferate, we may just throw up our arms and cry
uncle.

4 comments

It is ironic that the only report card that HANYS approved of was the one which is based on measures that almost all hospitals perform well. The Joint Commission “grades” hospitals on process measures that tell how often hospitals follow best practices to prevent complications, infections, and to respond to patients needs. Hospitals have been reporting these measures for many years and for several years, nearly all hospitals follow these practices more than 90% of the time (you can check it out on hospitalcompare.gov). CMS is retiring some of these measures because most hospitals finally “get it” and are doing them as a matter of routine. It is a good thing that hospitals are following best practices but they are not a proxy for best outcomes and consumers need to be aware of that. For example, hospitals are complex environments - all of the right practices can be conducted on a surgery patient, but one doctor’s contaminated hand in the patient’s wound can eradicate that and cause an infection. Further, why does the Joint Commission only show the TOP performers? Consumers want to know which hospitals they should avoid, too. The public should demand report cards that are based on outcome measures like infection rates and mortality rates, and on measures that show the real differences among hospitals. They are not all alike.

Healthcare of old had a bar but no numbers on the contestants jerseys, no starting line to jump from, no tape measure on the vertical post, and no score board. Winners were determined by the roar of the crowd.

Today some of the contestants have numbers, still no starting line, 4 different tape measures on the vertical posts,and jumble tron score boards that the fans can’t understand. Winners are determined by the size of their advertising panels.

One great way to audit hospitals would be to set objective criteria for the quality of medical records and then review a random sample of the records from a given hospital. Objective criteria might include an evidence based plan for diagnosis and treatment, a clear listing of all drugs being taken, and a high quality history. A study from a few years ago (Dunley, 2008) showed that hospitals’ average score is about 60% and those below 50% had a 40% higher mortality than those scoring greater than 75% on the criteria. Ratings from the JC are meaningless as far as I am concerned.

Safeguard the public interest

Republish This Story for Free

Thank you for your interest in republishing the story. You are are free republish it so long as you do the following:

You can’t edit our material, except to reflect relative changes in time, location and editorial style. (For example, "yesterday" can be changed to "last week," and "Portland, Ore." to "Portland" or "here.")

If you’re republishing online, you have link to us and to include all of the links from our story, as well as our PixelPing tag.

You can’t sell our material separately.

It’s okay to put our stories on pages with ads, but not ads specifically sold against our stories.

You can’t republish our material wholesale, or automatically; you need to select stories to be republished individually.