Blogs I am reading now

During my surgical training, I
took care of a patient who could barely afford the dressing supplies I had
prescribed for him. While caring for him, I realized that in the busy day-to-day
pursuit of becoming a good doctor, I had telescoped in on the clinical details,
neglecting my once-cherished ideal to embrace the social and economic aspects
of health care.

Last fall the journal Academic Medicine reported that the vast
majority of students felt they had received adequate clinical training during
their four years of schooling. But fewer than half felt they had had adequate
exposure to health care systems and practice, an area of study that extends to
subjects such as medical economics, managed care, practice management and
medical record keeping.

When the researchers compared the five-year results from
two different medical schools, they found that students who had attended the
school with more of these types of courses were significantly more satisfied
with their education than students from the school with fewer. Moreover,
regardless of how much of their school’s curriculum was devoted to these
nonclinical topics, students remained equally satisfied with their clinical
preparation.

In this week’s “Doctor and Patient”
column, I interview these researchers who believe that it is possible
to learn about the economic and social aspects of health care while immersed in
clinical learning and that it is impossible
to become a good clinician without doing so. Do you think learning about the
economic and social aspects of health care is possible within the constraints of the
current medical school curriculum? And is it important to becoming a good doctor? Please leave your comments below or on Tara Parker-Pope's "Well" blog.

Not long ago a friend confessed that her
son, who spends much of his free time volunteering at a children’s hospital and
who is applying to medical school, has been particularly anxious about his
future. “His test scores are just O.K.,”
my friend said, the despair in her voice nearly palpable. “I know he’d be a
great doctor, but who he is doesn’t
seem to matter to medical schools as much as how he does on tests.”

Her comment brought me back to the many
anxious conversations I had had with friends when we were applying to medical
school. Over and over again, we asked ourselves: Do we really need to be good at multiple-choice exams in order to be a
good doctor?

We were referring of course to not just any exam,
but to the Big One -- the Medical College Admission Test, or MCAT, the
standardized cognitive assessment exam that measures mastery of the premedical
curriculum. Back then, as now, American medical school admissions committees
required every applicant to sit for the MCAT. And with the competition fiercer than ever for slots in medical school, it seems inevitable that the MCAT would play a crucial role in admissions decisions.

But does that guarantee that the
applicants admitted are also destined to become the best doctors?

Now a new study by three industrial and organizational psychologists finds that another type of exam may be more helpful in determining which students are likely to succeed in, and after, medical school: personality testing.

In this week's "Doctor and Patient" column, I write about the fascinating results of this study and speak with one of the study authors. Is the current admissions process doing the best job in choosing who will be part of the next generation of doctors? And can personality tests improve that process? Please read the full article and leave your thoughts below or on Tara Parker-Pope's "Well" blog.

In this week's "Doctor and Patient" column, I write about the dwindling numbers of U.S. medical students who are choosing to go into primary care. Medical educators have pointed to enormous educational debts, lower income, and less controllable hours and lifestyle as the major reasons that students prefer other specialties over primary care. Even with the current legislative attempts to remedy some of these issues, medical educators continue worry about a worsening shortage of primary care physicians.

Why? Because primary care also suffers from a terrible image problem.

While the frisson of advancing treatments and approaches to patient care seem to envelope most other specialties, the image of primary care remains one of a vaguely anachronistic practice, a group of doctors who stand not on the forefront of creative change but who are continually left holding the biggest bag of administrative expectations and clinical care coordination and demands.

While a lot of attention has been given
to the long work hours of residents and medical errors, researchers at the Mayo
Clinic in Rochester, Minn., have found that distress, and not only fatigue,
contributed to errors by doctors-in-training. Residents who suffered from burnout
and depression could pose as much risk to patients as those doctors-in-training
who were exhausted, regardless and independent of their level of fatigue.

In this week's "Doctor and Patient" column, I interview to Dr. Colin P. West, the lead author of that study, and write about a colleague who could neither admit to nor find support for the distress in his life. How can we address the high levels of distress in training and practice so that doctors don't burn out, lose their sense of empathy and make errors? How can we help individual doctors thrive?

If
there is one issue in the patient-doctor relationship that defies all reason,
it is this: why don't doctors wash their hands more?

Over
the last 30 years, despite countless efforts at change, poor hand hygiene has
continued to contribute to the high rates of infections acquired in hospitals,
clinics and other health care settings. According to the World Health
Organization,
these infections affect as many as 1.7 million patients in the U.S. each year,
racking up an annual cost of $6.5 billion and contributing to more than 90,000
deaths annually.

But
there is one place where doctors and other caregivers have maintained the
highest levels of hand hygiene and defied these grim statistics. That place is
the operating room.

What
is it about O.R.’s?

In
this week’s “Doctor and Patient” column, I write about maintaining sterile technique in the
operating room and the new effort of the Joint Commission, the nation’s most
important hospital accrediting agency, to change how health care workers and
systems approach hand hygiene. Why haven’t we been able to get doctors and
other caregivers to wash their hands more? And is it possible to change? I'd love to know what you think. Please
leave your comments below or on Tara Parker-Pope’s “Well” blog.