Over at his blog Gene Expression, Razib Khan expressed his disgust with the paternalism of the medical profession. His disgust was not misplaced, but he did bypass some of the subtleties.

Razib, a respected science blogger, ran smack into a frightening piece in TIME. In it, doctors discuss whether and how much genetic information should be given to parents about their kids. The doctors featured evaluated a child's genome, found a potential for early-onset dementia, and chose not to tell the parents. Let's break this down.

Paternalism vs. autonomy

Paternalism in medicine isn't prima facie bad. We count on professionals to use their superior knowledge, experience, and relative objectivity to help us. The premise is, "you know more than I do; help me." The foundation of medical practice rests on centuries of paternalism taken to every imaginable extreme: surgeries performed without consent, diagnoses being withheld, forced sterilization. Over the last fifty years the reliance on paternalism has given way to our value of patient autonomy. This has been an uneasy struggle, one that lives in a grey world of few right answers.

But there are many things we as a society and medicine as a profession do agree upon. The basis of patient autonomy is informed consent. Patients should be given every opportunity to understand what we propose to do for them and be given a chance to assent or decline any test or treatment.

Once a patient has given consent, information must continue to flow. I cannot ethically order a test or perform a procedure and then withhold the outcome from the patient.

Since there are a lot of grey areas, it's helpful to see if there are any bright lines. A patient who comes to see me signs a general consent to treatment which covers much of what we do. I don't need to explicitly ask permission to look into an ear or to analyze kidney function. These are presumed to fall under a general consent because they are relatively minor interventions that are a normal part of an exam and commonly without grave implications.

Contrast this to a CT scan of the abdomen. This is a more "extraordinary" intervention, one with potential risks (radiation exposure, exposure to potentially toxic intravenous dye) and one whose results may give life-altering information (a tumor, a need for surgery, etc.).

Let's take these roughly-sketched guidelines--specific consent should be obtained for extraordinary procedures with unusual risks or high probability of life-altering results--and take a look at the TIME article.

The piece is not terribly written, but it leaves out a lot of important information. That being said, we can assume that a genomic analysis was conducted of some children, and doctors puzzled over whether to give the results to the parents.

Ethics require that the parents (who are surrogate decision makers for the children) be given informed consent. The genomic analysis should not be done at all without informing the parents what we can or cannot learn from the test. Genetic predisposition to disease can be very mushy: it's easy to see if someone has sickle cell anemia, but not so easy to see if they will develop hypertension.

For the purposes of this discussion, I'll assume that whatever this "dementia gene" the doctors found gives the child a very strong chance of dementia at an early age. In this case, they cannot ethically withhold this information from the parents. What if there were a "maybe, sort of" predisposition? Same answer.

In fact, if the parents gave their informed consent, and the information included the uncertainties inherent in genomic analysis, they have the right to all the results, along with interpretations from a professional who understands them. If they were not given proper informed consent, the test shouldn't have been run. A doctor who orders a test must have its consequences mapped out in advance, and shouldn't be wondering after the fact what to do with the data.

This scenario assumes an ideal collaboration between doctor and patient, where communication flows freely, where a doctor uses their expertise to guide patients' decisions. The complex nature of medical knowledge and its potential for life-altering impact (and expense) argue against any patient being able to order any test for themselves, including a genomic analysis. The information gained from such an analysis requires professional interpretation so that people can use the information wisely. A test result that sounds horrible to a patient may in fact be insignificant, and vice versa.

There are risks to people having unbounded ability to use medical testing and treatment. For patients to agree to this important premise, we as physicians must respect their autonomy and human dignity.

Peter A. Lipson, ACP Member, is a practicing internist and teaching physician in Southeast Michigan. After graduating from Rush Medical College in Chicago, he completed his internal medicine residency at Northwestern Memorial Hospital. This post first appeared at his blog, White Coat Underground. The blog, which has been around in various forms since 2007, offers "musings on the intersection of science, medicine, and culture." His writing focuses on the difference between science-based medicine and "everything else," but also speaks to the day-to-day practice of medicine, fatherhood, and whatever else migrates from his head to his keyboard.

This past week I started teaching medical students for the first time as an attending physician. It's a new start, but of course it's also a return: the last time I taught medical students was as a senior resident.

Treating patients in the hospital is something I've done even when I haven't been teaching, of course, but teaching increases one's powers of observation if only briefly. What I notice now is that, as doctors, we spend most of our time not in rooms but in corridors in the hospital: hurrying back and forth, holding whispered conversations or rounds at full volume. "Prepare yourself in the anteroom," says Pirkei Avot, a basic ethical-moral work in the Jewish tradition, "to enter the palace."

On the other hand, my daily work is in the outpatient setting, the clinic. Patients wait not in their sickrooms but in their chairs of delay, while we work things through in our chambers.

In the hospital we're always hurrying about while the patients wait; our activity is often an illusion and there's much we can't speed up. In the clinic, we are all on the same clock.

Medicine comes in many different mixtures of waiting and action, passivity and frenzied procedure, and I want some way to tell the students that neither hospital nor clinic is all there is.

Zackary Berger, MD, ACP Member, is a primary care doctor and general internist in the Division of General Internal Medicine at Johns Hopkins. His research interests include doctor-patient communication, bioethics, and systematic reviews. He is also a poet, journalist and translator in Yiddish and English. This post originally appeared at his blog.

About 70% of women who have both breasts removed following a breast cancer diagnosis do so despite a very low risk of facing cancer in the healthy breast, a study found.

The study found that 90% of women who underwent contralateral prophylactic mastectomy reported being very worried about the cancer recurring. But, the risk of breast cancer recurring in the other breast doesn't increase just because of the initial cancer, reported the study author, an associate professor of internal medicine at the University of Michigan.

Results were announced by the University of Michigan Comprehensive Cancer Center and presented at an American Society of Clinical Oncology symposium.

The study authors looked at 1,446 women who had been treated for breast cancer and who had not had a recurrence. They found that 7% of women had surgery to remove both breasts. Among women who had a mastectomy, nearly 1 in 5 had a double mastectomy.

Women with two or more immediate family members with breast or ovarian cancer or with a positive genetic test for mutations in the BRCA1 or BRCA2 genes may consider having both breasts removed, because they are at high risk of a new cancer developing in the other breast. But women without these indications are very unlikely to develop a second cancer in the healthy breast, the study found.

Since concerns about recurrence is one of the biggest factors driving the decision to have contralateral prophylactic mastectomy, better education is needed to educate women about whether the surgery would reduce the risk of recurrence.

Recently, American Medical News featured an article with the disturbing title, "Massive health job losses expected if Medicare sequester prevails." I wasn't entirely sure what a "sequester" was, since I thought it was a verb. Sequestration, I thought, was the noun. (I hear a loud knock. It must be the grammar police.)

The story, as I understand it, is that when our government decided to pull together and raise the debt ceiling, they also passed the Budget Control Act, which was intended to reduce the deficit by $1.2 trillion by 2021. This was to be achieved by a bipartisan Joint Select Committee on Deficit Reduction, which would make well considered cuts in funding for various projects.

They were unable to come up with a plan that they could agree upon (imagine that) and so automatic across the board spending cuts are mandated to go into effect in 2013, excluding only a few programs, such as childrens' health and disaster funding, and capping yearly cuts to sensitive programs such as Medicare to 2% per year. These funding cuts are called "sequesters."

This sounds so very familiar. Several years ago Congress passed the sustainable growth rate (SGR) formula which mandated that Medicare costs would rise only as fast as inflation. Up until recently Medicare costs continued to outpace inflation, and so yearly congress must legally cut Medicare payments, across the board, but then at the last minute they don't.

Doctors and patients say that the program will surely not survive since across the board cuts mean that as well as cutting the numbers of unnecessary procedures and devices that are used, we must also cut payments to primary care docs, who already receive far less than they are willing to accept in payment for office visits. Applying the SGR is then delayed, again, by a last minute agreement. Congress does this at least yearly.

Across the board cuts are a bad idea, yes they are. Some parts of programs need to be cut and others need to be grown in order to make systems more efficient. Good primary care more than pays for itself in saved hospital costs (I'm making this fact up entirely out of reason and good sense. There are no studies that exactly address this question.) If payments to hospitals shrink, it should be via improved health of populations who then will need less hospital care. But across the board cuts don't allow for this.

So, one might imagine, the specter of across the board cuts would be very effective in making us come to a consensus on how we could control expenditures so that such cuts would never become necessary. It clearly has not been an adequate deterrent. Doctors and others in the field of health care continue to allow their piece of the financial resource pie to grow, to the detriment of all kinds of things.

Nobody, it seems, wants to be seen to cut money that goes to programs when doing so would anger a significant portion of the voting population. And lawmakers don't really understand that there are huge areas of unpopular inefficiencies whose elimination would be mostly painless. They don't know this because they are rarely in the thick of medical care, either as caregivers, health care providers or patients. Those who know about inefficiencies are too busy to speak up or are not likely to be heard by lawmakers.

But on the subject of job losses related to cutting spending on Medicare--yes. That will absolutely happen and there will be economic repercussions. Excess money spent on health care sometimes goes into the pockets of greedy people who already have enough money, but it also supports families, via health care employees who do jobs mired in inefficiency, such as insurance adjusting and device sales and pharmaceutical advertising. These are homegrown jobs, and paychecks often go to local industry and support real live American people pursuing life, liberty and happiness. The inefficiencies of health care sometimes grow our economy, but at the cost of lowering effective take-home pay for all insured workers and creating dependent and indebted individuals who are forced by ill health or poor decisions to make use of acute care.

What to do? I would favor some kind of health care industry/government collaboration to make binding decisions regarding where best money can be removed from the Medicare budget. Failure to come to an agreement should not be an option.

Janice Boughton, MD, ACP Member, practiced in the Seattle area for four years and in rural Idaho for 17 years before deciding to take a few years off to see more places, learn more about medicine and increase her knowledge base and perspective by practicing hospital and primary care medicine as a locum tenens physician. She lives in Idaho when not traveling. Disturbed by various aspects of the practice of medicine that make no sense and concerned about the cost of providing health care to every American, she blogs at Why is American Health care so expensive?, where this post originally appeared.

This is my new office. I signed the lease for this property yesterday, another big step in the process of getting my new practice off the ground. I should feel good about this, shouldn't I? I've had people comment that I've gotten a whole lot accomplished in the four weeks since I've been off, but the whole thing is still quite daunting. Yes, there are days I feel good about my productivity, and there are moments when I feel an evangelical zeal toward what I am doing, but there are plenty more moments where I stare this whole thing in the face and wonder what I am doing.

I walked through the office today with a builder to discuss what I want done with the inside; it quickly became obvious that there was a problem: I don't know what I want done, and nobody can tell me what I should do. Yes, I need a waiting area, at least one exam room, an office for me, a lab area, bathrooms, and place for my nurse, but since I don't really know which of my ideas about the practice will work, I don't know what my needs will truly be. How much of my day will be spent with patients, how much will be doing online communication, and how much will be spent with my nurse? I want a space for group education, but how many resources should I put toward that? I also want a place to record patient education videos, but some of my "good ideas" just end up being wasted time, and I don't know if this is one of them.

I come across the same problem when I am trying to choose computer systems. I know that I want to do that differently: I want the central record to be the patient record, not what I record in the electronic medical record (EMR). I want patients to communicate with me via secure messaging and video chat, and I want to be able to put any information I think would be useful into their personal health record (PHR). So do I build a "lite" EMR product centered around the PHR, or do I use a standard EMR to feed the PHR product? Do I use an EMR company's "patient portal" product, or do I have a stand-alone PHR which is fed by the EMR? I have lots of thoughts and ideas on this, but I don't really know what will work until I start using it.

Here's the real rub in all of this. There's a large group of patients waiting for me to open my doors and take them in as my patients. These people will need excellent care and all that goes into providing that care. I am confident in my care as a doctor, but the doctor is only a part of the equation; there are referrals, labs, and other care-coordination services that need to be done. If people are going to be trusting me enough to pay a monthly fee in exchange for better care, I have to deliver on that.

This must become a viable business. I quit my other job, and now will rely on this new business to support me and my family. The incredibly low overhead of it all helps a lot, but the final say of any business is this: do I offer a service that is worth what I am charging? Decisions like how to redo the office, or what computer systems to use have a twofold impact on this: they impact the quality of the care, and they cost money.

It feels like I have been given the task of learning how to fly in three months. But instead of taking flying lessons and flying in the conventional way, I have to build a whole new kind of airship from the ground up. I need to design it, build it, and then learn to fly well enough to take passengers. My ideas were good enough to take this challenge, and I have lots of smart people willing to help me, but I will be the one who has to make it fly.

That's scary.

Some of this is ego. I wouldn't have quit my old practice for a new way of doing things if I didn't have the confidence to pull this off, much less write about it for thousands of people to see. So when people give me advice, my ego wants to assure them that I know what I am doing. I want to say, "Well, that may work for you, but I am doing something different." But then there's the small fact that I don't really know what I want, so I should at least listen to any advice I get.

In the end, all that matters is that I give good enough care for my patients that they are willing to keep me in business. Keeping that reality in front of me as the center of my focus will give me the best chance to get this baby off the ground. Once I am flying, it will be much easier to know how to improve it from there.

In the mean time I just pray that I don't crash.

After taking a year-long hiatus from blogging, Rob Lamberts, MD, ACP Member, returned with "volume 2" of his personal musings about medicine, life, armadillos and Sasquatch at More Musings (of a Distractible Kind), where this post originally appeared.

The number of drugs that have serious adverse interaction with grapefruit has more than doubled since 2008, reports the same research group that reported such side effects more than 20 years ago.

From 2008 to now, the number of drugs that have interactions have increased from 17 to 43, as new chemical and formulations are introduced that can interact with grapefruits to result in torsade de pointes, rhabdomyolysis, myelotoxicity, respiratory depression, gastrointestinal bleeding or nephrotoxicity, researchers reported in a review at CMAJ. There's actually 85 total drugs that have either major or minor interactions, the authors noted.

These drugs, most often taken in elderly populations, are administered orally, have very low to intermediate absolute bioavailability, and they are metabolized by the cytochrome P450 3A4 enzyme (CYP3A4). It only takes one whole grapefruit or 200 mL of grapefruit to cause increased systemic drug concentrations that are enough to induce adverse effects. Other citrus fruits that share grapefruit's "culprit compound" of furanocoumarins include limes and some oranges, but not oranges such as navel or Valencia varieties, the authors noted.

"Unless health care professionals are aware of the possibility that the adverse event they are seeing might have an origin in the recent addition of grapefruit to the patient’s diet, it is very unlikely that they will investigate it," the authors wrote. "In addition, the patient may not volunteer this information. Thus, we contend that there remains a lack of knowledge about this interaction in the general health care community. Consequently, current data are not available to provide an absolute or even approximate number representing the true incidence of grapefruit–drug interactions in routine practice."

For more on how the issue commonly appears in a busy internal medicine practice, see ACP Internist's previous coverage.

I will start with full disclosure. I still use paper charts. While I think my practice of medicine is "uber"-up-to-date, the truth is it could be 1950 when you look at my patient records. Charts are huge and some patients I've seen for decades are on volume 3, just to make them manageable. So this very week I am coming on board with a full blown, state-of-the-art electronic health record (EHR).

The government is pushing EHRs and, in fact the Centers for Medicare and Medicaid Services (CMS) has already imposed a 1% penalty on doctors that are not e-prescribing. The penalty goes up to 1.5% in 2013. There are also some large incentives connected with meaningful use criteria. It is a complicated set of criteria put out by CMS that pushes physicians toward investing in the EHR.

With all of these incentives, why haven't more physicians converted? For one it is darn expensive and the best systems require large groups or hospital funding to make it financially feasible. Staff needs to be trained, equipment purchased, software and licenses purchased, information technology (IT) support is needed and the doctor's productivity and ability to see the same number of patients declines. And it totally changes how you and your staff do your work.

The advantages are numerous, however. Having access to instant, legible information all in one place and shared by all of the caregivers is huge. The EHR gives easy access to consultant notes and all tests. When I am on call at night or weekends I can see my patients' information, and it will prevent medical errors. The EHR can be programed to give "alerts" for drug reactions, needed screening tests and medical information.

So it is a no brainer that we all need to switch to the 21st Century and start using technology to help us deliver better care.

I have already gone through an entire day of training and will be using more of my "free" time this week to abstract my old charts, learn the system and develop my own practice templates in the new EHR. I will need "at my side" IT support when I first start using it with patients. I think my patients will understand if it is clumsy at first. And they will surely like the ability to see their own lab tests and make office appointments on-line.

I am looking forward to the change but also wary of what is ahead. Internal medicine is already a grinding specialty with low pay and long hours. Spending more hours with an EHR is not appealing but I hope the benefit to patients and safety makes it worth it in the long run.

The graph below is on a scale of 1-5. 1=poor, 3=neutral and 5=excellent. You can see that none of the EHRs scored very high with the physician users.

Satisfaction with EHRs by employed internists in large practicesCriterion: Rating AverageEasy to learn: 3.62Ease of data entry: 3.57Overall ease of use (intuitive): 3.45Ease of EHR implementation: 3.43Reliability: 3.99Adequacy of vendor training program: 3.55Vendor continuing customer service: 3.63Interactivity with other office systems: 3.29Value for the money: 3.46Physician overall satisfaction: 3.51Staff overall satisfaction: 3.55Appearance/overall usefulness of the end product (eg, notes, consultations): 3.68This post originally appeared at Everything Health. Toni Brayer, MD, FACP, is an ACP Internist editorial board member who blogs at EverythingHealth, designed to address the rapid changes in science, medicine, health and healing in the 21st Century.

Nurses do a better job at making sure patients are vaccinated over, and specifically for flu and pneumococcus, a Canadian meta-analysis found.

Researchers conducted a meta-analysis of more than 100 randomized and nonrandomized studies involving more than 470,000 patients in North America and the UK, looking at studies that included a quality improvement intervention for improving adult influenza and pneumococcal vaccination rates in community dwelling adults.

The most effective flu vaccination interventions were community media campaigns and telephone reminders delivered by clinic staff. For pneumococcal vaccinations, office brochures handed to patients by clinic staff before their appointments was most effective. Brochures at the point of care were 3.87 times more effective than mailed reminders.

Having nurses assume responsibility for administering vaccinations worked, while having nurses and pharmacists remind physicians didn't, the researchers wrote.

"Configuring additional personnel so that they are able relieve physicians of vaccinations seems important to successful team change," they wrote.

One of the gripes that patients have about the medical profession is that we physicians don't communicate sufficiently about our patients. In my view, this criticism is spot on.

Patients we see in the office often have several physicians participating in their care. The level of communication among us is variable. While electronic medical records (EMR) have the potential to facilitate communication between physicians' offices and hospitals, the promise has not yet been realized. The physicians in our community, for example, all have different EMR systems which simply can't talk to each other. We can access hospital data banks from our office, but this is cumbersome and burns up time. Ideally, there should be a universal system, an Esperanto approach where all of us utilize the same EMR language.

On the day I wrote this post, I participated in a direct conversation with the treating physician at the hospital bedside which vexed me. This scenario would seem to be ideal from the patient's perspective. At the bedside were the attending physician, the gastroenterologist (the Whistleblower) and the anesthesiologist, who were conferring about the next appropriate diagnostic step in a patient who had experienced upper gastrointestinal (UGI) bleeding.

I was asked to evaluate this patient with UGI bleeding and to arrange an expeditious endoscopy to examine the esophagus and stomach region in order to identify a bleeding source. Hours prior to seeing the patient, I scheduled the procedure that I knew would be needed, a short cut that every gastroenterologist will do in order to be efficient. As the patient had other medical conditions, I requested that the sedation be administered by an anesthesiologist, rather than by me, to provide greater safety to the patient.

I arrived and became acquainted with the medical particulars. I agreed with the diagnosis of UGI bleeding and also that an endoscopy was the next logical step in this patient's care. These observations are not sufficient, however, to proceed with the examination. There are other criteria that must be considered.1) Does the procedure need to be done now?2) Do the risks justify performing the procedure?3) Has the patient provided informed consent for the procedure?

After I arrived on the scene, the anesthesiologist approached me and advised me that the anesthesia risks were extraordinarily high. He was concerned that performing the case could have a disastrous outcome. My reaction to his frank assessment? Thank you! The decision then fell to me to decide on whether to proceed.

For me, this was an easy call. The patient did not need an endoscopy at that moment to save his life, the only reason that would justify subjecting him to the prohibitive risks of the procedure. Before discussing this decision with the family, who were awaiting an endoscopy, I summoned the attending hospitalist to relate to him our revised plan.

In my view, when an anesthesiologist and the gastroenterologist advise an attending doctor that it would be unsafe to proceed with a planned procedure, the response should be, 'thank you'! But, it wasn't. This physician wanted the test and seemed irritated that the set diagnostic plans had been set aside. He wanted a diagnosis, and we declined to proceed after concluding that the risks exceeded the benefits.

I was as comfortable with this medical decision as I have been with any other decision I had made in my career. On other cases, when a consultant advises me against a planned course of action for safety reasons, I am so grateful that a patient has been spared from danger.

We got to the right answer here, but had to set aside an unforeseen obstacle to get there. Communication means listening to another point of view and being able to change your mind. As a doctor, when it's my finger is on the trigger, I call the shots. I this case, a doctor misfired.

This post by Michael Kirsch, MD, FACP, appeared at MD Whistleblower. Dr. Kirsch is a full time practicing physician and writer who addresses the joys and challenges of medical practice, including controversies in the doctor-patient relationship, medical ethics and measuring medical quality. When he's not writing, he's performing colonoscopies.

Virtual office visits for infections resulted in more prescribing of antibiotics, but could potentially lower health care costs, a study found.

Researchers studied four primary practices of the University of Pittsburgh Medical Center Health System that offered a secure online portal that allowed patients to answer questions about their condition. Based on this written information, physicians diagnose the infection, order care and reply to the patient.

To look at how care for virtual and in-office visits differed, researchers reviewer all office visits and e-visits for sinusitis and urinary tract infections (UTIs) from January 2010 to May 2011. Results appeared online at Archives of Internal Medicine:--Physicians were less likely to order a UTI-relevant test at an e-visit (8% e-visits vs. 51% office visits; P less than .01);--Few sinusitis-relevant tests were ordered for office or e-visits;--Antibiotics were prescribed more often from and e-visit for either condition;--For UTIs, antibiotics were prescribed 32% of the time when a urinalysis or urine culture was not ordered compared with 61% when the tests were ordered.

Researchers wrote, "When physicians cannot directly examine the patient, physicians may use a 'conservative' approach and order antibiotics. The high antibiotic prescribing rate for sinusitis for both e-visits and office visits is also a concern given the unclear benefit of antibiotic therapy for sinusitis."

Rough cost estimates showed that lower reimbursement for the e-visits ($40 e-visit vs. $69 office visit) and the lower rate of testing ($11 urine culture) at e-visits outweighed the increase in prescriptions ($17 average prescription). Visits for UTIs was $74 for e-visits compared with $93 for office visits.

Raw food diets have emerged as a pop culture preoccupation. They seem to have considerable traction in the public psyche, as evidenced by the volume of websites they populate, and the coverage they command in print.

It is doubtful they have comparable traction at the dinner table, of course. We have enough trouble getting people to eat a reasonable amount of reasonable foods, and to renounce ingestibles that glow in the dark. In this context, it seems a bit far-fetched that we would shift, en masse, to a strict diet of raw, unprocessed foods.

But our appetites for the concept, and the claims made in its defense, seem insatiable. So let's chew on it.

In pure form, raw food eating is exactly as advertised: No foods are cooked. The diet is based overwhelmingly, if not exclusively, on plant foods. If it does include animal foods, they are consumed raw. If milk is consumed, it is consumed raw, which is to say, unpasteurized. Some versions are strictly vegan, and ban all animal products.

There are, to be sure, potential benefits of such a diet, or of many aspects of it. By placing an emphasis on plant foods, the diet is a rich source of the foods that are in turn the richest sources of valuable nutrients. The diet renounces most processed foods, and thus eliminates trans fat, and provides generally very low levels of saturated fat, sodium, and sugar, while providing nutrient-dense foods, rich in fiber. And because food choice is subject to rather strict constraints, calories are caged, making raw food diets an effective answer to the prevailing problems of weight control.

Many foods are, indeed, most nutritious when raw. Heat can destroy many nutrients, notably some water-soluble vitamins, many antioxidants, and unsaturated fats, including omega-3s. The beneficial effects of dietary fibers, both insoluble and soluble, may be altered, and at times reduced, by cooking.

And there are potential harms of cooking that raw foods sidestep. Cooking meat can lead to charring, which generates carcinogenic compounds known as heterocyclic amines. Cooking of carbohydrates can produce acrylamide, another potential carcinogen.

There is, however, a great leap of faith from some benefit in eating some foods raw some of the time, to raw is always and dramatically better.

There are claims, for instance, that raw food is better because cooking destroys enzymes in plants. Perhaps so, but so does digestion. Very few enzymes survive the hydrochloric acid they encounter in the stomach. Meaningful health effects of swallowing an enzyme that doesn't survive to see the duodenum are dubious at best.

Raw food advocacy ignores the fact that some foods are more nutritious when cooked. The nutrient lycopene makes tomatoes red. It is a potent carotenoid antioxidant, long thought to reduce prostate cancer risk, although that effect per se is in doubt. Lycopene is fat-soluble, and much more "bioavailable", that is to say, available for absorption and making contributions to our health, when tomatoes are heated in combination with an oil. Tomato sauces with olive oil are ideal, and raise blood lycopene levels far more effectively than eating raw tomatoes.

Eggs are a good source of biotin, a nutrient important in many ways, its contributions to healthy hair, skin, and bones noteworthy among them. Raw eggs contain a protein called avidin, which binds and inactivates biotin. Cooking denatures avidin, a term used when the shape of a protein is changed. Denatured avidin does not bind biotin, so cooked eggs are a good source of bioavailable biotin.

Even more important than the nutrients that cooking can "add" to food are the things it can take away, namely pathogenic bacteria. Cooking is the best and final defense against salmonella, E. coli, and other microscopic nasties that can hitch a ride on our foods. Raw milk has captured the modern imagination, but pasteurization took hold for good reason. Milk can be contaminated by bacteria, from the cows, the farmers, or farm equipment, and it makes a great growth medium. Pasteurization protects us from the attendant consequences, which were once fairly common.

And finally, there are some truly excellent foods that can't be eaten raw; beans and lentils come to mind. These are nutrition powerhouses, inexpensive, and rich enough in high-quality protein to make a good meat alternative. But they are all but indigestible unless cooked.

Some variations on the theme of raw food eating accommodate this concern, by allowing for cooking at low temperatures. But food cooked low and slow is not really raw, it's slow-cooked, and should call itself that. Cooking is always a product of heat intensity times duration, so when raw food expands to encompass slow cooking, the topic devolves to a debate about cooking methods.

Lastly, there is the notion that cooking is a form of food processing and thus "unnatural." Perhaps. But cooking, and freezing, have figured in humans' interactions with foods since long before the dawn of agriculture. So if cooking is "unnatural," everything about agriculture is even more so. To make it just as blunt as a stone hammer: We cooked meat long before we ever grew potatoes.

What we are left with, then, is a whole lot of hype that runs well ahead of any legitimate science.

All too often, opinions about nutrition are disseminated with religious zeal, as if gospel. I have argued before for the separation of church and plate, and reaffirm my own commitment to it here. I have my own opinions about nutrition. But when they are just opinions, I am careful to treat them as such.

At its best, nutrition is science. That doesn't make it perfect. Our scientific understanding is not perfect in any field, and nutrition is far from an exception. But all opinions about a science must at least run the gauntlet of what we do know. Those that cannot do so and survive are hearsay.

We tend to honor this implicitly in almost every science but nutrition. Unsubstantiated opinions about how to build a suspension bridge, perform neurosurgery, or accelerate atoms are of no particular interest. We recognize in these disciplines that expertise matters, and we differentiate the insights of those with such expertise, generally born of years of study, from the random inclinations of the rest of us riff-raff.

Somehow, though, we make an exception for nutrition. Perhaps the fact that everyone eats invites us to view everyone as comparably expert in the far-reaching implications of what we eat on physiology, pathophysiology, cell biology, and biochemistry. But of course, that just ain't so.

Cyberspace may be the perfect crock pot for haphazard food for thought. Everyone with an Internet connection gets to dish. And so while raw food advocacy has been around for 200 years, it has taken on a whole new prominence only rather recently.

The result of acting as if all food-related opinions are created equal is a whole lot of food for thought unsuitable for human consumption. Ingesting such opinions nonetheless leads to a state I am inclined to label "cognitive indigestion", a condition of unfounded convictions, misplaced trust, and/or perennial confusion.

Many foods can be eaten raw, and many foods are the better for it. An emphasis on eating mostly plants direct from nature is irrefutably good, be they raw or cooked. But as is true of so much in the realm where opinions about nutrition masquerade as gospel, the case for raw food eating is oversold, the rhetoric is overheated, and the claims of universal benefits, substantially overcooked.

David L. Katz, MD, FACP, MPH, FACPM, is an internationally renowned authority on nutrition, weight management, and the prevention of chronic disease, and an internationally recognized leader in integrative medicine and patient-centered care. He is a board certified specialist in both Internal Medicine, and Preventive Medicine/Public Health, and Associate Professor (adjunct) in Public Health Practice at the Yale University School of Medicine. He is the Director and founder (1998) of Yale University's Prevention Research Center; Director and founder of the Integrative Medicine Center at Griffin Hospital (2000) in Derby, Conn.; founder and president of the non-profit Turn the Tide Foundation; and formerly the Director of Medical Studies in Public Health at the Yale School of Medicine for eight years. This post originally appeared on his blog at The Huffington Post.

The Task Force is providing an opportunity for public comment on the draft recommendation until December 17. All public comments will be considered as the Task Force develops its final recommendation.

--The Task Force strongly recommends that clinicians screen all people aged 15 to 65 for HIV infection. Younger adolescents and older adults who are at an increased risk for HIV infection should also be screened.--The Task Force also strongly recommends that clinicians screen all pregnant women for HIV, including women in labor whose HIV status is unknown.

"The draft recommendation reflects new evidence that demonstrates the benefits of both screening for and earlier treatment of HIV," said Task Force member Douglas K. Owens, MD, FACP. "Because HIV infection usually does not cause symptoms in the early stages, people need to be screened to learn if they are infected."

He continued, "People who are feeling well and learn they are infected with HIV can begin treatment earlier, reduce their chances of developing AIDS and live longer and healthier lives."

Dr. Oz, whose hyperbolic medical claims are the bane of most doctors’ existence, just tweeted another whopper: #OZTip To prevent UTI’s naturally, take 2 teaspoons of horseradish per day. Horseradish contains oils that have anti-bacterial properties.

Let’s start with the second question. Most UTIs are a one-time affair. You get it, you take antibiotics, and that’s that. But for a number of people, UTIs become a recurrent problem, whether because of an immune problem, behavioral factors, an anatomic problem, or, most often, random happenstance.

[Digression: the causes of UTI are complex and interesting, but much of it comes down to bowel bacteria being awfully close to the urethra, especially in women.]

There is a decent body of literature on recurrent UTIs, including evaluation of prophylaxis with antibiotics, post-coital urination, and other medications and behaviors. There is, however, little data on horseradish. There is a little bit in the German literature, but nothing particularly definitive.

One thing the literature does show is that Dr. Oz isn’t the first person to tout horseradish for UTIs. But there is no good evidence it does anything. This appears to be yet another example of Dr. Oz taking an interesting embryo of an idea and presenting it as fully born and raised.

My advice: Save the horseradish for the gefilte fish.

Peter A. Lipson, ACP Member, is a practicing internist and teaching physician in Southeast Michigan. After graduating from Rush Medical College in Chicago, he completed his internal medicine residency at Northwestern Memorial Hospital. This post first appeared at his blog, White Coat Underground. The blog, which has been around in various forms since 2007, offers "musings on the intersection of science, medicine, and culture." His writing focuses on the difference between science-based medicine and "everything else," but also speaks to the day-to-day practice of medicine, fatherhood, and whatever else migrates from his head to his keyboard.

What comes to mind when you hear the term "medical home?" Perhaps you favor the definition put forth by our government (AHRQ):

"The medical home model holds promise as a way to improve health care in America by transforming how primary care is organized and delivered. Building on the work of a large and growing community, the Agency for Healthcare Research and Quality (AHRQ) defines a medical home not simply as a place but as a model of the organization of primary care that delivers the core functions of primary health care."

They go on to describe five functions and attributes that define the medical home:
--comprehensive care
--patient-centered
--coordinated care
--accessible services
--quality and safety.

The presence of these five attributes to care should then constitute a medical home, right? It depends on who you get your definition from.

Take, for example, the definition put forth by NCQA, the body responsible for certifying practices as providers of the patient centered medical home:

"NCQA 's Patient-Centered Medical Home (PCMH) 2011 is an innovative program for improving primary care. In a set of standards that describe clear and specific criteria, the program gives practices information about organizing care around patients, working in teams and coordinating and tracking care over time. The NCQA Patient-Centered Medical Home standards strengthen and add to the issues addressed by NCQA's original program.

The patient-centered medical home is a health care setting that facilitates partnerships between individual patients, and their personal physicians, and when appropriate, the patient's family. Care is facilitated by registries, information technology, health information exchange and other means to assure that patients get the indicated care when and where they need and want it in a culturally and linguistically appropriate manner."

The one-up AHRQ and say that six elements need to be met for recognition as a PCMH:

--enhance access and continuity
--identify and manage patient populations
--plan and manage care
--provide self-care and community support
--track and coordinate care
--measure and improve performance

"The Patient Centered Medical Home is a care delivery model whereby patient treatment is coordinated through their primary care physician to ensure they receive the necessary care when and where they need it, in a manner they can understand."

It certainly is the simplest definition.

Regardless of definition, the idea of a patient-centered medical home has become very popular in circles pushing for a primary care centered health care reform. My old practice started in the process of certifying for PCMH through the NCQA, and I discovered something: it really isn't that patient-centered. Just like "meaningful use" is not really about using computers in a meaningful way, but instead an exercise to collect data in a way prescribed by a non-clinical body, the PCMH should really be called the "data-centered medical home." It's all about gathering and reporting data in a specified way, taking time and resources away from the thing at the center of care (by any definition): taking good care of patients.

Now, I am not totally against these kinds of programs; I certainly think their origins are driven by good intent. But I am wary of anything that comes in the form of prescription by a non-clinical governing body to define care for humans by other humans. I had to work hard to make meaningful use truly meaningful for my patients, and I anticipate that had I stayed at my old practice, a significant impediment in truly providing a good medical home for my patients would have been our effort toward PCMH certification.

A new idea came to me as I planned for my new practice, a practice that doesn't answer to insurance company requirements or government regulations: I am creating a medical home for my patients. I plan on meeting all the criteria put forth by the AHRQ and the NCQA, but not because I want to get certified or paid more, it just seemed like better care. The difference, however, between my version of the medical home and the "official" version is that mine is grown from the ground up; it is simply better care for my patients. I am growing the medical home "organically," not meaning that I am avoiding pesticides, but that I am allowing good care to grow on its own, rather than to do it by meeting a shape defined by a group of people who neither know nor care for my patients.

How will I make an organic medical home?

Access: My patients will have access to me. They will have my cell phone number, and can access me via secure online messaging or in person.

Personalized care: Each person will have their own personalized care plan ("GPS") that will let them know what care they should have, what they've done, what they are due for, and when care is due in the future.

Continuity: My patients will (in my plan) have a personal health record that will serve as their "official" medical record. Any records from any care from me or any other provider should be contained in a single medical record. I believe that should be the patient record, not one kept at a doctor's office.

Self-care and community support: I will provide resources for my patients to know what care they need. I intend on having an online library of information as well as links to websites I think will help them deal with their problems. I will also have education programs for people with certain conditions (dietitians teaching diabetics how to shop for food, for example) and do group visits to link like-minded patients together.

Track and coordinate care: I see this as my main task. I don't give most of the care, I just help people get hooked up to the resources they need. Online contact will be the main vehicle for this, but I'll use whatever means necessary.

Measure and improve performance: for my patients, the main measures of my performance are time and money. How much time are people spending at specialists, ERs, or in the hospital, and how much money are they spending on their care in total? If I can keep people healthy and away from the system, I will be improving the lives of my patients in both physical and financial ways.

So, I guess I can say I am "going organic" in my approach to the medical home. Perhaps I should also point out that my care will be entirely gluten-free? That could be a huge selling-point, a marketing bonanza.

I must be a genius.

After taking a year-long hiatus from blogging, Rob Lamberts, MD, ACP Member, returned with "volume 2" of his personal musings about medicine, life, armadillos and Sasquatch at More Musings (of a Distractible Kind), where this post originally appeared.

I was working late this week; making patient call backs, filling prescriptions, reviewing labs and finishing charts from the day. It was dark out and the medical office was quiet and empty. The janitorial crew started their work of emptying trash, picking up the scattered debris from the busy patient flow and sanitizing surfaces. I looked up and a beautiful Latin woman, age about 30, wearing latex gloves, was emptying the overflowing trash can.

"You are working overtime," she said with a heavy Hispanic accent. I laughed, realizing how late it was and how tired I was and still had more work to do. Then I stopped and really looked at her. She was busy putting liners in cans and dusting surfaces. She was working fast because there were many other offices ahead of her that also needed cleaning.

"You are working overtime too," I said. "Yes," she replied, "I will work until 12:30." (That's AM.)

"I bet this is a second job for you," I guessed, and she replied "Si, I will go to my job at the food court in the morning".

"Thank you for what you do here," I said. "Without the work you do at night we could not take care of our patients. Coming to work and seeing everything so spotless lets us take care of people so I thank you."

I have thought about this short interpersonal connection many times and how important it is to stop and really look at another person. I had been feeling "put-upon" with my workload, which was nothing compared to the work this young woman had in front of her. In our society a minimum wage job is not enough to live. People need two back to back jobs to survive. Yet she was working with grace and dignity and without complaint. In fact she was noticing that I was there late. Amazing!

Taking the time to realize that others are facing challenges in their lives and learning from the grace they bring to life is an important touchstone for grounding us in our work.

This post originally appeared at Everything Health. Toni Brayer, MD, FACP, is an ACP Internist editorial board member who blogs at EverythingHealth, designed to address the rapid changes in science, medicine, health and healing in the 21st Century.

Repeat testing is common among Medicare beneficiaries, with 55% having a second echocardiography within three years and 49% repeating a pulmonary function test, a study found.

Researchers conducted a longitudinal study of a 5% random sample of Medicare beneficiaries for the years 2004 to 2006 among the 50 largest metropolitan statistical areas to look at the proportions of the population tested and the proportion of tests repeated.

In addition to the half or more of patients who'd received a second echocardiography or pulmonary test, 44% of imaging stress tests were repeated within 3 years, 46% of chest computed tomographies, 41% of cystoscopies, and 35% of upper endoscopies.

Authors wrote, "Although we expect a certain fraction of examinations to be repeated, we were struck by the magnitude of that fraction: one-third to one-half of these tests are repeated within a 3-year period."

An editorial noted that financial payments are a major reason why. Practices don't feel like they can cut back without being paid in some other way than tests.

"To avoid reading an almost identical article about unwarranted geographic variations in these pages 10 years from now, physicians will need to support expansion of peer-designed active electronic clinical guidance systems and faster retirement of fee-for-service incentives," the editorial stated. "No matter what future payment system is implemented, some intercession in clinical decision making will be required to protect patients from too many tests and from too few tests."

Also driving overuse is the expectations of patients to be tested and treated. Donna Sweet, MD, MACP, recently attended a conference on overuse of five other tests and procedures, and reported some conclusions to Family Practice News.

Explaining the overprescribing of antibiotics, she said, "American patients aren't very patient. They want to be better now."

This blog is about freedom and personal responsibility. I have opined that cigarette smokers should not be permitted to transfer total responsibility for the consequences of their choices to the tobacco companies, even if this industry has committed legal and ethical improprieties. I do not support the politically correct beverage ban in New York City, sure to spread elsewhere, where the government decides the content and dimensions of beverages that the public desires to purchase. With regard to Obamacare, don't get me started or I'll never get to the intended subject of this post.

First, let me refute a point in advance that is sure to be leveled against me by the pro-breast crowd. I am zealously pro-breast and want all breasts foreign and domestic to remain free of disease. I am against breast cancer and support the goal of striving for early detection of this disease and medical research to prevent it. Indeed, I am against all cancer and boldly express this controversial view in print for all to see.

Breasts and politics have been intertwined for years. Many medical advocacy groups admire and envy the huge amount of research money that is garnered for breast cancer research. Some argue that breast cancer, while worthy, receives a disproportionate share of research dollars at the expense of other crippling and deadly diseases.

There is no clearer example of the contamination of breast cancer with political interference than Mammogate, when the federal government cowardly rejected the sound and impartial recommendations of its own expert panel for political reasons.

Now, a new scene in the government's Breast Fest has appeared where our elected legislators play doctor. States are passing laws that require medical facilities to inform patients who have undergone mammograms if they have dense breast tissue and that they should discuss with their physicians if additional testing is necessary. More details are found in the New York Timesreport on this issue.

I will defer expressing a medical view if women with dense breasts are adequately protected by conventional mammography. If medical professionals, unelected but presumably trained in actual medicine, believe that ultrasound exams or MRI scans are necessary to illuminate dense breast tissue, then brace yourself for an avalanche of unnecessary scans which will generate anxiety, cost a few zillion dollars and identify false positive lesions which are entirely innocent and lead to a breast biopsy bonanza. This cascade will be fueled also by the medical malpractice system, the raptor present in every mammography suite that is ready to sink talons into its prey.

Am I exaggerating here? Ask any radiologist why he has stopped reading mammograms. The guys that still do are scared stiff. These breast images are not sharp iPad images with futuristic resolution. Instead, they look like grainy collages where it can be agonizing for a doctor to decide if a small smudge is nothing or everything. Understandably, in today's litigious climate, radiologists join OperationOVERCALL, rather than risk the opportunity to serve as a defendant years later.

The government is not a physician and should not legislate medical advice. It's hard enough for actual doctors to sort through conflicting and controversial medical data and evidence to determine what is best for our patients. We struggle with this every day. Will the clumsy axe of government be a helpful player in this effort? Do we want folks who are beholden to lobbyists and are political animals by definition to force physicians to practice in certain way?

Why stop at breasts?

Pass laws that will require physicians to:--obtain a CXR if a patient has a cough and a fever,--tell every patient who has a negative cardiac stress test that the patient can drop dead of a heart attack within a week and that a cardiac catheterization should be considered,--advise patients who are scheduled for surgery to obtain a second opinion in case surgery is silly,--advise patients to pursue the probiotic promise of a panacea.

Sure, there's dense breast tissue out there. But, not nearly as dense as the government. I suppose we should trust them with our lives and our health judging by the sterling performance they demonstrate as legislators. Congress' approval rating is now soaring at 21%. This post by Michael Kirsch, MD, FACP, appeared at MD Whistleblower. Dr. Kirsch is a full time practicing physician and writer who addresses the joys and challenges of medical practice, including controversies in the doctor-patient relationship, medical ethics and measuring medical quality. When he's not writing, he's performing colonoscopies.

3) Accreditation: comply with national professional norms and requirements.

Regarding #2, the goal is to have trainees demonstrate competence as doctors at the end of the three-year training period (residency). Ideally, they acquire it in steady, graded fashion at distinct mileposts along the way, so that teaching faculty know that residents are making adequate progress and will flourish as independent doctors.

How do professions measure and determine competence? In medical training, residency programs are subject to regulations provided by "Residency Review Committees" which are empowered by a national accrediting body. Those committees come for periodic site visits to inspect our training environment and make sure that we're following best practices (and the rules).

It's up to us to comply with their rules, but we have leeway in interpreting them so that there can be innovation in how we implement our educational models.

Over the last 15 years, the national governing body was able to choose six "core competencies" that defined competence for doctors of all specialties. Regardless of what kind of medicine you practice, there should be fundamental attributes that all doctors share, right?

Those six competencies are:

--patient care (duh)

--medical knowledge (also duh)

--interpersonal skills and communication (hey, I kinda like that.)

--professionalism (for sure, right?)

--systems-based practice (huh?)

--practice-based learning & improvement (I think you lost me on these last two)

Give yourself an exercise: Now that you know the domains of competency, how would you evaluate learners in those domains?

Perhaps unsurprisingly, medical educators began resorting to numeric grading scales to evaluate residents in each of these domains. This allowed for quantification of residents' performances, and a better ability to document both interval progress and ultimate competence.

The problem became that different faculty members interpreted the grading scales differently. Grade inflation starting making nearly everyone look the same, as far as their evaluation numbers were concerned. Asked to define what makes a competent physician, faculty responded along the lines of Supreme Court Justice Potter Stewart, who famously quipped about the hard-to-define-concept of obscenity "I know it when I see it."

Voila, welcome to the Next Accreditation System (NAS). Program Directors like me across the country are currently struggling to implement this new system, with a goal of allowing more detailed analyses of learners' performances. Another goal of the new system is to allow more freedom and flexibility in educational innovation by making reporting requirements more frequent but less onerous (hey-will that work?) to keep educators' eyes more fixed on teaching and training than on evaluating and reporting. [I'm imagining that in the near-term, there will be a lot of the latter. I hope the former is not diminished.]

The new system is predicated on developmental milestones that lead doctors to become competent in a range of entrustable professional activities ("EPAs"). These EPAs map to the original six competencies which I shared with you above.

Got it?

At a recent national meeting discussing these changes and strategies for handling them, one colleague likened these new mandates to "repairing the airplane while flying it."

No one ever said that change is easy. Best for us to embrace it and make it a learning opportunity.

This post by John H. Schumann, MD, FACP, originally appeared at GlassHospital. Dr. Schumann is a general internist. His blog, GlassHospital, seeks to bring transparency to medical practice and to improve the patient experience.

Health IT issues are rising the ranks of hospital health hazards, a report found.

The ECRI Institute compiled its annual list of top 10 hazards, noting that three issues rising the ranks involve information technology, either integrating these systems to the point of care, or how the technology can distract the provider from the patient.

ECRI listed the hazards online (free download) and published them in the November issue its journal, Health Devices.

Each hazard met at least one of these criteria: it resulted in injury or death; it occurred frequently; it can affect a large number of individuals; it is difficult to recognize; it hadhigh-profile, widespread news coverage; and there are clear steps hospitals can take to minimize these risks, the organization said in a press release.

ECRI updates the annual list based upon the prevalence and severity of incidents reported by health care facilities nationwide; information found in the Institute's medical device problem reporting databases; and staff analysis.

As has likely come to your attention by now, a new study shows that daily multivitamin use is associated with a significant reduction in the overall rate of cancer. This is clearly important, and warrants careful consideration in the context of what we already knew, or thought we knew, about multi-nutrient supplementation.

For a long time, the prevailing view of multis, which as a rule contain a mix of vitamins and minerals, most at or above the level of recommended daily intake, was that they could and probably should help, and couldn't hurt. Nutrient levels were based on the Dietary Reference Intakes of the Institute of Medicine, and in general all or nearly all of the micronutrients known to be essential were in the mix.

There was always a sound rationale for such supplementation. Average intake in the U.S. of quite a few nutrients is lower than recommended, and intake of quite a few more is lower than optimal. In particular, as people get older, there is a tendency for both calories and dietary variety to fall, resulting in a rising risk for nutrient deficiencies. Few of these are bad enough to present as overt deficiency syndromes, but even nominal deficits of key nutrients may compromise health. And some cases of overt deficiency, notably of vitamins B12, folate, iron, and calcium, are seen.

This all seemed to make a robust argument for routine supplementation, in particular by those over age 50. Most doctors recommended the practice routinely, as did I. And, of course, the supplement industry made hay predictably, providing a wide array of products that competed for attention with claims about nutrient quality, quantity, variety, and customization.

But then the notion that multis could help but "couldn't hurt" started to take a beating. First came a long line of clinical trials suggesting lack of benefit and potential harm from high doses of select nutrients. Then came studies showing associations between multivitamin use and adverse outcomes, in particular, a higher rate of breast cancer among women.

Since we never had clear evidence of a benefit, even a hint of potential harm from multis was enough to argue pretty powerfully against their routine use. I stopped taking one, and stopped recommending them to my patients in the absence of a clear reason.

I never abandoned supplementation entirely, of course. In general, my clinic recommends supplements to do a particular job. So, for instance, we use omega-3s routinely to reduce inflammation, probiotics to improve gastrointestinal health and immune function, and vitamin D whenever levels are low. We use a wide range of other supplements when there is a specific case for doing so.

As for multis, I switched over to recommending them only when there was a meaningful likelihood of dietary deficits and, for whatever reason, an inability to fix them with food. I also switched from conventional multis to "whole-food-based supplements." I still think those are a good idea, and here's why:

If multis do harm, there must be a reason, and the most plausible one is a problem of "nutritional noise." Imagine, for instance, that a great electric guitar player from a rock band, a great sax player from a jazz ensemble, and a virtuoso cellist from a symphony orchestra play their own brand of music all at once. No matter how good each is when in the his/her native context, the result of this mishmash would be unpleasant noise.

In putting together multis, we, not nature, have chosen the dose, preparation, and variety of nutrients, and taken them all out of their native context in food. We know that nutrients, like musicians, work best in concert with one another. If we have assembled them wrong, they might clash. Nutritional noise could be harmful.

Whole-food-based supplements avoid this potential danger because they preserve the native context of nutrients in foods. This may facilitate the work of nutrients in concert with one another, and make far more beautiful music in our metabolism. It's theoretical, but makes good sense.

But now we have the new study, and it does invite some reconsideration of the traditional multi. The findings, reported in JAMA, are based on a randomized, blinded, placebo-controlled intervention among nearly 15,000 U.S. male physicians followed for more than 10 years. There were 8% fewer cancers overall in those who took the multivitamin, and this was statistically significant, although barely so. There were no significant effects on any particular cancer, none on cancer mortality, and none on all-cause mortality.

So the findings are intriguing and promising, but far from the proverbial slam-dunk. And they are limited to a population of male doctors age 50 and older. How they pertain to women, younger people, or populations who behave differently on average than doctors, is unknown.

So where does it leave us?

We had seen the gradual accumulation of evidence for potential harm from multivitamins. This study does not eradicate that, but it does suggest that in some populations at least, there is potential for net benefit. Judicious use of multivitamins by men age 50 and older is, if not obviously advisable, perfectly reasonable given what we do and don't know at present.

The theoretical case for whole-food-based supplements remains valid, and absence of evidence is not evidence of absence. We don't have a trial like this using supplements like that, so we are left to speculate about the potential for greater benefits.

We once thought multis could do good and couldn't do harm. We then learned they could do harm, and developed doubts about them doing any good. Our current understanding is far from perfect, but it seems to suggest some potential for both. This invites discussion between patients and doctors, and customized decisions based on personal circumstance. That may not be an entirely satisfying resolution, but anything else would run ahead of the evidence we've got. This study revises our risk/benefit assessment, it is not a basis to renounce it.

But the most important take away here has to do with the size of effects, rather than their direction. A relative reduction of 8% in the overall rate of cancer is better than nothing, but it is a small effect. In contrast, studies from 1993, 2004, 2009, 2010, and 2011, just to name a few, show that the combination of not smoking, eating well, and being active can reduce the risk of all chronic disease, cancer included, and premature death from any cause by as much as 80%. That is a tenfold multiplication of the best effect of multivitamins yet shown. You certainly want that math on your side!

So whether you choose to take a multi or not, remember it's a supplement, not a substitute. There is no substitute for the profound health benefits of a daily dose of well-chosen lifestyle as medicine.

David L. Katz, MD, FACP, MPH, FACPM, is an internationally renowned authority on nutrition, weight management, and the prevention of chronic disease, and an internationally recognized leader in integrative medicine and patient-centered care. He is a board certified specialist in both Internal Medicine, and Preventive Medicine/Public Health, and Associate Professor (adjunct) in Public Health Practice at the Yale University School of Medicine. He is the Director and founder (1998) of Yale University's Prevention Research Center; Director and founder of the Integrative Medicine Center at Griffin Hospital (2000) in Derby, Conn.; founder and president of the non-profit Turn the Tide Foundation; and formerly the Director of Medical Studies in Public Health at the Yale School of Medicine for eight years. This post originally appeared on his blog at The Huffington Post.

Diabetes rates grew fastest in the American South and Appalachian states, nearing a 10% growth compared to about or slightly less than 7.5% growth in the Midwest, Northeast and West.

While median prevalence of diabetes increased from 4.5% to 8.2% from 1995 to 2010, it was highest in the South (9.8%) compared to the Midwest (7.5%), Northeast (7.3%), and West (7.3%).

The Centers for Disease Control and Prevention conducted telephone surveys on self-reported diabetes in adults collected during 1995 to 2010 by the Behavioral Risk Factor Surveillance System. Results appeared online Nov. 16 at MMWR.

The age-adjusted prevalence of diagnosed diabetes increased in the same time span across the U.S. In 1995, age-adjusted prevalence was 6% or more in only three states, Washington DC, and Puerto Rico, but by 2010 it was 6% or more in every state, DC, and Puerto Rico, and 10% or more in six states and Puerto Rico.

In 2010, age-adjusted prevalence was highest 10% or more in Alabama, Mississippi, Puerto Rico, South Carolina, Tennessee, Texas, and West Virginia, and lowest (6% to 6.9%) in 12 states: Alaska, Colorado, Connecticut, Iowa, Minnesota, Montana, North Dakota, Oregon, South Dakota, Wisconsin, Vermont and Wyoming.

The relative increase in age-adjusted prevalence of diabetes was a median 82.2% for all states, but it was 226.7% in Oklahoma.

The growth in diabetes could be from many sources, including changes in diagnostic criteria, enhanced detection of undiagnosed diabetes, an aging population and growth of minority populations who are at greater risk for diabetes, and an increase in obesity and sedentary lifestyles.

The report continued, "Although the contribution of each factor to increasing diabetes incidence cannot be discerned, the increase in diabetes prevalence coincides with the increase in obesity prevalence across the United States."Percentage change in age-adjusted prevalence of diagnosed diabetes among adults

"This note was produced using [mega-brand] medical dictation software. While every effort has been made to insure accuracy, errors may still exist."

Really? What kind of doctor would admit in a medical chart to being too lazy or incompetent to produce an accurate record?

A lot of them. Dictations are easy to read if you are willing to confound legibility and accuracy. Dictation software is relatively cheap, and with the continued profusion of electronic health records (EHRs), dictation software allows the doctor's words to become immediately a part of the patient's chart, analogous to writing in a paper chart. In a paper chart, though I've never written a disclaimer warning of my own potential inaccuracy.

Doctors work in a safety-conscious environment on par with the best 19th century practices. If pilots worked like doctors, the sky would rain planes. Because as Americans we've chosen to maintain a medical culture reminiscent of pre-industrial guilds, with apprentices, journeymen and master craftsmen, medical quality is subject to the whims of individual patients and professionals.

EHRs are a tool that can be used to improve ourselves. Health care information entered into EHRs become potentially useful data. If I have 200 diabetic patients with paper charts, I have no easy way of seeing who is getting their yearly eye exams. If the data were entered into an EHR, I could easily produce a report that shows me the information, if I chose to.

Without agreed-upon ways to measure quality and share data, the EHR becomes a fancy toy, with amusing dictation errors. Many physicians have chosen to hold off on EHRs until their role in the system is more clear. The government and private insurers have stepped in to encourage EHR use. Medicare started by offering incentive payments which are being phased into penalties. Private insurers are demanding physicians provide them with reams of data, each company using its own data collection method. What each method has in common is the reliance on doctors to extract and report the data.

Here's the basic conflict: Medical practice needs safety, accountability, and cost-effectiveness. We also need doctors and other providers to focus on patient care. Another lesson from the airline industry is that distraction kills. The role of doctors has until now been to take care of patients, spending time with them, listening, examining, following up on tests. When I look at my desk and see piles of (different) forms from each insurance company asking me to gather and submit data on all of their patients, data which as the payer they already have access to, I may just give up, allow myself to miss out on pay incentives (i.e. pay a penalty), and miss out on an opportunity to improve the quality of care I give.

The private sector has moved in to help fill this gap with companies such as WellCentive, which offers to help doctors to gather and report data. The general idea is that to improve the quality of medical care and to hold costs down for insurance companies, doctors will purchase EHRs, pay for their upkeep and the extra personnel and hardware, and for third parties such as WellCentive to gather and report the data, all to avoid the penalty of reduced payments, penalties that for many of us aren't nearly as onerous as the process of avoiding them.

We need to use information technology to help improve safety, costs, and quality of care. But to put the burden directly on the shoulders of doctors, distracting them from patient care, is insane. If we are serious about this, the market, red in tooth in claw, is not the only solution. Until we take a systemic, serious approach to safety, cost, and quality, we will continue to have nonsensical medical practices designed around forms and incentives rather than efficient, data-driven care.

Peter A. Lipson, ACP Member, is a practicing internist and teaching physician in Southeast Michigan. After graduating from Rush Medical College in Chicago, he completed his internal medicine residency at Northwestern Memorial Hospital. This post first appeared at his blog, White Coat Underground. The blog, which has been around in various forms since 2007, offers "musings on the intersection of science, medicine, and culture." His writing focuses on the difference between science-based medicine and "everything else," but also speaks to the day-to-day practice of medicine, fatherhood, and whatever else migrates from his head to his keyboard.

1. Not surprising. I've been telling people for years that there is no great evidence for any particular visit interval. My understanding: the "yearly visit" was invented by those great protectors of American health and welfare, the insurance companies.

2. The studies that show little effect of the annual physical, i.e. the ones which have gotten the most press recently, are measuring the wrong endpoint. Sometimes people like to see their doctor to maintain the relationship, so that the MD is there if they are needed.

3. Doctor-patient communication is not yet as good as it should be across the board. Nor is the visit optimized to get the most out of the relationship. When (if!) those improvements occur, we will move the needle on the next go-round of such systematic reviews.

Zackary Berger, MD, ACP Member, is a primary care doctor and general internist in the Division of General Internal Medicine at Johns Hopkins. His research interests include doctor-patient communication, bioethics, and systematic reviews. He is also a poet, journalist and translator in Yiddish and English. This post originally appeared at his blog.

The U.S. will need nearly 52,000 more primary care physicians by 2025, mostly due to population growth and an aging population, although expanded health insurance will also play a role, researchers concluded.

Researchers used the Medical Expenditure Panel Survey to calculate the use of office-based primary care in 2008 and then applied Census projections and the American Medical Association's Masterfile to calculate the current the number of visits per physician and how that would change through 2025.

Results appeared in the November/December issue of Annals of Family Medicine.

Office visits to primary care physicians could increase from 462 million in 2008 to 565 million in 2025. Population growth of 15.2% could require 33,000 additional physicians, population aging as those 65 years and older will grow by 60% could require 10,000 additional physicians, and insurance expansion in the years 2014 and 2015 could soon require more than 8,000 additional physicians.

The authors wrote, "A rich source of additional primary care physicians exists in the current internal medicine pipeline. The number of internal medicine residents choosing primary care has decreased with most subspecializing. Increasing the number of internal medicine residents pursuing primary care would increase primary care physicians at no additional cost. The [Affordable Care Act] ACA included provisions to increase the attractiveness of primary care. Proposed increases for primary care physician reimbursement from Medicare and Medicaid, emphasis on patient-centered medical homes, and outlines for a national primary care extension service, if funded and implemented, would help support a satisfied and productive primary care physician workforce."

Recently, another study concluded that medical schools are churning out enough graduate to meet the goal of increasing enrollment 30% by the year 2016. But there aren't enough residency slots for them all, creating bottlenecks, said ACP's executive vice president and CEO, Steven Weinberger, MD, FACP.

I am presently really excited about learning all over again what I thought I knew when I finished my medical education about 25 years ago. Since that time I have become wiser, learning how to do things and what works for patients by practicing medicine and reading literature. I also retain a body of knowledge that I absorbed from my grand old doctor professors at Johns Hopkins which is sacred and dear and not necessarily true.

Just recently in my e-mail I got an invitation from the makers of the MKSAP (the Medical Knowledge Self-Assessment Program which I used in studying for my internal medicine boards) to answer a set of not-ready-for-primetime questions in the various subspecialties, for which I will be rewarded with a chance to get the next MKSAP materials for free. I must answer these questions without using outside materials and the answers from all of the folks who do this will be used to standardize the test.

I took the endocrinology section first and had an answer for each of the questions, based on what has been true over the last two decades. I then looked on UpToDate, the online resource that is updated constantly by recognized experts in every field, to find whether I had been right, and yes, sometimes I was right. But the answers I found didn't necessarily even correlate with the multiple choice answers, obviously also written by worlds' experts. So on subjects about which it is critical to do the right thing, it is really not clear what that is.

When I graduated from medical school I knew the right answer to questions that were of the ilk where there might be a right answer. Like "Is chemotherapy helpful for pancreatic cancer?" or "What are the most effective antibiotics for a simple urinary tract infection?" At some unclear moment in time, those and many other answers that I knew were no longer correct.

While cruising UpToDate I chanced upon a page called "Practice Changing UpDates" in which I found that a whole bunch of things that we do are wrong. I always feel warm inside when I find out that something that had seemed unnecessarily painful or expensive or complex is of no value. I wonder, though, how soon these new recommendations will also be wrong and when, perhaps, the previous ones will be right again, or whether the whole thing is a huge oversimplification and everything we've ever done was perhaps right, given the appropriate circumstances.

But a larger issue, for me, is the fact that it is now completely impractical to be an expert in the field of medicine, unless perhaps the field of knowledge to which on aspires to have wisdom is itty bitty. Research is just happening so very fast, communication is nearly instantaneous and discussion amongst the many diverse practitioners who very much have a right to their own educated and experienced opinion is limited.

For instance, last night I heard another physician tell a patient that he shouldn't drink so much coffee because he was having heart arrhythmias and the coffee would make it worse. He had heard a cardiologist say this and berate another physician for allowing a patient with a heart attack to have a cup of coffee. Studies show that coffee doesn't cause heart arrhythmias and that it is in general good for people is large quantities, reducing risk of liver disease, diabetes and all sorts of realms of misery. But certainly the studies that purport to show that coffee is of no harm and nearly infinite help are not designed to look at this particular individual's risk from caffeine, which definitely can cause an increase in heart rate (check your own pulse after a strong cup if you are not a habitual drinker.) What is true, then, about coffee, or anything else for that matter?

But all that said, I do think the news from UpToDate as of September 20, 2012 is pretty interesting, if not necessarily true.

Of the clinical pearls in the article, three stand out as particularly relevant to my practice. First, people with allergy to eggs CAN get a flu shot, even though it is made with eggs, because there is hardly ever any problem. They should be observed for 30 minutes where there are personnel capable of handling an allergy issue after vaccination, but they can go ahead and be vaccinated. The question of how important vaccination is for healthy adults and children is, of course, not addressed, and is still very controversial.

The second is that UpToDate recommends use of Pradaxa (dabigatran), Xarelto (rivaroxaban) or apixaban for prevention of strokes in patients with atrial fibrillation rather than Coumadin (warfarin.) I have written several articles about these new drugs which reduce the ability of the blood to clot and do not require monthly blood test monitoring. They are not easily reversible should abnormal bleeding occur, but honestly neither is warfarin, and the risk of bleeding is so much higher with it because of all of its drug and food interactions and its tendency to be taken wrong. The new drugs are more expensive, but with the expense of monitoring and paying for the morbidity from bleeding or clotting when using warfarin, the costs will end up being similar, and much less once there are generic options. There have been studies looking at various possible risks of the new drugs, including more heart attacks with the use of Pradaxa (dabigatran) but the vast magnitude of error related illness with warfarin dwarfs these risks.

And still. After all of my ranting, I am absolutely positive that many patients will still find that warfarin is the best drug for preventing clotting. There are many people whose doses are always perfect and have absolutely no problems, monitor their own blood tests at home without difficulty and the drug is, in fact, cheap. I could go on with pros and cons for a very long time, but I won't.

I would really like to see these new anticoagulants replace most of the injectable anticoagulants such as enoxaparin and dalteparin since this will profoundly change the way we treat patients with artificial heart valves and blood clotting disorders such as pulmonary embolism.

Last on the "Oh, cool. Finally." list is that it is unnecessary to follow liver function tests for people taking statin drugs such as Lipitor (atorvastatin), Crestor (rosuvastatin) and simvastatin. They are not liver toxic. We thought they were and they aren't. Again, like with the flu shot, the question of whether so many people should be taking these drugs is begged, but at least they don't need to get blood tests all the time. They also don't need to get their cholesterol levels checked all the time if they are on a dose that is stable and works, but that isn't part of the article, just information from long ago that still hasn't made its way into standard medical practice.

Janice Boughton, MD, ACP Member, practiced in the Seattle area for four years and in rural Idaho for 17 years before deciding to take a few years off to see more places, learn more about medicine and increase her knowledge base and perspective by practicing hospital and primary care medicine as a locum tenens physician. She lives in Idaho when not traveling. Disturbed by various aspects of the practice of medicine that make no sense and concerned about the cost of providing health care to every American, she blogs at Why is American Health care so expensive?, where this post originally appeared.

Internal medicine physicians are specialists who apply scientific knowledge and clinical expertise to the diagnosis, treatment, and compassionate care of adults across the spectrum from health to complex illness.
ACP Internist
provides news and information for internists about the practice of medicine and reports on the policies, products and activities of ACP. All published material, which is covered by copyright, represents the views of the contributor and does not reflect the opinion of the American College of Physicians or any other institution unless clearly stated.