Thursday, December 29, 2011

How's the second week of winter break going for you? How many times have you screamed at your kids? Fed them Christmas candy for breakfast? Confiscated your toddler's new set of horns, a gift from your relative with a grudge?

I've found a surefire way to cheer myself up whenever I feel exhausted and guilty at the end of a Bad Mommy day. No, I don't look through my kids' "You're the best mom in the world!" cards -- you do realize their teachers force them to write that. No, I cope by thinking about other women who are much worse mothers than I am. I'm talking celebrity moms. Moms who are impossibly rich and beautiful but can't be bothered to whip up some Kraft mac n' cheese. Moms like Britney Spears, who not only smokes around her kids, but also allows them to play with her cigarettes and lighter.

It's probably not fair of me to pick on poor Britney. If the paparazzi were tailing my family, they'd probably catch my kids setting the dog on fire. And I wouldn't look half as good in that bikini.

Keeping your child healthy should be a powerful motivation to quit, but is this protective instinct enough to overcome the addiction? A meta-analysis published in Pediatrics this week found that programs that counsel parents on the dangers of secondhand smoke do increase the chances of quitting successfully, though the benefit was small: 23% in the intervention groups quit, compared to 18% in the control groups. Interestingly, parents of children over the age of 4 were more likely to quit with counseling, while those of younger children weren't. The authors speculated that despite the increased risk of SIDS, mothers with newborns may be less able to stop smoking during this particularly stressful time. Another possible reason is that as kids get older, parents may be more motivated to model healthy behaviors.

So for those of you still trying to quit, stick with it. Make a New Year's resolution. Ask your pediatrician to give you a pep talk at your child's next visit, and make an appointment with your own doctor to see if you should be prescribed any medication to help you quit. Don't be surprised if this is the hardest thing you've ever had to do. In fact, the tobacco companies are on to you -- there are more cigarette ads in January and February than any other month. If you relapse, don't beat yourself up. It takes the average smoker eight tries before she's able to quit for good.

Wednesday, December 21, 2011

As a working mom, I accept that my toddler is going to be exposed to drool, snot and microscopic fecal contamination from his fellow daycare inmates. But I draw the line at pus. So imagine my dismay when one of my son's caregivers pulled me aside and said, "I get a lot of boils. Would you mind taking a look?" Whereupon she rolled up her shirt, revealing a lovely specimen, which fortunately had already burst and dried up. Most boils and abscesses are caused by Staphylococcus aureus, and in my area, about 60% are methicillin-resistant.

MRSA (along with some forms of strep) is commonly described as the "flesh-eating bacteria" in the media. While MRSA can result in serious, life-threatening infections, more often it causes nettlesome skin infections that may require incision and drainage and treatment with specific classes of antibiotics. The classic presentation is that of a "spider bite," sans spider.

Source: Dermatlas.org

It used to be that MRSA was seen primarily in hospitalized patients, but in the past 10 to 15 years, we've seen a meteoric rise in a particular strain in the community. MRSA is contagious, and pediatric outbreaks have been described in daycare centers and on sports teams.

So what can you do to protect your child? Unless you plan on raising a bubble boy or girl, MRSA is not entirely preventable. It's best to avoid sharing sweaty sports equipment and towels, which are often colonized. And there is another thing you can do to reduce the risk: Avoid unnecessary antibiotics. Antibiotics wipe out the good bacteria with the bad, allowing resistant strains to flourish. And many conditions frequently treated with antibiotics, such as ear infections, tend to resolve on their own anyway.

A recently published study looked at all the MRSA diagnoses in kids from 400 general practices in the U.K., and compared them to same-age controls. They then looked at the kids' exposure to antibiotics 1 to 6 months prior to the MRSA infection. Children who were infected with MRSA were three times as likely to have received antibiotics during that time period than those who weren't infected. The more antibiotics received, or the stronger the antibiotic (i.e., those with the broadest spectrum of activity), the stronger the association. Of course, it's possible that a child receiving multiple antibiotics is just more prone to infections, and the antibiotics per se are not causing the MRSA. The authors still found a correlation after controlling for baseline diseases, such as diabetes and asthma.

I admit I was caught flat-footed by this curbside consult, and I ended up advising my son's caregiver to see her own doctor. I told her a little about decolonization protocols, which involve bathing with antiseptics, taking antibiotics and lacing your nostrils with Bacitracin. Unfortunately, unless you place all your clothes, bedding and pets on a bonfire*, re-colonization is the norm, so these protocols aren't often used. Though I told her she had a bacterial infection, I avoided using the M-word in front of the other parents. I also didn't recommend staying home during her outbreaks, though I did suggest she cover up her boils with gauze. Afterwards I tried to hand off my kid to the other providers as discreetly as possible. This happened many years ago, and I still wonder whether I did the right thing.

Tuesday, December 13, 2011

One of the enduring myths of Christmas (aside from the man with the bag) is that eating all those candy canes will make your kids go bonkers. There are certainly observational studies showing that sugar ingestion leads to inattention and impulsivity. But if you look at only the well-designed, double-blind, placebo-controlled studies (there have been at least a dozen), not one has found that sugar has a deleterious effect on behavior in kids, even in those with attention deficit-hyperactivity disorder (ADHD).

The definitive study was performed in normal preschoolers, as well as 6- to 10-year olds whose parents had identified as being "sugar sensitive." The care involved in the design of this trial was remarkable. A dietician supervised the removal of all the food from the home, except for coffee and alcohol "as long as they were not consumed by the children." The families were then provided with meals for the next 9 weeks. The experimental diets, which rotated every three weeks, included one that used sucrose (sugar) as a sweetener, one that used aspartame (Nutrasweet), and one that used saccharin. The families were not told the hypothesis, or what substitutions were made. In fact, the investigators created sham diets that changed every week, to throw the parents off the scent. One sham diet, for instance, consisted mainly of red and orange foods (although no artificial food coloring or additives were allowed). Every three weeks, children underwent tests of their memory, attention, motor skills, reading and math performance. Interviewers also surveyed parents and teachers about their kids' behavior.

The blinding was near perfect; only one parent correctly identified the sequence of diets. Even though 48 tests and surveys were conducted per child, almost none found a difference in cognition or behavior among the three diets. The one exception? Children on the sugar diet scored significantly better on the cognition portion of the Pediatric Behavior Scale. I'm tempted to use this as a post hoc justification for letting my kids eat Cap'n Crunch, but it's probably just a chance finding.

Nonetheless, some parents continue to insist that sweets make their kids hyper. Once you believe something, you're more likely to see it. In one study, thirty-five 5- to 7-year-old boys who were reported to be sensitive to sugar were randomized to two groups. In one group, the mothers were told their sons would receive a sugary drink; in the other group, they were told that they would receive a Nutrasweet drink. They then videotaped the boys playing by themselves and with their mom. The boys also wore an "actometer" on their wrists and ankles as an objective measure of their activity level.

Here's the twist: Both groups actually received Nutrasweet. As expected, there was no significant difference in the boys' activity levels by videotape review or actometer readings. But the mothers who thought their sons consumed sugar reported significantly more hyperactivity during the play session than those who knew they were drinking Nutrasweet. The videotape reviewers (who were blinded to the intervention) also found that the mothers who thought their boys drank sugar were more likely to hover around them, yet they scored lower in warmth and friendliness. It was kind of a mean study, if you think about it. First, they lied to the moms, then they slammed them for being more vigilant.

So why, despite the plethora of data to the contrary, has the sugar myth persisted? It's possible that something else in the sweets, such as food coloring or caffeine, causes hyperactivity. And think about it: When do we do let our kids consume copious amounts of sugar? On birthdays, Halloween, and Christmas -- all recipes for going a little nuts.

Thursday, December 8, 2011

My kids' favorite cereal is Cap'n Crunch Crunch Berries. Hey, I'm a health-minded mom; I make certain that they get a serving of fruit every morning. Sure, Crunch Berries might not have the same anti-oxidant, cancer-fighting properties of acai berries. (Amazonian natives don't get cancer, so it must be true!) But I sure ain't gonna feed 'em the fruitless stuff.

They were rooting around in the pantry for their fix when they came across this old cereal container:

I tried to pass off the larvae* as a science experiment/proof of spontaneous generation/Christmas surprise, but my kids would have none of it. I was tempted to call the Captain himself to complain about the inaccurate nutrition labelling, when my husband discovered another little squirmer in a half-opened box of Cinnamon Life. Somehow finding one larva in your cereal is way more disturbing than finding a colony of them.

No doubt some of you are more disgusted that I let my kids eat sugary cereal than by the fact that my pantry is an insect zoo. And indeed, you would be in the right. There are no studies examining the larval content of sugary cereals, but there is a study showing that there is (brace yourself) sugar in sugary cereal.

The Environmental Working Group released a report this week on the sugar content of 84 popular breakfast cereals. Only one in four met the U.S. government's guideline of having less than 26% added sugar by weight. The worst was Kellogg's Honey Smacks, followed closely by Post Golden Crisps (formerly known as Sugar Smacks; there used to be truth in advertising). Cap'n Crunch Crunch Berries came in 9th, with 42% added sugar. A cup has 11 grams of sugar, which is less than a Twinkie but more than two Oreos. You can imagine how upset I was when I read that. I've since reformed my ways, and this is what I now serve my kids in the morning:

Chocolate's an anti-oxidant, isn't it?

*They weren't maggots; maggots eat meat, not fake berries. I have no idea what these larvae would have metamorphed into. (Any entomologists among my readers?) We sprayed them with Raid, squished them, burned them, and scattered their ashes in a lovely forest glen.

Tuesday, December 6, 2011

Warning: This is one of my wonkier postings. Read on if you'd like to learn more about the supposed science of economic analysis, and how it shapes healthcare policy.

How much would you pay to keep this little critter away from your child?*

I don't have a compelling personal anecdote about meningitis, and I hope I never do. Meningococcus is one of the more common causes of meningitis, and this bug gives even hardened doctors and nurses the heebie-jeebies. For one thing, it spreads by close contact, so members of the same household, or healthcare workers exposed to secretions, must take antibiotics to ward off the same fate. And if you don't die from meningococcus, you could end up with brain damage or multiple limb amputations, since one of the complications is gangrene.

So we should be thrilled that there's a vaccine against the most common serotypes that cause disease in adolescents and young adults, who are particularly susceptible to this infection. In the past, a single dose at age 11 or 12 was thought to be protective for 10 years, but recent studies have found that immunity lasts for only five. Last week, the American Academy of Pediatrics issued a statement recommending a second, booster dose for 16-year-olds.

No one argues that adding a booster won't save lives. But is it worth the extra cost, given that meningococcus is still a relatively uncommon disease? Already, kids routinely receive about 30 shots in their childhood -- double the number back in 1980. An editorial in the New England Journal of Medicine argued that "routine adolescent [meningococcal vaccine] does not provide good value for money, largely because of low disease incidence rates and relatively high vaccine cost."

You might argue that you can't put a price on a human life, but it turns out you can. Economic analysis is the science of quantifying the cost of healthcare interventions, but as you'll see, there are a lot of smoke and mirrors involved.

Let's start off with the basics. One way to measure the cost-effectiveness of a vaccine (or a pill, or seatbelts, or virtually anything) is to express it in dollars per life-yearsaved. You can see right away that if you had a vaccine that was equally effective across all age groups, it is cheaper to save the life of a baby than the life of a 70-year-old, since you could potentially add 80 years to the baby's life, but only 10 years to Grandpa's. It doesn't mean the baby's life is worth more, only that the vaccine is a bargain when given in infancy.

Some illnesses rarely cause death, but there may still be value in preventing them, to avoid complications, hospitalizations or lost productivity. So most economists use the measure of dollars per quality-adjusted life-year, or QALY, saved. How do economists quantify quality? Simple: They ask patients, "If 1 is the value of a perfectly healthy life, and 0 is death, how would you rate having this condition?" Suffering through a cold might be 0.999, while being hooked up to a ventilator and feeding tube might be 0.1. (There are no negative numbers in quality-of-life estimates, though there are probably some fates worse than death.) Already, you can see one of the inherent problems with economic analysis -- quality is an extremely subjective measure.

The other methodologic difficulties with this type of research involve knowing what to include in the accounting of costs and benefits, and which estimates to use. Do you analyze the economics from the individual's standpoint, or society's, or the third-party payer's? Each analysis is specific to its country; you can't take an analysis from, say, Singapore, and apply it in the U.K. Although our body of scientific knowledge is constantly changing, economic analyses become rapidly outdated as costs fluctuate. Econ analysis for vaccines is especially tricky, since you have to take herd immunity into account. In other words, the benefits of immunization may extend beyond the immunized.

One popular myth is that a "cost-effective" intervention saves money. In fact, most modern prevention and treatment measures don't save money at all. The biggest exception? Almost all routine early childhood immunizations, such as the measles and polio vaccines, save money. The same isn't true, though, for the newer vaccines targeting tweens and teens. Why is that?

Well, for one thing, adolescents are a hardy group. Their immune system is stronger than infants', and when they do die, it's often a result of their own stupidity -- think of texting while driving. On top of that, there's no loss in productivity when they're sick. (Insert your own lazy teenager joke here.) An adult takes time off from work for illness, and a parent needs to stay home with a sick toddler, but a jobless16-year-old with the flu can fend for himself. And then there's the fact that the newer vaccines aimed towards this age group are a lot more expensive than the older ones. So let's look at the cost-effectiveness of some of these vaccines in the U.S.:

Meningococcus is one of the more expensive, with $88,000 per QALY saved. Giving the double dose ends up being about the same price, since even though you double the cost, you save more lives.

Human papillomavirus virus (HPV) wasn't too bad, at $15,000 to $24,000 per QALY, although it's much more expensive to vaccinate boys than girls, since cervical cancer is more common than penile or anal cancer, and reducing HPV in girls should reduce the frequency of screening and treatment of pre-cancerous lesions.

Hepatitis A ranged from cost-saving in college freshmen to $40,000 per QALY in 15-year-olds. (The wide range should clue you in to the fragility of these economic models.)

The cheapest vaccine? Pertussis booster, at the bargain basement price of $6,300 per life-year saved. Outbreaks of pertussis, or whooping cough, have been linked to waning immunity in adolescents and adults, and while whooping cough is not particularly dangerous to older kids, it's very contagious and can kill unimmunized newborns. Much of its cost-effectiveness derives from herd immunity and the fact that pertussis is an older, cheaper vaccine. Middle school students in California are now required to get the pertussis booster.

Of course, these numbers give the illusion of hardness to a science that's based on the softest of data. And what is the definition of a "cost-effective" intervention anyway? By convention, a maximum limit of $50,000 per QALY saved is considered cost-effective. There's no logical reason why this number appears in the literature. It hasn't budged in the past two decades, despite inflation. And $50,000 may be a year's salary for one family, or the price of a car for another. But that's the figure in the minds of policy makers when they try to decide whether a new treatment should be covered by insurance.

Here's another way of looking at the numbers: The NEJM editorial laments that the public-sector cost of immunizing one child until adulthood (not including annual flu vaccines) is about $1,450 for males and $1,800 for females. I was surprised to see that this number was so low. After all, we spend much more than that educating and clothing our children. Heck, $100 a year is less than my caffeine budget. Shouldn't we be spending at least that much to keep our kids healthy?

*For you microbiologists out there, this is technically gonorrhea -- but it's in the same family of bacteria. Giant Microbes apparently found there's a bigger market for an STD than for meningitis.

About Me

My name is Stephanie, and I'm the happy but tired mother of two boys (ages 8 and 1) and a girl (age 6). I'm also a general internist who practices in a public teaching hospital in California, and the editor of a medical education website, ProfessorEBM.com. My passion is teaching about evidence-based medicine (EBM) to doctors-in-training. EBM involves critically reading the medical literature and applying it appropriately to patient care. I thought it would be fun and enlightening to examine firsthand the evidence on how best to parent kids. My mission is to debunk bad science and to highlight the gaps in our medical and psychosocial knowledge. But first, a warning: I don't treat children, and my take on the research may or may not apply to your particular kid. Reading this blog shouldn't be a substitute for talking to your pediatrician. Heck, I don't even follow my own advice half the time! Enjoy.