A review in the Journal of Drugs in Dermatology notes oatmeal has been used for centuries as a topical soothing agent on the skin to relieve itch and irritation in dermatology. Of course, that was coming from Johnson & Johnson, which sells a brand of oatmeal lotion. But if it helps with dry skin or a bug bite, I can imagine it having some soothing quality. One study out of Georgetown University, though, shocked me.

There's a class of chemo drugs, like Cetuximab, that can cause an awful rash. Various treatments have been tried and failed. There was no clear preventive or curative treatment for this eruption, until this remarkable study, which you can see in my Oatmeal Lotion for Chemotherapy-Induced Rash video.

The researchers had heard about a study where human skin fragments from plastic surgery were subjected to an inflammatory chemical, and adding an oatmeal extract appeared to help. Of the ten patients with chemo rashes who the researchers were able to get to try some oatmeal lotion, six had a complete response, and four a partial response, giving an overall oatmeal response rate of 100%.

Doctors wrote in from around the world. Significant improvement in all patients seemed too good to be true, but out of desperation they tried it and got the same astonishing results. Oatmeal--a simple topical agent producing such spectacular benefit where more complex therapies have failed. In an age when ever more expensive treatments are consistently being championed, it would be a great pity if this inexpensive, natural approach to relieving distressing symptoms were to be overlooked.

Ironically, two of the cancer cell lines found resistant in vitro to this type of chemotherapy were found to be sensitive to avenanthramides, which are unique phytonutrients found in oats, suggesting that people should be applying oatmeal to their insides as well.

If oatmeal is so powerful that it can clear up some of the ravages of chemotherapy just applied to the skin, what might it do if we actually ate it? That's the subject of my video Can Oatmeal Help Fatty Liver Disease?.

Cetuximab is often given for metastatic colorectal cancer. Better to try to prevent the disease in the first place:

A review in the Journal of Drugs in Dermatology notes oatmeal has been used for centuries as a topical soothing agent on the skin to relieve itch and irritation in dermatology. Of course, that was coming from Johnson & Johnson, which sells a brand of oatmeal lotion. But if it helps with dry skin or a bug bite, I can imagine it having some soothing quality. One study out of Georgetown University, though, shocked me.

There's a class of chemo drugs, like Cetuximab, that can cause an awful rash. Various treatments have been tried and failed. There was no clear preventive or curative treatment for this eruption, until this remarkable study, which you can see in my Oatmeal Lotion for Chemotherapy-Induced Rash video.

The researchers had heard about a study where human skin fragments from plastic surgery were subjected to an inflammatory chemical, and adding an oatmeal extract appeared to help. Of the ten patients with chemo rashes who the researchers were able to get to try some oatmeal lotion, six had a complete response, and four a partial response, giving an overall oatmeal response rate of 100%.

Doctors wrote in from around the world. Significant improvement in all patients seemed too good to be true, but out of desperation they tried it and got the same astonishing results. Oatmeal--a simple topical agent producing such spectacular benefit where more complex therapies have failed. In an age when ever more expensive treatments are consistently being championed, it would be a great pity if this inexpensive, natural approach to relieving distressing symptoms were to be overlooked.

Ironically, two of the cancer cell lines found resistant in vitro to this type of chemotherapy were found to be sensitive to avenanthramides, which are unique phytonutrients found in oats, suggesting that people should be applying oatmeal to their insides as well.

If oatmeal is so powerful that it can clear up some of the ravages of chemotherapy just applied to the skin, what might it do if we actually ate it? That's the subject of my video Can Oatmeal Help Fatty Liver Disease?.

Cetuximab is often given for metastatic colorectal cancer. Better to try to prevent the disease in the first place:

Acne is an epidemic skin disease of industrialized countries, reaching prevalence rates of over 85 percent of teenagers. In nearly half of American men and women, acne even continues after adolescence and into the third decade of life.

Acne is considered a disease of Western civilization, as in places like Okinawa, Japan, acne is rare or even nonexistent. So acne is not some "physiological" phenomenon of puberty, but may represent "a visible risk indicator pointing to aberrant nutrient signaling promoting chronic epidemic diseases of civilization," according to a group of German researchers (See Saving Lives By Treating Acne With Diet). What they mean is that the dairy, junk foods, meat, and egg proteins in Western diets all conspire to raise the activity of the enzyme TOR, contributing to acne and obesity. Therefore, using diet to suppress TOR may not only improve acne, but may also prevent the march to more serious chronic TOR-driven diseases of civilization. The excessive TOR stimulation induced by the standard American diet may initially manifest as premature puberty and acne, but then may later contribute to obesity, diabetes, cancer and Alzheimer's.

A lot of this research is relatively new. Until recently, for example, only a weak association had been accepted for the role of milk and dairy products in acne formation. However, there is now substantial evidence supporting the effects of milk and dairy products as enhancers of acne aggravation. Milk is not just food, but appears to represent a most sophisticated hormone signaling system activating TOR, which is of critical concern given that TOR is recognized as the fundamental driving force for a number of serious chronic diseases.

If milk is naturally supposed to stimulate TOR, why the problem? Because we're drinking milk from the wrong species. Cow's milk is designed for calves. Baby cows grow nearly 40 times faster than human infants. Cow's milk has three times more leucine, the primary activator of TOR, than breast milk, so cow's milk may over-stimulate TOR when consumed by humans. It's like giving donkey milk to rats--it doesn't make sense. Furthermore, milk is for babies, so the continued consumption of any kind of milk during adolescence and adulthood is something that never really happened naturally and may have long-term adverse effects on human health.

In this regard, it's kind of frightening to realize that more than 85 percent of teens in Western countries exhibit acne; it implies that the "majority of our population is living with over-activated TOR signaling, a major disease-causing factor, which may pave the way for the development of other more serious diseases." A history of acne has been associated with breast cancer risk in women, for example, and prostate cancer in men.

So early dietary counseling of teenage acne patients is thus a great opportunity for dermatologists, who will not only help to improve acne but may reduce the long-term adverse effects of Western diet on more serious TOR-driven diseases. So just like urologists use erectile dysfunction as an opportunity to save lives by putting people on heart-healthy diets, dermatologists can use acne as a way to save lives by putting people on a cancer prevention diet.

How do you turn acne on and off via dietary manipulation of TOR? A "comprehensive dietary strategy to treat acne can only be achieved by higher consumption of vegetables and fruit and reduction of animal-derived food" given preliminary evidence for the effectiveness of natural plant-derived TOR inhibitors in the treatment of acne.

The Washington State Fruit Commission, our largest cherry producer, can fund reviews that cherry-pick studies on the anti-inflammatory effects of cherries in a petri dish and animal models. But what we've needed are human studies. For example, if we stuff the human equivalent of up to a thousand cups of cherries down the throats of rats, it appears to have an anti-inflammatory effect, but we could never eat that many. (In fact, if we tried, it could end badly. One poor guy who ate 500 cherries whole--without spitting out the pits--ended up fatally obstructing his colon.)

A decade ago, we didn't have many human studies, but thankfully now we do. A study published in The Journal of Nutrition had men and women eat about 45 cherries a day for a month (I wouldn't mind being part of that study!). The researchers found a 25% drop in C-reactive protein levels (a marker of inflammation), as well as an inflammatory protein with an inelegant acronym RANTES ("Regulated on, Activation, Normal, T cell, Expressed and, Secreted"). Even a month after the study ended there appeared to be residual anti-inflammatory benefit from the cherry fest.

These subjects were all healthy, with low levels of inflammation to begin with, but a follow-up study, highlighted in my video, Gout Treatment with a Cherry on Top, on folks with higher levels found similar results for C-reactive protein and for a number of other markers for chronic inflammatory diseases. Do cherries then help people who actually have a chronic inflammatory disease?

Back in 1950, in an obscure Texas medical journal, "observations made by responsible physicians" suggested that in a dozen patients with gout, eating half a pound of fresh or canned cherries helped prevent flares of gout. But the issue had never seriously been tested, until recently. Gout is an excruciatingly painful inflammatory arthritis caused by the crystallization of uric acid within joints. Based on the National Health and Nutrition Examination Survey 2007-2008, the prevalence of gout in the US is estimated to be 3.9% among US adults, which translates into 8.3 million people.

Hundreds of gout sufferers were studied, and cherry intake was associated with a 35% lower risk of gout attacks, with over half the risk gone at three servings measured over a two day period (about 16 cherries a day). That's the kind of efficacy the researchers saw with a low-purine diet (uric acid is a break-down product of purines). This same research group found that purine intake of animal origin increased the odds for recurrent gout attacks by nearly five-fold. Heavy alcohol consumption isn't a good idea either.

There are some high-purine non-animal foods, like mushrooms and asparagus, but they found no significant link to plant sources of purines. So the researchers recommended eliminating meat and seafood from the diet. This may decrease risk substantially, and adding cherries on top may decrease risk of gout attacks even further. Same thing with the leading drug: allopurinol works, but adding produce appears to work even better.

Often, dietary changes and cherries may be all patients have, as doctors are hesitant to prescribe uric acid-lowering drugs like allopurinol due to rare but serious side-effects.

In addition to fighting inflammation, cherries may also lower uric acid levels. Within five hours of eating a big bowl of cherries, uric acid levels in the blood significantly drop. At the same time, antioxidant levels in the blood go up. So is it just an antioxidant effect? Would other fruit work just as well? No. Researchers tried grapes, strawberries, and kiwi fruit, and none significantly lowered uric acid levels, supporting a specific anti-gout effect of cherries.

There are some new gout drugs out now, costing up to $2,000 per dose and carry a "risk of toxicity that may be avoided by using nonpharmacologic treatments or prevention in the first place." Given the potential harms and high costs, attention ought to be directed to dietary modification, reducing alcohol and meat intake, particularly sardines and organ meats. "If life serves up a bowl of cherries (consumed on a regular basis), the risk of a recurrent gout attack may be meaningfully reduced."

In a previous video Alpha Gal and the Lone Star Tick, I started talking about a tick bite-induced meat allergy, called Alpha-Gal, that is unlike any other food allergy we know. The most interesting feature of the reactions may be that first symptoms can occur hours after eating meat. Normally, an allergic reaction to a bee sting, for example, happens within minutes. With this meat allergy, we could have a piece of bacon for breakfast and our throat wouldn't start closing off until the afternoon. Because the cause and effect are temporarily separated, we often blame other factors, such as what we ate for lunch, or we just call it "spontaneous" or "idiopathic" anaphylaxis, which is just doctor-speak for "we have no idea what the cause is."

The delay likely occurs because the alpha-gal is probably absorbed along with the fat in meat, given that the allergic reaction occurring four to five hours after meat ingestion corresponds to the peak absorption time of fatty acids from the intestinal tract.

What makes the allergy even more difficult to diagnose is that the majority of victims experience only occasional overt reactions, despite regular meat consumption. Fattier meats, like pork rinds, may provoke episodes more consistently and severely, but still don't trigger a reaction every time.

Tick bite-induced meat allergy is on the rise. Ten years ago we didn't even know this thing existed, but now in tick-ridden states as many as 20% of the population have these anti-meat allergic antibodies (See Tick Bites, Meat Allergies, and Chronic Urticaria). And more and more people are coming in affected, though probably no more than 10% who test positive go on to experience hives or serious allergic reaction to meat.

We're also seeing it more and more in kids. Researchers in Virginia have found that it is not uncommon, though identification of the allergy may not be straightforward. Unlike in adults, who frequently present with systemic reactions, the majority of children with this syndrome present with just skin manifestations, such as hives. However, this doesn't mean it's not serious. In fact, nearly half the kids ended up in the ER, and about 1 in 12 needed to be hospitalized.

Up to a quarter of the population breaks out in hives at some time in their lives, but some children can be affected for weeks or months. It can be triggered by infections, foods, drugs, parasites, or be autoimmune, but in a large subset of cases we don't know what the trigger is, and so, call it chronic "idiopathic" urticaria. It's a common thing pediatricians see. The only cure is avoiding and eliminating whatever is triggering it, but in three quarters of cases we have no clue.

We now know that many children who had been diagnosed with mysterious hives or allergic reactions and may have been specifically told that the reactions were not a result of a food allergy, may have actually been suffering from anti-gal meat allergies. The serious nature of the reactions and the rising frequency of allergic swelling and hives across all age groups underscore the importance of identifying what's going on, and physicians should keep this new diagnosis in mind.

Allergies to meat might be more common than previously thought, as much as 2% of the population (which would mean millions of people). But to put this in context, Americans are much more likely to suffer an anaphylactic reaction due to seafood, tick bite or not, no matter where they live. A national survey of emergency rooms found shellfish was by far the most frequently implicated food, and unlike many other allergies, kids don't tend to outgrow fish and shellfish allergies.

Some fish allergies are actually allergies not to the fish, but to worms in the fish, like anisakis, which are found particularly in cod, anchovies, and squid. Exposure to these parasites in fish, living or dead, is a widespread problem. In fact, we can even have an allergic reaction to the parasitic fish worm when we eat chickens that were fed on fishmeal. This is one of the ways someone who's allergic to fish could get triggered by chicken.

Because of these worms, researchers recommend that people stop eating seafood and sushi altogether, because besides inducing allergenic reactions, the worms may cause a leaky gut syndrome, which often is unrecognized and can predispose someone to other, more important pathologies than just being itchy all over.

It is estimated that the human body consists of ten or so trillion cells. Almost all of these cells get turned over within approximately 100 days. That means we're like a new person every three months. We reinvent ourselves physically. And since we're physically made of air, water, and food--those are essentially the only inputs--we are what we eat, literally and physically. In a sense our body has to rebuild itself every three months with the building materials we deliver to it through our stomach. Our mouths are like the access road to the continual construction site of our body. Trucks roll in three times a day. What do we want them to deliver? Some shoddy cheap stuff we scrounged around for or bought at the discount outlets that's just going to fall apart? Or do we want to build our foundation solid? We are each walking inside the greatest known architectural structures in the universe. Let's not ruin such grand blueprints by consuming junk.

We only own the biological real estate we're born with, so if we need to rebuild every three months, we also need a wrecking crew. If we're replacing ten trillion cells every hundred days, that means we have to kill off about 100 billion cells every day. Out with the old, in with the new. We do that primarily through "apoptosis," pre-programmed cell death (from the Greek ptosis, meaning "falling", and apo, "away from"). For example, we all used to have webbed fingers and toes. Literally. Each one of us did in the womb until about four months, when apoptosis kicked in, and the cells in the webbing kill themselves off to separate our fingers.

However, some cells overstay their welcome: cancer cells. They don't die when they're supposed to by somehow turning off their suicide genes. What can we do about that? Well, one of the ways the yellow pigment in curry powder kills cancer cells is by reprogramming the self-destruct mechanism back into cancer cells. Let me just run through one of these pathways.

FAS is a so-called death receptor that activates the FAS associated death domain, death receptor five, and death receptor four. The FADD associated death domain then activates caspase-8, which "ignites the death machine," and kills the cell. (To see the diagram of the pathway, go to my video Turmeric Curcumin Reprogramming Cancer Cell Death). Where does curry powder fit into all this? In cancer cells, curcumin, the pigment in the spice turmeric that makes curry powder yellow, upregulates and activates death receptors (as shown in human kidney cancer cells, skin cancer cells, and nose and throat cancer cells).

Curcumin can also activate the death machine directly (as shown in lung cancer and colon cancer). Caspases are so-called "executioner enzymes," that when activated, destroy the cancer cell from within by chopping up proteins left and right--kind of like death by a thousand cuts.

And that's just one pathway. Curcumin can also affect apoptosis in a myriad other ways, affecting a multitude of different types of cancer cells. It also tends to leave normal cells alone for reasons that are not fully understood. Overall, researchers "showed that curcumin can kill a wide variety of tumor cell types through diverse mechanisms. And because curcumin can affect numerous mechanisms of cell death at the same time, it's possible that cancer cells may not easily develop resistance to curcumin-induced cell death like they do to most chemotherapy."

It is estimated that many tumors start around the age of 20. However, detection of cancer is normally around the age of 50 or later. Thus, it takes cancer decades to incubate. Why does it take so long? Recent studies indicate that in any given type of cancer, hundreds of different genes must be modified to change a normal cell into a cancer cell. Although cancers are characterized by the dysregulation of cell signaling pathways at multiple steps, most current anticancer therapies involve the modulation of a single target. Chemotherapy has gotten incredibly specific, but the ineffectiveness, lack of safety, and high cost of these monotargeted therapies has led to real disappointment, and drug companies are now trying to develop chemo drugs that take a multitargeted approach.

Many plant-based products, however, accomplish multitargeting naturally and are inexpensive and safe compared to drugs. However, because drug companies are not usually able to secure intellectual property rights to plants, the development of plant-based anticancer therapies has not been prioritized. They may work (and work better for all we know), and they may be safer, or even fully risk free.

If we were going to choose one plant-based product to start testing, we might choose curcumin, the pigment in the spice turmeric (the reason curry powder looks yellow). Before we start throwing money at research, we might want to ask some basic questions, like "Do populations that eat a lot of turmeric have lower cancer rates?" The incidence of cancer does appear to be significantly lower in regions where turmeric is heavily consumed. Population-based data indicate that some extremely common cancers in the Western world are much less prevalent in regions where turmeric is widely consumed in the diet.

For example, "overall cancer rates are much lower in India than in western countries." U.S. men get 23 times more prostate cancer than men in India. Americans get between 8 and 14 times the rate of melanoma, 10 to 11 times more colorectal cancer, 9 times more endometrial cancer, 7 to 17 times more lung cancer, 7 to 8 times more bladder cancer, 5 times more breast cancer, and 9 to 12 times more kidney cancer. This is not mere 5, 10, or 20 percent more, but 5, 10, or 20 times more. Hundreds of percent more breast cancer, thousands of percent more prostate cancer--differences even greater than some of those found in the China Study.

The researchers in this study, highlighted in my video Back to Our Roots: Curry and Cancer, conclude: "Because Indians account for one-sixth of the world's population, and have some of the highest spice consumption in the world, epidemiological studies in this country have great potential for improving our understanding of the relationship between diet and cancer. The lower rates of cancer may, of course, not be due to higher spice intake. Several dietary factors may contribute to the low overall rate of cancer in India. Among them are a "relatively low intake of meat and a mostly plant-based diet, in addition to the high intake of spices." Forty percent of Indians are vegetarians, and even the ones that do eat meat don't eat a lot. And it's not only what they don't eat, but what they do. India is one of the largest producers and consumers of fresh fruits and vegetables, and Indians eat a lot of pulses (legumes), such as beans, chickpeas, and lentils. They also eat a wide variety of spices in addition to turmeric that constitute, by weight, the most antioxidant-packed class of foods in the world.

Population studies can't prove a correlation between dietary turmeric and decreased cancer risk, but they can certainly inspire a bunch of research. So far, curcumin has been tested against a variety of human cancers, including colorectal cancer, pancreatic cancer, breast, prostate, multiple myeloma, lung cancer, and head and neck cancer, for both prevention and treatment. For more information on turmeric and curcumin, check outCarcinogen Blocking Effects of Turmeric CurcuminandTurmeric Curcumin Reprogramming Cancer Cell Death.

I'm working on another dozen or so videos on this amazing spice. This is what I have so far:

Aromatherapy -- the use of concentrated essential oils extracted from plants to treat disease -- is commonly used to treat anxiety symptoms. Anxiety disorders are the most prevalent class of psychiatric disorders in the general population. However, their treatment is challenging, because the drugs used for the relief of anxiety symptoms can have serious side effects.

Thankfully, credible studies that examine the effect of essential oils on anxiety symptoms are gradually starting to appear in the medical literature. However, in most of these studies, exposure to the essential oil odor was accompanied by massage. This makes it difficult to draw firm conclusions about the effect of the aroma itself.

A typical example includes this study where patients in the intensive care unit the day after open-heart surgery got foot massages with orange-scented oil. Why not back massages? Because they just had their chests cracked open so they have huge sternotomy wounds. Patients showed a significant psychological benefit from the aromatherapy massage.

But how do we know the essential oil had anything to do with it? Maybe it was just the massage. If that's the case, then great--let's give people massages! I'm all for more ICU foot rubs. "There is considerable evidence from randomized trials that massage alone reduces anxiety, so if massage is effective, then aromatherapy plus massage is also effective." One study where cancer patients got massaged during chemo and radiation even found that the massage without the fragrance may be better. The researchers thought it might be a negative Pavlovian response: the patient smells the citrus and their body thinks, "Oh no, not another cancer treatment!"

More recently the ambient odor of orange was tested in a dental office to see if it reduces anxiety and improves mood. Ambient odor of orange was diffused in the waiting room and appeared to have a relaxant effect--less anxiety, better mood, and more calmness--compared to a control group where there was no odor in the air. No odor, that is, except for the nasty dentist office smell. Maybe the orange scent was just masking the unpleasant odors. Maybe it had nothing to do with any orange-specific molecules. More research was necessary.

So in another study, highlighted in my video, Orange Aromatherapy for Anxiety, researchers exposed some graduate students to an anxiety-producing situation and tested the scent of orange, versus a non-orange aroma, versus no scent at all. The orange did appear to have an anxiety-reducing effect. Interestingly, the observed anxiety-reducing effects were not followed by physical or mental sedation. On the contrary, at the highest dose, the orange oil made the volunteers feel more energetic. So orange aromatherapy may potentially reduce anxiety without the downer effect of Valium-type drugs. Does that mean we can get the benefits without the side effects? I've talked about the concerns of using scented consumer products before, even ones based on natural fragrances (Throw Household Products Off the Scent), and there have been reports of adverse effects of aromatherapy.

Alternative medicine isn't necessary risk-free. For example, there are dozens of reported cases of people having their hearts ruptured by acupuncture. Ouch.

But the adverse effects of aromatherapy were mostly from skin irritation from essential oils being applied topically, or even worse swallowed. Certain citrus oils can also make your skin sensitive to sunlight.

When designing an antibiotic, we can't create a drug that destroys DNA because that's something that both humans and bacteria share in common. It would kill bacteria, but it might kill us, too. Instead, many antibiotics work by attacking bacterial cell walls, which is something bacteria have that we don't.

Similarly, antifungals can attack the unique cell walls of fungus. Pesticides can work by attacking the special exoskeleton of insects. But fighting cancer is harder because cancer cells are our own cells. So fighting cancer comes down to trying to find and exploit differences between cancer cells and normal cells.

Forty years ago, a landmark paper was published showing for the first time that many human cancers have what's called "absolute methionine dependency," meaning that if we try to grow cells in a Petri dish without giving them the amino acid methionine, normal cells thrive, but without methionine, cancer cells die. Normal breast cells grow no matter what, with or without methionine, but cancer cells need that added methionine to grow.

What does cancer do with the methionine? Tumors use it to generate gaseous sulfur-containing compounds that, interestingly, can be detected by specially trained diagnostic dogs. There are mole-sniffing dogs that can pick out skin cancer. There are breath-sniffing dogs that can pick out people with lung cancer. Pee-sniffing dogs that can diagnose bladder cancer and--you guessed it--fart-sniffing dogs for colorectal cancer. Doctors can now bring their lab to the lab!

It gives a whole new meaning to the term pet scan :)

Methionine dependency is not just present in cancer cell lines in a Petri dish. Fresh tumors taken from patients show that many cancers appear to have a biochemical defect that makes them dependent on methionine, including some tumors of the colon, breast, ovary, prostate, and skin. Pharmaceutical companies are fighting to be the first to come out with a drug that decreases methionine levels. But since methionine is sourced mainly from food, a better strategy may be to lower methionine levels by lowering methionine intake, eliminating high methionine foods to control cancer growth as well as improve our lifespan (see Methionine Restriction as a Life-Extension Strategy).

Here's the thinking: smoking cessation, consumption of diets rich in plants, and other lifestyle measures can prevent the majority of cancers. Unfortunately, people don't do them, and as a result hundreds of thousands of Americans develop metastatic cancer each year. Chemotherapy cures only a few types of metastatic cancer. Unfortunately, the vast majority of common metastatic cancers, such as breast, prostate, colon, and lung, are lethal. We therefore desperately need novel treatment strategies for metastatic cancer, and dietary methionine restriction may be one such strategy.

So, where is methionine found? In my video, Starving Cancer with Methionine Restriction, you can see a graph of foods with their respective methionine levels. Chicken and fish have the highest levels. Milk, red meat, and eggs have less, but if we really want to stick with lower methionine foods, fruits, nuts, veggies, grains, and beans are the best. In other words, "In humans, methionine restriction may be achieved using a predominately vegan diet."

So why isn't every oncologist prescribing a low-methionine diet? One researcher notes that "Despite many promising preclinical and clinical studies in recent years, dietary methionine restriction and other dietary approaches to cancer treatment have not yet gained wide clinical application. Most clinicians and investigators are probably unfamiliar with nutritional approaches to cancer." That's an understatement! "Many others may consider amino acid restriction as an 'old idea,' since it has been examined for several decades. However, many good ideas remain latent for decades if not centuries before they prove valuable in the clinic....With the proper development, dietary methionine restriction, either alone or in combination with other treatments, may prove to have a major impact on patients with cancer."

Why might the medical profession be so resistant to therapies proven to be effective? The Tomato Effect may be partially to blame.

Inadequate fruit and vegetable consumption may kill millions around the globe every year, so the public health community is not beyond "appealing to vanity."

How can we tell if someone's healthy? You can look for that golden glow that comes from the carotenoids in fruits and vegetables, found to increase the attractiveness of African, Asian, and Caucasian faces. In my video, Eating Better to Look Better, you can see some "before-and-after" shots, before and after increased consumption of fruits and vegetables. Most think the pictures representing the greater fruit and veggie group appear healthier and more attractive.

College students who went from three servings a day to the recommended minimum of nine servings a day for just six weeks were able to significantly improve their skin color, though it's possible smaller dietary changes could help as well.

Public health advocates hope that research suggesting healthy eating may "affect mate choice and sexual selection" could provide a powerful message for promoting healthy eating. Their hope is to boost fruit and veggie intake up to 13 servings a day.

And while a rosy glow associated with cardiovascular health in the face and lips can also increase one's appearance of healthfulness and attractiveness, the color red can also reduce junk food intake. People drink less soda from cups with red stickers than from cups with blue stickers, and eat less from red plates than from blue or white plates. How crazy is that? Researchers speculate that it's because our brains are subconsciously thinking "red traffic lights, stop-signs, red alert," and therefore give us pause when we see the color red while eating.

I previously covered this topic in Golden Glow and Rosy Glow, though I'm so glad we now have data from people of color as well.

I'm certainly not above appealing to vanity. Whatever it takes to get people healthy. Hence videos like: