Monday, December 30, 2013

Does the Vitamin and Mineral Content of Food Influence Our Food Intake and Body Fatness?

The Claim: We Overeat Because Our Diet is Low in Vitamins and Minerals

We know that animals, including humans, seek certain properties of food. Humans are naturally attracted to food that's high in fat, sugar, starch, and protein, and tend to be less enthusiastic about low-calorie foods that don't have these properties, like vegetables (1). Think cookies vs. plain carrots.

In certain cases, the human body is able to detect a nutritional need and take steps to correct it. For example, people who are placed on a calorie-restricted diet become hungry and are motivated to make up for the calorie shortfall (2, 3). People who are placed on a low-protein diet crave protein and eat more of it after the restriction is lifted (4). Humans and many other animals also crave and seek salt, which supplies the essential minerals sodium and chlorine, although today most of us eat much more of it than we need to. At certain times, we may crave something sweet or acidic, and pregnant women are well known to have specific food cravings and aversions, although explanations for this remain speculative. Research suggests that certain animals have the ability to correct mineral deficiencies by selecting foods rich in the missing mineral (5).

These observations have led to a long-standing idea that the human body is able to detect vitamin and mineral (micronutrient) status and take steps to correct a deficit. This has led to the secondary idea that nutrient-poor food leads to overeating, as the body attempts to make up for low nutrient density by eating more food. In other words, we overeat because our food doesn't supply the micronutrients our bodies need, and eating a micronutrient-rich diet corrects this and allows us to eat less and lose body fat. These ideas are very intuitive, but intuition doesn't always get you very far in biology. Let's see how they hold up to scrutiny.

The Evidence

For this hypothesis to be correct, the human body would have to:

Have the ability to evaluate its own micronutrient status for a variety of micronutrients.

Have the ability to detect the micronutrient content of food, whether instinctively or by learning.

Have the ability to promote the consumption of foods that contain the needed micronutrients, or eat more food in general.

Prioritize micronutrient status over calorie balance.

In short, I'm not aware of much scientific evidence that supports any of this in humans. Micronutrient deficiency does have physiological effects on the human body of course, but for the most part I'm not aware of clear evidence that those effects are able to influence behavior in a way that would aim to correct the specific deficiency.

Micronutrient deficiencies are often associated with a reduction of appetite, not usually an increase, suggesting that the body does not generally attempt to increase overall food intake to correct a deficiency. This is also true of animals, which tend to decrease overall food intake and lose weight on nutrient-deficient diets (6). Correcting the deficiency restores appetite and body weight.

We know the human body can detect deficiencies of calories and protein, and respond accordingly, and at least for calories (and protein to a lesser extent) the systems that regulate this are fairly well understood (7). For micronutrients, there is little evidence that such a detection and regulatory system exists. We know that the body can detect sodium chloride (salt), because we can taste it. Beyond that, I'm not aware of any clear evidence that the human body can detect the micronutrient content of foods or respond to deficiencies by altering behavior. The fact that some animals can do it does leave open the possibility that we simply haven't uncovered this effect in humans yet.

We can detect many food properties that are indirectly associated with micronutrients. For example, sweetness and tartness would have been associated with potassium and vitamin C in the wild, because fruit is rich in both. The flavor and aroma of cooked meat (glutamate and volatiles) would have been associated with B vitamins, iron, and zinc. Some animals can learn which foods are poor and rich sources of minerals (8). Yet in the modern world, these potential nutritional guidelines are easily fooled by added sugars, glutamate, and flavorings, uncoupling the natural association between flavor/aroma and nutrition. Humans do seek variety in the diet*, perhaps for nutritional reasons, but I'm not aware of evidence that this behavior is modified by the body's nutritional status or the actual micronutrient content of foods.

Another problem with this hypothesis as an explanation for obesity is that modern Americans have relatively good micronutrient status by global and recent historical standards. Although we may not eat an optimal amount of all micronutrients, frank deficiency is uncommon, in large part because much of our food is fortified. The picture was quite different a century ago. Before food fortification, deficiency of iodine (goiter), vitamin D (rickets), niacin (pellagra), and vitamin C (scurvy) were common in specific regions of the United States. Yet this was true at a time when the prevalence of obesity was much lower than it is today-- these people were evidently not eating more food to make up for deficient micronutrient status.

If our bodies can detect micronutrients in food and act to correct deficiencies, and this is a major driver of behavior, why don't people crave vegetables and liver rather than chips and soda? Why eat a larger quantity of micronutrient-poor food to try to make up the nutrient shortfall when it would be much more effective to just swap out the junk for nutrient-dense foods that are readily available? Yet many of these nutrient-dense foods remain unappealing to most people of all weights, while nutrient-poor foods remain appealing, suggesting that what humans seek from food is primarily something other than micronutrients.

Randomized Controlled Trials

The best test of this hypothesis would be to determine if 1) reducing the micronutrient content of the diet independently of other factors increases food intake in lean people, or if 2) increasing the micronutrient content of the diet independently of other factors decreases food intake in overweight people. The second prediction has been tested by using multivitamin/mineral supplementation in randomized controlled trials.

The first trial I came across was published in 2010 (9). In a group of obese Chinese women, a multivitamin/mineral supplement caused no significant change in the food-derived intake of calories, protein, fat, carbohydrate, or any of the 15 measured micronutrients. However, over 26 weeks, the supplement did increase calorie expenditure, modestly reduce fat mass, and improve markers of metabolic and cardiovascular health. This study does not support the idea that we eat to obtain micronutrients, but it does suggest a possible intriguing effect of micronutrient status on calorie expenditure, metabolic health, and body weight, at least in obese Chinese women.

I found this result difficult to accept uncritically, because it's hard for me to believe that an effect this large on markers of metabolic health wouldn't have been noticed in all the other studies involving multivitamin/mineral supplementation. It turns out, there is other evidence that relates to this question. My favorite study was published in 2008 and combined an observational study with a randomized controlled trial (10). The observational study, like many others, showed that people who habitually take multivitamin/mineral supplements tend to be leaner than people who don't. However, as with many diet/lifestyle variables, these people had an overall healthier lifestyle pattern that suggested they were more health-conscious, casting doubt on how much of their leanness could be attributed directly to the supplements as opposed to some other variable associated with it.

To test this hypothesis more directly, the investigators administered a multivitamin or placebo to obese men and women during a 12-week weight loss intervention. Changes in fat mass and energy expenditure didn't differ between groups. Women, but not men, had a reduced "fasting desire to eat" in the supplement group, which was one of four measures they used to measure food-related motivation. The other three measures, "hunger", "fullness", and "prospective food consumption", didn't differ between supplement and placebo groups in either gender. In my view, this study does not support the idea that micronutrient status plays a significant role in appetite and body weight, or increases the effectiveness of fat loss efforts.

The Bottom Line

There's little evidence that the human body is able to detect its own micronutrient status, detect the micronutrient content of foods, or respond behaviorally to manage micronutrient status, aside from sodium chloride**. Neither humans nor other animals appear to eat more calories to make up for a micronutrient shortfall, or eat less when the micronutrient density of the diet is increased, and therefore I doubt that the diet's overall micronutrient density plays a direct role in overeating, although specific micronutrients could still have biological effects that tip the scales in one direction or the other.

There is an intriguing suggestion that micronutrient supplementation can influence calorie expenditure and body fatness, but this will require independent confirmation before I can accept it. Also, we may have more to learn about human food selection in response to nutritional requirements, since we know that some animals are capable of it.

Regardless of this, eating nutrient-dense whole foods remains a good idea, and it does help with fat loss, even if that has nothing to do with vitamins and minerals. Similarly, eating nutrient-poor highly processed foods can lead to overeating, but we don't have to invoke micronutrients to explain that. It's probably more related to the fact that this food is seductive, palatable, calorie-dense, and provides little satiety per calorie, making it easy to overeat and continue eating independently of hunger. There may be other reasons as well, such as effects on the gut microbiota.

* Clear evidence of this comes from research on sensory-specific satiety (11).

** This implies that NaCl was an important and scarce nutrient in our evolutionary past. This remains true of many wild animals and a few human groups with no access to concentrated NaCl. Humans may have an elevated need for NaCl because we sweat more than other animals, but our current intake in affluent nations far exceeds this requirement.

34 comments:

I suspect the interesting study would be of indigenous groups, and seeing where the essential minerals and vitamins come from. Obviously tribes and bands would not survive with serious deficiencies.

I suspect the driving force would be cultural. Grandmother talked about one child in the 1910s who was diagnosed with a vitamin deficiency, cure was all kids had to eat at least a little of everything on the table.

I think it unlikely that all animals but humans could detect micronutrient deficiencies. Why would humans have lost that ability?

I know there have been times in the winter when I craved cabbage. I figured I probably needed vitamin C. This is no proof, of course.

However, I suspect that even if humans can detect micronutrient deficiencies (and this would include things like selenium that aren't usually added to bread or milk), that isn't the major contributor to obesity.

I have occasional intense cravings for oysters. I don't even particularly like the taste of oysters but I will still crave them. Once I eat them I will get a nice sense of wellbeing and the craving will go away again for a couple of weeks.

Citation 6 appears to be for ruminant animals. A citation for humans or an additional citation for meat eaters or other animals would be much more convincing.

In the evidence section, it is good to point out that historically the theory doesn't seem to make sense. For the rest of the section I think you make a logical case that the body is not good at broadly detecting micronutrient status in food at the time they are eaten. However, I don't think that is a requirement of the theory being tested here. Rather, the theory just states that the body recognizes its own micronutrient deficiency, not that of food, and that the body seeks more *calories*, not the actual micronutrients. The reason why (assuming your question was not rhetorical!) is an assumption that food will probably contain some of the needed micronutrients even if it is not rich in them. This thinking bring up other interesting questions: what is more important: nutrients per calorie, or nutrients per day? If the body's response to excess calories is to increase metabolism, how much does that cost in terms of micronutrients?

I am just looking at the abstract for Citation 10, but were they actually correcting deficiencies? Unless we have an idea of the micronutrient status of the subjects before and after the study then the study does not seem that useful other than to try to state that the average American is unaffected by supplements. Averages are useful, but in the end we care about anything that affects us, even if it does not affect the average.

The evidence for humans presented here for the *calorie* seeking hypothesis seems very sparse and mixed, with 2 studies both showing some findings that could be interpreted as consistent with the hypothesis.

One piece of evidence for a device for deficiency detection is in the literature on pica, defined as the eating of things not normally eaten such as dirt, or kaolin.

The Wikipedia article on this states "The scant research that has been done on the causes of pica suggests that the disorder is a specific appetite caused by mineral deficiency in many cases, such as iron deficiency, which sometimes is a result of celiac disease[4] or hookworm infection.[11] Often the substance eaten by someone with pica contains the mineral in which that individual is deficient.[12" It does supply references for these statements.

It is not clear that theoretically humans could not have a system to detect deficiencies and react to them.

As background,, it appears humans have a system for detecting foods that are bad for them. As I understand it, a small area of the brain is not protected by the blood-brain barrier. If the cells in this area receive things that hurt them (or kill them), the damage is discovered and a signal sent to the stomach to vomit. Such a system can detect poisons of various types, including ones never before encountered.

This has been extended to pregnant women to explain morning sickness. In the first trisemester the fetus is very vulnerable, and this system is reset to be unusually sensitive, rejecting all sorts of foods (or virtually all foods in my wife's case).

I could generalize this. Imagine a cluster of cells that normally grows and divides, and then suffers apotheosis. As long as they receive adequate nutrition to grow and divide, they send a signal that all is well. However, if they do not grow and divide adequately, a signal is sent that the nutrition is inadequate in some way.

I can imagine that in most cases, the gain from eating more (and getting more of the micro-nutrients) exceeds the damage from over eating (and I can imagine this being calibrated) to limit the overeating.

At a slightly more sophisticated level, I can image a subroutine that says seek variety in foods,or try new ones, or possibly eat dirt (pica) if micro-nutrient status is inadequate.

I can imagine a system in which the overeating is attempted, and after a short period the body checks to see if things are better. If they are not, it changes behavior or tries another food.

This could also be calibrated to vary with stage in the life cycle (react more if a child, or a teenager). If pregnant, react more, since the growing fetus needs nutrients.

One possible response to a nutrient deficiency could be to eat more, or different foods. Another might be to use less by moving less, producing fewer digestive enzymes, replacing worn out cells at a lower rate, or by doing less creation of new cells.

It might be argued that if the body lacks enough nutrients to grow new cells, it will not be able to grow any, so it needs no device to tell it not to attempt what it cannot do in any case. One answer is there may be disadvantages to attempting what cannot be finished. (Think of the man who starts building a new house he cannot afford to finish).

It is also possible that there are some processes in the body that are more important than others, and that when micro-nutrients are scarce, it should attempt only certain activities. In this case a signal that the body is low in micro-nutrients might be helpful in that certain activities might be slowed, but not all activities that used micro-nutrients.

A switch that says do not start or continue a new pregnancy unless there are sufficient micro-nutrients to produce a healthy baby without hurting the mother too much would make sense. One mechanism would monitor fetal growth, and if mile stones are not being met, abort. Others can be imagined.

Such a generalized micro-nutrient deficiency detector sounds much easier to evolve than having a long list of ones that look for separate deficiencies such as for copper, zinc, selenium, vitamin E, etc.

Gretchen, he said some animals, not all. Still losing the ability to detect micronutrients wouldn't be unprecedented. My understanding is that humans have the genes to generate several vitamins besides, vitamin D. However, except for vitamin D all these genes are either broken or inactive. The theory is that we had the ability to generate other vitamins, but because we got so much in our diet the genes weren't advantageous, without positive selective pressure the mechanisms eventually broke down.

A similar thing could explain why even if humans had the ability to taste micronutrients in the past we could have lost it. If early humans already had a highly diverse diet just from necessity vitamin deficiencies might not have been a problem, if so we might have lost the ability to detect them simply because it wasn't important enough.

'... modern Americans have relatively good micronutrient status by global and recent historical standards. Although we may not eat an optimal amount of all micronutrients, frank deficiency is uncommon ...'

Stephan, things are changing. Here in Oxford scientists are waking up to the possibility that deficiencies of certain micronutrients are important causes of disease. Years ago they thought it was all due to genes, and they could fix everything with gene therapy. Now they wouldn't be seen dead with those ideas.

'.. eating nutrient-poor highly processed foods can lead to overeating, but we don't have to invoke micronutrients to explain that.'

Oh yes we do. Your own group works on this. The arcuate nucleus modulates ROS levels. POMC neurons signal satiety by raising ROS, and NPY/AgRP neurons signal the opposite by lowering them. If mitochondria are defective due to micronutrient deficiencies, ROS will be too high, and then stress response systems kick in and lower them too low. That means overeating.

It matters up to a point that an individual may have lost the sensitivity to micronutrients, if the tribe as a whole evolved methods to seek/enhance those nutrients. most neolithic food preparation methods (food combination, broth, food fermentation) all improve micronutrient availability.

I prefer to think of varying threshold sensitivities for different people. I, too, have experienced feelings of well beings associated with foods, and have used them to modify my diet. I am still amazed, though, at how many decades I spent eating wheat, when a short term dedicated experiment showed me how much better off I was without.

Please don't co-opt our research for your argument. Our work has nothing to do with micronutrients. All of the diets we give to our animals, including the fattening ones, are specifically designed to provide more than sufficient quantities of all essential micronutrients. There is no reason to believe that any of the obesity and obesity-related phenotypes we study are related to a deficiency or insufficiency of micronutrients.

A personal anecdote: several years ago, I developed a craving for liver that continued for several months. Finally I was diagnosed with B12 deficiency. The deficiency was fixed with B12 injections and gradually the craving went away. Of course, before taking the test I had no clue I was lacking B12, and I didn't know that liver was a good source to get it. However eating liver had not fixed the deficiency because the problem was in absorption of the vitamin.

Hi StephanYes I do realise the diets contain everything the animals need. But consider Klevay's experiment on heart disease in mice on a high fat diet, which you have discussed here. Presumably these mice were given everything they needed, but still they developed heart disease on this diet, which they did not do when they were given extra copper. We know now that saturated fat can inhibit copper absorption.

It can also inhibit manganese absorption and increase iron absorption. Too much iron and too little manganese is apparently what causes diabetes in mice on a high fat diet.

Obese people have been found to have excess free iron in their urine. Iron can be carried across membranes by fatty acids. The great puzzle has been how a high fat diet can cause so much damage when fatty acids are not especially toxic.

Copper intestinal absorption in the rat: effect of free fatty acids and triglycerideshttp://www.ncbi.nlm.nih.gov/pubmed/8618945Manganese absorption and retention in rats is affected by the type of dietary fathttp://www.ncbi.nlm.nih.gov/pubmed/11697763Manganese supplementation protects against diet-induced diabetes in wild type mice by enhancing insulin secretionhttp://www.ncbi.nlm.nih.gov/pubmed/23372018Urinary catalytic iron in obesityhttp://www.ncbi.nlm.nih.gov/pubmed/21189275Transport of Fe2+ across lipid bilayers: possible role of free fatty acidshttp://www.sciencedirect.com/science/article/pii/000527368790037X

There is a lot of folklore on how Eskimos get their Vitamin C, since they do not have access to fruits or vegetables (http://www.straightdope.com/columns/read/2374/traditionally-eskimos-ate-only-meat-and-fish-why-didnt-they-get-scurvy). Some have claimed that they get their C from eating lots of collagen. Others suggest that it comes from raw meat, especially liver. Do we have any idea as to whether the Eskimos are prompted by signs of scurvy, or perhaps another mechanism, to insure that they get their Vitamin C?

Stephan, could I ask you what you meant when you said 'please don't co-opt our research for your argument'? The paper I referred to on ROS in the arcuate nucleus has Michael Schwartz's name on it. I wasn't presenting an argument of my own.

I've heard that vitamins, minerals,and other micronutrients are plentiful, as they essentially come from the ground. Our mechanisms that process them in the body are essentially passive, as these nutrients are cheap and plentiful. We don't need to extract them actively. Makes sense, since plants have plenty of nutrients. We only get deficient of them with industrialised foods.

Thanks for presenting the evidence with a good faith effort to achieve some kind of impartiality, balance and completeness. The complete opposite of a lot of what passes for science these days - partial, imbalanced, personal-axe-grinding, cherry-picking propaganda.

you may want to check out the US army's budget and research facilities for studying why soldiers in combat or even awaiting combat starve. Wansink and more recently Moss in "Sugar, Salt and ..." write about this.

Last time I trawled for info on anrexia and bulimia stress was often cited as a common initiating/precipitating factor at the condition's onset. Just a correlation- I don't think causation's been established yet.

I remember reading about an experiment done with very young children back, I think, in the 1930s. Food was provided to the children as an ad lib smorgasbord featuring a wide range of from-scratch foods. The children exhibited wide differences in what they ate, both between themselves and over time. But all maintained good health and growth.

I'm not a scholar in this field, and I cannot supply a citation. But my impression has been that the study was rather famous.

To the extent my memory is accurate, the results would be relevant to the question of the ability of humans to respond to the body's need for nutrients with appropriate food selections.

I recall emailing you and Chris Masterjohn maybe a couple of years back with this hypothesis. Just N=1 but I found that to the extent I ate, liver, oysters, drank raw whole milk, I was simply not very hungry. These were all too short term to gather meaningful data, but that idea has always persisted.

Now, enter the gut biome, 100 trillion manufacturing plants and we now know that not only does the relative mix vary pretty widely between individuals, but that one can radically alter relative populations in days by a switch in diet.

Integrate the gut-brain connection and my speculation is that if there's truly a nutritional absorption factor in obesity, it's to be found in the other 90% of us.

In my opinion the 2008 study, instead of serving to refute the micronutrient hypothesis, simply confuses the issues. If deficiencies do play a role in weight gain, it would be through influencing our spontaneous food choices. There is no guarantee that such an influence would be apparent at a time when we are making conscious choices in order to restrict the calorie content of our diet (and that's what the participants were doing). It is quite likely, IMO, that "hunger" and "fullness" measures didn't differ simply because the subtler effect of micronutrient deficiency was overshadowed by caloric restriction.

There is also the question of dosage. Is it possible to correct deficiencies quickly using an ordinary multivitamin supplement? (it took me months to get my iron levels back to normal - and I was taking 90 mg a day)

I would also like to see a study that specifically targets the so-called "cravings" (not the same thing as "hunger" or "appetite"). In my opinion this is the most likely area for the micronutrient effect to show. (Personally, I eliminated a craving that had plagued me for two years by taking a high-strength vitamin B supplement).

@AnkaIf you had low iron levels it may have been for reasons other than iron deficiency. Perhaps your doctor would like to see this article by Ray Peat explaining why true iron deficiency is rare and how dangerous iron can be.

Another vein that reinforces micronutrient deficiency is how something like saffron a natural serotonin reuptake inhibitor controls appetite so effectively.

People eat sugars because insulin response clears away all other amino acids from bloodstream affording the shunting of tryptophan past the blood-brain barrier.

Whenever I get a late night sugar craving, which can strike paleo and non-paleos alike, I simply take a saffron or 5htp supplement. If it's a dopamine deficiency, I find seaweed bulk bought at Costco that nails that.

In the past year I've tried this has not let me down once stopping cravings in its tracks.

I am also thinking of a Maslow like decision tree where satiety is tiered in layers of which sodium chloride and its ilk is tier one. The question it seems to me is despite a solid say Paleo diet, what cravings remain and why.

That way you can narrow the scope and specify a much narrower experiment. This is also what concerns me about vitamin studies when they take a broad cohort of sick and healthy folks alike then citing say Vitamin E isn't heart healthy.

That is why the Nurse's Study is more valid because there is a baseline to be had.

I have really wondered this as well, if anybody can really determine if we are Vitamin/Mineral deficient and determines if our body needs a certain food. In this last year and half my wife and I have been really taking a interest in what we eat and I have been doing best to transfer what I have learned to my classes at school. Being a PE teacher at the middle school level it is important for me to teach students proper health/lifestyle benefits. I have always told my kids and that if you crave a something it usually means that your body is deficient or lacking a certain mineral. A good example would be a pregnant craving charcoal because she might be lacking carbon or other trace minerals. I think we probably can detect the lack of vitamins and mineral but we are just trained to eat to fill our craving or hunger and that leads to people becoming over weight. It would be interesting if we could find a better way to train people to eat right.

I have often wondered if we could truly detect a deficiency in our diets. My wife and I have been doing a lot of research on eating and how it effects our body and weight. I know it is important and I try to bring this information back to my students because I am a Physical Education teacher in Southern California. Is it possible to teach people to look for signs of a deficiency in our diets so we don't eat the wrong foods that would effect our body and fat intake? I heard that when people crave charcoal it is due to a mineral deficiency. What signs can we see or look for that would indicate we are deficient in other vitamin/minerals? I think if we could teach our kids what to look for when they are having cravings for foods we could find a way to cut the fat intake and increase the the food that the body is truly craving.

Marcia Pelchat from Monell Chemical Sciences Center has done some work in this area. Most of the researchers I've spoken with at Monell about this topic agree that cravings are much more psychosocial than they are indicative of mineral deficiencies.

In the clinic, anemic patients crave all kinds of foods and none tend to be great sources of iron.

With regards to vit/min intake affecting fatness, there's a lot of mixed animal data that shows some support that high calcium intake encourages fatty acid oxidation, and low intake spurs lipogenesis. Pretty controversial topic though

I remember as a teenager opening the fridge and staring at the food there for minutes. Knowing I wanted something, but not seeing anything appetizing (although we had a typical fridge full of food.). Your article reminded me of this (common?) phenomenon with the information that appetite is reduced with micronutrient deficiency. I wonder if this reduced appetite is only an artifact of a food supply that is lacking the seeded for variety. In a natural environment would this deficiency result in an urge to graze farther, forage farther in seeking what is lacking? Perhaps it's only be a reduced appetite in the absence of a truly diverse food source? Of course this does nothing to address your goal in this article, to answer the question of whether or not nutrient deficiency drives overeating. I think you've done a good job of laying out the evidence. I feel fairly convinced of your position. Thanks for the continued contributions to our understanding.

About Me

I'm a writer and science consultant with a background in neuroscience and obesity research. I have a BS in biochemistry and a PhD in neurobiology. I'm the author of "The Hungry Brain: Outsmarting the Instincts That Make Us Overeat".

Copyright 2008-2017

Please feel free to reproduce the contents of this blog, on the condition that you:

1) Attribute the work to me

2) Provide a link to the page where you found it

3) Do not use it for commercial purposes

Financial disclosure

I am a co-creator of the Ideal Weight Program, and I receive revenue from the sale of this program.

In addition, I am registered as an Amazon affiliate. I may receive a small commission on the sale of some of the books I review, or other products sold through Amazon.

Disclaimer

This blog is a compilation of my opinions. It's not advice; it's information that you can take or leave as you please. I don't intend it to replace professional medical consultation or treatment. Your health is in your own hands.