There are more things in heaven and earth

January 31, 2007

I blogged once last year on the growing use of tasers in schools, and I came across another such story today, this time from Ohio:

A Westerville North High School student who stripped naked, lubed his
body in oil and ran amok through the school commons during lunch
yesterday was arrested after police twice zapped him with a Taser.

The article treats the use of tasers as a minor point in what is otherwise just a local human interest story. A taser electrocutes its target with 50,000 volts of electricity and is excruciatingly painful. It clearly constitutes torture. It was never supposed to be used except in cases where the alternative was to fire a gun. For instance, here's the CBC:

Tasers are supposed to allow police officers to subdue violent
individuals without killing them. A police officer can "take
down" a threatening suspect without worrying that a stray
bullet might kill or injure an innocent bystander.

Yeah, it would've been really dangerous to shoot a naked teenager in a crowded cafeteria-- what choice did the poor cops have?

I've followed the use of tasers and other non-lethal weapons quite a bit. Amnesty International keeps statistics on taser deaths, and theirs are actually among the most conservative of figures. According to a 2006 report:

Sixty-one people died in 2005 after being shocked by law enforcement agency TASERs, a 27
percent increase from 2004's tally of 48 deaths, finds an Amnesty
International study released today.

More disturbingly, the taser seems to produce Abu-Ghraib style behavior in policemen (link):

In a massive report released late last year, Amnesty International documented hundreds of cases in the last three years in which Taser-happy police used the weapon on everyone from disturbed children to old men and women who didn’t follow orders fast enough to a Florida man — strapped down on a hospital bed — who wouldn’t provide a urine sample.

Given the behavior of taser-carrying cops, what kind of school principal could allow such a weapon to be present on school grounds? And what the hell are cops doing in schools in the first place? Are we such frightened sheep that one Columbine means we will accede to having our schools remade into boot camps?

I want to make a second point regarding taser use against children which I have not seen brought up anywhere. It has become clear that people on cocaine have a dramatically higher risk of dying from being tasered, even when they are young and otherwise physically fit. Even if you're not on cocaine the taser may stop your heart momentarily (it all depends on when in your heartbeat cycle the electrocution begins). Among taser victims whose circulatory systems are already stressed by stimulants, the chance the heart rhythm will be fatally disrupted or that the heart will not re-start is considerably increased.

And who are some of the primary users of stimulants in the United States? Schoolchildren, of course. Ritalin is extremely similar to cocaine, chemically speaking, except that its effects in the brain are stronger, and its cardiovascular effects last longer (reference). Adderall, an amphetamine similar to speed, has been known to cause sudden death due to heart attack or stroke, which is why it was taken off the market in Canada. Regarding Ritalin:

Ritalin, extensively prescribed to calm hyperactive children...
should carry the highest-level warning that it may increase the risk of
death from heart attacks, US experts recommended yesterday.

<snip>

This class of drugs, known as methylphenidates, are amphetamine-based
and it is thought they could cause heart problems in some children and
adults because they raise blood pressure [and heart rate, may I add].

<snip>

[The British version of the FDA] added that methylphenidate "is recognised to cause cardiovascular
adverse effects", such as a racing or abnormal heartbeat and
palpitations and increased blood pressure.

It's actually not that easy to find out how many schoolkids are taking stimulant medications. The most common estimate seems to be about 3% of schoolchildren, but the number of prescriptions written to children nationwide, divided by the number of schoolchildren in K through 12 schools, suggests it's twice that percentage. Among high school students, the number using prescription stimulants illicitly (not as prescribed, or without a prescription) is as much as 10% per year. Illicit use might mean once, or it might mean repeatedly over a long period (Ritalin and Adderall are addictive, after all). And although it's possible to buy Ritalin or Adderall online without a prescription, many studies look only at prescriptions filled. So in any given high school, I'd guesstimate that between 3 and 8 percent of students are on stimulants on any given day (but even then, it's highly variable between schools). The kids on meds (legitimately or illicitly) are disproportionately the kids with behavioral problems, who would be statistically more likely to run afoul of the authorities.

Someone's child is going to die from being tasered in school while taking stimulant meds. Of course, the school administrators and the police will pretend that it was impossible to predict that Ritalin + 50,000 volts would produce death... sort of like Condi claiming no one could have predicted planes flying into the WTC, in spite of months of warnings predicting exactly that.

Any principal allowing the use of tasers in a population where there is a minority group with major cardiovascular risk factors-- i.e. the typical American student body with its stimulant users-- is criminally negligent, and should be prosecuted if a student dies from tasering under his/her watch.

January 28, 2007

Call me a cynic, but lately I'm suspicious of anything that "everybody knows" about health. For instance, I distrust the common wisdom that the best thing you can do to prevent osteoporosis is to eat gigantic amounts of calcium. I think we know which industry is responsible for the calcium PR campaign, and all I have to say to them is, "Got Science?"

Calcium is not an independent actor in the body. It needs to be in balance with magnesium, in order to build the strongest bones, to allow nerves to fire properly, to regulate blood pressure and prevent arterial plaques (which are basically calcium + unsaturated fat deposits), and in numerous other processes. A really severe imbalance between calcium and magnesium can cause a coronary spasm, i.e. a heart attack. In traditional diets (paleolithic or peasant), calcium and magnesium were consumed in roughly equal amounts. Yet the modern recommended intake of calcium is now 1,200 mg per day, while the RDA for magnesium is only 400 mg: a 3:1 ratio in favor of calcium.

Actually, the modern diet is probably even more skewed than 3:1, because people often get more calcium than they think. The frenzy over calcium has many people convinced they can't get enough-- they drink their skim milk, and the OJ with a third of their daily calcium in every 8 ounces, eat yogurt at lunch, etc. (They may forget entirely that vegetables contain calcium, since the Milk Lobby has succeeded in creating a "calcium = dairy" mentality.) On top of it all, a health-conscious person may eat 1,200mg of calcium just from supplements, since what we like to see on the pill bottle label is "100%" under the RDA column.

But let me poke a hole in the "calcium prevents osteoporosis" argument: the Chinese get only around 500mg of calcium per day, yet
osteoporosis is less common than in the US. Our calcium intake is, relatively speaking, sky high, yet so is our rate of osteoporosis. What gives?

Vegetable
calcium sources are more important in the Chinese diet than in the
typical Western diet because of the lower intake of dairy products by
Chinese. Vegetables and soy products provide 41% of the calcium intake
of Chinese, in contrast with 10% in the United States (6). Calcium
absorption from several commonly consumed Chinese vegetables is higher
than that from milk by almost 10% (7).

Unfortunately,
this editorial is titled "Calcium requirements: The need to understand
racial differences." According to the author, the Chinese have less
osteoporosis because they have lucky genes and absorb and utilize
calcium really well. The word "magnesium" does not appear once in this
document. Yet green vegetables are known for their magnesium content,
which indicates that calcium in Chinese diets tends to arrived packaged
along with magnesium. Not only is calcium better absorbed when
magnesium is present, but magnesium also promotes a hormone called
calcitonin that causes calcium to be stored in bones instead of in soft
tissue. This article argues that osteoporosis is actually caused by magnesium deficiency, not a lack of calcium in the diet.

But don't worry about magnesium, says the National Osteoporosis Foundation: "Minerals such as magnesium and phosphorus also are important, but usually are obtained through food or multivitamins." I disagree.

Most of the magnesium in multivitamins or included in calcium
supplements is magnesium oxide. This form of magnesium is cheap, which
is why they use it. But there's a reason it's also used as a laxative:
we don't absorb it. It just goes right on through.

And while magnesium is in every plant, there's only enough to be beneficial to humans if there's an abundance of magnesium in the soil. I ask you, if you grow the same crop on the same soil using agribusiness farming... how much magnesium is still in that dirt after 10 years? One study gives us a hint:

A fascinating study, conducted at the Earth Summit in Rio in June of 1992, compared the mineral content of soils today with soils 100 years ago, and revealed some startling facts. Researchers found that in African soils there were 74% less minerals present in the soil today than there were 100 years ago. Asian soils have 76% less, European soils have 72% less, South American soils have 76% less. And the soils in the US And Canada contain 85% less minerals today than they did 100 years ago.

On top of that, we don't simply pick a vegetable and eat it, or chew wheat berries. We refine our foods, often to ill effect:

Many factors affect magnesium availability from foods.... Much
magnesium can be lost in the processing and refining of foods and in
making oils from the magnesium-rich nuts and seeds. Nearly 85 percent
of the magnesium in grains is lost during the milling of flours.
Soaking and boiling foods can leach magnesium into the water.... Oxalic acid in vegetables such as spinach and chard and
phytic acid in some grains may form insoluble salts with magnesium,
causing it to be eliminated rather than absorbed.... [M]any people get insufficient magnesium from
their diets.

Why, though, should I be so obsessed with magnesium anyway? Check out this list of Mg benefits, or this article. It turns out this mineral is vital to many aspects of health, but perhaps most importantly, it helps to prevent hypertension and heart attacks. Then why don't we hear more about magnesium?

One reason is that it's not easy to measure magnesium in the body or detect a deficiency via a simple lab test. You can't do it by blood samples, for instance, because only about 1% of the magnesium in your body is found in your blood. Those blood magnesium levels are very tightly regulated, so that you could be fairly deficient in magnesium yet show nothing unusual in a blood sample.

Secondly, there's no particular food industry that would benefit from touting this mineral. As far as I am aware there's no Buckwheat Consortium, nor any Brazil Nut Committee. There's no simple equation to be advertised, such as "dairy = calcium". Even if the Spinach & Kale Lobby wanted to market their goods by talking up the magnesium content, that content can vary widely depending on geography, which complicates any advertising campaign.

And then, of course, there's Big Pharma. Magnesium is dirt cheap. Even the new designer forms of magnesium, such as magnesium amino acid chelate (the best absorbed form), are less expensive than aspirin.

According to one MD who wrote an entire book on magnesium, it can be extremely beneficial during a heart attack. Specifically:

Magnesium is able to:

Dilate blood vessels

Prevent spasm in the heart muscle and blood vessel walls

Counteract the action of calcium, which increases spasm

Help dissolve blood clots

Dramatically lessen the site of injury and prevent arrhythmia

Act as an antioxidant against the free radicals forming at the site of injury [1-4]

. . .

A drug trial called ISIS sought to disprove the effects of magnesium. [But] in the ISIS trial, the protocol was not followed, in that magnesium was not the first drug given, and often it was not given for many hours or days after a heart attack was well established, causing widespread damage and blood clotting. Yet, drug reps... tell their doctor clients that ISIS proved that magnesium is worthless for heart disease! [6] Since the LIMIT-2 and ISIS trials, another smaller trial with only 200 people who were given IV magnesium at the onset of a heart attack, experienced a 74 percent lower death rate [emphasis mine]. [7]

So in other words, in the larger trials they intentionally did not follow the alternative protocol for using magnesium in heart attacks, yet they made claims that magnesium is ineffectual. Big Pharma knows that all people remember is the headline: "Magnesium doesn't help with heart attacks." Never mind that half the time they gave the magnesium on day 2 or 3 (and, no doubt, in the wrong form, or combined with medications that negate its biochemical activity, or who knows what else-- they're quite creative). Often when you see the phrase "In a smaller trial," as above, this translates to "In a trial without Big Pharma funding." Of course, when it comes to the influence of money and the corruption of science, that's by no means confined to trials backed by Pfizer. The Institute of Medicine, the NIH, the CDC, and the FDA are all in bed with Big Pharma to varying degrees. We lost as many people to Vioxx, in less than 5 years, as we lost in the Vietnam War, though the FDA knew of major safety concerns within months of it hitting the market. In some prominent location at FDA headquarters their motto ought to be carved in stone: Bought and Paid For. These people make Jack Abramoff look like an ethicist.

So yes, I can believe that magnesium is ignored in treating hypertension, heart arrhythmia, heart attacks, osteoporosis, and a list of other conditions you can find in the above links. And that instead of trying nutritional solutions, drugs (almost always with dangerous side effects) are a doctor's first thought. Sure I can believe it. Big Pharma employs roughly 90,000 drugs salespeople in the US alone, who hound doctors on a daily basis. Curing illness with minerals isn't even on the table.

If a mineral sells milk, though... then you'll probably hear about that one.

January 19, 2007

You've no doubt heard the phrase "calories in, calories out" in reference to weight gain and weight loss. I admit that I once said it myself, during a group presentation on obesity in a grad school class. This is what you will constantly see repeated in much of the public health literature.

It's total bunk, of course.

Let's just start with what a calorie is (when talking about food, this really means a "kilocalorie"). A food calorie is the amount of energy it takes to heat 1kg of water by 1 degree Celsius, and they release this energy by setting the food on fire. I know it seems silly, but that's how they do it. They stick a carefully measured amount of a certain food into something called a bomb calorimeter, set the food on fire, and measure how much the temperature of 1kg of nearby water goes up. If it goes up 9 degrees C, then the food inside contains 9 calories.

Just think about this for one moment. You eat a cracker, your body makes glucose, that gets turned into ATP by one of two complicated processes, and you've got energy. Where, in this process, is anything set aflame? The calorimeter method is like measuring the energy released in a chemical reaction by burning the component chemicals. This is hardly an accurate measure of the true potential energy of the ingredients.

Protein and carbohydrates both contain, according to setting-them-on-fire tests, 4 calories per gram. But carbohydrates are far more easily and quickly converted into glucose, which means they're more readily used as fuel. Protein, on the other hand, is required for various structural purposes (in new cells, for example), and much of the protein we eat is never converted into energy. The energy in 1 gram of carbs is easy to estimate because there's really nothing the body can do with carbs except make glucose. But the energy in 1 gram of protein is unknown. If that's the only gram of protein in your lunch, there is no way the body will waste it by turning it into glucose. If, on the other hand, you eat a big skinless chicken breast and nothing else for lunch, then yeah, some of the protein will likely be made into glucose and used as fuel. In short, the real energy content of protein depends on what else you are eating, and varies between 0 and 4 calories per gram.

The primary function of dietary proteins, for example, is body cell manufacture and repair: making
skin, blood, hair, finger‑ and toe‑nails, etc. The amount of protein needed for this purpose is generally accepted to be about one gram per kilogram of lean body weight. As meats contain approximately 23
grams of protein per 100 grams, a person weighing, say, 70 kg [154 pounds] needs to eat about 300 g (11 oz) of meat, or its equivalent, every day just to supply his basic protein needs. Even eating this volume of lean chicken would provide some 465 calories. These calories are not used to supply energy, they contribute nothing to the body’s calorie needs and so must be deducted if you are counting calories.

This suggests that a quarter of the typically recommended 2,000-calorie diet is never "burned" or converted into energy. That is, we're counting them as "calories in," but our bodies are not in fact using them as calories. Throws quite a wrench into the "calories in, calories out" equation, doesn't it?

Similarly, as I wrote in my last post, fats are needed for structural purposes throughout the body. Much of the fat you eat, as with protein, is never "burned" as fuel. It's true that the liver can convert fats into energy, just as with protein. But since your brain is about 70% fat, and the myelin coating your nerves is 70% fat, and since every cell membrane in your body contains fat, then preferably your body will not have to waste precious fats by expending them as fuel. The set-it-on-fire data say that if your body takes a gram of fat and breaks it down for energy, you'll get about 9 calories out of it, which is more than twice what you get from carbs. But again, what happens to a gram of ingested fat depends on what else you've been eating. The actual calorie contribution could be anywhere from 0 to 9 calories, and even that is a bad measurement since we don't, in reality, obtain energy via combustion.

When you do convert consumed fats into energy, it takes longer to convert them to glucose, and your blood sugar levels are less likely to "spike". You're therefore less likely to experience a flood of insulin in your blood-- insulin which signals your adipose fat tissue to start grabbing glucose and socking it away as stored fat. The more often you experience peaks or spikes of blood sugar and insulin, the more fat you're probably storing away.

This is where it's helpful to know the "glycemic index" of foods. This is a measure of how fast the food is converted into glucose in the blood. Higher glycemic foods (white bread, white rice, potatoes, etc) get converted to glucose more quickly, are more likely to create blood sugar spikes, and are therefore more fattening. Whole grains and higher-fiber foods have a lower glycemic index (thus, traditional 19th century bread was less fattening than most commercial breads). You can also change the overall glycemic index of a meal by adding more fat and protein, which will slow the digestion, absorption, and conversion to glucose of any carbs in the meal. Clearly, the calorie content alone cannot tell you how much body fat you are likely to put on from eating a given meal. "A calorie is a calorie" is false.

As a last comment, trans-fats result in more weight gain than naturally occurring fats. If you eat biscuits made with butter every day at dinner, and your identical twin eats biscuits made with shortening, your twin will weigh more than you after a while. Plus, even within a particular category of fats (saturated, mono-unsaturated, etc.) there are shorter and longer fat molecules. Shorter molecules are more likely to be used as energy, certain medium-chain fatty acids are used in the immune system, long saturated fatty acids are preferred for cell membranes, and very long-chain fats are sent straight to the brain. Between the "short-chain" vs. "long-chain" fats issue and the high-glycemic / low-glycemic carbs issue, not even the breakdown of carbohydrates, proteins, and fats in a meal will tell you the whole story.

Thankfully, a lot of traditional meals are the least fattening, so you can forget all the above complexity and just think 19th century in your eating. Bacon, eggs, and heavily buttered whole-grain toast is less fattening than a bagel with fat-free cream cheese (and anyway, what IS fat-free "cream cheese"? It was never cream and it certainly isn't cheese). The low-fat recommendations of the medical community have resulted in swarms of people who eat carbs without enough fat or protein, get a huge and sudden blood sugar boost, pack the sugar away in fat cells (as instructed by insulin), and then become exhausted and hypoglycemic-- and then, naturally, crave more carbs. The success of the Atkins and South Beach diets can attest to the role carbs play in determining many people's weight.

It makes me incredibly mad to hear mainstream news reports on obesity (which always involve wagging a finger at the overweight) when the public health community has given advice that has actually made people fatter. Consider our government's recommended "food pyramid", with the 49 servings of carbohydrates or whatever it is, while fats and oils are at the top, labeled "use sparingly". Use sparingly? The traditional human diet, when there is enough food to go around, is 30-40% fat! Perhaps many of these medical folks are suffering from brain atrophy as a result of insufficient fat intake.

January 17, 2007

About a year ago, I read the book Know Your Fats by Mary Enig, who has a PhD in nutritional biochemistry and specifically studies lipid biochemistry (i.e., fats and oils). She's a certified and practicing nutritionist. And she happens to believe that saturated fats are good for you, while polyunsaturated fats, for the most part, are not.

There is a summary of her arguments here, which I promise you is very eye-opening. For instance, it turns out that the percentage of fat in our diet which comes from animals has fallen during the 20th century. Butter consumption, to take one example, was 18 pounds per person per year in 1910, but had fallen to 4 pounds per person per year by 1970. What did increase, during the decades when heart disease was skyrocketing, was consumption of highly unsaturated vegetable oils and hydrogenated vegetable oils. Where we used to use lard or tallow, we now use oils or shortening. Kind of hard to then point the finger at saturated fats, isn't it?

On another front, consider evolution. During the past 200,000 years of human evolution, we got our fats from animals, nuts, and olives, but mostly animals. Olives weren't pressed for their oil until the last few millennia, and coconut and palm oils weren't exported around the globe until a couple of centuries ago. Nuts have been fairly expensive and in short supply for most people for most of history. In short, then, we evolved eating highly saturated animal fats. No proto-hominid ever tasted corn, canola, or soybean oil. Which kind of fat is more likely to be well-utilized in our bodies, then? The one we've been eating for almost a quarter million years, or the one we've been eating for 75?

On yet another note, the more unsaturated a fat is, the more reactive it is with other chemicals, and the more easily it oxidizes (goes rancid). The unsaturated fats in olives and nuts won't hurt you, because olives and nuts happen to be chock full of antioxidants, which keep the oil from spoiling. Nature makes a perfect package: fragile unsaturated oil + stuff to make sure that oil doesn't oxidize. But we take it all apart by extracting and "refining" these oils, which is to say, we strip out all the antioxidants and leave behind a rapidly oxidizing oil stuffed with free radicals. Yum! (If you buy cold-pressed, dark green, dusty-looking olive oil, you're fine, because it hasn't been refined. Same with very dark brown, sediment-containing sesame oil. If you like to cook with flavorless oils try coconut oil, which, being mostly saturated, will not poison you with free radicals.)

Even worse than rancid veggie oils are hydrogenated oils, i.e. trans-fats. Trans-fats are normal on one end of the molecule, but are screwed up on the other end. Every cell membrane in your body is made up in large part of fatty acids (preferably, saturated fatty acids). So when you eat trans-fats, the cell thinks "Oh, good, here's that fat molecule we needed," and plugs the normal end into its outer membrane. Only this trans-fat isn't a natural molecule. It (essentially) does not exist in nature. The other end of the molecule does not do its job in terms of letting toxins out, and letting nutrients in. As a result, there are hundreds of biochemical processes which do not occur correctly in tissues containing trans-fats. Besides that, 2g of trans-fat per day increases your risk of coronary heart disease by 20 percent, and 6g per day may as much as double your risk of dying of heart disease. (The average fast food meal contains 22g.) Trans-fats also block the absorption and proper use of omega-3 fatty acids, which can cause depression and anxiety, or developmental and behavioral problems in children. Yet, this is the poison the medical community convinced people to use instead of butter! It takes a hell of a lot of hubris to tell people not to eat butter, which we've consumed for about 12,000 years, but instead to eat a substance originally developed for use in candles, and which does not exist in nature.

Just to throw in another point: in case we suffer a shortage of fat in our diets, our livers are capable of making fats. As I said above, every single cell in your body requires fat in its structure, and the brain is basically made of fat and cholesterol and little else. Well, the liver only makes one kind of fat: saturated. If saturated fats were bad for us, why would this be the only kind our bodies can manufacture? Also, may I point out, people eating no-fat diets, in addition to stressing out their livers, are utilizing solely saturated fats. The only difference between eating entirely saturated fats and eating no fat at all is that in the latter case, your liver is working much harder.

This may seem a bit theoretical so far, but I'll include a couple of excerpts from Mary Enig's summary page. One famous study, the decades-long Framingham heart study of over 6,000 people, contradicted popular opinion on saturated fats:

After 40 years, the director of this
study had to admit: "In Framingham, Mass, the more saturated fat one ate, the more cholesterol one ate, the more calories one ate, the lower the person's serum cholesterol.... [W]e found that the people who ate the most cholesterol, ate the most saturated fat, ate the most calories, weighed the least and were the most physically active."3
The study did show that those who weighed more and had abnormally high
blood cholesterol levels were slightly more at risk for future heart
disease; but weight gain and cholesterol levels had an inverse correlation with fat and cholesterol intake in the diet.4

Somehow I missed that in the news. Here's another:

The U.S. Multiple Risk Factor Intervention Trial, (MRFIT) sponsored by the National Heart, Lung and Blood Institute, compared mortality
rates and eating habits of over 12,000 men. Those with "good" dietary habits (reduced saturated fat and cholesterol, reduced smoking, etc.)
showed a marginal reduction in total coronary heart disease, but their overall mortality from all causes was higher. Similar results have been obtained in several other studies. The few studies that indicate a correlation between fat reduction and a decrease in coronary heart disease mortality also document a concurrent increase in deaths from cancer, brain hemorrhage, suicide and violent death.6

Please click the first link in this post when you've got time, because there's so much more data than I can possibly quote here. Comparisons of populations with higher or lower saturated fat intakes also contradict the common wisdom that saturated fats are harmful, for example.

But... why would common opinion have this all wrong? Certain food industries certainly have benefitted, which is part of the answer.

Soybean production was virtually non-existent in 1900. Now it's one of the three biggest crops in the US and hydrogenated soybean oil is everywhere. Soybeans chased out the imported coconut and palm oils that had been used before. People started throwing animal fat away (e.g., the lard left over when you cook bacon), and buying Crisco instead, in the misguided belief that this was healthier. (The primary fat in lard, by the way, is mono-unsaturated.)

Secondly, dairy farmers can now make cows produce something like 20+ times their normal amount of milk. And sure, it's weak, gray, practically fat-free, watery stuff. A 19th century person would never have let it pass their lips, but in these Orwellian times we think of it as healthy (i.e., "skim"). Skim milk, first of all, was never "skimmed". There wasn't much fat in it to begin with, nor any of the other nutrients cow's milk is supposed to have. Secondly, there is no point whatsoever in putting vitamins A or D in fat-free milk. Vitamins A and D are fat-soluble; without the fat, you absorb no vitamin A or D at all. Thirdly, they beef it up with spray-dried milk derivatives, which means they're adding oxidized proteins and oxidized cholesterol in an attempt to at least make it opaque. But they can boost their profit margin by doing this, so they've sold the public on the idea that this crappy milk is a health food. Considering hormones, antibiotics, and other disgusting byproducts of factory farmed dairy, skim milk is very far from being healthy.

Processed food makers know they can boost their profit margins using cheaper vegetable oils rather than real heavy cream or butter. Who needs vanilla ice cream made with actual cream, right? Just mix up some skim milk, veggie oil, guar gum, carageenan, and emulsifiers, and it's more or less the same consistency... who will ever know? And plus you get to brag about how it contains zero saturated fat! Next time you're at the store just consider the marketing and advertising dollars that have gone into low-fat and low-saturated-fat products-- and how much money has simultaneously been saved using soybean oil instead of butter, or skim milk + locust bean gum instead of cream.

Having said all this, dairy and animal meat fats are not what they are supposed to be these days. Butter is supposed to contain an anti-cancer and anti-abdominal-fat compound called Conjugated Linoleic Acid, but American butter does not because the cows don't eat grass. Beef is supposed to contain a fair bit of omega-3 fatty acids, as should lamb meat, but only if they are grass-fed. Luckily, we are returning somewhat to local farming, and the coming (permanent) oil shortage will make factory farming less and less profitable. I hope that in another 10 years I can buy my eggs, butter, milk, and cream from someone local, who lets their animals eat grass. I have a lot of respect for traditional foods, and-- if only they would stop misleading us-- research shows that those traditions were healthiest.