HOUSTON – A diet that incorporates a daily dose of pistachios may help reduce the risk of lung and other cancers, according to data presented at the American Association for Cancer Research Frontiers in Cancer Prevention Research Conference, held Dec. 6-9.

“It is known that vitamin E provides a degree of protection against certain forms of cancer. Higher intakes of gamma-tocopherol, which is a form of vitamin E, may reduce the risk of lung cancer,” said Ladia M. Hernandez, M.S., R.D., L.D., senior research dietitian in the Department of Epidemiology at the University of Texas M. D. Anderson Cancer Center, and doctoral candidate at Texas Woman’s University – Houston Center.

“Pistachios are a good source of gamma-tocopherol. Eating them increases intake of gamma-tocopherol so pistachios may help to decrease lung cancer risk,” she said.

Pistachios are known to provide a heart-healthy benefit by producing a cholesterol-lowering effect and providing the antioxidants that are typically found in food products of plant origin. Hernandez and colleagues conducted a six-week, controlled clinical trial to evaluate if the consumption of pistachios would increase dietary intake and serum levels of gamma-tocopherol. A pistachio-rich diet could potentially help reduce the risk of other cancers from developing as well, according to Hernandez.

“Because epidemiologic studies suggest gamma-tocopherol is protective against prostate cancer, pistachio intake may help,” she said. “Other food sources that are a rich source of gamma-tocopherol include nuts such as peanuts, pecans, walnuts, soybean and corn oils.”

The study, conducted at Texas Woman’s University – Houston Center, included 36 healthy participants who were randomized into either a control group or the intervention group consisting of a pistachio diet. There were 18 participants in the control group and 18 in the intervention group. There was a two-week baseline period, followed by a four-week intervention period in which the intervention group was provided with 68 grams (about 2 ounces or 117 kernels) of pistachios per day; the control group continued with their normal diet.

The effect on the intake and serum cholesterol-adjusted gamma-tocopherol was investigated. Intake was calculated using the Nutrition Data System for Research Version 2007, and consumption was monitored using diet diaries and by measuring the weights of the returned pistachios.

Hernandez and colleagues found a significant increase in energy-adjusted dietary intake of gamma-tocopherol at weeks three and four in those on the pistachio diet compared with those on the control diet. The similar effect was seen at weeks five and six among those on the pistachio diet compared with those on the control diet. For those on the pistachio diet, cholesterol-adjusted serum gamma-tocopherol was significantly higher at the end of the intervention period compared to baseline.

“Pistachios are one of those ‘good-for-you’ nuts, and 2 ounces per day could be incorporated into dietary strategies designed to reduce the risk of lung cancer without significant changes in body mass index,” said Hernandez.

Public release date: 8-Dec-2009

Testosterone does not induce aggression

New scientific evidence refutes the preconception that testosterone causes aggressive, egocentric, and risky behavior. A study at the Universities of Zurich and Royal Holloway London with more than 120 experimental subjects has shown that the sexual hormone with the poor reputation can encourage fair behaviors if this serves to ensure one’s own status.

Popular scientific literature, art, and the media have been attributing the roll of aggression to the arguably best known sexual hormone for decades. Research appeared to confirm this – the castration of male rodents evidently led to a reduction in combativeness among the animals. The prejudice thus grew over decades that testosterone causes aggressive, risky, and egocentric behavior. The inference from these experiments with animals that testosterone produces the same effects in humans has proven to be false, however, as a combined study by neuroscientist Christoph Eisenegger and economist Ernst Fehr, both of the University of Zurich, and economist Michael Naef of Royal Holloway in London demonstrates. “We wanted to verify how the hormone affects social behavior,” Dr. Christoph Eisenegger explains, adding, “we were interested in the question: what is truth, and what is myth?”

For the study, published in the renowned journal Nature, some 120 test subjects took part in a behavioral experiment where the distribution of a real amount of money was decided. The rules allowed both fair and unfair offers. The negotiating partner could subsequently accept or decline the offer. The fairer the offer, the less probable a refusal by the negotiating partner. If no agreement was reached, neither party earned anything.

Before the game the test subjects were administered either a dose of 0.5 mg testosterone or a corresponding placebo. “If one were to believe the common opinion, we would expect subjects who received testosterone to adopt aggressive, egocentric, and risky strategies – regardless of the possibly negative consequences on the negotiation process,” Eisenegger elucidates.

Fairer with testosterone

The study’s results, however, contradict this view sharply. Test subjects with an artificially enhanced testosterone level generally made better, fairer offers than those who received placebos, thus reducing the risk of a rejection of their offer to a minimum. “The preconception that testosterone only causes aggressive or egoistic behavior in humans is thus clearly refuted,” sums up Eisenegger. Instead, the findings suggest that the hormone increases the sensitivity for status. For animal species with relatively simple social systems, an increased awareness for status may express itself in aggressiveness. “In the socially complex human environment, pro-social behavior secures status, and not aggression,” surmises study co-author Michael Naef from Royal Holloway London. “The interplay between testosterone and the socially differentiated environment of humans, and not testosterone itself, probably causes fair or aggressive behavior”.

Moreover the study shows that the popular wisdom that the hormone causes aggression is apparently deeply entrenched: those test subjects who believed they had received the testosterone compound and not the placebo stood out with their conspicuously unfair offers. It is possible that these persons exploited the popular wisdom to legitimate their unfair actions. Economist Michael Naef states: “It appears that it is not testosterone itself that induces aggressiveness, but rather the myth surrounding the hormone. In a society where qualities and manners of behavior are increasingly traced to biological causes and thereby partly legitimated, this should make us sit up and take notice.” The study clearly demonstrates the influence of both social as well as biological factors on human behavior.

Public release date: 14-Dec-2009

Antidepressants may increase risk of stroke and death

Study examines data from more than 136,000 postmenopausal women

Postmenopausal women who take antidepressants face a small but statistically significant increased risk for stroke and death compared with those who do not take the drugs. The new findings are from the federally-funded, multi-institution, Women’s Health Initiative Study sponsored by the National Institutes of Health, and the results are published in the December 14 online edition of Archives of Internal Medicine.

Senior author Sylvia Wassertheil-Smoller, Ph.D., is a principal investigator in the Women’s Health Initiative and is division head of epidemiology and professor of epidemiology & population health at Albert Einstein College of Medicine of Yeshiva University. In addition to Einstein, other institutions involved in the study were Massachusetts General Hospital, where the lead author of the paper, Jordan W. Smoller, M.D., Sc.D., is based. He is also associate professor of psychiatry in the Harvard Medical School. Also contributing to the study are researchers from the University of California San Diego, the University of Washington, the University of Hawaii, the University of Iowa, the University of Massachusetts Medical School, and Emory University School of Medicine.

The study examined data from 136,293 study participants, aged 50 to 79, who were not taking antidepressants when they enrolled in the study, and who were followed for an average of six years. Data from 5,496 women who were taking antidepressants at their first follow-up visit were compared with data from 130,797 not taking antidepressants at follow-up. The researchers compared the two groups with respect to the incidence of fatal or nonfatal stroke, fatal or nonfatal heart attack and death due to all causes.

The researchers found no difference in coronary heart disease (defined as fatal and non-fatal heart attacks). However, they did observe a significant difference in stroke rates: antidepressant users were 45 percent more likely to experience strokes than women who weren’t taking antidepressants.

The study also found that when overall death rates (all-cause mortality) were compared between the two groups, those on antidepressants had a 32 percent higher risk of death from all causes compared with non-users.

Dr. Wassertheil-Smoller notes that the overall risk for women taking antidepressants is relatively small: a 0.43 percent risk of stroke annually versus a 0.3 percent annual risk of stroke for women not taking antidepressants. However, because antidepressants are among the most widely prescribed drugs in the U.S. – especially among postmenopausal women – small risk increases can have significant implications for large patient populations.

Dr. Wassertheil-Smoller cautioned that “it remains unclear” from the data whether antidepressants are solely responsible for the greater mortality rate among users. The link observed in this study between antidepressant use and increased stroke risk for older women might partially be due to the underlying depression, since several studies have found that depression itself is a risk factor for cardiovascular problems.

In their analysis, the researchers tried to control for depression’s effects, but they couldn’t rule out the possibility that underlying depression in the antidepressant group may be contributing to their increased stroke risk. The study found no difference in stroke risk between the two major classes of antidepressants, selective serotonin reuptake inhibitors (SSRIs) or tricyclic antidepressants (TCAs). However, the SSRIs did appear to convey a higher risk of hemorrhagic stroke caused by a bleed in the brain.

Antidepressants are valuable drugs for treating a condition that can be debilitating or even fatal. Dr. Wassertheil-Smoller advises women who may be concerned about taking their antidepressants based on this study to discuss the matter with their physicians. “You have to weigh the benefits that you get from these antidepressants against the small increase in risk that we found in this study,” she says.

Dr. Jordan Smoller adds, “While this study did find an association between antidepressants and cardiovascular events, additional research needs to be done to determine exactly what it signifies. Older women taking antidepressants, like everyone else, should also work on modifying their other risk factors for cardiovascular disease, such as maintaining a healthy weight and controlling cholesterol levels and blood pressure.”

The researchers also pointed out other limitations to their findings. This was an observational study, so the findings are not as conclusive of causality as would be the case for a randomized controlled trial; and since the WHI study is comprised primarily of older white women, the findings might not extend to other groups.

The group’s paper, “Antidepressant Use and Risk of Incident Cardiovascular Morbidity and Mortality Among Post-Menopausal Women in the Women’s Health Initiative Study,” appears in the December 14 online edition of Archives of Internal Medicine.

Dr. Wassertheil-Smoller is also the Dorothy and William Manealoff Foundation & Molly Rosen Professor of Social Medicine and the principal investigator for the Women’s Health Initiative at Einstein.

Ralph’s Note – It is interesting the researcher broke the increased chance of death down to a yearly basis, as opposed to the six years of the study…This Minimized the apparent risk.

Public release date: 14-Dec-2009

Regular coffee, decaf and tea all associated with reduced risk for diabetes

Individuals who drink more coffee (regular or decaffeinated) or tea appear to have a lower risk of developing type 2 diabetes, according to an analysis of previous studies reported in the December 14/28 issue of Archives of Internal Medicine.

By the year 2025, approximately 380 million individuals worldwide will be affected by type 2 diabetes, according to background information in the article. “Despite considerable research attention, the role of specific dietary and lifestyle factors remains uncertain, although obesity and physical inactivity have consistently been reported to raise the risk of diabetes mellitus,” the authors write. A previously published meta-analysis suggested drinking more coffee may be linked with a reduced risk, but the amount of available information has more than doubled since.

Rachel Huxley, D.Phil, of The George Institute for International Health, University of Sydney, Australia, and colleagues identified 18 studies involving 457,922 participants and assessing the association between coffee consumption and diabetes risk published between 1966 and 2009. Six studies involving 225,516 individuals also included information about decaffeinated coffee, whereas seven studies with 286,701 participants reported on tea consumption.

When the authors combined and analyzed the data, they found that each additional cup of coffee consumed in a day was associated with a 7 percent reduction in the excess risk of diabetes. Individuals who drank three to four cups per day had an approximately 25 percent lower risk than those who drank between zero and two cups per day.

In addition, in the studies that assessed decaffeinated coffee consumption, those who drank more than three to four cups per day had about a one-third lower risk of diabetes than those who drank none. Those who drank more than three to four cups of tea had a one-fifth lower risk than those who drank no tea.

“That the apparent protective effect of tea and coffee consumption appears to be independent of a number of potential confounding variables raises the possibility of direct biological effects,” the authors write. Because of the association between decaffeinated coffee and diabetes risk, the association is unlikely to be solely related to caffeine. Other compounds in coffee and tea—including magnesium, antioxidants known as lignans or chlorogenic acids—may be involved, the authors note.

“If such beneficial effects were observed in interventional trials to be real, the implications for the millions of individuals who have diabetes mellitus, or who are at future risk of developing it, would be substantial,” they conclude. “For example, the identification of the active components of these beverages would open up new therapeutic pathways for the primary prevention of diabetes mellitus. It could also be envisaged that we will advise our patients most at risk for diabetes mellitus to increase their consumption of tea and coffee in addition to increasing their levels of physical activity and weight loss.”

Public release date: 16-Dec-2009

New study links DHA type of omega-3 to better nervous-system function

Deficiencies may factor into mental illnesses

WASHINGTON — The omega-3 essential fatty acids commonly found in fatty fish and algae help animals avoid sensory overload, according to research published by the American Psychological Association. The finding connects low omega-3s to the information-processing problems found in people with schizophrenia; bipolar, obsessive-compulsive, and attention-deficit hyperactivity disorders; Huntington’s disease; and other afflictions of the nervous system.

The study, reported in the journal Behavioral Neuroscience, provides more evidence that fish is brain food. The key finding was that two omega-3 fatty acids – docosahexaenoic acid (DHA) and eicosapentaenoic acid (EPA) – appear to be most useful in the nervous system, maybe by maintaining nerve-cell membranes.

“It is an uphill battle now to reverse the message that ‘fats are bad,’ and to increase omega-3 fats in our diet,” said Norman Salem Jr., PhD, who led this study at the Laboratory of Membrane Biochemistry and Biophysics at the National Institute on Alcohol Abuse and Alcoholism.

The body cannot make these essential nutrients from scratch. It gets them by metabolizing their precursor, α-linolenic acid (LNA), or from foods or dietary supplements with DHA and EPA in a readily usable form. “Humans can convert less than one percent of the precursor into DHA, making DHA an essential nutrient in the human diet,” added Irina Fedorova, PhD, one of the paper’s co-authors. EPA is already known for its anti-inflammatory and cardiovascular effects, but DHA makes up more than 90 percent of the omega-3s in the brain (which has no EPA), retina and nervous system in general.

In the study, the researchers fed four different diets with no or varying types and amounts of omega-3s to four groups of pregnant mice and then their offspring. They measured how the offspring, once grown, responded to a classic test of nervous-system function in which healthy animals are exposed to a sudden loud noise. Normally, animals flinch. However, when they hear a softer tone in advance, they flinch much less. It appears that normal nervous systems use that gentle warning to prepare instinctively for future stimuli, an adaptive process called sensorimotor gating.

Only the mice raised on DHA and EPA, but not their precursor of LNA, showed normal, adaptive sensorimotor gating by responding in a significantly calmer way to the loud noises that followed soft tones. The mice in all other groups, when warned, were startled nearly as much by the loud sound. When DHA was deficient, the nervous system most obviously did not downshift. That resulted in an abnormal state that could leave animals perpetually startled and easily overwhelmed by sensory stimuli.

The authors concluded that not enough DHA in the diet may reduce the ability to handle sensory input. “It only takes a small decrement in brain DHA to produce losses in brain function,” said Salem.

In humans, weak sensorimotor gating is a hallmark of many nervous-system disorders such as schizophrenia or ADHD. Given mounting evidence of the role omega-3s play in the nervous system, there is intense interest in their therapeutic potential, perhaps as a supplement to medicines. For example, people with schizophrenia have lower levels of essential fatty acids, possibly from a genetic variation that results in poor metabolism of these nutrients.

More broadly, the typical American diet is much lower in all types of omega-3 than in omega-6 essential fatty acids, according to Salem. High intake of omega-6, or linoleic acid, reduces the body’s ability to incorporate omega-3s. As a result, “we have the double whammy of low omega-3 intake and high omega-6 intake,” he said.

BIRMINGHAM, Ala. – Autism and related development disorders are becoming more common, with a prevalence rate approaching 1 percent among American 8-year-olds, according to new data from researchers at the University of Alabama at Birmingham (UAB) School of Public Health and the Centers for Disease Control and Prevention (CDC).

The study is a partnership between UAB, the CDC and 10 other U.S. research sites. It shows that one in 110 American 8-year-olds is classified as having an autism spectrum disorder (ASD), a 57 percent increase in ASD cases compared to four years earlier.

The new findings, published Dec. 18 in the CDC’s Morbidity and Mortality Weekly Report (MMWR), highlight the need for social and educational services to help those affected by the condition, said Beverly Mulvihill, Ph.D., a UAB associate professor of public health and co-author on the study.

ASDs are a group of developmental disabilities such as autism and Asperger disorder that are characterized by delays or changes in childhood socialization, communication and behavior.

“This is a dramatic increase in the number of kids classified as autistic or documented on the spectrum of similar disorders,” Mulvihill said. “It is not entirely clear what is causing the rise, but we know major collaborative efforts are needed to improve the understanding and lives of people and families impacted.”

The MMWR study discusses possible factors that might contribute to the increase in ASD cases. They include a broader definition of autism disorders and a heightened awareness of ASD by parents, doctors, educators and other professionals. The findings do not address whether or not any of the increase is attributable to a true increase in the risk of developing ASD, more frequent and earlier diagnoses, and other factors.

Data comes from the Autism and Developmental Disabilities Monitoring (ADDM) Network, a collection of 11 sites in Alabama, Arizona, Colorado, Florida, Georgia, Maryland, Missouri, North Carolina, Pennsylvania, South Carolina and Wisconsin. ADDM reviewers are uniformly trained to review and confirm cases; some children included in the study have documented ASD symptoms but never received a diagnosis.

The study also found that boys are 4.5 times more likely than girls to have ASD, a finding that confirms earlier studies, says Martha Wingate, Dr.P.H., a UAB assistant professor of public health and study co-author.

“It still is not clear why males more frequently are affected,” Wingate said. “One thing we know for sure is that more research is needed to quantify the effects of single or multiple factors such as diagnosis patterns, inclusion of milder cases and other components.”

The ADDM sites are not selected based on any statistical pattern, but the 300,000-plus children included in the study represent 8 percent of the nation’s 8-year-olds.

Ralph’s Note – Why only use 8 year olds? What is the estimated percentage of all Children under 18?

Public release date: 21-Dec-2009

The brain may feel other people’s pain

By Amy Norton Amy Norton

NEW YORK (Reuters Health) – If you’ve ever thought that you literally feel other people’s pain, you may be right. A brain-imaging study suggests that some people have true physical reactions to others’ injuries.

Using an imaging technique called functional MRI, UK researchers found evidence that people who say they feel vicarious pain do, in fact, have heightened activity in pain-sensing brain regions upon witnessing another person being hurt.

The findings, published in the journal Pain, could have implications for understanding, and possibly treating, cases of unexplained “functional” pain.

“Patients with functional pain experience pain in the absence of an obvious disease or injury to explain their pain,” explained Dr. Stuart W. G. Derbyshire of the University of Birmingham, one of the researchers on the new study.

“Consequently,” he told Reuters Health in an email, “there is considerable effort to uncover other ways in which the pain might be generated.”

Derbyshire said he now wants to study whether the brains of patients with functional pain respond to images of injury in the same way that the current study participants’ did.

For the study, Derbyshire and colleague Jody Osborn first had 108 college students view several images of painful situations — including athletes suffering sports injuries and patients receiving an injection. Close to one-third of the students said that, for at least one image, they not only had an emotional reaction, but also fleetingly felt pain in the same site as the injury in the image.

Derbyshire and Osborn then took functional MRI scans of 10 of these “responders,” along with 10 “non-responders” who reported no pain while viewing the images.

Functional MRI charts changes in brain blood flow, allowing researchers to see which brain areas become more active in response to a particular stimulus. Here, the researchers scanned participants’ brains as they viewed either images of people in pain, images that were emotional but not painful, or neutral images.

The investigators found that while viewing the painful images, both responders and non-responders showed activity in the emotional centers of the brain. But responders showed greater activity in pain-related brain regions compared with non-responders, and as compared with their own brain responses to the emotional images.

“We think this confirms that at least some people have an actual physical reaction when observing others being injured or expressing pain,” Derbyshire said.

He noted that the responders also tended to say that they avoided horror movies and disturbing images on the news “so as to avoid being in pain” — which, the researcher said, is more than just an empathetic response.

As far as the potential practical implications of the findings, Derbyshire said it would be a “reach” to think that such brain mechanisms might be behind all functional pain. But, he added, “they might explain some of it.”

________________________________

These reports are done with the appreciation of all the Doctors, Scientist, and other

Medical Researchers who sacrificed their time and effort. In order to give people the

ability to empower themselves. Without the base aspirations for fame, or fortune.