25.Treatment with light benefits Alzheimer’s patients, Wayne State University finds

26.Diet counts: Iron intake in teen years can impact brain in later life

27.Grapes may help prevent age-related blindness

28.Expectant mothers on antidepressants risk newborns with high blood pressure

29.Researchers: Honeybee deaths linked to seed insecticide exposure

30.NIH study shows 32 million Americans have autoantibodies that target their own tissues

31.Study finds curcumin as effective as antidepressants

Spinal Manipulation and Home Exercises Better than Medication for Relieving Neck Pain

Most people experience neck pain within their lifetime. While not serious, neck pain can interfere greatly with daily activities. Usual care for neck pain may include medication or chiropractic, yet there has been little published evidence about the efficacy of spinal manipulation therapy (SMT) for neck pain. Researchers conducted a randomized trial to test the hypothesis that SMT is more effective than medication or home exercise with advice for acute and subacute neck pain. Two-hundred-seventy-two patients aged 18 to 65 with nonspecific neck pain for two to 12 weeks were randomly assigned to receive SMT, medication, or home exercise with advice for 12 weeks. Up to one year after treatment, patients having 12 weeks of SMT reported greater pain relief than patients in the medication group. Patients in the home exercise with advice group reported just as much pain relief as those in the SMT group over the same period. However, patients having SMT reported that they were more satisfied with care than those in either of the other groups. The researchers conclude that SMT and home exercise are similarly effective to each other and both are more effective than medication for neck pain.

Changes seen in cerebrospinal fluid levels before onset of Alzheimer dementia

CHICAGO – Cerebrospinal fluid levels of Aβ42 appear to be decreased at least five to 10 years before some patients with mild cognitive impairment develop Alzheimer disease (AD) dementia whereas other spinal fluid levels seem to be later markers of disease, according to a report in the January issue of Archives of General Psychiatry, one of the JAMA/Archives journals.

The researchers note as background in the study that disease-modifying therapies, such as immunotherapy, are more likely to be successful if started in the early stages of the disease so there is a need to identify patients with Alzheimer disease before neurodegeneration is not too severe.

Peder Buchhave, M.D., Ph.D, who is affiliated with Lund University and Skane University, Sweden, and colleagues conducted an extended follow-up of the cohort from a previous study of 137 patients with mild cognitive impairment (MCI) at baseline. The median follow-up was 9.2 years.

During the follow-up, 72 patients (53.7 percent) developed AD and 21 (15.7 percent) progressed to other forms of dementia. At the baseline, cerebrospinal fluid Aβ42 levels were reduced and other biomarkers T-tau and P-tau levels were elevated in patients who converted to AD during follow-up compared with levels in patients who did not develop AD.

The study indicates baseline CSF Aβ42 levels were equally reduced in patients with MCI who converted to AD within five years (the early converters) compared to those who converted later between five and 10 years. However, T-tau and P-tau levels were significantly higher in early converters compared to later ones.

Researchers suggest that “approximately 90 percent of patients with MCI and pathologic CSF biomarkers at baseline will develop AD within 9.2 years.”

“Therefore, these markers can identify individuals at high risk for future AD least five to 10 years before conversion to dementia. Hopefully, new therapies that can retard or even halt progression of the disease will soon be available. Together with an early and accurate diagnosis, such therapies could be initiated before neuronal degeneration is too widespread and patients are already demented,” the authors conclude.

When overeating, calories, not protein, contribute to increase in body fat

CHICAGO – In a study conducted among 25 healthy individuals living in a controlled setting who were randomized to overconsumption of different levels of protein diets, those consuming the low-protein diet had less weight gain compared to those consuming normal and high protein diets, and calories alone, and not protein appeared to contribute to an increase in body fat, according to a study in the January 4 issue of JAMA. The researchers also found that protein did contribute to changes in energy expenditure and lean body mass.

“Obesity has become a major public health concern with more than 60 percent of adults in the United States categorized as overweight and more than 30 percent as obese,” according to background information in the article. The role of diet composition in response to overeating and energy dissipation is unclear.

George A. Bray, M.D., of the Pennington Biomedical Research Center, Baton Rouge, La., and colleagues conducted a study to determine whether the level of dietary protein differentially affected body composition, weight gain, or energy expenditure under tightly controlled conditions. The randomized controlled trial included 25 U.S. healthy, weight-stable male and female volunteers, ages 18 to 35 years, with a body mass index between 19 and 30. The first participant was admitted to the inpatient metabolic unit in June 2005 and the last in October 2007. After consuming a weight-stabilizing diet for 13 to 25 days, participants were randomized to receive diets containing 5 percent of energy from protein (low protein), 15 percent (normal protein), or 25 percent (high protein), which they were overfed during the last 8 weeks of their 10- to 12-week stay in the inpatient metabolic unit. Compared with energy intake during the weight stabilization period, the protein diets provided approximately 40 percent more energy intake, which corresponds to 954 calories a day.

All participants in the study gained weight and there were no differences by sex. The rate of weight gain in the low protein diet group was significantly less than in the other 2 groups (6.97 lbs. [3.16 kg] vs. 13.3 lbs [6.05 kg] for the normal protein diet group and 14.4 lbs [6.51 kg] in the high protein diet group).

“Body fat increased similarly in all 3 protein diet groups and represented 50 percent to more than 90 percent of the excess stored calories. Resting energy expenditure, total energy expenditure, and body protein did not increase during overfeeding with the low protein diet,” the authors write.

Lean body mass (body protein) decreased during the overeating period by 1.5 lbs. (0.70 kg) in the low protein diet group compared with a gain of 6.3 lbs. (2.87 kg) in the normal protein diet group and 7 lbs. (3.18 kg) in the high protein diet group. Resting energy expenditure (normal protein diet: 160 calories/day; high protein diet: 227 calories/day) increased significantly with the normal and high protein diets.

“In summary, weight gain when eating a low protein diet (5 percent of energy from protein) was blunted compared with weight gain when eating a normal protein diet (15 percent of energy from protein) with the same number of extra calories. Calories alone, however, contributed to the increase in body fat. In contrast, protein contributed to the changes in energy expenditure and lean body mass, but not to the increase in body fat,” the researchers write.

“The key finding of this study is that calories are more important than protein while consuming excess amounts of energy with respect to increases in body fat.”

In an accompanying editorial, Zhaoping Li, M.D., Ph.D., and David Heber, M.D., Ph.D., of the University of California, Los Angeles, write that the results of this study “informs primary care physicians and policy makers about the benefits of protein in weight management.”

“The results suggest that overeating low protein diets may increase fat deposition leading to loss of lean body mass despite lesser increases in body weight. Policy makers and primary care physicians need to understand the role of the Western diet in promoting overweight and obesity. Because this diet increases the risks of overnutrition through fat deposition beyond that detected by body mass index, the method used to assess the current obesity epidemic and the magnitude of the obesity epidemic may have been underestimated. Clinicians should consider assessing a patient’s overall fatness rather than simply measuring body weight or body mass index and concentrate on the potential complications of excess fat accumulation. The goals for obesity treatment should involve fat reduction rather than simply weight loss, along with a better understanding of nutrition science

Deer antlers inspire a new theory on osteoporosis

IMAGE:This is three broken antlers and an intact one. The weakening is due to manganese depletion.

The loss of manganese could mean that calcium does not stick to bones and could cause osteoporosis. This is the new theory put forward by researchers at the University of Castilla-La Mancha (UCLM) in Spain after studying deer antlers. The hypothesis published this month in the Frontiers of Bioscience journal still needs to be confirmed by the scientific community.

Through the study of deer antlers, researchers of the Research Institute of Hunting Resources (IREC, joint centre UCLM-CSIC-JCCM) suggest that the origin of osteoporosis could not be directly linked to the lack of calcium but rather to the lack of a mineral essential to calcium absorption. In particular they believe that this could be manganese, according to a new theory published in the latest issue of the Frontiers of Bioscience journal.

According to Tomás Landete, sub-director of the IREC and one of team’s researchers, “previous antler studies show that manganese is necessary for calcium absorption. Our hypothesis is that when the human body absorbs less manganese or when it is sent from the skeleton to other organs that require it, such as the brain, the calcium that is extracted at the same time is then not properly absorbed and is excreted in the urine. It is in this way that osteoporosis can slowly strike.”

The theory must now be validated with more studies and medical trials but its creators believe that it is a “step in a totally new direction in osteoporosis research as it considers calcium loss to be a consequence of the disease and not the origin.”

The idea for the new proposal came from a dramatic increase in antler breakages seen in Spain in 2005. When scientists analysed these antlers in detail, they realised that weakening was due to manganese depletion caused by the deer’s diet. That year saw an intensely cold winter which in turn caused plants to reduce their manganese concentrations in response to such stress.

“Antlers grow by transferring 20% of the skeleton’s calcium towards their structure. We therefore saw that it was not calcium deficiency that caused the weakening but rather the deficiency of manganese,” clarifies Landete. “The lack of manganese was almost as if the ‘glue’ that sticks calcium to antlers bones was missing.”

Links to Alzheimer’s and Parkinson’s Disease

In the case of humans, the researchers suggest that manganese is extracted from the bones when it is required by the “most important” organs, such as the brain. The researcher adds that “maintaining the bones is important, but even more so is sustaining the working of the brain, which uses 25% of our energy intake when at rest.”

The team also points out that when this vital mineral runs out after the onset of osteoporosis, conditions like Alzheimer’s disease, Parkinson’s disease, and senile dementia could strike. To put this theory to the test, they analysed data from 113 patients who were operated on for osteoporosis and osteoarthritis (wear and tear of joint cartilage) at Hellín Hospital in Albacete, Spain between 2008 and 2009. Some 40% of those operated on for osteoporosis showed some form of cerebral dysfunction whereas this was not the case in any of the 68 patients operated on for osteoarthritis.

Furthermore, the percentage increased with age and only amongst those patients with osteoporosis. The exhaustion of manganese reserves could be behind the bone disease and the cerebral degeneration. “We are collecting human bones to confirm this. However, studies on rats in which Alzheimer’s disease has been induced by aluminium intoxication show that as the severity of this disease increases, manganese levels in the bones decrease,” says Landete.

The researcher also recalls studies that link manganese to Parkinson’s disease and show that astrocytes, which provide support to neurons, have specific enzymes that require manganese. In any case, researchers outline that their theory “is not a final solution to such diseases but constitutes the first step in a new direction” – a new direction that requires validation and confirmation from the scientific community.

Finding lays basis for studies in animal models of diabetic kidney disease SAN ANTONIO (Jan. 2, 2012) — Hydrogen sulfide, a gas notorious for its rotten-egg smell, may have redeeming qualities after all. It reduces high glucose-induced production of scarring proteins in kidney cells, researchers from The University of Texas Health Science Center San Antonio report in the Journal of Biological Chemistry. The paper is scheduled for print publication in early 2012. “There is interest in gases being mediators of biological events,” said B.S. Kasinath, M.D., professor of medicine and a nephrologist with UT Medicine San Antonio, the clinical practice of the School of Medicine at the UT Health Science Center. “We found that when we added sodium hydrosulfide, a substance that releases hydrogen sulfide, to kidney cells exposed to high glucose, it decreased the manufacture of matrix proteins that scar the kidney.” Consistent with this finding, enzymes in the kidney that facilitate production of hydrogen sulfide were reduced in mice with type 1 or type 2 diabetes, Dr. Kasinath and his team reported. Scarring in the kidney, called renal fibrosis, is a core defect leading to end-stage kidney disease. Nearly half of end-stage kidney disease in the U.S. is related to diabetes, which is a disease marked by poor regulation of blood glucose. “We have found a way to decrease matrix protein synthesis, which is a problem in diabetes,” Dr. Kasinath said. Because the studies are limited to cells, the finding should not be extrapolated to the treatment of human diabetic kidney disease, he emphasized. The finding paves the way for studies in mice or other animal models. Both the safety and effectiveness of hydrogen sulfide should be established in animal models of kidney disease before human trials can be considered. This precaution is required because hydrogen sulfide, at higher concentrations, is known to be a toxic agent. Journal of Biological Chemistry editors selected the team’s manuscript to be the Paper of the Week, reserved for the top 1 percent of manuscripts in significance and overall importance. About 50 to 100 papers are selected for this recognition from the more than 6,600 the journal publishes each year. Hak Joo Lee, Ph.D., a postdoctoral fellow in the Division of Nephrology, is the lead author on the study. Dr. Kasinath, also a member of the Barshop Institute for Longevity and Aging Studies at the Health Science Center, is the senior author and wishes to acknowledge the contributions of his co-authors. This work was supported by a grant from the National Institutes of Health/National Institute of Diabetes and Digestive and Kidney Diseases, (DK077295), and a U.S. Department of Veterans Affairs research grant to Dr. Kasinath, the principal investigator.

Ralph’s Note: Hydrogen Sulfide can be easily obtained through consumption of Garlic Oil.

Inflammation in depression: Chicken or egg?

New study in Biological Psychiatry attempts to answer the question

Philadelphia, PA — An important ongoing debate in the field of psychiatry is whether inflammation in the body is a consequence of or contributor to major depression. A new study in Biological Psychiatry has attempted to resolve the issue.

Inflammation in the body is common to many diseases, including high blood pressure, coronary artery disease, and diabetes. Depression has also been linked to an inflammation marker in blood called C-reactive protein (CRP).

Dr. William Copeland at Duke University Medical Center and his colleagues tested the direction of association between depression and CRP in a large sample of adolescent and young adult volunteers. By following the children into young adulthood, they were able to assess the changes over time in both their CRP levels and any depressive symptoms or episodes.

They found that elevated levels of CRP did not predict later depression, but the number of cumulative depressive episodes was associated with increased levels of CRP.

“Our results support a pathway from childhood depression to increased levels of CRP, even after accounting for other health-related behaviors that are known to influence inflammation. We found no support for the pathway from CRP to increased risk for depression,” said Copeland.

These findings suggest that, by this measure, depression is more likely to contribute to inflammation in the body as opposed to arise as a consequence of inflammation in the body. The highest levels of CRP were found in those who had endured the wear and tear of multiple depressive episodes. This suggests the possibility that long-term emotional distress, beginning in childhood, may lay the foundation for inflammatory processes that lead, in middle age, to cardiovascular disease and diabetes.

“Depression is a recurring disorder for many people. Thus the finding that repeated episodes of depression contribute to inflammation in the body highlights a potentially important role for untreated depression as a contributor to a range of serious medical problems,” commented Dr. John Krystal, Editor of Biological Psychiatry. “These data add to growing evidence of the medical importance of effectively treating depression.”

Research proving link between virus and MS could point the way to treatment and prevention

Wednesday 4 January 2012

A new study from researchers at Queen Mary, University of London shows how a particular virus tricks the immune system into triggering inflammation and nerve cell damage in the brain, which is known to cause MS.

Previous research has suggested a link between the Epstein-Barr virus (EBV) and multiple sclerosis but the research has remained controversial since scientists have so far failed to substantiate the link. The new study proves the virus is involved in a manner more sophisticated and subtle than previously imagined, and may offer new ways to treat or prevent the disease. MS is a neurological condition that affects around 100,000 people in the UK. It can cause vision problems, difficulties with walking and fatigue, and tends to strike mainly young and middle-aged women. Its causes are not completely understood but both genes and environment are known to play a role. Some previous research has suggested that EBV triggers MS but subsequent studies have failed to find the connection. The new research, which is published in the journal Neurology, looked at post mortem brains of MS patients, examining areas where neurological damage had recently occurred. Dr Ute-Christiane Meier from Barts and the London Medical School, part of Queen Mary, led the research. She explained: “EBV is quite a clever virus; when it’s not growing and spreading it can hide away in our immune cells. “In this study we used a different technique which allowed us to detect the virus in the brains of some people affected by MS, even when it was hiding away in the cells.” Dr Meier and her team of collaborators found that, although the virus was not actively spreading, it was releasing a chemical message into areas of the brain nearby. This chemical message – made up of small RNA molecules – was activating the body’s immune system, causing inflammation. This damages nerve cells in the brain and causes MS symptoms. Dr Meier continued: “We have to be careful and have to study more MS brains but this is potentially very exciting research. Now we understand how EBV gets smuggled into the brain by cells of the immune system and that it is found at the crime scene, right where the attack on our nervous system occurs. Now we know this, we may have a number of new ways of treating or even preventing the disease.” One possibility is the widely-used cancer treatment Rituximab; a drug which is known to kill the cells of the immune system in which the virus hides. It is now being trialed as a treatment for MS.

Another possible approach, using anti-viral treatment, will be tested in clinical trials currently in preparation by Professor Gavin Giovannoni and colleagues, also at Queen Mary. “If we can pinpoint EBV as a trigger, it’s possible that we could alter the course of MS or potentially even prevent the condition by treating the virus,” Dr Meier added. “MS so often strikes young women and its unpredictable nature makes it an incredibly difficult disease to live with. We desperately need better ways to tackle the condition.” Interestingly, the research also hinted that infection with EBV and its action on the immune system could also be playing a role in other brain diseases such as cancer and stroke. This research was supported by the Medical Research Council and MS charities, Roan Charitable Trust and Aims2Cure.

DALLAS — Low levels of vitamin D have been linked to depression, according to UT Southwestern Medical Center psychiatrists working with the Cooper Center Longitudinal Study. It is believed to be the largest such investigation ever undertaken.

Low levels of vitamin D already are associated with a cavalcade of health woes from cardiovascular diseases to neurological ailments. This new study – published in Mayo Clinic Proceedings – helps clarify a debate that erupted after smaller studies produced conflicting results about the relationship between vitamin D and depression. Major depressive disorder affects nearly one in 10 adults in the U.S.

“Our findings suggest that screening for vitamin D levels in depressed patients – and perhaps screening for depression in people with low vitamin D levels – might be useful,” said Dr. E. Sherwood Brown, professor of psychiatry and senior author of the study, done in conjunction with The Cooper Institute in Dallas. “But we don’t have enough information yet to recommend going out and taking supplements.”

UT Southwestern researchers examined the results of almost 12,600 participants from late 2006 to late 2010. Dr. Brown and colleagues from The Cooper Institute found that higher vitamin D levels were associated with a significantly decreased risk of current depression, particularly among people with a prior history of depression. Low vitamin D levels were associated with depressive symptoms, particularly those with a history of depression, so primary care patients with a history of depression may be an important target for assessing vitamin D levels. The study did not address whether increasing vitamin D levels reduced depressive symptoms.

The scientists have not determined the exact relationship – whether low vitamin D contributes to symptoms of depression, whether depression itself contributes to lower vitamin D levels, or chemically how that happens. But vitamin D may affect neurotransmitters, inflammatory markers and other factors, which could help explain the relationship with depression, said Dr. Brown, who leads the psychoneuroendocrine research program at UT Southwestern.

Vitamin D levels are now commonly tested during routine physical exams, and they already are accepted as risk factors for a number of other medical problems: autoimmune diseases; heart and vascular disease; infectious diseases; osteoporosis; obesity; diabetes; certain cancers; and neurological disorders such as Alzheimer’s and Parkinson’s diseases, multiple sclerosis, and general cognitive decline.

Investigators used information gathered by the institute, which has 40 years of data on runners and other fit volunteers. UT Southwestern has a partnership with the institute, a preventive medicine research and educational nonprofit located at the Cooper Aerobics Center, to develop a joint scientific medical research program aimed at improving health and preventing a wide range of chronic diseases. The institute maintains one of the world’s most extensive databases – known as the Cooper Center Longitudinal Study – that includes detailed information from more than 250,000 clinic visits that has been collected since Dr. Kenneth Cooper founded the institute and clinic in 1970.

Study finds statin costs 400 percent higher in US compared to UK

(Boston) — In the United States, the cost paid for statins (drugs to lower cholesterol) in people under the age of 65 who have private insurance is approximately 400 percent higher than comparable costs paid by the government in the United Kingdom (U.K.). These findings, from the Boston University School of Medicine (BUSM) Boston Collaborative Drug Surveillance Program, are the first results of a comprehensive comparison of prescription drug costs between the U.S. and U.K. The study appears on-line in the journal Pharmacotherapy.

Expenditures for prescription drugs remain a large part of the ongoing debate on the costs of medical care in the U.S. and U.K. Because of the many complex and interactive variables that contribute to these costs, well-defined estimates of the actual and relative usage and costs for the two countries have not been reliably documented.

Data for this study came from two large electronic medical databases, one in each country. Costs were derived from private health insurance claims in the U.S., while the costs were originated from a general practice research database constructed in 1990 in the U.K.

The study is based upon a 2005 sample of 280,000 people age 55-64 in each country. Statins were prescribed to an estimated 32.7 percent of people in the U.S. and 24.4 percent in the U.K. In the U.S. the estimated annual cost of statins ranged from a high of $1,428 for simvastatin (generic unavailable), to a low of $314 for lovastatin (available in generic formulation). In the U.K. the annual cost varied from a high of $500 for atorvastatin (generic not available), to a low of $164 for simvastatin (available in generic). The estimated cost per pill was at least twice as high for each statin prescribed in both countries.

When the annual cost for each annual statin user together with the number of users were combined, the total estimated cost for statin users was $69.5 million in people covered by private insurance companies in the U.S. The total estimated annual cost for statin users covered by the government in the U.K. was $15.7 million.

“In addition to differences in overall statin use and per unit costs, another significant factor contributing to the disparity of costs appears to be the availability and utilization of generics,” said lead author Hershel Jick, MD, Director Emeritus of BUSM’s Collaborative Drug Surveillance Program and associate professor of medicine at BUSM.

According to the researchers, simvastin was approved in the U.S. for sale in generic formulation in late June 2006. Accordingly, within the next six months more than 60 percent of users switched from the brand preparation to the generic. The resultant estimated cost was reduced more than 60 percent. According to the researchers, however, it still was four times higher than that in the U.K.

Statins may increase risk of interstitial lung abnormalities in smokers

Use of statins may influence susceptibility to or the progression of interstitial lung disease (ILD) in smokers, according to a new study.

While some studies have suggested that statins might be beneficial in the treatment of fibrotic lung disease, others have suggested that they may contribute to the progression of pulmonary fibrosis by enhancing secretion of inflammasome-regulated cytokines, and numerous case reports have suggested that statins may contribute to the development of various types of ILD.

“Based on earlier case reports of statin-associated ILD and data suggesting that smoking is associated with the interstitial lung abnormalities (ILA) which underlie ILD, we hypothesized that statins would increase the risk for ILA in a population of smokers,” said George R. Washko MD, MMsC, and Gary M. Hunninghake MD, MPH, of the Division of Pulmonary and Critical Care at Brigham and Women’s Hospital in Boston. “Accordingly, we evaluated the association between statin use and ILA in a large cohort of current and former smokers from the COPDGene study. In addition to the association between statin use and ILA we found in humans, we also demonstrated that statin administration aggravated lung injury and fibrosis in bleomycin-treated mice.” Bleomycin has been shown to induce lung inflammation and fibrosis.

The findings were published online ahead of print publication in the American Thoracic Society’s American Journal of Respiratory and Critical Care Medicine.

Assessment included pulmonary function testing and CT scanning for radiologic features of ILA. Among 1,184 subjects with no evidence of ILA, 315 (27%) used statins, compared with 66 of 172 (38%) subjects with ILA. After adjustment for a number of covariates, including a history of high cholesterol or coronary artery disease, statin users had a 60 percent increase in the odds of having ILA , compared to subjects not taking statins. No other positive associations between ILA and cardiovascular medications or disorders were detected. The association between statin use and ILA was greatest with statins with higher hydrophilicity (readily absorbed or dissolved in water), such as pravastatin, and in higher age groups.

The effects of statins on lung injury and fibrogenesis were also examined in a study in mice, which were pretreated with pravastatin prior to intratracheal bleomycin administration. Statin use was found to exacerbate bleomycin-induced lung fibrosis. In a further in vitro study, statin pretreatment was shown to enhance Nlrp3-inflammasome activation through mitochondrial reactive oxygen species generation in macrophages. “These results implicate activation of the NLRP3 inflammasome in fibrotic lung disease,” said Jin-Fu Xu MD, and Augustine M. K. Choi, MD, of the Department of Pulmonary Medicine, Shanghai Pulmonary Hospital, Tongji University School of Medicine, in Shanghai, China and the Division of Pulmonary and Critical Care at Brigham and Women’s Hospital in Boston, respectively.

There were some limitations to both studies. Findings in the mouse model were not replicated in human samples. All study subjects were current or former smokers, perhaps limiting the applicability of the results to others. Cigarette smoking by itself may lead to pulmonary inflammation. Finally, the duration and dosage of statin therapy was not available for the majority of patients.

“While statin use was associated with ILA in our study, caution should be used when extrapolating these findings to the care of patients,” concluded Dr. Hunninghake. “The significant benefits of statin therapy in patients with cardiovascular disease probably outweigh the risk of developing ILA, and statin use may benefit some patients with respiratory disease. Clinicians should be aware, though, that radiological evidence of ILD can develop in some patients treated with statins.”

How poor maternal diet can increase risk of diabetes — new mechanism discovered

Researchers funded by the Biotechnology and Biological Sciences Research Council have shown one way in which poor nutrition in the womb can put a person at greater risk of developing type 2 diabetes and other age-related diseases in later life. This finding could lead to new ways of identifying people who are at a higher risk of developing these diseases and might open up targets for treatment.

The team, from the University of Cambridge and the Medical Research Council (MRC) Toxicology Unit at the University of Leicester, publish their findings today (Friday 6 January) in the journal Cell Death and Differentiation.

The research shows that, in both rats and humans, individuals who experience a poor diet in the womb are less able to store fats correctly in later life. Storing fats in the right areas of the body is important because otherwise they can accumulate in places like the liver and muscle where they are more likely to lead to disease.

Professor Anne Willis of the MRC Toxicology Unit at the University of Leicester explains “One of the ways that our bodies cope with a rich modern western diet is by storing excess calories in fat cells. When these cells aren’t able to absorb the excess then fats get deposited in other places, like the liver, where they are much more dangerous and can lead to type 2 diabetes.”

The team found that this process is controlled by a molecule called miR-483-3p. They found that miR-483-3p was produced at higher levels in individuals who had experienced a poor diet in their mother’s wombs than those who were better nourished.

When pregnant rats were fed low protein diets their offspring had higher levels of miR-483-3p. This led to them developing smaller fat cells and left them less able to store fats in adulthood. These rats were less likely to get fat when fed a high calorie diet but were at a higher risk of developing diabetes. Rats are known to be a good model for studying human dietary diseases and the team also found that miR-483-3p was present in elevated levels in a group of people who were born with a low birth weight.

Dr Susan Ozanne, a British Heart Foundation Senior Fellow, who led the work at the University of Cambridge, adds “It has been known for a while that your mother’s diet during pregnancy plays an important role in your adult health, but the mechanisms in the body that underlie this aren’t well understood. We have shown in detail how one mechanism links poor maternal diet to diabetes and other diseases that develop as we age.”

Dr Ozanne and Professor Willis and their team found that miR-483-3p works by suppressing a protein called GDF3. When they studied a group of adult humans who were born with a low birth weight, they found that GDF3 protein was present at around only thirty percent of the levels found in people born at a normal weight.

Professor Willis, Director of the MRC Toxicology Unit, adds “Improving people’s diets and encouraging exercise is clearly the best way to combat the epidemic of diabetes and diet-related disease which is sweeping through our society. However some people are at particular risk of these diseases, despite not looking visibly overweight. This research will hopefully allow us to help these people to take precautionary steps to reduce their likelihood of developing type 2 diabetes.”

Professor Douglas Kell, Chief Executive of BBSRC said “People are continuing to live ever longer and healthier lives thanks to improvements in nutrition and healthcare. However modern diets and lifestyles are posing new challenges to which our bodies sometimes seem poorly adapted – and this has caused unforeseen health problems.

“If we are to remain healthy throughout our lives and into old age it is vital that scientists work to understand our fundamental biology in the context of social and environmental changes. By identifying a mechanism that links maternal diet to diabetes this research has made an important contribution to the fight against a growing epidemic of metabolic diseases.”

Bacteria in the gut of autistic children different from non-autistic children

The underlying reason autism is often associated with gastrointestinal problems is an unknown, but new results to be published in the online journal mBio® on January 10 reveal that the guts of autistic children differ from other children in at least one important way: many children with autism harbor a type of bacteria in their guts that non-autistic children do not. The study was conducted by Brent Williams and colleagues at the Mailman School of Public Health at Columbia University.

Earlier work has revealed that autistic individuals with gastrointestinal symptoms often exhibit inflammation and other abnormalities in their upper and lower intestinal tracts. However, scientists do not know what causes the inflammation or how the condition relates to the developmental disorders that characterize autism. The research results appearing in mBio® indicate the communities of microorganisms that reside in the gut of autistic children with gastrointestinal problems are different than the communities of non-autistic children. Whether or not these differences are a cause or effect of autism remains to be seen.

“The relationship between different microorganisms and the host and the outcomes for disease and development is an exciting issue,” says Christine A. Biron, the Brintzenhoff Professor of Medical Science at Brown University and editor of the study. “This paper is important because it starts to advance the question of how the resident microbes interact with a disorder that is poorly understood.”

Bacteria belonging to the group Sutterella represented a relatively large proportion of the microorganisms found in 12 of 23 tissue samples from the guts of autistic children, but these organisms were not detected in any samples from non-autistic children. Why this organism is present only in autistic kids with gastrointestinal problems and not in unaffected kids is unclear.

“Sutterella has been associated with gastrointestinal diseases below the diaphragm, and whether it’s a pathogen or not is still not clear,” explains Jorge Benach, Chairman of the Department of Microbiology at Stony Brook University and a reviewer of the report. “It is not a very well-known bacterium.”

In children with autism, digestive problems can be quite serious and can contribute to behavioral problems, making it difficult for doctors and therapists to help their patients. Autism, itself, is poorly understood, but the frequent linkage between this set of developmental disorders and problems in the gut is even less so.

Benach says the study was uniquely powerful because they used tissue samples from the guts of patients. “Most work that has been done linking the gut microbiome with autism has been done with stool samples,” says Benach, but the microorganisms shed in stool don’t necessarily represent the microbes that line the intestinal wall. “What may show up in a stool sample may be different from what is directly attached to the tissue,” he says.

Tissue biopsy samples require surgery to acquire and represent a difficult process for the patient, facts that underscore the seriousness of the gastrointestinal problems many autistic children and their families must cope with.

Benach emphasizes that the study is statistically powerful, but future work is needed to determine what role Sutterella plays, if any, in the problems in the gut. “It is an observation that needs to be followed through,” says Benach.

Nicotine replacement therapies may not be effective in helping people quit smoking

Boston, MA – Nicotine replacement therapies (NRTs) designed to help people stop smoking, specifically nicotine patches and nicotine gum, do not appear to be effective in helping smokers quit long-term, even when combined with smoking cessation counseling, according to a new study by researchers at Harvard School of Public Health (HSPH) and the University of Massachusetts Boston.

The study appears January 9, 2012 in an advance online edition of Tobacco Control and will appear in a later print issue.

“What this study shows is the need for the Food and Drug Administration, which oversees regulation of both medications to help smokers quit and tobacco products, to approve only medications that have been proven to be effective in helping smokers quit in the long-term and to lower nicotine in order to reduce the addictiveness of cigarettes,” said co-author Gregory Connolly, director of the Center for Global Tobacco Control at HSPH.

In the prospective cohort study the researchers, including lead author Hillel Alpert, research scientist at HSPH, and co-author Lois Biener of the University of Massachusetts Boston’s Center for Survey Research, followed 787 adult smokers in Massachusetts who had recently quit smoking. The participants were surveyed over three time periods: 2001-2002, 2003-2004, and 2005-2006. Participants were asked whether they had used a nicotine replacement therapy in the form of the nicotine patch (placed on the skin), nicotine gum, nicotine inhaler, or nasal spray to help them quit, and if so, what was the longest period of time they had used the product continuously. They also were asked if they had joined a quit-smoking program or received help from a doctor, counselor, or other professional.

The results showed that, for each time period, almost one-third of recent quitters reported to have relapsed. The researchers found no difference in relapse rate among those who used NRT for more than six weeks, with or without professional counseling. No difference in quitting success with use of NRT was found for either heavy or light smokers.

“This study shows that using NRT is no more effective in helping people stop smoking cigarettes in the long-term than trying to quit on one’s own,” Alpert said. He added that even though clinical trials (studies) have found NRT to be effective, the new findings demonstrate the importance of empirical studies regarding effectiveness when used in the general population.

Biener said that using public funds to provide NRT to the population at large is of questionable value, particularly when it reduces the amount of money available for smoking interventions shown in previous studies to be effective, such as media campaigns, promotion of no smoking policies, and tobacco price increases.

Smoking cessation medications have been available over the counter since 1996, yet U.S. Centers for Disease Control and Prevention statistics show that the previous adult smoking rate decline and quitting rates have stalled in the past five years.

Statin use in postmenopausal women associated with increased diabetes risk

CHICAGO – The use of statins in postmenopausal women is associated with increased diabetes risk, according to a study published Online First by the Archives of Internal Medicine, one of the JAMA/Archives journals.

But researchers note statins address the cardiovascular consequences of diabetes and current American Diabetes Association guidelines for primary and secondary prevention should not change. Likewise, researchers write that guidelines for statin use in nondiabetic populations also should not change.

In this study, researchers used WHI data through 2005 and included 153,840 women without diabetes and with a mean (average) age of 63.2 years. Statin use was assessed at enrollment and in year three. At baseline, 7.04 percent of women reported taking statin medication.

The results indicate 10,242 new cases of diabetes and statin use at baseline was associated with an increased risk of diabetes. This association remained after adjusting for other potential variables, including age, race/ethnicity and body mass index, and was observed for all types of statins.

“The results of this study imply that statin use conveys an increased risk of new-onset DM in postmenopausal woman. In keeping with the findings of other studies, our results suggest that statin-induced DM is a medication class effect and not related to potency or to individual statin,” the researchers write.

“However, the consequences of statin-induced DM (diabetes mellitus) have not been specifically defined and deserve more attention. Given the wide use of statins in the aging population, further studies among women, men, and diverse ethnicities will clarify DM risk and risk management to optimize therapy,” researchers conclude.

Marijuana use associated with cyclic vomiting syndrome in young males

Researchers have found clear associations between marijuana use in young males and cyclic vomiting syndrome (CVS), where patients experience episodes of vomiting separated by symptom free intervals.

The study, published in the January issue of Neurogastroenterology and Motility, looked at 226 patients seen at the Mayo Clinic in Rochester, New York, USA, over a 13-year period.

These were broken into three groups. Eighty-two patients with CVS were randomly matched with 82 patients with Irritable Bowel Syndrome (IBS) based on age, gender and geographic referral region. Researchers also examined the records of 62 patients with functional vomiting (FV), recurrent vomiting that cannot be attributed to a specific physical or psychiatric cause.

“Our study showed that CVS and FV had very similar clinical features, apart from marijuana use” says Dr G Richard Locke III from the Division of Gastroenterology and Hepatology at the Clinic.

Key findings of the study included:

Members of the CVS group were younger than members of the FV group (30 versus 36 years) and more likely to be male (53% versus 46%).

No statistically significant association was detected between membership of the CVS and FV groups and marital status, education level, body mass index, employment status, alcohol use or smoking history.

37% of the CVS group had used marijuana (81% male), together with 13% of the FV group (equally split between male and female) and 11% of the IBS group (73% male).

Marijuana users were 2.9 times more likely to be in the CVS group than the FV group. When this was adjusted for age and gender, males using marijuana were 3.9 times more likely to be in the CVS group and women using marijuana were 1.2 times more likely.

The research team also looked at gastrointestinal symptoms and migraine as these have previously been associated with CVS. They found that

The prevalence of gastrointestinal symptoms, including abdominal pain and nausea, was similar in CVS and FV patients, with the exception of retching, which was more common in patients with CVS (69% versus 31%).

Patients in the CVS group were more likely to have headaches and migraines than patients in the FV group, but the difference was not statistically significant. Migraine headache and psychiatric disorders did not appear to commonly co-exist in CVS patients, unlike in the IBS group.

The researchers also measured rapid gastric emptying, which is when undigested food enters the small bowel too quickly causing nausea, vomiting and other symptoms. This showed that there were much higher rates of fast gastric emptying in patients in the CVS and FV group (45% and 46% respectively), compared with the IBS group (8%). A novel finding was that the patterns of fast, normal and delayed gastric emptying were similar in the CVS and FV groups.

“Our study confirms that cyclic vomiting syndrome occurs most often in young males and is significantly associated with marijuana use, unlike functional vomiting” says Dr Locke. “The current treatment options for this condition remain challenging and are limited by the lack of randomised controlled trials. Further research is clearly needed.”

Illinois scientists link dietary DHA to male fertility

URBANA – Who knew that male fertility depends on sperm-cell architecture? A University of Illinois study reports that a certain omega-3 fatty acid is necessary to construct the arch that turns a round, immature sperm cell into a pointy-headed super swimmer with an extra long tail.

“Normal sperm cells contain an arc-like structure called the acrosome that is critical in fertilization because it houses, organizes, and concentrates a variety of enzymes that sperm use to penetrate an egg,” said Manabu Nakamura, a U of I associate professor of biochemical and molecular nutrition.

The study shows for the first time that docosahexaenoic acid (DHA) is essential in fusing the building blocks of the acrosome together. “Without DHA, this vital structure doesn’t form and sperm cells don’t work,” said Timothy Abbott, a doctoral student who co-authored the study.

Men concerned about their fertility may wonder what foods contain DHA. Marine fish, such as salmon or tuna, are excellent sources of this omega-3 fatty acid.

The scientists became intrigued with DHA’s role in creating healthy sperm when they experimented with “knockout” mice that lack a gene essential to its synthesis. “We looked at sperm count, shape, and motility, and tested the breeding success rate. The male mice that lacked DHA were basically infertile,” Nakamura said.

But when DHA was introduced into the mice’s diet, fertility was completely restored. “It was very striking. When we fed the mice DHA, all these abnormalities were prevented,” he said.

The scientists then used confocal laser scanning (3D) microscopy to look at thin slices of tissue in progressive stages of a sperm cell’s development. By labeling enzymes with fluorescence, they could track their location in a cell.

“We could see that the acrosome is constructed when small vesicles containing enzymes fuse together in an arc. But that fusion doesn’t happen without DHA,” he said.

In the absence of DHA, the vesicles are formed but they don’t come together to make the arch that is so important in sperm cell structure, he noted.

Nakamura finds the role this omega-3 fatty acid plays in membrane fusion particularly exciting. Because DHA is abundant in specific tissues, including the brain and the retina as well as the testes, the scientists believe their research findings could also impact research relating to brain function and vision.

“It’s logical to hypothesize that DHA is involved in vesicle fusion elsewhere in the body, and because the brain contains so much of it, we wonder if deficiencies could play a role, for example, in the development of dementia. Any communication between neurons in the brain involves vesicle fusion,” he noted.

The Illinois scientists will continue to study sperm; meanwhile, Nakamura has sent some of his DHA-deficient knockout mice to other laboratories where scientists are studying DHA function in the brain and the retina.

70 percent of Europeans suffer from low vitamin D levels

The article has been published in the Maturitas journal

IMAGE:A healthy lifestyle should include exposure to the sun for 15 minutes three to four times per week.

A group of experts has prepared a report on vitamin D supplementation for menopausal women after it was revealed that Europeans have suffered an alarming decrease in their levels of this vitamin. In their opinion, the ideal would be to maintain blood levels above 30 ng/ml. Vitamin D is essential to the immune system and processes such as calcium absorption.

“We believe that many diseases can be aggravated by a chronic deficiency of vitamin D,” states Faustino R. Pérez-López, researcher at the University of Zaragoza. In particular, this is worse during the menopause as low levels of vitamin D in the blood are associated with an increased risk of osteoporosis, loss of motor coordination and bone fractures.

Vitamin D deficiency is a real problem in Europe as levels in the blood are low in 50% to 70% of the population. Pérez-López points out that “healthcare professionals should be aware that this is a common problem which affects a large part of the population in Europe, even those who live in sunny places.”

Therefore, a group of experts from the European Menopause and Andropause society (EMAS), led by Pérez-López, have prepared a report about vitamin D supplementation and the health of postmenopausal women. The text has been signed by 11 experts from international institutions like the John Radcliffe Hospital in Oxford.

As Pérez-López explains, “we analysed the conditions and diseases that are associated with vitamin D deficiency and we recommended the intake of supplements in postmenopausal women.”

Improvements in bone health

According to these experts, vitamin D supplements improve the mineral density of the bones and neuromuscular function and reduce the risk of fracture. Pérez-López believes that “the World Health Organisation or other relevant bodies belonging to the European Union should establish minimum requirements or recommendations on the fortification of foods with vitamin D.”

There are recommendations of this type in some European countries but in others there are either no regulations or they are not strictly observed. There is not even a consensus amongst the medical community itself regarding the advantages of supplements.

Pérez-López insists however that “they are effective but its efficacy has not yet been accepted.”

The researcher outlines that “it is unknown what will happen in the future but we make our recommendations from the EMAS. This is the first statement on the matter in Europe directed towards menopausal women.”

As well as stimulating calcium and phosphorus absorption, the vitamin D system has numerous functions. Low vitamin D levels are linked to rickets, osteomalacia, osteoporosis and the risk of bone fracture, cardiovascular disease, diabetes, cancer, infections and degenerative diseases.

“In healthy postmenopausal women, we have seen that a good level of vitamin D is linked to good physical fitness and has an effect on body fat mass as well as muscle strength and balance,” state the authors of the article published in the Maturitas journal.

A ray of sunshine

The researchers describe how “a healthy lifestyle should include exposure to the sun for 15 minutes three to four times per week when the weather permits since 90% of vitamin D is synthesized upon the skin having contact with sunlight.”

Vitamin D is synthesized through sunlight exposure. Therefore, a modern lifestyle that involves little or no sun exposure and few outdoor activities causes deficiency.

Like with everything, we have to strike a balance. Pérez-López adds that “prolonged sun exposure is not recommended as it increases the risk of different types of cancer along with aging of the skin.”

Substitutes to sunlight

For the experts the ideal would be to maintain blood levels above 30 ng/ml but there is no agreement as to optimum levels.

However, a large number of women are unable to obtain the required quantity of vitamin D through diet and sun exposure. As a way of making up for this deficiency, daily intake of 600 IU (international units) of vitamin D is recommended for women of up to 70 years of age and 800 IU/day for women over 70 years.

The researcher explains that “patients with risk factors associated with hypovitaminosis (obesity, pigmented skin, intestinal malabsorption syndromes and living in regions close to the North and South poles) should increase their intake to up to 4,000 IU per day.” There is scientific evidence that a daily dose of 4,000 IU/day is not poisonous in healthy people.

A new study finds that low concentrations of the chemical methylisothiazolinone has subtle but measurable negative effects on the neural development of tadpoles. The chemical is found in some cosmetics, although the study does not provide any evidence that cosmetics are unsafe for humans.

Cosmetic chemical hinders brain development in tadpoles

PROVIDENCE, R.I. [Brown University] — Scientists, health officials, and manufacturers already know that a chemical preservative found in some products, including cosmetics, is harmful to people and animals in high concentrations, but a new Brown University study in tadpoles reports that it can also interrupt neurological development even in very low concentrations.

In the cosmetics industry, the biocide methylisothiazolinone or MIT, is considered safe at concentrations of less than 100 parts per million. Lab studies, however, have found that lower concentrations affected the growth of animal neurons. Picking up from there, the Brown researchers performed a series of experiments to investigate how 10 days of exposure at concentrations as low as 1.5 ppm would affect whole, living tadpoles as they develop. Their results appear in advance online in the journal Neuroscience.

“The lower concentrations we studied didn’t kill the animals or cause any big deformities or affect the behavior you’d see just by looking at them,” said Carlos Aizenman, associate professor of neuroscience and the study’s senior author. “But then we decided to do a series of functional tests and we found that exposure to this compound during a period of development that’s critical for the fine wiring of the nervous system disrupted this period of fine tuning.”

Aizenman emphasized that there is no evidence in the study that any products with MIT, such as shampoos or cosmetics, are harmful to consumers.

Neurotoxic effects

Carlos Aizenman In the laboratory, behavioral effects observed in tadpoles suggested a further search for a “non-obvious but real deficit in neural function.” Credit: David Orenstein/Brown UniversityWhen Aizenman and lead author Ariana Spawn explored the consequences of exposing tadpoles to two nonlethal concentrations, 1.5 ppm and 7.6 ppm, they found some deficits both in behavior and in basic brain development.

In one experiment they shined moving patterns of light into one side of the tadpole tanks from below. As they expected, the unexposed tadpoles avoided the light patterns, swimming to the other side. Tadpoles that had been exposed to either concentration of MIT, however, were significantly less likely to avoid the signals.

In another experiment, Aizenman and Spawn, who was an undergraduate at the time and has since graduated, exposed the tadpoles to another chemical known to induce seizures. The tadpoles who were not exposed to MIT and those exposed to the lower concentration each had the same ability to hold off seizures, but the ones who had been exposed to the 7.6 ppm concentration succumbed to the seizures significantly more readily.

In these experiments, seizure susceptibility had nothing to do with epilepsy, Aizenman said, but was instead a measure of more general neural development.

After observing the two significant behavioral effects in the tadpoles, Aizenman and Spawn then sought the underlying physiological difference between exposed and unexposed tadpoles that might cause them. They performed an electrophysiological analysis of each tadpole’s optic tectum, a part of the brain responsible for processing visual information. They found evidence that the chemical seems to have stunted the process by which tadpoles prune and refine neural connections, a key developmental step.

“The neural circuits act like the neural circuits of a much more immature tadpole,” Aizenman said. “This is consistent with the previous findings in cell cultures.”

Aizenman said consumers should know about the study’s results and pay attention to the ingredients in the products they use, but should not become worried based on the basic science study.

Aizenman said one area where further studies may be warranted is in cases of repeated exposure in industrial or occupational settings, but the study’s broader message may be that chemical manufacturers and independent labs should test more for neurodevelopmental effects of even low concentrations of products. In the specific case of MIT in tadpoles, he noted, “It’s resulting in a non-obvious but real deficit in neural function.”

Brown University and the Whitehall Foundation funded the research.

Parabens in breast tissue not limited to women who have used underarm products

New research into the potential link between parabens and breast cancer has found traces of the chemicals in breast tissue samples from all of the women in the study. Parabens are commonly used as preservatives in cosmetics, food products and pharmaceuticals. As the research shows that parabens are measurable in the tissue of women who do not use underarm cosmetics the parabens must enter the breast from other sources.

Breast tissue samples were taken from 40 women, with the results showing that all of the women had at least one paraben in their tissue. The research, published in the Journal of Applied Toxicology, was a collaborative study led by Dr Philippa Darbre, University of Reading and Mr Lester Barr, University Hospital of South Manchester.

The research team studied tissue samples from 40 women undergoing mastectomies between 2005 and 2008 for first primary breast cancer in England. In total, 160 samples were collected, four from each woman, covering serial locations from the axilla (nearest the armpit) to the sternum (breast bone). 99% of the tissue samples contained at least one paraben and 60% of the samples had five.

A number of studies since 1998 have raised concerns about the potential role of parabens in breast cancer as these chemicals possess oestrogenic properties and oestrogen is known to play a central role in the development, growth and progression of breast cancer. In particular, a link was proposed between the disproportionate incidence of breast cancer in the upper outer quadrant of the breast and oestrogenic chemicals in that region, maybe from local application of underarm cosmetic products.

“Our study appears to confirm the view that there is no simple cause and effect relationship between parabens in underarm products and breast cancer” said Mr Lester Barr, consultant surgeon at the University Hospital of South Manchester and Chairman of the Genesis Breast Cancer Prevention Appeal, which part sponsored the study.

“The intriguing discovery that parabens are present even in women who have never used underarm products raises the question: where have these chemicals come from?”

Key findings of the study included:

One or more paraben esters were detected in 158 of the 160 samples studied (99%) and 96 of the samples (60%) contained all five of the most common paraben esters.

The overall median value for total parabens in the breast tissue was 85.5 ng/g – one billionth of a gram of parabens per gram of breast tissue – ranging from 0 ng/g to 5134.5 ng/g. This level was four times higher than the 20.6 ng/g recorded by a smaller study in 2004.

There was a disproportionate incidence of breast cancer in the upper outer quadrant nearest the armpit and significantly higher levels of n-propylparaben were detected in the axilla region, closest to the armpit, than in the mid or medial regions. The other four parabens were equally distributed across all parts of the breast.

“The fact that parabens were detected in the majority of the breast tissue samples cannot be taken to imply that they actually caused breast cancer in the 40 women studied,” said Dr Philippa Darbre, Reader in Oncology at the University of Reading, who also led the 2004 study. “However, the fact that parabens were present in so many of the breast tissue samples does justify further investigation.”

Chemotherapy may influence leukemia relapse

IMAGE:Research by, from left, Timothy Ley, M.D., John DiPersio, M.D., Ph.D, Richard Wilson, Ph.D, and other Washington University physicians and scientists suggests that chemotherapy may contribute to relapse in some…

The chemotherapy drugs required to push a common form of adult leukemia into remission may contribute to DNA damage that can lead to a relapse of the disease in some patients, findings of a new study suggest.

The research, by a team of physicians and scientists at Washington University School of Medicine in St. Louis, is published Jan. 11 in the advance online edition of Nature.

For patients with acute myeloid leukemia (AML), initial treatment with chemotherapy is essential for putting the cancer into remission. Without it, most patients would die within several months. But even so, about 80 percent of AML patients die within five years when chemotherapy treatment fails to keep the cancer in remission and the disease returns.

Results of the new research provide evidence for a theory that scientists have long held: Chemotherapy contributes to relapse in cancer patients by damaging DNA and generating new mutations that allow tumor cells to evolve and become resistant to treatment.

“The mutations in AML patients who have relapsed are different from those present in the primary tumor, and they are more likely to have a telltale signature of DNA damage,” says senior author John F. DiPersio, MD, PhD, the Virginia E. and Sam J. Golman Professor of Medicine and chief of the division of oncology. “This suggests that mutations in the relapse cells are influenced by the chemotherapy drugs the patients receive.”

Chemotherapy is known to damage the DNA of both cancer cells and healthy cells. But until now, scientists have had little direct evidence to suggest that chemotherapy itself helps shape the evolution of cancer cells and may contribute to disease recurrence. The researchers suspect this phenomenon is not unique to AML and may occur in other cancers as well.

“Chemotherapy drugs are absolutely necessary to get leukemia patients into remission, but we also pay a price in terms of DNA damage,” says co-author Timothy J. Ley, MD, the Lewis T. and Rosalind B. Apple Professor of Oncology. “They may contribute to disease progression and relapse in many different cancers, which is why our long-term goal is to find targeted therapies based on the mutations specific to a patient’s cancer, rather than use drugs that further damage DNA.”

For the current study, scientists at Washington University’s Genome Institute sequenced the genomes – the entire DNA – of cancer cells before and after relapse in eight patients with AML and compared the genetic sequences to healthy cells from the same patients. The data essentially allowed them to map the evolution of cancer cells in each patient.

All the patients received cytarabine and an anthracycline drug to induce remission plus additional chemotherapy in an attempt to keep the cancer from returning. Using technology developed at the Genome Institute, the researchers isolated the DNA segments that contained every mutation in the samples of cancer cells and sequenced those regions nearly 600 times each, far more than the usual 30 times each, which substantially increased the statistical accuracy of the results.

The researchers found that the relapsed cancer cells did not contain a large number of new mutations, as some had predicted. In fact, while the relapsed cells in all the patients had gained some mutations, the percentage was relatively small compared to the number of mutations in the primary tumor.

The scientists also discovered a type of mutation in the relapsed cells that is associated with DNA damage. The frequency of these alterations, known as transversions, was significantly higher for relapse-specific mutations (46 percent) than for primary-tumor mutations (31 percent), suggesting that the chemotherapy may have contributed to some of these mutations, the researchers report. Transversions are also more commonly found in the tumor cells of lung cancer patients who smoke.

Genome sequencing also revealed two major patterns of evolution of cancer cells linked to AML relapse. All patients had a single founding clone: a cluster of cancer cells – all with the same mutations – that define the leukemia. In some patients, the founding clone gains mutations, enabling it to survive chemotherapy and evolve into the relapse clone. In others, a subclone derived from the founding clone survives chemotherapy, gains mutations and evolves to become the dominant clone at relapse.

“It’s the same tumor coming back but with a twist,” says co-author Richard K. Wilson, PhD, director of the Genome Institute. “It’s always the founding clone or a subclone that comes back with new mutations that give the cells new strategies for surviving attack by whatever drugs are thrown at them. This makes a lot of sense but it’s been hard to prove without whole-genome sequencing.”

In all cases, the chemotherapy failed to kill the founding clone, an indication that eradicating the founding clone and subclones is the key to achieving a cure.

Sequencing the entire genomes of the cancer cells was essential to the researchers’ discoveries. Most of the mutations in the relapse samples occurred in the regions of the genome that don’t include genes and would have been missed if the researchers had sequenced only a portion of the patients’ DNA.

“If we only look at the genes, we typically find a total of 10 to 25 mutations in each patient with AML,” says lead author and Genome Institute scientist Li Ding, PhD, research assistant professor of genetics. “That’s not enough to see significant changes in the mutational patterns of the primary tumor cells versus those in the relapsed cells. Whole-genome sequencing identifies hundreds of mutations in each patient, which provides the resolution and confidence necessary for us to dig deeper to understand how cancer evolves.”

DiPersio, who regularly treats patients with AML, says, “Our preconceived notion of the clonal evolution of AML and other cancers has been altered by our study, which suggests that it is much more complicated and dynamic than we initially suspected and can even be impacted by the therapy that is given to treat the disease.”

About 13,000 cases of acute myeloid leukemia are diagnosed each year in the United States. It occurs most often among those age 60 or older and becomes more difficult to treat as patients age. According to the American Cancer Society, the five-year survival rate for AML is 21 percent.

Short, sharp shock treatment for E. coli

A short burst of low voltage alternating current can effectively eradicate E. coli bacteria growing on the surface of even heavily contaminated beef, according to a study published in the International Journal of Food Safety, Nutrition and Public Health. The technique offers an inexpensive and easy to implement approach to reducing the risk of food poisoning, which can occur despite handlers complying with hygiene standards.

Food poisoning is a serious public-health issue, especially with the emergence of lethal and highly virulent strains of Escherichia coli (E. coli O157:H7, for example). Infection with this bacterium causes serious diarrhea, dehydration, kidney problems and can lead to serious long-term problems or even be fatal in children, the elderly and people with pre-existing health problems. Tens of thousands of people are affected by E. coli infection each year through eating contaminated beef and other food products. The US Centers for Disease Control and Prevention (CDC) estimates that about 2500 people are hospitalized and there are several dozen deaths each year.

Now, Ajit Mahapatra and colleagues at Fort Valley State University, in Georgia and Virginia Tech have demonstrated that applying a low-voltage alternating current to beef samples inoculated with large numbers of the potentially lethal E. coli O157:H7 can almost completely deactivate the bacterium, which is usually present on the surface of contaminated meat. The team points out that the level of contamination used in their tests far exceeded the contamination that would be seen in commercial carcasses after slaughter.

Previous researchers had demonstrated that electricity can kill bacteria effectively. The study by Mahapatra and colleagues proves efficacy against E. coli O157:H7 at low voltage and low alternating current. It offers a quick and easy way to decontaminate at-risk, but otherwise safe beef without recourse to microbicidal chemicals or other more complicated treatment processes.

Increase dietary fiber, decrease disease

Review confirms benefits of more roughage in the diet

We should all be eating more dietary fiber to improve our health – that’s the message from a health review by scientists in India. The team has looked at research conducted into dietary fiber during the last few decades across the globe and now suggests that to avoid initial problems, such as intestinal gas and loose stool, it is best to increase intake gradually and to spread high-fiber foods out throughout the day, at meals and snacks. Writing in the International Journal of Food Safety, Nutrition and Public Health, the team offers fruit, vegetables, whole-grain foods, such as muesli and porridge, beans and pulses, as readily available foods rich in dietary fiber.

Dietary fiber, also known as roughage, is the general term of the non-digestible parts of the fruit and vegetable products we eat. There are two forms soluble and insoluble. Soluble (prebiotic, viscous) fiber that is readily broken down or fermented in the colon into physiologically active byproducts and gases. The second form is insoluble fiber, which is metabolically inert, but absorbs water as it passes through the digestive system, providing bulk for the intestinal muscles to work against and easing defecation.

Vikas Rana of the Rain Forest Research Institute, in Assam, India, and colleagues point out that research has shown that modern food habits have, it seems, led to an increase in the incidence of obesity, cardiovascular diseases, and type 2 diabetes. These are growing more common even in developing nations where a “western” diet of highly processed foods, high in sugars and saturated fats, beef and dairy products and low in dietary fiber is displacing more traditional options. The team suggests that evidence points to a loss of dietary fiber in the diet as being a major risk factor for health problems but one of the simplest to remedy without recourse to major changes in diet or the addition of supplements or so-called functional foods and nutraceuticals to the diet.

Given that dietary fiber has physiological actions such as reducing cholesterol and attenuating blood glucose, maintaining gastrointestinal health, and positively affecting calcium bioavailability and immune function, it is important for the current generation and future generations that this component of our diets be reasserted through education and information.

“Consuming adequate quantities of DF can lead to improvements in gastrointestinal health, and reduction in susceptibility to diseases such as diverticular disease, heart disease, colon cancer, and diabetes. Increased consumption has also been associated with increased satiety and weight loss,” the team concludes. Given the ready availability particularly in the West and in the relatively richer parts of the developing world of vegetables, fruit and other foods high in dietary fiber it is a matter of recommending that people eat more dietary fiber rather than consistently taking the unhealthy low-fiber option throughout their lives.

Research from Queen Mary, University of London suggests that omega-3 fatty acids, which are found in fish oil, have the potential to protect nerves from injury and help them to regenerate.

When nerves are damaged because of an accident or injury, patients experience pain, weakness and muscle paralysis which can leave them disabled, and recovery rates are poor.

The new study, published this week in the Journal of Neuroscience*, suggests that omega-3 fatty acids could play a significant role in speeding recovery from nerve injury.

The study focused on peripheral nerve cells. Peripheral nerves are the nerves which transmit signals between the brain and spinal cord, and the rest of the body.

These nerves have the ability to regenerate but, despite advances in surgical techniques, patients usually only have good recovery when their injury is minor.

Omega-3 fatty acids are vital for the body’s normal growth and development and have been widely researched for their health benefits. Because the body cannot manufacture omega-3 fatty acids, they have to be consumed in foods such as oily fish.

In the new study, researchers first looked at isolated mouse nerve cells. They simulated the type of damage caused by accident or injury, by either stretching the cells or starving them of oxygen. Both types of damage killed a significant number of nerve cells but enrichment with omega-3 fatty acids in cells gave them significant protection and decreased cell death.

Next the researchers studied the sciatic nerves of mice. They found that a high level of omega-3 fatty acids helped mice to recover from sciatic nerve injury more quickly and more fully, and that their muscles were less likely to waste following nerve damage.

The research was carried out by a group led by Adina Michael-Titus, Professor of Neuroscience at Barts and The London Medical School and lead of the Neurotrauma and Neurodegeneration group in the Centre for Neuroscience and Trauma, Queen Mary, University of London.

She explained: “Our previous research has shown that these fatty acids could have beneficial effects in a number of neurological conditions. This new study suggests that they could also have a role in treating peripheral nerve injuries.

“More work is needed but our research indicates that omega-3 fatty acids can protect damaged nerve cells, which is a critical first step in a successful neurological recovery.”

Why coffee drinking reduces the risk of Type 2 diabetes

Why do heavy coffee drinkers have a lower risk of developing Type 2 diabetes, a disease on the increase around the world that can lead to serious health problems? Scientists are offering a new solution to that long-standing mystery in a report in ACS’ Journal of Agricultural & Food Chemistry.

Ling Zheng, Kun Huang and colleagues explain that previous studies show that coffee drinkers are at a lower risk for developing Type 2 diabetes, which accounts for 90-95 percent of diabetes cases in the world. Those studies show that people who drink four or more cups of coffee daily have a 50 percent lower risk of Type 2 diabetes. And every additional cup of coffee brings another decrease in risk of almost 7 percent. Scientists have implicated the misfolding of a substance called human islet amyloid polypeptide (hIAPP) in causing Type 2 diabetes, and some are seeking ways to block that process. Zheng and Huang decided to see if coffee’s beneficial effects might be due to substances that block hIAPP.

Indeed, they identified two categories of compounds in coffee that significantly inhibited hIAPP. They suggest that this effect explains why coffee drinkers show a lower risk for developing diabetes. “A beneficial effect may thus be expected for a regular coffee drinker,” the researchers conclude

Treatment with light benefits Alzheimer’s patients, Wayne State University finds

Detroit – Exposure to light appears to have therapeutic effects on Alzheimer’s disease patients, a Wayne State University researcher has found.

In a study published recently in the Western Journal of Nursing Research, LuAnn Nowak Etcher, Ph.D., assistant professor of nursing, reported that patients treated with blue-green light were perceived by their caregivers as having improved global functioning.

Caregivers said patients receiving the treatment seemed more awake and alert, were more verbally competent and showed improved recognition, recollection and motor coordination. They also said patients seemed to recapture their personalities and were more engaged with their environment. Patients’ moods also were described as improved.

Etcher’s work is inspired by her interest in a phenomenon known as “sundowning,” when Alzheimer’s patients sleep during the day, wake up later and may be up all night long. Part of her doctoral research was to utilize light, a common intervention for circadian disorders, to regulate the rest-activity patterns of women with Alzheimer’s.

This study, Etcher said, was an effort to address disagreement among researchers on the effect of therapeutic light in regulating rest-activity patterns in Alzheimer’s patients. The study involved 20 women older than age 65 with Alzheimer’s dementia from nursing homes in southeast Michigan. Each patient was assigned randomly to an experimental group receiving blue-green light treatments or a control group receiving dim red light.

A commercially available visor used to treat seasonal affective disorder and jet lag was used to administer the light to patients. Caregivers — patients’ family members and nursing facility personnel — were not told which type of light was hypothesized to have physiologic effects.

Although blue-green light recipients comprised the active experimental group, Etcher said she was surprised when some recipients of red light — the placebo group — also were reported as showing improvements, with caregivers saying their patients were calmer and had reduced resistance to care.

The level of effects varied, Etcher said, noting that while the blue-green group recipients were largely reported by caregivers as showing improvement, a few showed little to no effect from the treatments.

“Some of the rest-activity pattern disruptions that we see associated with Alzheimer’s dementia may not necessarily be circadian based,” Etcher said. “They may be due to unmet needs, pain or other phenomena, and therefore would not respond to an intervention aimed at regulation of the circadian system.”

Calling her study preliminary, she said it now needs to be replicated with a larger sample and different demographics.

In addition to ascertaining which behaviors are circadian based, establishing which methods are most appropriate to analyze data like Etcher’s requires exploration, she said. She is proposing further work that uses two different nonlinear analytic methods to examine sensitivity and specificity to detect change in circadian patterns, with a long-term goal of developing interventions to regulate those patterns to the benefit of patients’ overall function.

“If they sleep better at night, and are more awake during the day, they can eat, they can interact with other people and they can take advantage of other cueing agents in the environment,” she said. “In addition to light during daytime and darkness during the nighttime, smells at mealtimes, food intake, interactions — all these things in conjunction help regulate our day.”

Diet counts: Iron intake in teen years can impact brain in later life

Iron is a popular topic in health news. Doctors prescribe it for medical reasons, and it’s available over the counter as a dietary supplement. And while it’s known that too little iron can result in cognitive problems, it’s also known that too much promotes neurodegenerative diseases.

Now, researchers at UCLA have found that in addition to causing cognitive problems, a lack of iron early in life can affect the brain’s physical structure as well.

UCLA neurology professor Paul Thompson and his colleagues measured levels of transferrin, a protein that transports iron throughout the body and brain, in adolescents and discovered that these transferrin levels were related to detectable differences in both the brain’s macro-structure and micro-structure when the adolescents reached young adulthood.

The researchers also identified a common set of genes that influences both transferrin levels and brain structure. The discovery may shed light on the neural mechanisms by which iron affects cognition, neurodevelopment and neurodegeneration, they said.

Their findings appear in the current online edition of the journal Proceedings of the National Academy of Sciences.

Iron and the proteins that transport it are critically important for brain function. Iron deficiency is the most common nutritional deficiency worldwide, causing poor cognitive achievement in school-aged children. Yet later in life, iron overload is associated with damage to the brain, and abnormally high iron concentrations have been found in the brains of patients with Alzheimer’s, Parkinson’s and Huntington diseases.

Since both a deficiency and an excess of iron can negatively impact brain function, the body’s regulation of iron transport to the brain is crucial. When iron levels are low, the liver produces more transferrin for increased iron transport. The researchers wanted to know whether brain structure in healthy adults was also dependent on transferrin levels.

“We found that healthy brain wiring in adults depended on having good iron levels in your teenage years,” said Thompson, a member of UCLA’s Laboratory of Neuro Imaging. “This connection was a lot stronger than we expected, especially as we were looking at people who were young and healthy — none of them would be considered iron-deficient.

“We also found a connection with a gene that explains why this is so. The gene itself seems to affect brain wiring, which was a big surprise,” he said.

To assess brain volume and integrity, Thompson’s team collected brain MRI scans on 615 healthy young-adult twins and siblings, who had an average age of 23. Of these subjects, 574 were also scanned with a type of MRI called a “diffusion scan,” which maps the brain’s myelin connections and their strength, or integrity. Myelin is the fatty sheath that coats the brain’s nerve axons, allowing for efficient conduction of nerve impulses, and iron plays a key role in myelin production.

Eight to 12 years before the current imaging study, researchers measured the subjects’ blood transferrin levels. They hoped to determine whether iron availability in the developmentally crucial period of adolescence impacted the organization of the brain later in life.

“Adolescence is a period of high vulnerability to brain insults, and the brain is still very actively developing,” Thompson said.

By averaging the subjects’ transferrin levels, which had been assessed repeatedly — at 12, 14 and 16 years of age — the researchers estimated iron availability to the brain during adolescence, he said.

The team discovered that subjects who had elevated transferrin levels — a common sign of poor iron levels in a person’s diet — had structural changes in brain regions that are vulnerable to neurodegeneration. And further analyses of the twins in the study revealed that a common set of genes influences both transferrin levels and brain structure.

One of the genetic links — a specific variation in a gene called HFE, which is known to influence blood transferrin levels — was associated with reduced brain-fiber integrity, although subjects carrying this gene variant did not yet show any symptoms of disease or cognitive impairment.

“So this is one of the deep secrets of the brain,” Thompson said. “You wouldn’t think the iron in our diet would affect the brain so much in our teen years. But it turns out that it matters very much. Because myelin speeds your brain’s communications, and iron is vital for making myelin, poor iron levels in childhood erode your brain reserves which you need later in life to protect against aging and Alzheimer’s.

“This is remarkable, as we were not studying iron deficient people, just around 600 normal healthy people. It underscores the need for a balanced diet in the teenage years, when your brain’s command center is still actively maturing. ”

The findings, he said, may aid future studies of how iron transport affects brain function, development and the risk of neurodegeneration.

FRESNO, Calif. – Can eating grapes slow or help prevent the onset of age-related macular degeneration (AMD), a debilitating condition affecting millions of elderly people worldwide? Results from a new study published in Free Radical Biology and Medicine suggest this might be the case. The antioxidant actions of grapes are believed to be responsible for these protective effects.

The study compared the impact of an antioxidant-rich diet on vision using mice prone to developing retinal damage in old age in much the same way as humans do. Mice either received a grape-enriched diet, a diet with added lutein, or a normal diet.

The result? Grapes proved to offer dramatic protection: the grape-enriched diet protected against oxidative damage of the retina and prevented blindness in those mice consuming grapes. While lutein was also effective, grapes were found to offer significantly more protection.

“The protective effect of the grapes in this study was remarkable, offering a benefit for vision at old age even if grapes were consumed only at young age,” said principal investigator Silvia Finnemann, PhD, Department of Biological Sciences, Fordham University in New York.

Dr. Finnemann noted that results from her study suggest that age-related vision loss is a result of cumulative, oxidative damage over time. “A lifelong diet enriched in natural antioxidants, such as those in grapes, appears to be directly beneficial for RPE and retinal health and function.”

Age-related macular degeneration is a progressive eye condition, leading to the deterioration of the center of the retina, called the macula. It is the leading cause of blindness in the elderly. Aging of the retina is associated with increased levels of oxidative damage, and oxidative stress is thought to play a pivotal role in the development of AMD.

In AMD, there is a known decline in the function of retinal pigment epithelium cells (RPE), which are the support cells for the photoreceptors in the retina that are critical to the process of converting light into sight. The RPE dysfunction is caused by 1) a build-up of metabolic waste products (known as lipofuscin) in the RPE itself and 2) an oxidation burden on the RPE that compromise important metabolic pathways. The resulting dysfunction, distress and often death of the RPE cells leads to AMD.

This study showed that adding grapes to the diet prevented blindness in mice by significantly decreasing the build-up of lipofuscin and preventing the oxidative damage to the RPE, thus ensuring optimal functioning of this critical part of the retina.

“Preserving eye health is a key concern as we age and this study shows that grapes may play a critical role in achieving this,” said Kathleen Nave, president of the California Table Grape Commission. “This is good news for consumers of all ages who enjoy grapes, and adds to the growing body of evidence that grapes offer an array of health benefits.”

Expectant mothers on antidepressants risk newborns with high blood pressure

Mothers who take anti-depressants during pregnancy are more likely to give birth to children with persistent pulmonary hypertension (high blood pressure in the lungs) finds a study published today on bmj.com.

Persistent pulmonary hypertension is an increase in blood pressure in the lungs leading to shortness of breath and difficulty breathing. It is a rare, but severe disease with strong links to heart failure.

The study, carried out by researchers at the Centre for Pharmacoepidemiology at Karolinska Institutet in Stockholm Sweden, reviewed 1.6 million births in total between 1996 and 2007 in five Nordic countries: Denmark, Finland, Iceland, Norway and Sweden. The babies were assessed after 231 days (33 weeks).

A total of 1,618,255 singleton births were included in the study. Approximately 11,000 of the mothers filled out a prescription for anti-depressants in late pregnancy and approximately 17,000 in early pregnancy. Those who did fill out a prescription were generally older mothers who also smoked. A further 54,184 mothers were identified as having previously undergone psychiatric diagnosis but were not currently taking any medication.

Factors taken into account during the study included persistent pulmonary hypertension, maternal smoking, BMI (in early pregnancy), year of birth, gestational age at birth, birth weight and maternal diseases including epilepsy, malignancies, arthritis, bowel disease, lupus and pre-eclampsia.

The uses of several drugs were analysed which included fluoxetine, fluvoxamine, citalopram, paroxetine, sertraline, fluvoxamine and escitalopram. Although, research found that fluvoxamine had rarely been used and that none of the children with persistent pulmonary hypertension were exposed to this drug.

The results found that out of 11,014 mothers who used anti-depressants in late pregnancy just 33 babies (0.2%) were born with persistent pulmonary hypertension and out of 17,053 mothers who used anti-depressant drugs in early pregnancy, just 32 babies (less than 0.2%) were diagnosed with persistent pulmonary hypertension. A total of 114 babies whose mothers had previously been diagnosed with a mental illness were found to be suffering from the disease.

For mothers using anti-depressants, factors such as being born small for gestational age, or by C-section did not influence the likelihood of having a child with persistent pulmonary hypertension.

While the authors acknowledge that the risk of developing pulmonary persistent hypertension is low (around three cases per 1000 women which more than doubles if anti-depressants are taken in late pregnancy) they still advise caution when treating pregnant women with SSRIs.

In an accompanying editorial, researchers from the Motherisk Program Hospital for Sick Children in Toronto and the School of Pharmacy at the University of Oslo support the view that mothers who take SSRIs in late pregnancy are more likely to give birth to children with persistent pulmonary hypertension.

Researchers: Honeybee deaths linked to seed insecticide exposure

January 11, 2012

WEST LAFAYETTE, Ind. – Honeybee populations have been in serious decline for years, and Purdue University scientists may have identified one of the factors that cause bee deaths around agricultural fields.

Analyses of bees found dead in and around hives from several apiaries over two years in Indiana showed the presence of neonicotinoid insecticides, which are commonly used to coat corn and soybean seeds before planting. The research showed that those insecticides were present at high concentrations in waste talc that is exhausted from farm machinery during planting.

The insecticides clothianidin and thiamethoxam were also consistently found at low levels in soil – up to two years after treated seed was planted – on nearby dandelion flowers and in corn pollen gathered by the bees, according to the findings released in the journal PLoS One this month.

“We know that these insecticides are highly toxic to bees; we found them in each sample of dead and dying bees,” said Christian Krupke, associate professor of entomology and a co-author of the findings.

The United States is losing about one-third of its honeybee hives each year, according to Greg Hunt, a Purdue professor of behavioral genetics, honeybee specialist and co-author of the findings. Hunt said no one factor is to blame, though scientists believe that others such as mites and insecticides are all working against the bees, which are important for pollinating food crops and wild plants.

“It’s like death by a thousand cuts for these bees,” Hunt said.

Krupke and Hunt received reports that bee deaths in 2010 and 2011 were occurring at planting time in hives near agricultural fields. Toxicological screenings performed by Brian Eitzer, a co-author of the study from the Connecticut Agricultural Experiment Station, for an array of pesticides showed that the neonicotinoids used to treat corn and soybean seed were present in each sample of affected bees. Krupke said other bees at those hives exhibited tremors, uncoordinated movement and convulsions, all signs of insecticide poisoning.

Seeds of most annual crops are coated in neonicotinoid insecticides for protection after planting. All corn seed and about half of all soybean seed is treated. The coatings are sticky, and in order to keep seeds flowing freely in the vacuum systems used in planters, they are mixed with talc. Excess talc used in the process is released during planting and routine planter cleaning procedures.

“Given the rates of corn planting and talc usage, we are blowing large amounts of contaminated talc into the environment. The dust is quite light and appears to be quite mobile,” Krupke said.

Krupke said the corn pollen that bees were bringing back to hives later in the year tested positive for neonicotinoids at levels roughly below 100 parts per billion.

“That’s enough to kill bees if sufficient amounts are consumed, but it is not acutely toxic,” he said.

On the other hand, the exhausted talc showed extremely high levels of the insecticides – up to about 700,000 times the lethal contact dose for a bee.

“Whatever was on the seed was being exhausted into the environment,” Krupke said. “This material is so concentrated that even small amounts landing on flowering plants around a field can kill foragers or be transported to the hive in contaminated pollen. This might be why we found these insecticides in pollen that the bees had collected and brought back to their hives.”

Krupke suggested that efforts could be made to limit or eliminate talc emissions during planting.

“That’s the first target for corrective action,” he said. “It stands out as being an enormous source of potential environmental contamination, not just for honeybees, but for any insects living in or near these fields. The fact that these compounds can persist for months or years means that plants growing in these soils can take up these compounds in leaf tissue or pollen.”

Although corn and soybean production does not require insect pollinators, that is not the case for most plants that provide food. Krupke said protecting bees benefits agriculture since most fruit, nut and vegetable crop plants depend upon honeybees for pollination. The U.S. Department of Agriculture estimates the value of honeybees to commercial agriculture at $15 billion to $20 billion annually.

Hunt said he would continue to study the sublethal effects of neonicotinoids. He said for bees that do not die from the insecticide there could be other effects, such as loss of homing ability or less resistance to disease or mites.

“I think we need to stop and try to understand the risks associated with these insecticides,” Hunt said.

The North American Pollinator Protection Campaign and the USDA’s Agriculture and Food Research Initiative funded the research.

NIH study shows 32 million Americans have autoantibodies that target their own tissues

More than 32 million people in the United States have autoantibodies, which are proteins made by the immune system that target the body’s tissues and define a condition known as autoimmunity, a study shows. The first nationally representative sample looking at the prevalence of the most common type of autoantibody, known as antinuclear antibodies (ANA), found that the frequency of ANA is highest among women, older individuals, and African-Americans. The study was conducted by the National Institute of Environmental Health Sciences (NIEHS), part of the National Institutes of Health. Researchers in Gainesville at the University of Florida also participated.

Earlier studies have shown that ANA can actually develop many years before the clinical appearance of autoimmune diseases, such as type 1 diabetes, lupus, and rheumatoid arthritis. ANA are frequently measured biomarkers for detecting autoimmune diseases, but the presence of autoantibodies does not necessarily mean a person will get an autoimmune disease. Other factors, including drugs, cancer, and infections, are also known to cause autoantibodies in some people.

“Previous estimates of ANA prevalence have varied widely and were conducted in small studies not representative of the general population,” said Frederick Miller, M.D., Ph.D., an author of the study and acting clinical director at NIEHS. “Having this large data set that is representative of the general U.S. population and includes nearly 5,000 individuals provides us with an accurate estimate of ANA and may allow new insights into the etiology of autoimmune diseases.” The findings appear online in the Jan. 11 issue of the journal Arthritis and Rheumatism.

Miller, who studies the causes of autoimmune diseases, explains that the body’s immune system makes large numbers of proteins called antibodies to help the body fight off infections. In some cases, however, antibodies are produced that are directed against one’s own tissues. These are referred to as autoantibodies.

A multi-disciplinary team of researchers evaluated blood serum samples using a technique called immunofluorescence to detect ANA in 4,754 individuals from the 1994-2004 National Health and Nutrition Examination Survey (NHANES). The overall prevalence of ANA in the population was 13.8 percent, and was found to be modestly higher in African-Americans compared to whites. ANA generally increased with age and were higher in women than in men, with the female to male ratio peaking at 40-49 years of age and then declining in older age groups.

“The peak of autoimmunity in females compared to males during the 40-49 age bracket is suggestive of the effects that the hormones estrogen and progesterone might be playing on the immune system,” said Linda Birnbaum, Ph.D., director of NIEHS and an author on the paper.

The paper also found that the prevalence of ANA was lower in overweight and obese individuals than persons of normal weight. “This finding is interesting and somewhat unexpected,” said Edward Chan, Ph.D., an author on the study and professor of the Department of Oral Biology at the University of Florida.

“It raises the likelihood that fat tissues can secrete proteins that inhibit parts of the immune system and prevent the development of autoantibodies, but we will need to do more research to understand the role that obesity might play in the development of autoimmune diseases,” said Minoru Satoh, M.D., Ph.D., another author on the study and associate professor of rheumatology and clinical immunology at the University of Florida.

The researchers say the paper should serve as a useful baseline for future studies looking at changes in ANA prevalence over time and the factors associated with ANA development. The paper is the first in a series analyzing these data from the NHANES dataset, and exploring possible environmental associations with ANA.

Study finds curcumin as effective as antidepressants

In a recently published study, a highly absorbable form of curcumin called BCM-95 was compared to fluoxetine (one brand name is Prozac®) and imipramine (one brand name is Tofranil®) in an animal scientific model of depression and found to be as effective in alleviating depression as either prescription drug.The study [Evaluation of Antidepressant-Like Activity of Curcumin and its Combination with Fluoxetine and Imipramine: an Acute and Chronic Study. Sanmukhani J, Anovadiya A, Tripathi CB. Published in: Acta Pol Pharm.2011 Sep-Oct; 68(5):769-75] also found no significant adverse safety issues with BCM-95 curcumin use.The study authors theorize that“antidepressant like activity could be due to an increase in serotonin, norepinephrine and dopamine levels in the brain.”BCM-95 curcumin is an extract of the spice turmeric, but only 2 to 5 percent of turmeric is curcumin. BCM-95 curcumin has been shown in published human studies to have up to 10 times the absorption of standard curcumin. [ Antony B, Merina B, Iyer VS, et al. A Pilot Cross-Over Study to Evaluate Human Oral Bioavailability of BCM-95CG (Biocurcumax), A Novel Bioenhanced Preparation of Curcumin. Indian J Pharm Sci. 2008;70(4):445-449]

Dr. Benny Antony, lead author of the absorption trial, commented on the new study, “It does not matter how much you take—it matters how much you absorb. BCM-95 curcumin is not only significantly better absorbed than standard curcumin; the curcuminoids are absorbed in the ratio in which they occur in nature. I personally feel this plays a role in BCM-95’s effectiveness, and I am glad to see more studies illuminating the health benefits of this extraordinary herb.”

According to the Center for Disease Control (CDC), an estimated one in ten adults report depression, which translates to over 30 million individuals in the United States alone. Experts estimate that 20 to 45 percent of antidepressant users fail to respond to treatment and between 5 and 20 percent of patients (depending on type of drug) stop using these medications due to severe adverse effects.

________________________________

These reports are done with the appreciation of all the Doctors, Scientist, and other

Medical Researchers who sacrificed their time and effort. In order to give people the

ability to empower themselves. Without the base aspirations for fame, or fortune. Just honorable people, doing honorable things.