The aim of this study was to explore the relationship of coexisting severe frailty and malnutrition with all-cause mortality among the oldest old in nursing homes. This study was conducted among all subjects (n=160) aged 85 years and older who lived in two nursing homes of Japan. Information about the health status of participants was gathered from history, medical documentation, test assessing frailty, according to the Canadian Study of Health and Aging-Clinical Frailty Scale (CSHA-CFS) and the Mini Nutritional Assessment Short Form (MNA-SF). Seventy five residents (46.9%) were identified as affected by coexisting severe frailty and malnutrition. After a 12-month follow-up period, 42 (26.3%) residents died. In the Cox regression analysis, coexisting severe frailty and malnutrition, and heart failure were associated with mortality during the 12-month follow-up period among the oldest old nursing home residents (adjusted HR 10.89, 95% CI 4.04-29.33, p<0.0001; and adjusted HR 7.83, 95% CI 3.25-18.88, p<0.0001, respectively). The present study suggests that coexisting severe frailty and malnutrition are very frequent, and coexisting severe frailty and malnutrition are associated with all-cause mortality among the oldest old in nursing homes.

KEYWORDS:

Frailty; Malnutrition; Mortality; Nursing home; Oldest old

[i looks as though only the Spanish full-text is availed.]

Changes in fatty liver index after consuming a Mediterranean diet: 6-year follow-up of the PREDIMED-Malaga trial.

To analyze the effect of an intervention with a Mediterranean diet supplemented with either extra virgin olive oil or nuts, on the fatty liver index (FLI), compared to a low-fat control diet.

METHODS:

Participants of the PREDIMED-Malaga trial, free from cardiovascular disease at baseline, but with a high risk to develop it, were included in this study. Anthropometric measurements were assessed and blood samples were taken to calculate participants' FLI at study baseline and after one, 3, 5 and 6 years. Mixed linear models were used to explore the fixed effects of the 3 intervention groups on the FLI as well as their interaction with time.

RESULTS:

A total of 276 participants were included in the study. Average participant age was 67 years, with 66% of participants being women. The baseline prevalence of NAFL was 57%. The change in the FLI of the control group increased significantly over time (1.13±0.41; P=.006). In the MedDiet+EVOO group, the time trend of the change in the FLI was similar to that of the control group, although it was seen to be lower (-3.90±1.9; P=.038). In the MedDiet+Nuts group, the trend was significantly lower than that of the control group (-1.63±0.62; P=.009). In the MedDiet+Nuts group, the trend of changes in participants' BMI was 0.100 points lower per year compared to the control group (P=.004). In the control group, the change in waist circumference increased significantly over time (0.61±0.16cm/year; P<.001) in contrast to the MedDiet+EVOO group, in which this variable remained stable (-0.51±0.22; P=.019).

CONCLUSIONS:

A dietary intervention consisting of a Mediterranean diet could delay or slow down the natural progression of NAFL, thus, being beneficial for its prevention and treatment. However, further studies supporting these conclusions have yet to be carried out.

In comparison with 1980-1990, an overall decrease in the incidence of suicide was found. The annual averages for male, female, and total suicide rates were 32.2, 11.0, and 21.3, respectively, representing decreases of 17.6%, 25.7%, and 19.3%. Decreases were observed in all age groups except for males aged 80-84 years, where the suicide rate was 123.5 (15.1% increase), and for males aged 85 years or over, where the rate was 148.9 (25.4% increase). Hanging is still the most frequently used suicide method in Austria, despite steady decreases during recent decades. During the 1990s, hanging was used in 47.5% of male suicides and 34.8% of female cases. Shooting is the next most common method for male suicides (23.5% of cases) and has become more frequent for both sexes.

CONCLUSIONS:

The main findings reveal that the decrease in suicide incidence in Austria is greater for females than for males, reflecting the increased suicide risk within the oldest male age groups. This population subgroup should thus be a particular target for suicide-prevention efforts in Austria. A further aim within a national strategy for suicide prevention should be to stop the increased use of shooting as a suicide method.

KEYWORDS:

Austria; Epidemiology; Suicide; Suicide prevention

Role of Insulin-Stimulated Adipose Tissue Perfusion in the Development of Whole-Body Insulin Resistance.

After food ingestion, macronutrients are transported to and stored in the skeletal muscle and adipose tissue. They can be subsequently used as an energy source in times of energy deprivation. Uptake of these nutrients in myocytes and adipocytes depends largely on adequate tissue perfusion. Interestingly, insulin is able to dilate skeletal muscle arterioles, which facilitates the delivery of nutrients and insulin itself to muscle tissue. Insulin-stimulated skeletal muscle perfusion is impaired in several insulin-resistant states and is believed to contribute to impaired skeletal muscle and consequently whole-body glucose uptake. Insulin-resistant individuals also exhibit blunted postprandial adipose tissue perfusion. However, the relevance of this impairment to metabolic dysregulation is less clear. In this review, we provide an overview of adipose tissue perfusion in healthy and insulin-resistant individuals, its regulation among others by insulin, and the possible influences of impaired adipose tissue perfusion on whole-body insulin sensitivity. Finally, we propose a novel hypothesis that acute overfeeding impacts distribution of macronutrients by reducing skeletal muscle perfusion, while adipose tissue perfusion remains intact.

Every week, we're on the lookout for thought-provoking and off-the-radar bits of health news. Here's a selection of stories from this week's health newsletter, Second Opinion.

...

Perils of calculating your date of demise

Date with death

How long will you live? How many days do you have left? A number of online sites claim to offer the answer. So-called "death clocks" ask for a variety of details, and then calculate your life expectancy.

Now a British economist has tested a random sample of those sites and found the results to be as unpredictable as, well, life itself. In the end, he was given life expectancies ranging from 67 to 89 years.

But before you investigate your own death clock, you might want to heed his advice: "Death clocks should come with a health warning: calculating your date of demise is somewhat sobering and the results should be taken with a pinch of salt."

Hopefully reading this isn't making you drowsy. But if it is, how was your sleep last night?

A panel of experts has weighed in on what constitutes quality sleep. Among the findings: continuity is key. That means falling asleep in less than 30 minutes, waking up no more than once during the night, and falling back asleep within 20 minutes.

Overall, they found that sleeping at least 85 per cent of the time while lying in bed to be a quality sleep. The authors say that with all the new sleep tracking gadgets on the market, it's important to have accepted indicators so people can better gauge their sleeps.

To provide evidence-based recommendations and guidance to the public regarding indicators of good sleep quality across the life-span.

Methods

The National Sleep Foundation assembled a panel of experts from the sleep community and representatives appointed by stakeholder organizations (Sleep Quality Consensus Panel). A systematic literature review identified 277 studies meeting inclusion criteria. Abstracts and full-text articles were provided to the panelists for review and discussion. A modified Delphi RAND/UCLA Appropriateness Method with 3 rounds of voting was used to determine agreement.

Results

For most of the sleep continuity variables (sleep latency, number of awakenings >5 minutes, wake after sleep onset, and sleep efficiency), the panel members agreed that these measures were appropriate indicators of good sleep quality across the life-span. However, overall, there was less or no consensus regarding sleep architecture or nap-related variables as elements of good sleep quality.

Conclusions

There is consensus among experts regarding some indicators of sleep quality among otherwise healthy individuals. Education and public health initiatives regarding good sleep quality will require sustained and collaborative efforts from multiple stakeholders. Future research should explore how sleep architecture and naps relate to sleep quality. Implications and limitations of the consensus recommendations are discussed.

At baseline, 42% of the 5,888 participants were men and 84% were white. At enrollment, 3.7% (215 of 5,888) met the criteria for prevalent epilepsy. During 14 years of follow-up totaling 48,651 person-years, 120 participants met the criteria for incident epilepsy, yielding an incidence rate of 2.47 per 1,000 person-years. The period prevalence of epilepsy by the end of follow-up was 5.7% (335 of 5,888). Epilepsy incidence rates were significantly higher among blacks than nonblacks: 4.44 vs 2.17 per 1,000 person-years (p < 0.001). In multivariable analyses, risk of incident epilepsy was significantly higher among blacks compared to nonblacks (hazard ratio {HR} 4.04, 95% confidence interval [CI] 1.99-8.17), those 75 to 79 compared to those 65 to 69 years of age (HR 2.07, 95% CI 1.21-3.55), and those with history of stroke (HR 3.49, 95% CI 1.37-8.88).

CONCLUSIONS:

Epilepsy in older adults in the United States was common. Blacks, the very old, and those with history of stroke have a higher risk of incident epilepsy. The association with race remains unexplained.

Emerging evidence has demonstrated that gut microbiome plays essential roles in the pathogenesis of human diseases in distal organs. Amyotrophic lateral sclerosis (ALS) is a fatal neurodegenerative disease characterized by the progressive loss of motor neurons. Treatment with the only drug approved by the US Food and Drug Administration for use in ALS, riluzole, extends a patient׳s life span by only a few months. Thus, there is an urgent need to develop novel interventions that for alleviate disease progression and improve quality of life in patients with ALS. Here we present evidence that intestinal dysfunction and dysbiosis may actively contribute to ALS pathophysiology.

METHODS:

We used G93A transgenic mice as a model of human ALS. The G93A mice show abnormal intestinal microbiome and damaged tight junctions before ALS disease onset. The mice were given 2% butyrate, a natural bacterial product, in the drinking water.

RESULTS:

In mice fed with butyrate, intestinal microbial homeostasis was restored, gut integrity was improved, and life span was prolonged compared with those in control mice. At the cellular level, abnormal Paneth cells-specialized intestinal epithelial cells that regulate the host-bacterial interactions-were significantly decreased in the ALS mice treated with butyrate. In both ALS mice and intestinal epithelial cells cultured from humans, butyrate treatment was associated with decreased aggregation of the G93A superoxide dismutase 1 mutated protein.

IMPLICATIONS:

The findings from this study highlight the complex role of the gut microbiome and intestinal epithelium in the progression of ALS and present butyrate as a potential therapeutic reagent for restoring ALS-related dysbiosis.

Low-grade chronic inflammation is associated with several chronic conditions, and diet is known to play a role in chronic inflammation. We aimed to evaluate the association between the inflammatory potential of the diet and mortality in the Spanish population from the European Prospective Investigation into Cancer and Nutrition (EPIC-Spain).

METHODS AND RESULTS:

The study included 41199 participants (62% female) aged 29-69 years from 5 Spanish regions. During 18 years of follow-up 3316 deaths were identified. The dietary inflammatory potential was assessed by means of an inflammatory score of the diet (ISD), calculated using 30 dietetic components and their corresponding inflammatory scores (weights). The association between the ISD and mortality was analyzed by multivariate Cox regression models. There was a significant association between ISD and mortality: subjects classified in the fifth quintile of the ISD (more pro-inflammatory diets) had a hazard ratio of 1.42 (95%-confidence interval 1.25-1.60) as compared with those in the first quintile; the corresponding figures were 1.89 (1.48-2.40) for cardiovascular diseases mortality and 1.44 (1.22-1.69) for death by cancer).

CONCLUSION:

Consuming more pro-inflammatory diets, expressed by means of the ISD, is associated with higher mortality; this effect seems to be stronger for deaths by cardiovascular diseases. This article is protected by copyright. All rights reserved.

Upon eating, the body needs to distribute glucose and fight the bacteria injested, triggering an inflammatory response that activates the immune system.

Food is more than just the need to acquire energy, it also involves the consumption of significant quantities of bacteria. The body is then faced with the job of distributing glucose to cells in the bloodstream and then confronting hostile bacteria. Researchers from the University Hospital Basel have discovered that in healthy individuals, the body triggers the immune system into action, and this results in a natural inflammatory response. However, for overweight people, this inflammation effect fails for some reason and increases an individual's risk of diabetes.

It is well documented that adult-onset diabetes (type 2 diabetes) leads to widespread inflammation. Doctors treat this form of diabetes with drugs that stem the over production of a chemical called Interleukin-1beta (IL-1beta). For diabetic patients, this compound is known to kill cells that produce the insulin as well as causing chronic inflammation. The report was published in the journal Nature Immunology.

Immune Cells Activated During Meal Time

The study has pointed out that inflammation plays an important role in the healing process as it does activate the immune system to respond to any threat. Short-term inflammation in healthy people also plays a key role in regulating sugar uptake which is a process of transporting glucose to the body's cells.

In the study, Professor Marc Donath and his team of researchers demonstrated that a number of immune cell types in the intestinal walls measurably increase during meal time. These scavenger cells are known to produce the chemical IL-1beta which varies in the concentration dependent on levels of glucose in the bloodstream. The immune cells also stimulate the production of insulin from the pancreas. The result of higher insulin increases production of IL-1beta compounds in the blood. In regulating the blood sugar levels, it appears both insulin and Interleukin-1beta are working together. Interleukin-1beta seems to ensure there is a steady supply of glucose which is vital to keep the immune system active.

Our Immune Systems Vulnerable When Lacking Nutrients

When we eat sufficient nutrients, foreign bacteria can be effectively combated by our immune systems. But if we are lacking in nutrients, our bodies must conserve the remaining calories for vital functions leaving the immune system vulnerable. This can explain why during times of famine, infectious diseases are commonplace. Researchers now believe that the mechanisms of the immune system and metabolism are so dependent on the levels of nutrients and bacterial we consume during our meals.

The deleterious effect of chronic activation of the IL-1β system on type 2 diabetes and other metabolic diseases is well documented. However, a possible physiological role for IL-1β in glucose metabolism has remained unexplored. Here we found that feeding induced a physiological increase in the number of peritoneal macrophages that secreted IL-1β, in a glucose-dependent manner. Subsequently, IL-1β contributed to the postprandial stimulation of insulin secretion. Accordingly, lack of endogenous IL-1β signaling in mice during refeeding and obesity diminished the concentration of insulin in plasma. IL-1β and insulin increased the uptake of glucose into macrophages, and insulin reinforced a pro-inflammatory pattern via the insulin receptor, glucose metabolism, production of reactive oxygen species, and secretion of IL-1β mediated by the NLRP3 inflammasome. Postprandial inflammation might be limited by normalization of glycemia, since it was prevented by inhibition of the sodium-glucose cotransporter SGLT2. Our findings identify a physiological role for IL-1β and insulin in the regulation of both metabolism and immunity.

Dietary Supplements and Risk of Cause-Specific Death, Cardiovascular Disease, and Cancer: A Systematic Review and Meta-Analysis of Primary Prevention Trials.

Our aim was to assess the efficacy of dietary supplements in the primary prevention of cause-specific death, cardiovascular disease (CVD), and cancer by using meta-analytical approaches. Electronic and hand searches were performed until August 2016. Inclusion criteria were as follows: 1) minimum intervention period of 12 mo; 2) primary prevention trials; 3) mean age ≥18 y; 4) interventions included vitamins, fatty acids, minerals, supplements containing combinations of vitamins and minerals, protein, fiber, prebiotics, and probiotics; and 5) primary outcome of all-cause mortality and secondary outcomes of mortality or incidence from CVD or cancer. Pooled effects across studies were estimated by using random-effects meta-analysis. Overall, 49 trials (69 reports) including 287,304 participants met the inclusion criteria. Thirty-two trials were judged as low risk-, 15 trials as moderate risk-, and 2 trials as high risk-of-bias studies. Supplements containing vitamin E (RR: 0.88; 95% CI: 0.80, 0.96) significantly reduced cardiovascular mortality risk, whereas supplements with folic acid reduced the risk of CVD (RR: 0.81; 95% CI: 0.70, 0.94). Vitamins D, C, and K; selenium; zinc; magnesium; and eicosapentaenoic acid showed no significant risk reduction for any of the outcomes. On the contrary, vitamin A was linked to an increased cancer risk (RR: 1.16; 95% CI: 1.00, 1.35). Supplements with β-carotene showed no significant effect; however, in the subgroup with β-carotene given singly, an increased risk of all-cause mortality by 6% (RR: 1.06; 95% CI: 1.02, 1.10) was observed. Taken together, we found insufficient evidence to support the use of dietary supplements in the primary prevention of cause-specific death, incidence of CVD, and incidence of cancer. The application of some supplements generated small beneficial effects; however, the heterogeneous types and doses of supplements limit the generalizability to the overall population.

Magnesium is a mineral that is essential to hundreds of biochemical reactions in the body – notably, the metabolism of glucose and the production of cellular energy. F. Guerrero-Romero, from the Institute at Durango (Mexico), and colleagues enrolled 116 men and non-pregnant women, ages 30 to 65 years, with hypomagnesaemia and newly diagnosed with prediabetes, in a study in which subjects were randomized to receive either magnesium chloride (30 mL, 5% solution; equivalent to 382mg of magnesium) or an inert placebo, once daily for four months. The group that received magnesium chloride displayed significant reductions in fasting and post-meal blood glucose levels, as well as reduced insulin resistance. Further, the magnesium group also showed reductions in triglycerides and increases in high-density lipoprotein (HDL, “good” cholesterol). The study authors report that: “Our results show that magnesium supplementation reduces plasma glucose levels, and improves the glycaemic status of adults with prediabetes and hypomagnesaemia.”

This study evaluated the efficacy of oral magnesium supplementation in the reduction of plasma glucose levels in adults with prediabetes and hypomagnesaemia.

METHODS:

A total of 116 men and non-pregnant women, aged 30 to 65 years with hypomagnesaemia and newly diagnosed with prediabetes, were enrolled into a randomized double-blind placebo-controlled trial to receive either 30 mL of MgCl2 5% solution (equivalent to 382 mg of magnesium) or an inert placebo solution once daily for four months. The primary trial endpoint was the efficacy of magnesium supplementation in reducing plasma glucose levels.

Magnesium is the fourth most abundant mineral in the body. It has been recognized as a cofactor for more than 300 enzymatic reactions, where it is crucial for adenosine triphosphate (ATP) metabolism. Magnesium is required for DNA and RNA synthesis, reproduction, and protein synthesis. Moreover, magnesium is essential for the regulation of muscular contraction, blood pressure, insulin metabolism, cardiac excitability, vasomotor tone, nerve transmission and neuromuscular conduction. Imbalances in magnesium status-primarily hypomagnesemia as it is seen more common than hypermagnesemia-might result in unwanted neuromuscular, cardiac or nervous disorders. Based on magnesium's many functions within the human body, it plays an important role in prevention and treatment of many diseases. Low levels of magnesium have been associated with a number of chronic diseases, such as Alzheimer's disease, insulin resistance and type-2 diabetes mellitus, hypertension, cardiovascular disease (e.g., stroke), migraine headaches, and attention deficit hyperactivity disorder (ADHD).

Higher dietary intake of potassium, calcium, and magnesium is protective against ischemic strokes while also being associated with a decreased risk of all-cause dementia. The effect of dietary iron intake on cerebral function is less clear but iron is also implicated in Alzheimer neuropathology. The aim of this study was to investigate whether dietary intake of these minerals was also associated with increased risk of mild cognitive impairment (MCI, amnestic) and other mild cognitive disorders (MCD).

METHODS:

Associations between dietary mineral intake and risk of MCI/MCD were assessed in cognitively healthy individuals (n = 1406, 52% female, mean age 62.5 years) living in the community, who were followed up over 8 years. Relative risk was assessed with Cox hazard ratios (HRs) after controlling for health and socio-demographic covariates.

These findings suggest that dietary intake of minerals known to be implicated in biological processes associated with vascular and Alzheimer's pathology may contribute to disease progression earlier in the disease process and require further attention.

Green tea is one of the most widely consumed beverages in Asia. While a possible protective role of green tea against various chronic diseases has been suggested in experimental studies, evidence from human studies remains controversial.

Dyslipidaemia is characterized by increased blood levels of total or LDL cholesterol and triglycerides, or decreased HDL cholesterol levels, and is a risk factor for cardiovascular disease. Dyslipidaemia has a high worldwide prevalence, and many patients are turning to alternatives to pharmacotherapy to manage their lipid levels. Lifestyle modification should be emphasized in all patients to reduce cardiovascular risk and can be initiated before pharmacotherapy in primary prevention of cardiovascular disease. Many functional foods and natural health products have been investigated for potential lipid-lowering properties. Those with good evidence for a biochemical effect on plasma lipid levels include soy protein, green tea, plant sterols, probiotic yogurt, marine-derived omega-3 fatty acids and red yeast rice. Other products such as seaweed, berberine, hawthorn and garlic might confer some limited benefit in certain patient groups. Although none of these products can reduce lipid levels to the same extent as statins, most are safe to use in addition to other lifestyle modifications and pharmacotherapy. Natural health products marketed at individuals with dyslipidaemia, such as policosanol, guggulsterone and resveratrol, have minimal definitive evidence of a biochemical benefit. Additional research is required in this field, which should include large, high-quality randomized controlled trials with long follow-up periods to investigate associations with cardiovascular end points.

The aim of this study was to evaluate the effect of the Mediterranean diet (MedDiet) on the incidence of heart failure (HF), a pre-specified secondary outcome in the PREDIMED (PREvención con DIeta MEDiterránea) primary nutrition-intervention prevention trial.

METHODS AND RESULTS:

Participants at high risk of cardiovascular disease were randomly assigned to one of three diets: MedDiet supplemented with extra-virgin olive oil (EVOO), MedDiet supplemented with nuts, or a low-fat control diet. Incident HF was ascertained by a Committee for Adjudication of events blinded to group allocation. Among 7403 participants without prevalent HF followed for a median of 4.8 years, we observed 29 new HF cases in the MedDiet with EVOO group, 33 in the MedDiet with nuts group, and 32 in the control group. No significant association with HF incidence was found for the MedDiet with EVOO and MedDiet with nuts, compared with the control group [hazard ratio (HR) 0.68; 95% confidence interval (CI) 0.41-1.13, and HR 0.92; 95% CI 0.56-1.49, respectively].

CONCLUSION:

In this sample of adults at high cardiovascular risk, the MedDiet did not result in lower HF incidence. However, this pre-specified secondary analysis may have been underpowered to provide valid conclusions. Further randomized controlled trials with HF as a primary outcome are needed to better assess the effect of the MedDiet on HF risk.

The incidence of prostate cancer is much lower in Asian than in Western populations. Lifestyle and dietary habits may play a major role in the etiology of this cancer. Given the possibility that risk factors for prostate cancer differ by disease aggressiveness, and the fact that 5-year relative survival rate of localized prostate cancer is 100%, identifying preventive factors against advanced prostate cancer is an important goal. Using data from the Japan Public Health Center-based Prospective Study, the author elucidates various lifestyle risk factors for prostate cancer among Japanese men. The results show that abstinence from alcohol and tobacco might be important factors in the prevention of advanced prostate cancer. Moreover, the isoflavones and green tea intake in the typical Japanese diet may decrease the risk of localized and advanced prostate cancers, respectively.

[it looked to me like the authors incorrectly described their data, at least in the abstract.]

Risk of incident ischemic stroke according to the metabolic health and obesity states in the Vascular-Metabolic CUN cohort.

Background Whether obesity is a major risk factor for cardiovascular disease in the absence of metabolic comorbidities remains under debate. Indeed, some obese individuals may be at low risk of metabolic-related complications, while normal-weight individuals may not be "healthy." Aims To assess the incidence of ischemic stroke according to the metabolic health and obesity states of 5171 participants from the Vascular-Metabolic CUN cohort. Methods A Cox proportional-hazard analysis was conducted to estimate the hazard ratio and their 95% confidence interval of stroke according to the metabolic health and obesity states based on TyG index and Adult Treatment Panel-III criteria, during 9.1 years of follow-up. Results After 50,056.2 person-years of follow-up, 162 subjects developed an ischemic stroke (incidence rate 3.23 per 1000 person-years). Metabolically healthy obese subjects did not show greater risk of stroke, while metabolically unhealthy participants, obese and non-obese, had an increased risk of stroke, compared with healthy non-obese. The hazard ratios for the multivariable adjusted model were 1.55 (95% CI: 1.36-1.77) and 1.86 (95% CI: 1.57-2.21), respectively. Conclusions Metabolically unhealthy individuals exhibited a greater risk of ischemic stroke than metabolically healthy obese individuals.

The achievement of sustainability and health objectives in Western countries requires a transition to a less meat-based diet. This article investigates whether the alleged link between meat consumption and particular framings of masculinity, which emphasize that 'real men' eat meat, may stand in the way of achieving these objectives. From a theoretical perspective, it was assumed that the meat-masculinity link is not invariant but dependent on the cultural context, including ethnicity. In order to examine the link in different contexts, we analyzed whether meat-related gender differences varied across ethnic groups, using samples of young second generation Chinese Dutch, Turkish Dutch and native Dutch adults (aged 18-35) in the Netherlands. The Turkish group was the most traditional; it showed the largest gender differences and the strongest meat-masculinity link. In contrast, the native group showed the smallest gender differences and the weakest meat-masculinity link. The findings suggest that the combination of traditional framings of masculinity and the Western type of food environment where meat is abundant and cheap is bound to seriously hamper a transition to a less meat-based diet. In contrast, less traditional framings of masculinity seem to contribute to more healthy food preferences with respect to meat. It was concluded that cultural factors related to gender and ethnic diversity can play harmful and beneficial roles for achieving sustainability and health objectives.

The present studies examine how culturally held stereotypes about gender (that women eat more healthfully than men) implicitly influence food preferences. In Study 1, priming masculinity led both male and female participants to prefer unhealthy foods, while priming femininity led both male and female participants to prefer healthy foods. Study 2 extended these effects to gendered food packaging. When the packaging and healthiness of the food were gender schema congruent (i.e., feminine packaging for a healthy food, masculine packaging for an unhealthy food) both male and female participants rated the product as more attractive, said that they would be more likely to purchase it, and even rated it as tasting better compared to when the product was stereotype incongruent. In Study 3, packaging that explicitly appealed to gender stereotypes (“The muffin for real men”) reversed the schema congruity effect, but only among participants who scored high in psychological reactance.

Transient ischemic attack (TIA) epidemiology may have changed in recent years as a consequence of improved identification and treatment of vascular risk factors. Our aim was to provide updated information about TIA epidemiology in Italy.

METHODS:

Cases of first-ever TIA were ascertained from January 1, 2011, until December 31, 2012, in a population-based prospective registry. All residents in the L'Aquila district with an incident TIA were included and followed up to 2 years after the event. Outcome events were recurrent TIA, nonfatal and fatal stroke, nonfatal and fatal myocardial infarction, and all-cause mortality.

RESULTS:

A total of 210 patients with a TIA according to the traditional time-based definition were included (51.4% women); 151 patients (71.9%) with transient symptoms and negative brain neuroimaging were broadly considered as tissue-based TIA, 29 patients (13.8%) had transient symptoms and evidence of a congruous acute ischemic lesion, and 30 patients (14.3%) had an acute neurovascular syndrome. The crude annual incidence rate for traditional time-based TIA was 35.2 per 100 000 (95% confidence interval, 30.6-40.3) and 28.6 per 100 000 (95% confidence interval, 24.1-33.5) when standardized to the 2011 European population. The incidence peaked in subjects aged ≥85 years, in both sexes. At 2 years, outcome events occurred in 50 patients (23.8%) including 15 patients (7.1%) with nonfatal or fatal strokes.

CONCLUSIONS:

Our population-based study found a low annual TIA incidence rate and a fair TIA prognosis confirming the effectiveness of preventive strategies for cardiovascular diseases. We also proved the nonfitting applicability of the tissue-based definition in our district.

The incidence of osteoporotic fractures is lower in countries in the Mediterranean basin. Virgin olive oil, a key component of the Mediterranean Diet (MDiet), with recognised beneficial effects on metabolism and cardiovascular health, may decrease the risk of osteoporotic fractures. The aim to this study was to explore the effect of chronic consumption of total olive oil and its varieties on the risk of osteoporosis-related fractures in a middle-aged and elderly Mediterranean population.

METHODS:

We included all participants (n = 870) recruited in the Reus (Spain) centre of the PREvención con DIeta MEDiterránea (PREDIMED) trial. Individuals, aged 55-80 years at high cardiovascular risk, were randomized to a MedDiet supplemented with extra-virgin olive oil, a MedDiet supplemented with nuts, or a low-fat diet. The present analysis was an observational cohort study nested in the trial. A validated food frequency questionnaire was used to assess dietary habits and olive oil consumption. Information on total osteoporotic fractures was obtained from a systematic review of medical records. The association between yearly repeated measurements of olive oil consumption and fracture risk was assessed by multivariate Cox proportional hazards.

RESULTS:

We documented 114 incident cases of osteoporosis-related fractures during a median follow-up of 8.9 years. Treatment allocation had no effect on fracture risk. Participants in the highest tertile of extra-virgin olive oil consumption had a 51% lower risk of fractures (HR:0.49; 95% CI:0.29-0.81. P for trend = 0.004) compared to those in the lowest tertile after adjusting for potential confounders. Total and common olive oil consumption was not associated with fracture risk.

CONCLUSIONS:

Higher consumption of extra-virgin olive oil is associated with a lower risk of osteoporosis-related fractures in middle-aged and elderly Mediterranean population at high cardiovascular risk.

KEYWORDS:

Aging; Olive oil; Osteoporotic fractures; Prevention

RISK OF CARDIOVASCULAR DISEASE MORBIDITY AND MORTALITY IN FRAIL AND PRE-FRAIL OLDER ADULTS: RESULTS FROM A META-ANALYSIS AND EXPLORATORY META-REGRESSION ANALYSIS.

Frailty is common and associated with poorer outcomes in the elderly, but its role as potential cardiovascular disease (CVD) risk factor requires clarification. We thus aimed to meta-analytically evaluate the evidence of frailty and pre-frailty as risk factors for CVD. Two reviewers selected all studies comparing data about CVD prevalence or incidence rates between frail/pre-frail vs. robust. The association between frailty status and CVD in cross-sectional studies was explored by calculating and pooling crude and adjusted odds ratios (ORs) ±95% confidence intervals (CIs); the data from longitudinal studies were pooled using the adjusted hazard ratios (HRs). Eighteen cohorts with a total of 31,343 participants were meta-analyzed. Using estimates from 10 cross-sectional cohorts, both frailty and pre-frailty were associated with higher odds of CVD than robust participants. Longitudinal data were obtained from 6 prospective cohort studies. After a median follow-up of 4.4 years, we identified an increased risk for faster onset of any-type CVD in the frail (HR=1.70 [95%CI, 1.18-2.45]; I2=66%) and pre-frail (HR=1.23 [95%CI, 1.07-1.36]; I2=67%) vs. robust groups. Similar results were apparent for time to CVD mortality in the frail and pre-frail groups. In conclusion, frailty and pre-frailty constitute addressable and independent risk factors for CVD in older adults.

KEYWORDS:

cardiovascular disease; frailty; meta-analysis

[The below paper is pdf-availed.]

Potential involvement of dietary advanced glycation end products in impairment of skeletal muscle growth and muscle contractile function in mice.

Egawa T, Tsuda S, Goto A, Ohno Y, Yokoyama S, Goto K, Hayashi T.

Br J Nutr. 2017 Jan;117(1):21-29. doi: 10.1017/S0007114516004591.

PMID: 28093090

Abstract

Diets enriched with advanced glycation end products (AGE) have recently been related to muscle dysfunction processes. However, it remains unclear whether long-term exposure to an AGE-enriched diet impacts physiological characteristics of skeletal muscles. Therefore, we explored the differences in skeletal muscle mass, contractile function and molecular responses between mice receiving a diet high in AGE (H-AGE) and low in AGE (L-AGE) for 16 weeks. There were no significant differences between L-AGE and H-AGE mice with regard to body weight, food intake or epididymal fat pad weight. However, extensor digitorum longus (EDL) and plantaris (PLA) muscle weights in H-AGE mice were lower compared with L-AGE mice. Higher levels of N ε -(carboxymethyl)-l-lysine, a marker for AGE, in EDL muscles of H-AGE mice were observed compared with L-AGE mice. H-AGE mice showed lower muscle strength and endurance in vivo and lower muscle force production of PLA muscle in vitro. mRNA expression levels of myogenic factors including myogenic factor 5 and myogenic differentiation in EDL muscle were lower in H-AGE mice compared with L-AGE mice. The phosphorylation status of 70-kDa ribosomal protein S6 kinase Thr389, an indicator of protein synthesis signalling, was lower in EDL muscle of H-AGE mice than that of L-AGE mice. These findings suggest that long-term exposure to an AGE-enriched diet impairs skeletal muscle growth and muscle contractile function, and that these muscle dysfunctions may be attributed to the inhibition of myogenic potential and protein synthesis.

This study aimed to examine the association between vitamin B6, folate and vitamin B12 biomarkers and plasma fatty acids in European adolescents. A subsample from the Healthy Lifestyle in Europe by Nutrition in Adolescence study with valid data on B-vitamins and fatty acid blood parameters, and all the other covariates used in the analyses such as BMI, Diet Quality Index, education of the mother and physical activity assessed by a questionnaire, was selected resulting in 674 cases (43 % males). B-vitamin biomarkers were measured by chromatography and immunoassay and fatty acids by enzymatic analyses. Linear mixed models elucidated the association between B-vitamins and fatty acid blood parameters (changes in fatty acid profiles according to change in 10 units of vitamin B biomarkers). DHA, EPA) and n-3 fatty acids showed positive associations with B-vitamin biomarkers, mainly with those corresponding to folate and vitamin B12. Contrarily, negative associations were found with n-6:n-3 ratio, trans-fatty acids and oleic:stearic ratio. With total homocysteine (tHcy), all the associations found with these parameters were opposite (for instance, an increase of 10 nmol/l in red blood cell folate or holotranscobalamin in females produces an increase of 15·85 µmol/l of EPA (P value <0·01), whereas an increase of 10 nmol/l of tHcy in males produces a decrease of 2·06 µmol/l of DHA (P value <0·05). Positive associations between B-vitamins and specific fatty acids might suggest underlying mechanisms between B-vitamins and CVD and it is worth the attention of public health policies.

Dairy consumption is associated with a lower incidence of the metabolic syndrome in middle-aged and older Korean adults: the Korean Genome and Epidemiology Study (KoGES).

Kim D, Kim J.

Br J Nutr. 2017 Jan;117(1):148-160. doi: 10.1017/S000711451600444X.

PMID: 28098053

Abstract

This cohort study examined the association between total and individual dairy products and the risk of developing the metabolic syndrome (MetS) and its components in Korean adults from the Korean Genome and Epidemiology Study. We prospectively analysed 5510 participants aged 40-69 years without the MetS at baseline during a 10-year follow-up period. Dairy consumption was assessed with a semi-quantitative FFQ at baseline and after 4 years. The MetS was defined according to the criteria by the National Cholesterol Education Program Adult Treatment Panel III. The Cox's proportional hazard model was used to examine the association between consumption of total dairy products, milk and yogurt in servings per week and the risk of incident MetS or individual components. A total of 2103 subjects developed the MetS (38·2 %) during an average follow-up of 67·4 months (range 17-104 months). Frequent dairy consumption (>7 servings of total dairy and milk/week, ≥4 servings of yogurt/week) was associated with a reduced risk of incident MetS and its components. In the multivariable adjusted model, hazard ratios for the MetS were 0·51 (95 % CI 0·43, 0·61) for total dairy products, 0·50 (95 % CI 0·38, 0·66) for milk and 0·67 (95 % CI 0·57, 0·78) for yogurt in frequent consumers compared with non-consumers. An inverse association between milk/yogurt and low HDL-cholesterol was shown only in women. In conclusion, high consumption of individual dairy products including milk and yogurt as well as total dairy were associated with a reduced risk of incident MetS and individual components in Korean adults.

To clarify the correlation between chronic sleep restriction (CSR) and sporadic Alzheimer disease (AD), we determined in wild-type mice the impact of CSR, on cognitive performance, beta-amyloid (Aβ) peptides, and its feed-forward regulators regarding AD pathogenesis.

METHODS:

Sixteen nine-month-old C57BL/6 male mice were equally divided into the CSR and control groups. CSR was achieved by application of a slowly rotating drum for 2 months. The Morris water maze test was used to assess cognitive impairment. The concentrations of Aβ peptides, amyloid precursor protein (APP) and β-secretase 1 (BACE1), and the mRNA levels of BACE1 and BACE1-antisense (BACE1-AS) were measured.

RESULTS:

Following CSR, impairments of spatial learning and memory consolidation were observed in the mice, accompanied by Aβ plaque deposition and an increased Aβ concentration in the prefrontal and temporal lobe cortex. CSR also upregulated the β-secretase-induced cleavage of APP by increasing the protein and mRNA levels of BACE1, particularly the BACE1-AS.

CONCLUSIONS:

This study shows that a CSR accelerates AD pathogenesis in wild-type mice. An upregulation of the BACE1 pathway appears to participate in both cortical Aβ plaque deposition and memory impairment caused by CSR. BACE1-AS is likely activated to initiate a cascade of events that lead to AD pathogenesis. Our study provides, therefore, a molecular mechanism that links CSR to sporadic AD.

To evaluate the effect of age and chosen factors related to aging such as dentition, muscle strength, and nutrition on masticatory muscles electromyographic activity during chewing in healthy elderly women.

BACKGROUND:

With longer lifespan there is a need for maintaining optimal quality of life and health in older age. Skeletal muscle strength deteriorates in older age. This deterioration is also observed within masticatory muscles.

METHODS:

A total of 30 women, aged 68-92 years, were included in the study: 10 individuals had natural functional dentition, 10 were missing posterior teeth in the upper and lower jaw reconstructed with removable partial dentures, and 10 were edontoulous, using complete removable dentures. Surface electromyography was performed to evaluate masticatory muscles activity. Afterwards, measurement of masseter thickness with ultrasound imaging was performed, body mass index and body cell mass index were calculated, and isometric handgrip strength was measured.

RESULTS:

Isometric maximal voluntary contraction decreased in active masseters with increasing age and in active and passive temporalis muscles with increasing age and increasing body mass index. In active masseter, mean electromyographic activity during the sequence (time from the start of chewing till the end when the test food became ready to swallow) decreased with increasing age and during the cycle (single bite time) decreased with increasing age and increasing body mass index. In active and passive temporalis muscles, mean electromyographic activity during the sequence and the cycle decreased with increasing age, increasing body mass index, and loss of natural dentition. Individuals with natural dentition had significantly higher mean muscle activity during sequence and cycle in active temporalis muscles than those wearing full dentures and higher maximal activity during cycle in individuals with active and passive temporalis muscles than in complete denture wearers.

CONCLUSION:

Decrease in electromyographic activity of masticatory muscles in elderly women is related to age, deterioration of dental status, and body mass index.

Age-related macular degeneration (AMD) is the leading cause of severe, irreversible vision loss in older adults. Evidence for an association between AMD and mortality remains inconclusive despite evidence for an association with cardiovascular and inflammatory diseases. We aim to compare all-cause, cardiovascular and cancer mortality between those with early or late AMD and control study participants.

METHODS:

A protocol was registered at PROSPERO (CRD42015020622). A systematic search of Medline (Ovid), PubMed, and Embase (Ovid) was conducted on 6 June 2015. Reference lists from identified studies and four clinical trial registries were searched for additional studies. Participants were required to be over the age of 40 years, and AMD status must have been objectively assessed. The Risk Of Bias In Non-Randomized Studies - of Interventions (ROBINS-I) tool was used to assess the risk of bias. Random-effects meta-analyses were performed.

RESULTS:

A total of 12 reports from 10 studies were included in the meta-analysis. Late AMD was associated with elevated rates of all-cause (nine studies, hazard ratio (HR) 1.20, 95% confidence interval, CI, 1.02-1.41) and cardiovascular mortality (six studies, HR 1.46, 95% CI 1.13-1.98), but early AMD was not (all-cause mortality, 10 studies, HR 1.06, 95% CI 0.98-1.14; cardiovascular mortality, five studies, HR 1.12, 95% CI 0.96-1.31). There was no evidence of an association between early or late AMD and cancer mortality (early AMD, three studies, HR 1.17, 95% CI 0.78-1.75; late AMD, three studies, HR 1.01, 95% CI 0.77-1.33).

CONCLUSION:

Late AMD is associated with increased rates of all-cause and cardiovascular mortality, suggesting shared pathways between late AMD and systemic disease.

Associations between long-term exposure to ambient fine particulate matter (PM2.5) and all-cause and cardiovascular mortality are well documented however less is known regarding possible interactions with cigarette smoking. We previously reported a supra-additive synergistic relationship between PM2.5 and cigarette smoking for lung cancer mortality. Here we examine interactions for all-cause and cardiovascular mortality among 429,406 current or never smoking participants in the prospective American Cancer Society Cancer Prevention Study-II with modeled PM2.5 concentrations. Cox proportional and additive hazards models were used to estimate mortality associations and interactions on the multiplicative and additive scales. A total of 146,495 all-cause and 64,339 cardiovascular (plus diabetes) deaths were observed. The hazard ratio (HR) (95% confidence interval (CI)) for cardiovascular mortality for high vs. low PM2.5 exposure (>14.44µg/m3 vs ≤10.59µg/m3, 75th vs 25th percentile) was 1.09 (95% CI 1.05, 1.12) in never smokers. The HR for cigarette smoking was 1.89 (95% CI 1.82, 1.96) in those with low PM2.5. The HR for both high PM2.5 and cigarette smoking was 2.08 (95% CI 2.00, 2.17). A small significant excess relative risk due to interaction (0.10; 95% CI 0.02, 0.19) was observed. Quantification of the public health burden attributed to the interaction between PM2.5 and cigarette smoking indicated a total of 32 (95% CI -6, 71) additional cardiovascular deaths per 100,000 person-years due to this interaction. In conclusion, PM2.5 was associated with all-cause and cardiovascular mortality in both smokers and never smokers, with some evidence for a small additive interaction with cigarette smoking. Reductions in cigarette smoking will result in the greatest impact on reducing all-cause and cardiovascular death at the levels of PM2.5 observed in this study. However, reductions in PM2.5 will also contribute to preventing a proportion of mortality attributed to cigarette smoking.

Dietary fatty acid composition likely affects prediabetic conditions such as isolated impaired fasting glucose (IFG) or impaired glucose tolerance (IGT); however, this risk has not been evaluated in a large population nor has it been followed prospectively.

DESIGN:

Diet, physical activity, anthropometric, socio-economic and blood glucose data from the Atherosclerosis Risk in Communities (ARIC) study were obtained from BioLINCC. Cox proportional hazards regression models were used to evaluate associations of dietary SFA, MUFA, PUFA, n-3 fatty acid (FA) and n-6 FA intakes with incidence of one (isolated IFG) or two (IFG with IGT) prediabetic conditions at the end of 12-year follow-up.

SETTING:

Study volunteers were from counties in North Carolina, Mississippi, Minnesota and Maryland, USA.

SUBJECTS:

Data from 5288 volunteers who participated in the ARIC study were used for all analyses reported herein.

RESULTS:

The study population was 62% male and 84 % white, mean age 53·5 (sd 5·7) years and mean BMI 26·2 (sd 4·6) kg/m2. A moderately high intake of dietary MUFA (10-15 % of total daily energy) was associated with a 10 % reduced risk of isolated IFG incidence, while a high intake of n-3 FA (>0·15 % of total daily energy) was associated with a 10 % increase in risk. Curiously, moderately high intake of n-6 PUFA (4-5 % of total daily energy) was associated with a 12 % reduction in IFG and IGT incidence.

CONCLUSIONS:

MUFA, n-3 and n-6 FA contribute differently to the development of isolated IFG v. IFG with IGT; and their mechanism may be more complex than originally proposed.

KEYWORDS:

Impaired fasting glucose; Impaired glucose tolerance; MUFA; n-6

Comprehensive Review of the Impact of Dairy Foods and Dairy Fat on Cardiometabolic Risk.

Because regular-fat dairy products are a major source of cholesterol-raising saturated fatty acids (SFAs), current US and Canadian dietary guidelines for cardiovascular health recommend the consumption of low-fat dairy products. Yet, numerous randomized controlled trials (RCTs) have reported rather mixed effects of reduced- and regular-fat dairy consumption on blood lipid concentrations and on many other cardiometabolic disease risk factors, such as blood pressure and inflammation markers. Thus, the focus on low-fat dairy in current dietary guidelines is being challenged, creating confusion within health professional circles and the public. This narrative review provides perspective on the research pertaining to the impact of dairy consumption and dairy fat on traditional and emerging cardiometabolic disease risk factors. This comprehensive assessment of evidence from RCTs suggests that there is no apparent risk of potential harmful effects of dairy consumption, irrespective of the content of dairy fat, on a large array of cardiometabolic variables, including lipid-related risk factors, blood pressure, inflammation, insulin resistance, and vascular function. This suggests that the purported detrimental effects of SFAs on cardiometabolic health may in fact be nullified when they are consumed as part of complex food matrices such as those in cheese and other dairy foods. Thus, the focus on low-fat dairy products in current guidelines apparently is not entirely supported by the existing literature and may need to be revisited on the basis of this evidence. Future studies addressing key research gaps in this area will be extremely informative to better appreciate the impact of dairy food matrices, as well as dairy fat specifically, on cardiometabolic health.

This review was supported in part by an unrestricted grant from the Dairy Research Consortium (Dairy Farmers of Canada, Centre national interprofessionnel de l’économie laitière, Dairy Research Institute, Dairy Australia Ltd., Dutch Dairy Association, and Danish Dairy Research Foundation)

Fruit and vegetable intake and the risk of overall cancer in Japanese: A pooled analysis of population-based cohort studies.

A series of recent reports from large-scale cohort studies involving more than 100,000 subjects reported no or only very small inverse associations between fruit and vegetable intake and overall cancer incidence, despite having sufficient power to do so. To date, however, no such data have been reported for Asian populations.

OBJECTIVE:

To provide some indication of the net impact of fruit and vegetable consumption on overall cancer prevention, we examined these associations in a pooled analysis of large-scale cohort studies in Japanese populations.

METHODS:

We analyzed original data from four cohort studies that measured fruit and vegetable consumption using validated questionnaires at baseline. Hazard ratios (HRs) in the individual studies were calculated, with adjustment for a common set of variables, and combined using a random-effects model.

RESULTS:

During 2,318,927 person-years of follow-up for a total of 191,519 subjects, 17,681 cases of overall cancers were identified. Consumption of fruit or vegetables was not associated with decreased risk of overall cancers: corresponding HRs for the highest versus lowest quartiles of intake for men and women were 1.03 (95% CI, 0.97-1.10; trend p = 1.00) and 1.03 (95% CI, 0.95-1.11; trend p = 0.97), respectively, for fruit and 1.07 (95% CI, 1.01-1.14; trend p = 0.18) and 0.98 (95% CI, 0.91-1.06; trend p = 0.99), respectively, for vegetables, even in analyses stratified by smoking status and alcohol drinking.

CONCLUSIONS:

The results of this pooled analysis do not support inverse associations of fruit and vegetable consumption with overall cancers in the Japanese population.

KEYWORDS:

Cancer risk; Fruit and vegetable intake; Japanese; Pooled analysis

Dietary antioxidant vitamins intake and mortality: A report from two cohort studies of Chinese adults in Shanghai.

Few studies have evaluated dietary antioxidant vitamins intake in relation to risk of mortality in Asia.

METHODS:

We examined the associations between total carotene, vitamin C, and vitamin E from diet and risk of mortality from all causes, cancer, and cardiovascular disease in 134,358 participants (59,739 men and 74,619 women) from the Shanghai Men's Health Study and Shanghai Women's Health Study, two prospective cohort studies of middle-aged and elderly Chinese adults in urban Shanghai. Participants were followed up for a median period of 8.3 and 14.2 years for men and women, respectively. Hazard ratios (HRs) and 95% confidence interval (CIs) were estimated using Cox proportional hazards regression models.

RESULTS:

During the 495,332 and 1,029,198 person-years of follow-up for men and women, respectively, there were 10,079 deaths (4170 men and 5909 women). For men, compared with the lowest quintiles, the multivariable-adjusted risk reductions in the highest categories were 17% (HR 0.83; 95% CI, 0.76-0.92) for dietary total carotene and 17% (HR 0.83; 95% CI, 0.75-0.91) for dietary vitamin C. Associations were weaker in women than in men, though they were still statistically significant (highest versus lowest quintiles of dietary total carotene, HR 0.87; 95% CI, 0.80-0.95; dietary vitamin C: HR 0.83; 95% CI, 0.77-0.91). Significant inverse associations were observed between dietary total carotene, vitamin C, and risk of cardiovascular disease mortality but not cancer mortality.

CONCLUSION:

This study suggests that total carotene and vitamin C intake from diet were inversely associated with deaths from all causes and cardiovascular disease in middle-aged or elderly people in China.

KEYWORDS:

Antioxidants; Cohort studies; Mortality; Vitamins

Systematic Review of the Association between Dairy Product Consumption and Risk of Cardiovascular-Related Clinical Outcomes.

The objective of this systematic review was to determine if dairy product consumption is detrimental, neutral, or beneficial to cardiovascular health and if the recommendation to consume reduced-fat as opposed to regular-fat dairy is evidence-based. A systematic review of meta-analyses of prospective population studies associating dairy consumption with cardiovascular disease (CVD), coronary artery disease (CAD), stroke, hypertension, metabolic syndrome (MetS), and type 2 diabetes (T2D) was conducted on the basis of the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) statement. Quality of evidence was rated by using the Grading of Recommendations Assessment, Development, and Evaluation scale. High-quality evidence supports favorable associations between total dairy intake and hypertension risk and between low-fat dairy and yogurt intake and the risk of T2D. Moderate-quality evidence suggests favorable associations between intakes of total dairy, low-fat dairy, cheese, and fermented dairy and the risk of stroke; intakes of low-fat dairy and milk and the risk of hypertension; total dairy and milk consumption and the risk of MetS; and total dairy and cheese and the risk of T2D. High- to moderate-quality evidence supports neutral associations between the consumption of total dairy, cheese, and yogurt and CVD risk; the consumption of any form of dairy, except for fermented, and CAD risk; the consumption of regular- and high-fat dairy, milk, and yogurt and stroke risk; the consumption of regular- and high-fat dairy, cheese, yogurt, and fermented dairy and hypertension risk; and the consumption of regular- and high-fat dairy, milk, and fermented dairy and T2D risk. Data from this systematic review indicate that the consumption of various forms of dairy products shows either favorable or neutral associations with cardiovascular-related clinical outcomes. The review also emphasizes that further research is urgently needed to compare the impact of low-fat with regular- and high-fat dairy on cardiovascular-related clinical outcomes in light of current recommendations to consume low-fat dairy.

Author disclosures: D Brassard, M Tessier-Grenier, JA Côté, and M-È Labonté, no conflicts of interest. B Lamarche is Chair of Nutrition at Laval University. This Chair is supported by unrestricted endowments from the Royal Bank of Canada, Pfizer, and Provigo/Loblaws. He has received funding in the last 5 y for his research from the Canadian Institutes of Health Research (CIHR), Natural Sciences and Engineering Research Council of Canada, Agriculture and Agrifood Canada, the Canola Council of Canada, Dairy Farmers of Canada (DFC), the Dairy Research Institute (DRI), Atrium Innovations, the Danone Institute, and Merck Frosst. He has received speaker honoraria over the last 5 y from DFC and the DRI. He is Chair of the Expert Scientific Advisory Panel of DFC and of the ad hoc committee on saturated fat of the Heart and Stroke Foundation of Canada. P Couture has received funding in the last 5 y from the CIHR, Agriculture and Agrifood Canada, DFC, DRI, Merck Frosst, and Kaneka Corporation. S Desroches has received funding in the last 5 y from the CIHR and the Danone Institute. J-P Drouin-Chartier has received speaker honoraria over the last year from DFC.

Question Does engaging in a mentally stimulating activity in old age associate with neurocognitive function?

Findings In this population-based cohort study, 1929 cognitively normal participants 70 years or older were followed for approximately 4 years. The following activities were associated with significant decreased risk of new-onset mild cognitive impairment: computer use, craft activities, social activities, and playing games.

Meaning Engaging in a mentally stimulating activity even in late life may decrease the risk of mild cognitive impairment.

Abstract

Importance Cross-sectional associations between engagement in mentally stimulating activities and decreased odds of having mild cognitive impairment (MCI) or Alzheimer disease have been reported. However, little is known about the longitudinal outcome of incident MCI as predicted by late-life (aged ≥70 years) mentally stimulating activities.

Objectives To test the hypothesis of an association between mentally stimulating activities in late life and the risk of incident MCI and to evaluate the influence of the apolipoprotein E (APOE) ε4 genotype.

Design, Setting, and Participants This investigation was a prospective, population-based cohort study of participants in the Mayo Clinic Study of Aging in Olmsted County, Minnesota. Participants 70 years or older who were cognitively normal at baseline were followed up to the outcome of incident MCI. The study dates were April 2006 to June 2016.

Main Outcomes and Measures At baseline, participants provided information about mentally stimulating activities within 1 year before enrollment into the study. Neurocognitive assessment was conducted at baseline, with evaluations at 15-month intervals. Cognitive diagnosis was made by an expert consensus panel based on published criteria. Hazard ratios (HRs) and 95% CIs were calculated using Cox proportional hazards regression models after adjusting for sex, age, and educational level.

Results The final cohort consisted of 1929 cognitively normal persons (median age at baseline, 77 years [interquartile range, 74-82 years]; 50.4% [n = 973] female) who were followed up to the outcome of incident MCI. During a median follow-up period of 4.0 years, it was observed that playing games (HR, 0.78; 95% CI, 0.65-0.95) and engaging in craft activities (HR, 0.72; 95% CI, 0.57-0.90), computer use (HR, 0.70; 95% CI, 0.57-0.85), and social activities (HR, 0.77; 95% CI, 0.63-0.94) were associated with a decreased risk of incident MCI. In a stratified analysis by APOE ε4 carrier status, the data point toward the lowest risk of incident MCI for APOE ɛ4 noncarriers who engage in mentally stimulating activities (eg, computer use: HR, 0.73; 95% CI, 0.58-0.92) and toward the highest risk of incident MCI for APOE ɛ4 carriers who do not engage in mentally stimulating activities (eg, no computer use: HR, 1.74; 95% CI, 1.33-2.27).

Conclusions and Relevance Cognitively normal elderly individuals who engage in specific mentally stimulating activities even in late life have a decreased risk of incident MCI. The associations may vary by APOE ε4 carrier status.

Folate is a vital component of a healthy diet, being essential for numerous bodily functions. Deficiency of folate is common, with studies suggesting prevalence of deficiencyas high as 85.5% as was shown in women between the ages of 16 and 49, living in the UK. Causes of folate deficiency range from diet and lifestyle, to pathological and pharmacological processes. Because of the well-known role of folate in prevention of neural tube defects, numerous countries have implemented strategies to increase folate intake, with programs such as mandatory grain fortification. As a result, the intake of folate in these countries is often higher than the recommended dietary allowance for many groups of people. Although folate is believed to be non-toxic, the potential adverse effects of excessive intake of folic acid (synthetic form of folate) have not been highlighted well by authorities to people taking supplements; despite this, many studies have addressed this issue. However, the results of these studies provide discrepant results, leading to confusion as to whether mandatory folic acid fortification should be introduced in other countries. The purpose of this review was to provide a summary of evidence related to high folic acid ingestion and to look at the unwanted effects it may have on the certain groups within the general population.

Impact of weight gain on the evolution and regression of prediabetes: a quantitative analysis.

The quantitative impact of weight gain on prediabetic glucose dysregulation remains unknown; only one study quantitated the impact of weight loss. We quantified the impact of weight gain on the evolution and regression of prediabetes (PDM).

SUBJECTS/METHODS:

In 4234 subjects without diabetes, using logistic regression analysis with a 4.8-year follow-up period, we analyzed the relationship between (1) δBMI (BMIfollow-up-basal) and the progression from normal glucose regulation (NGR) to PDM or diabetes, and (2) δBMI and the regression from PDM to NGR.

RESULTS:

Mean (±s.d.) δBMI was 0.17 (±1.3) kg/m2 in subjects with NGR and δBMI was positively and independently related to progression (adjusted odds ratio (ORadj) (95% CI), 1.24 (1.15-1.34), P<0.01). Mean (±s.d.) δBMI was -0.03 (±1.25) kg/m2 in those with PDM and δBMI was negatively related to the regression (ORadj, 0.72 (0.65-0.80), P<0.01). The relation of δBMI to the progression was significant in men (ORadj, 1.42 (1.28-1.59), P<0.01) but not in women (ORadj, 1.05 (0.94-1.19), P=0.36). Also, the negative impact of δBMI on the regression was significant only in men (men, ORadj, 0.65 (0.57-0.75), P<0.01; women, ORadj, 0.94 (0.77-1.14), P=0.51).

CONCLUSIONS:

In Japanese adults, an increase in the BMI by even 1 kg/m2 was related to 24% increase in the risk of development of PDM or diabetes in NGR subjects and was related to 28% reduction in the regression from PDM to NGR. In women, we did not note any significant impact of weight gain on the evolution or regression of PDM.

Colonic fermentation of dietary fiber to short-chain fatty acids (SCFA) may protect against obesity and diabetes, but excess production of colonic SCFA has been implicated in the promotion of obesity. We aimed to compare the effects of two fermentable fibers on postprandial SCFA and second-meal glycemic response in healthy overweight or obese (OWO) vs lean (LN) participants.

SUBJECTS/METHODS:

Using a randomized crossover design, 13 OWO and 12 LN overnight fasted participants were studied for 6 h on three separate days after consuming 300 ml water containing 75 g glucose (GLU) as control or with 24 g inulin (IN) or 28 g resistant starch (RS). A standard lunch was served 4 h after the test drink.

RESULTS:

Within the entire group, compared with control, IN significantly increased serum SCFA (P<0.001) but had no effect on free-fatty acids (FFA) or second-meal glucose and insulin responses. In contrast, RS had no significant effect on SCFA but reduced FFA rebound (P<0.001) and second-meal glucose (P=0.002) and insulin responses (P=0.024). OWO had similar postprandial serum SCFA and glucose concentrations but significantly greater insulin and FFA than LN. However, the effects of IN and RS on SCFA, glucose, insulin and FFA responses were similar in LN and OWO.

CONCLUSIONS:

RS has favorable second-meal effects, likely related to changes in FFA rather than SCFA concentrations. However, a longer study may be needed to demonstrate an effect of RS on SCFA. We found no evidence that acute increases in SCFA after IN reduce glycemic responses in humans, and we were unable to detect a significant difference in SCFA responses between OWO vs LN subjects.

Fatty acid intake and its dietary sources in relation with markers of type 2 diabetes risk: The NEO study.

The aim of this study was to examine the relations between intakes of total, saturated, mono-unsaturated, poly-unsaturated and trans fatty acids (SFA, MUFA, PUFA and TFA), and their dietary sources (dairy, meat and plant) with markers of type 2 diabetes risk.

SUBJECTS/METHODS:

This was a cross-sectional analysis of baseline data of 5675 non-diabetic, middle-aged participants of the Netherlands Epidemiology of Obesity (NEO) study. Associations between habitual dietary intake and fasting and postprandial blood glucose and insulin, Homeostatic Model Assessment of Insulin Resistance (HOMA-IR), HOMA of β-cell function (HOMA-B) and Disposition Index were assessed through multivariable linear regression models with adjustments for demographic, lifestyle and dietary factors.

Our study suggests that the relations between fatty acid intakes and markers of type 2 diabetes risk may depend on the dietary sources of the fatty acids. More epidemiological studies on diet and cardiometabolic disease are needed, addressing possible interactions between nutrients and their dietary sources.

Background/Objectives: Vitamin D insufficiency in cystic fibrosis is common. Vitamin D3 is currently preferred over D2. We aimed to study the efficacy of vitamin D2 and D3 at increasing serum 25-hydroxyvitamin D (s25OHD) concentrations and their effect on respiratory health in cystic fibrosis.

Subjects/Methods: Sixteen CF patients were randomized to receive vitamin D2 or D3 or to serve as controls. The starting dose of 5000 IU (<16 years old) or 7143 IU/day (greater than or equal to16 years old) was further individually adjusted. Three months of intervention were followed by two of washout (ClinicalTrials.gov NCT01321905).

Results: To increase s25OHD, the mean daily dose of vitamin D2 and D3 had to be increased up to 15650 and 8184 IU, respectively. The combined group of vitamin D2 and D3 treated patients decreased plasma IL-8 (P<0.05). Patients provided vitamin D3 improved FVC at the end of the trial (P<0.05). Change in s25OHD was positively correlated with changes in the adult Quality-of-Life respiratory score at the end of supplementation (P=0.006, r=0.90), and with changes in FEV1 (P=0.042, r=0.62) and FVC (P=0.036, r=0.63) at one month of washout.

The aim of this study was to determine the effects of n-3 ingestion on periodontal disease. Besides, we also investigated the relationship between plasma concentrations of eicosapentaenoic acid (EPA), docosahexaenoic acid (DHA) and/or aracdonic acid (AA) and periodontal disease. An electronic search was performed in several databases with the following keywords: "n-3," DHA, EPA and polynsaturated fatty acids (PUFA) in combination with the term "periodontal disease" (PD). Only studies conducted with humans, involving clinical parameters of PD assessment and use of n-3 were selected, without restriction to the date of publication. The search has returned 1368 articles, 11 of which were selected. The results were separated according to the type of n-3 ingestion: supplementation or n-3 content in normal diet. In the studies where n-3 has been supplemented, there was no significant difference in the clinical severity of PD compared to the control subjects. However, in patients where levels of n-3 were evaluated in a usual diet, a lower disease severity was reported. We have detected a preventive effect related to plasma levels of EPA and DHA against PD progression. Thus, n-3 ingestion may beneficially interfere in PD progression, depending on the duration and dosage of consumption.

KEYWORDS:

dietary fats; fatty acids; periodontitis

Lack of associations between modifiable risk factors and dementia in the very old: findings from the Cambridge City over-75s cohort study.

To investigate the association between modifiable risk and protective factors and severe cognitive impairment and dementia in the very old. Additionally, the present study tests the predictive validity of the 'LIfestyle for BRAin health' (LIBRA) score, an index developed to assess an individual's dementia prevention potential.

METHOD:

Two hundred seventy-eight individuals aged 85 years or older from the Cambridge City over-75s cohort study were followed-up until death. Included risk and protective factors were: diabetes, heart disease, hypertension, depression, smoking, low-to-moderate alcohol use, high cognitive activity, and physical inactivity. Incident severe cognitive impairment was based on the Mini-Mental State Examination (score: 0-17) and incident dementia was based on either post-mortem consensus clinical diagnostic assessments or death certificate data. Logistic regressions were used to test whether individual risk and protective factors and the LIBRA score were associated with severe cognitive impairment or dementia after 18 years follow-up.

RESULTS:

None of the risk and protective factors or the LIBRA score was significantly associated with increased risk of severe cognitive impairment or dementia. Sensitivity analyses using a larger sample, longer follow-up period, and stricter cut-offs for prevalent cognitive impairment showed similar results.

CONCLUSION:

Associations between well-known midlife risk and protective factors and risk for severe cognitive impairment or dementia might not persist into very old age, in line with suggestions that targeting these factors through lifestyle interventions should start earlier in life.

KEYWORDS:

Dementia; cohort study; epidemiology; prevention; risk factors

Associations of objectively measured moderate-to-vigorous-intensity physical activity and sedentary time with all-cause mortality in a population of adults at high risk of type 2 diabetes mellitus.

The relationships of physical activity and sedentary time with all-cause mortality in those at high risk of type 2 diabetes mellitus (T2DM) are unexplored. To address this gap in knowledge, we examined the associations of objectively measured moderate-to-vigorous-intensity physical activity (MVPA) and sedentary time with all-cause mortality in a population of adults at high risk of T2DM. In 2010-2011, 712 adults (Leicestershire, U.K.), identified as being at high risk of T2DM, consented to be followed up for mortality. MVPA and sedentary time were assessed by accelerometer; those with valid data (≥ 10 hours of wear-time/day with ≥ 4 days of data) were included. Cox proportional hazards regression models, adjusted for potential confounders, were used to investigate the independent associations of MVPA and sedentary time with all-cause mortality. 683 participants (250 females (36.6%)) were included and during a mean follow-up period of 5.7 years, 26 deaths were registered. Every 10% increase in MVPA time/day was associated with a 5% lower risk of all-cause mortality [Hazard Ratio (HR): 0.95 (95% Confidence Interval (95% CI): 0.91, 0.98); p = 0.004]; indicating that for the average adult in this cohort undertaking approximately 27.5 minutes of MVPA/day, this benefit would be associated with only 2.75 additional minutes of MVPA/day. Conversely, sedentary time showed no association with all-cause mortality [hr (every 10-minute increase in sedentary time/day): 0.99 (95% CI: 0.95, 1.03); p 0.589]. These data support the importance of MVPA in adults at high risk of T2DM. The association between sedentary time and mortality in this population needs further investigation.

Copper and zinc are essential micronutrients and cofactors of many enzymatic reactions that may be involved in liver-cancer development. We aimed to assess pre-diagnostic circulating levels of copper, zinc and their ratio (Cu/Zn) in relation to hepatocellular carcinoma (HCC), intrahepatic bile duct (IHBD) and gall bladder and biliary tract (GBTC) cancers.

METHODS:

A nested case-control study was conducted within the European Prospective Investigation into Cancer and Nutrition cohort. Serum zinc and copper levels were measured in baseline blood samples by total reflection X-ray fluorescence in cancer cases (HCC n=106, IHDB n=34, GBTC n=96) and their matched controls (1:1). The Cu/Zn ratio, an indicator of the balance between the micronutrients, was computed. Multivariable adjusted odds ratios and 95% confidence intervals (OR; 95% CI) were used to estimate cancer risk.

A recent clinical trial found a protective role of niacinamide, a derivative of niacin, against skin cancer recurrence. However, there is no epidemiologic study to assess the association between niacin intake and risk of skin cancer (basal cell carcinoma [bCC], squamous cell carcinoma [sCC], and melanoma). We prospectively evaluated whether total, dietary and supplemental niacin intake was associated with skin cancer risk based on 72,308 women in the Nurses' Health Study (1984-2010) and 41,808 men in the Health Professionals Follow-up Study (1986-2010). Niacin intake was assessed every 2 to 4 years during follow-up and cumulative averaged intake. Cox proportional hazard models were used to compute the hazard ratios (HR) and 95% confidence intervals (CI) and cohort-specific results were pooled using a random-effects model. During the follow-up, we documented 23,256 BCC, 2,530 SCC and 887 melanoma cases. Total niacin intake was inversely associated with SCC risk; the pooled HR for top vs. bottom quintiles was 0.84 (95%CI = 0.74-0.95; Ptrend= 0.08). On the other hand, there were a marginally positive association between total niacin intake and BCC risk; the pooled HR for top vs. bottom quintiles was 1.05 (95%CI = 1.01-1.10; Ptrend <0.01). Higher total niacin intake was also marginally positively associated with melanoma risk in men, but not in women. The results were similar in stratified analyses according to sun exposure related factors and by body location of melanoma and SCC. Our study supports a potential beneficial role of niacin intake in relation to SCC but not of BCC or melanoma.

Impact of betablockers on general and local outcome in patients hospitalized for lower extremity peripheral artery disease: The COPART Registry.

Lower extremity peripheral artery disease (PAD) is one manifestation of atherosclerosis. Patients with PAD have an increased rate of mortality due to concurrent coronary artery disease and hypertension. Betablockers (BB) may, therefore, be prescribed, especially in case of heart failure. However, BB safety in PAD is controversial, because of presumed peripheral hemodynamic consequences of BB that could lead to worsening of symptoms in patients with PAD. In this context, we aimed to determine the impact of BB on all-cause and cardiovascular mortality and amputation rate at 1 year after hospitalization for PAD from the COPART Registry population. This is a prospective multicenter observational study collecting data from consecutive patients hospitalized for PAD in vascular medicine departments of 4 academic hospitals in France. Patients with, either claudication, critical limb ischemia or acute lower limb ischemia related to a documented PAD were included. We compared the outcomes of patients with BB versus those without BB in their prescription list at hospital discharge. The mean age of the study population was 70.9 years, predominantly composed of males (71%). Among the 1267 patients at admission, 28% were treated by BB for hypertension, prior myocardial infarction or heart failure. During their hospital stay, 40% underwent revascularization (including bypass surgery 29% and angioplasty 74%), 17% required an amputation, and 5% died. In a multivariate analysis, only prior myocardial infarction was found associated with BB prescription with an odds ratio (OR) of 3.11, P < 0.001. Conversely, chronic obstructive pulmonary disease or PAD with ulcer impeded BB prescription (OR: 0.57 and 0.64, P = 0.007; P = 0.001, respectively). One-year overall mortality of patients with BB did not differ from those without (23% vs. 23%, P = 0.95). The 1-year amputation rate did not differ either (4% vs. 6%, P = 0.14). Patients hospitalized for PAD with a BB in their prescription did not worsen their outcome at 1 year compared to patients without BB. Based on these safety data, prospective study could be conducted to assess the effect of BB on long-term mortality and amputation rate in patients with mild, moderate, and severe PAD.

The effects of folic acid and pyridoxine supplementation on characteristics of migraine attacks in migraine patients with aura: A double-blinded randomized placebo-controlled clinical trial

This study was performed to assess the effects of folic acid alone and in combination with pyridoxine on characteristics of migraine attacks in adults’ migraine patients with aura.

Methods

This double-blinded randomized placebo-controlled clinical trial was conducted on 95 migraine patients with aura, with age range of 18-65 years in Isfahan, Islamic republic of Iran, during 2014. Patients were randomly allocated to receive folic acid (5 mg/d) plus pyridoxine (80 mg/d) or folic acid alone (5 mg/d) or placebo (lactose) for 3 months. Characteristics of migraine attacks including headache severity, attacks frequency, duration and headache diary result (HDR) were obtained for each patient at baseline and at the end of the study.

s: Folic acid in addition to pyridoxine supplementation could decrease the characteristics of migraine attacks including headache severity, attacks frequency and HDR; however, further studies are needed to shed light our findings.

Keywords:

Pyridoxine, Folic acid, Migraine, Headache

Effect of Magnesium Supplementation on Insulin Resistance in Humans: a Systematic Review

•We have evaluated evidence for the effectiveness of supplementation of magnesium in the control of the insulin resistance;

•This systematic review provide evidence of the benefits of magnesium supplementation on insulin resistance in subjects with hypomagnesemia;

•Magnesium has a potentially significant role for improving insulin sensitivity. However, larger-scale studies over a longer duration of treatment are needed to confirm this conclusion.

AbstractObjectives

Recent studies have demonstrated a role of minerals in glucose metabolism disorders in humans. Magnesium, in particular, is an extensively studied mineral that has been shown to function in the management of hyperglycemia, hyperinsulinemia, and insulin resistance action. The aim of this study was to investigate the effect of magnesium supplementation on insulin resistance in humans via systematic review of the available clinical trials.

Methods

This review was conducted in accordance with Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) recommendations. A survey was conducted for selecting clinical trials related to the effects of this mineral in insulin sensitivity using the following databases: PubMed, SciVerse Scopus, ScienceDirect, and SciVerse Cochrane.

Results

After the selection process, 12 articles were identified as eligible for this review, with different clinical conditions without restriction of sex, age, ethnicity, and differential dosing/shape of magnesium. The results of eight clinical trials have shown that supplementation with magnesium influences serum fasting glucose concentrations, and five trials determined an effect on fasting insulin levels. The results of seven studies have shown that mineral supplementation reduced homeostasis model assessment for insulin resistance values.

Conclusions

The data of this systematic review provide evidence on the benefits of magnesium supplementation in reducing insulin resistance in subjects with hypomagnesemia presenting insulin resistance. However, new intervention studies are needed to elucidate the role of the nutrient in protection against this metabolic disorder, as well as the standardization of the type, dose, and time of magnesium supplementation.

25-hydroxyvitamin D [25(OH)D] deficiency is associated with increased cardiovascular disease risk, perhaps mediated through dyslipidemia. Deficient 25(OH)D is cross-sectionally associated with dyslipidemia, but little is known about longitudinal lipid changes. Our objective was to determine the association of 25(OH)D deficiency with longitudinal lipid changes and risk of incident dyslipidemia.

Research Methods

This is a longitudinal community-based study of 13,039 ARIC participants who had 25(OH)D and lipids measured at baseline (1990-1992) and lipids re-measured in 1993-1994 and 1996-1998. Mixed-effect models were used to assess associations of 25(OH)D with lipid trends after adjusting for clinical characteristics and for baseline or incident use of lipid-lowering therapy. Risk of incident dyslipidemia was determined for those without baseline dyslipidemia.

Deficient 25(OH)D was prospectively associated with lower TC and HDL-C and greater TC/HDL-C ratio after considering factors such as diabetes and adiposity. Further work including randomized controlled trials is needed to better assess how 25(OH)D may impact lipids and cardiovascular risk.

A new report offers a rigorous review of scientific research published since 1999 and based on more than 10,000 scientific abstracts

New Report: Health Effects of Marijuana and Cannabis-Derived Products

Recent statistics estimate that more than 22 million Americans have used cannabis in the last 30 days. Currently, twenty-six states and the District of Columbia have laws legalizing marijuana in some form. Of concern is the lack of conclusive evidence of the effects of use of cannabis and marijuana.

In an effort to provide accurate information in the ongoing debate about the use of cannabis in the treatment of medical conditions and recreational use, the National Academies of Sciences, Engineering, and Medicine has conducted a study of the current research performed since 1999. Ten thousand abstracts were reviewed and the committee offered 100 conclusions. Here are the results of their findings.

Therapeutic Effects

Some of the research reviewed indicated the following therapeutic effects:

A significant reduction of the symptoms of chronic pain in adults.

Relief of muscle spasms in adults with multiple sclerosis

A reduction of nausea and vomiting in adults undergoing chemotherapy treatment.

For those with schizophrenia and other psychoses, better performance on learning and memory tasks.

Risk Factors

Additional research reviewed indicated the following risks:

An increased risk of being involved in a vehicular accident following the use of cannabis

Increased risk of unintentional cannabis overdose injuries among children in states where legalized.

Increased respiratory problems including coughing, phlegm production and chronic bronchitis

Increased learning, memory and attention span attention difficulties, especially in children and young adults

An increased rate of unemployment and low income in users

A positive correlation between the use of cannabis and problems related to it

An increase in the likelihood of developing other substance abuse problems

Inconclusive Evidence

More research is needed in several areas in the study of the effects of cannabis, including:

Cancer, stroke, heart attack and diabetes

Serious respiratory diseases such as asthma, and chronic pulmonary disease

The effects of cannabis on the human immune system

The effects on fetuses and infants

As a final note in its report the National Academies of Sciences, Engineering, and Medicine committee commented on the challenges faced in studying the beneficial and harmful effects of the use of cannabis and marijuana. Adequate funding is an issue, as well as the fact that cannabis is classified as a Schedule 1 substance, which makes it difficult to obtain consistent, reliable amounts to use in their research. More research is still needed to provide conclusive information about the effects of marijuana and cannabis use.

VIEW NEWS SOURCE…

Copies of The Health Effects of Cannabis and Cannabinoids: The Current State of Evidence and Recommendations for Research are available from the National Academies Press at http://www.nap.eduor by calling 1-800-624-6242. Reporters may obtain a copy from the Office of News and Public Information (contacts listed above).

Keyword: Medical Marijuana, Cannabidiol (CBD)

Patterns of sitting and mortality in the Nord-Trøndelag health study (HUNT).

Current evidence concerning sedentary behaviour and mortality risk has used single time point assessments of sitting. Little is known about how changes in sitting levels over time affect subsequent mortality risk.

Aim

To examine the associations between patterns of sitting time assessed at two time points 11 years apart and risk of all-cause and cardio-metabolic disease mortality.

Methods

Participants were 25,651 adults aged > =20 years old from the Nord-Trøndelag Health Study with self-reported total sitting time in 1995-1997 (HUNT2) and 2006-2008 (HUNT3). Four categories characterised patterns of sitting: (1) low at HUNT2/ low at HUNT3, ‘consistently low sitting’; (2) low at HUNT2/high at HUNT3, ‘increased sitting’; (3) high at HUNT2/low at HUNT3, ‘reduced sitting’; and (4) high at HUNT2 /high at HUNT3, ‘consistently high sitting’. Associations of sitting pattern with all-cause and cardio-metabolic disease mortality were analysed using Cox regression adjusted for confounders.

Examining patterns of sitting over time augments single time-point analyses of risk exposures associated with high sitting time. Whilst sitting habits can be stable over a long period, life events (e.g., changing jobs, retiring or illness) may influence sitting trajectories and therefore sitting-attributable risk. Reducing sitting may yield mortality risks comparable to a stable low-sitting pattern.

We sought to assess the relation between plasma pyridoxal 5' phosphate (PLP; the active form of vitamin B-6) and serum DHA, EPA, AA, linoleic acid, eicosadienoic, and α-linolenic acid concentrations during pregnancy.

METHODS:

A prospective cohort study in 186 healthy pregnant Brazilian women (aged 20-40 y) who were not using supplements was conducted in Rio de Janeiro, Brazil. Participants were enrolled in the first trimester of pregnancy (5-13 gestational weeks) and were followed up twice between 20-26 and 30-36 wk of gestation. Longitudinal linear mixed-effects regression models were used to evaluate the associations between 1) first-trimester PLP and PUFA concentrations across pregnancy and 2) ΔPLP (i.e., difference between third- and first-trimester plasma PLP concentrations) and PUFA concentrations across pregnancy. Models were adjusted for gestational week, first-trimester body mass index, smoking habit, and dietary intakes of vitamin B-6, fish, total fat, and PUFAs.

Maternal vitamin B-6 status during pregnancy was positively associated with the circulating concentration of DHA and inversely associated with n-6:n-3 FAs in Brazilian women who were not taking vitamin supplements. Further study is required to determine the impact of poor vitamin B-6 status on fetal neurodevelopment.

We used a precise feeding protocol in adult rats to determine if optimizing postmeal MPS response by modifying the meal distribution of protein, and the amino acid leucine (Leu), would affect muscle mass.

METHODS:

Two studies were conducted with the use of male Sprague-Dawley rats (∼300 g) trained to consume 3 meals/d, then assigned to diet treatments with identical macronutrient contents (16% of energy from protein, 54% from carbohydrates, and 30% from fat) but differing in protein quality or meal distribution. Study 1 provided 16% protein at each meal with the use of whey, egg white, soy, or wheat gluten, with Leu concentrations of 10.9%, 8.8%, 7.7%, and 6.8% (wt:wt), respectively. Study 2 used whey protein with 16% protein at each meal [balanced distribution (BD)] or meals with 8%, 8%, and 27% protein [unbalanced distribution (UD)]. MPS and translation factors 4E binding protein 1 (4E-BP1) and ribosomal protein p70S6 (S6K) were determined before and after breakfast meals at 2 and 11 wk. Muscle weights and body composition were measured at 11 wk.

RESULTS:

In study 1, the breakfast meal increased MPS and S6K in whey and egg treatments but not in wheat or soy treatments. Gastrocnemius weight was greater in the whey group (2.20 ± 0.03 g) than the soy group (1.95 ± 0.04 g) (P < 0.05) and was intermediate in the egg and wheat groups. The wheat group had >20% more body fat than the soy, egg, or whey groups (P < 0.05). Study 2, postmeal MPS and translation factors were 30-45% greater in the BD group than the UD group (P < 0.05), resulting in 6% and 11% greater (P < 0.05) gastrocnemius and soleus weights at 11 wk.

CONCLUSION:

These studies show that meal distribution of protein and Leu influences MPS and long-term changes in adult muscle mass.

Dairy food intake has been associated with infertility; however, little is known with regard to associations with reproductive hormones or anovulation.

OBJECTIVE:

We investigated whether intakes of dairy foods and specific nutrients were associated with reproductive hormone concentrations across the cycle and the risk of sporadic anovulation among healthy women.

Each serving increase in total and low- and high-fat dairy foods and all increases in amounts of all dairy nutrients tested were associated with an ∼5% reduction in serum estradiol concentrations but were not associated with anovulation. Total and high-fat dairy food intakes were positively associated with serum luteinizing hormone concentrations. We observed associations between intakes of >0 servings of yogurt (RR: 2.1; 95% CI: 1.2, 3.7) and cream (RR: 1.8; 95% CI: 1.0, 3.2) and a higher risk of sporadic anovulation compared with no intake.

CONCLUSIONS:

Our study showed associations between increasing dairy food and nutrient intakes and decreasing estradiol concentrations as well as between cream and yogurt intakes and the risk of sporadic anovulation. These results highlight the potential role of dairy in reproductive function in healthy women.

The objective of this review was to elucidate the relationship between VaD and various nutritional factors based on epidemiological studies.

BACKGROUND:

Vascular dementia (VaD) is the second most common type of dementia. The prevalence of VaD continues to increase as the US population continues to grow and age. Currently, control of potential risk factors is believed to be the most effective means of preventing VaD. Thus, identification of modifiable risk factors for VaD is crucial for development of effective treatment modalities. Nutrition is one of the main modifiable variables that may influence the development of VaD.

METHODS:

A systematic review of literature was conducted using the PubMed, Web of Science, and CINAHL Plus databases with search parameters inclusive of vascular dementia, nutrition, and vascular cognitive impairment (VCI).

RESULTS:

Fourteen articles were found that proposed a potential role of specific nutritional components in VaD. These components included antioxidants, lipids, homocysteine, folate, vitamin B12, and fish consumption. Antioxidants, specifically Vitamin E and C, and fatty fish intake were found to be protective against VaD risk. Fried fish, elevated homocysteine, and lower levels of folate and vitamin B12 were associated with increased VaD. Evidence for dietary lipids was inconsistent, although elevated midlife serum cholesterol may increase risk, while late-life elevated serum cholesterol may be associated with decreased risk of VaD.

CONCLUSION:

Currently, the most convincing evidence as to the relationship between VaD and nutrition exists for micronutrients, particularly Vitamin E and C. Exploration of nutrition at the macronutrient level and additional long term prospective cohort studies are warranted to better understand the role of nutrition in VaD disease development and progression. At present, challenges in this research include limitations in sample size, which was commonly cited. Also, a variety of diagnostic criteria for VaD were employed in the studies reviewed, indicating the need for constructing a correct nosological definition of VaD for consistency and conformity in future studies and accurate clinical diagnosis of VaD.

High total cholesterol levels in late life associated with a reduced risk of dementia.

To examine the longitudinal association between plasma total cholesterol and triglyceride levels and incident dementia.

METHODS:

Neuropsychiatric, anthropometric, laboratory, and other assessments were conducted for 392 participants of a 1901 to 1902 birth cohort first examined at age 70. Follow-up examinations were at ages 75, 79, 81, 83, 85, and 88. Information on those lost to follow-up was collected from case records, hospital linkage system, and death certificates. Cox proportional hazards regression examined lipid levels at ages 70, 75, and 79 and incident dementia between ages 70 and 88.

High cholesterol in late life was associated with decreased dementia risk, which is in contrast to previous studies suggesting high cholesterol in mid-life is a risk factor for later dementia. The conflicting results may be explained by the timing of the cholesterol measurements in relationship to age and the clinical onset of dementia.

The perimenopausal and postmenopausal periods are times of pronounced physiological change in body mass index (BMI), physical activity and energy intake. Understanding these changes in middle age could contribute to formation of potential public health targets.

METHOD:

A longitudinal cohort of 5119 perimenopausal women from the Aberdeen Prospective Osteoporosis Screening Study (APOSS) recruited between 1990 and 1994, with follow-up visits at 1997-1999 and 2009-2011. At each visit, participants were weighed, measured and completed socioeconomic and demographic questionnaires. Participants at the first visit were asked to recall body weights at 20, 30 and 40 years of age. We assessed trends in BMI, physical activity and energy intake across and within visits.

RESULTS:

Over 2 decades, obesity prevalence doubled from 14% to 28% of the participants, with 69% of participants being categorised as overweight or obese. Greater than 70% of participants gained >5% of their baseline BMI with weight gain occurring across all weight categories. Energy intake and physical activity levels (PALs) did not change during the 2 decades after menopause (p trend=0.06 and 0.11, respectively), but, within the second visit, energy intake increased concomitantly with a decrease in physical activity across increasing quartiles of BMI (p trend <0.001 for all).

CONCLUSIONS:

Overweight and obesity increased by over 50% over the course of 20 years. Weight gain occurred across the adult life course regardless of starting weight. The marked increase in dietary intake and decrease in PALs in middle age suggest a potential critical period for intervention to curb excess weight gain.

Coolsculpting is fast and non-invasive, and improved technology has lead to explosion in demand.

Freezing Fat Fad

Liposuction might be giving way to freezing fat. You read that right: freezing fat. The beauty of freezing fat is that it is non-invasive. This fat reduction method does not involve needles, cutting of the skin, anaesthetics or other invasive procedures. In fact, one can read a book, surf the web on his smartphone or even play a video game while his fat is being frozen.

>>>>>>>>>>>>>

Review of the Mechanisms and Effects of Noninvasive Body Contouring Devices on Cellulite and Subcutaneous Fat.

Today, different kinds of non-invasive body contouring modalities, including cryolipolysis, radiofrequency (RF), low-level laser therapy (LLLT), and high-intensity focused ultrasound (HIFU) are available for reducing the volume of subcutaneous adipose tissue or cellulite. Each procedure has distinct mechanisms for stimulating apoptosis or necrosis adipose tissue. In addition to the mentioned techniques, some investigations are underway for analyzing the efficacy of other techniques such as whole body vibration (WBV) and extracorporeal shockwave therapy (ESWT). In the present review the mechanisms, effects and side effects of the mentioned methods have been discussed. The effect of these devices on cellulite or subcutaneous fat reduction has been assessed.

EVIDENCE ACQUISITION:

We searched pubmed, google scholar and the cochrane databases for systemic reviews, review articles, meta-analysis and randomized clinical trials up to February 2015. The keywords were subcutaneous fat, cellulite, obesity, noninvasive body contouring, cryolipolysis, RF, LLLT, HIFU, ESWT and WBV with full names and abbreviations.

RESULTS:

We included seven reviews and 66 original articles in the present narrative review. Most of them were applied on normal weight or overweight participants (body mass index < 30 kg/m2) in both genders with broad range of ages (18 to 50 years on average). In the original articles, the numbers of included methods were: 10 HIFU, 13 RF, 22 cryolipolysis, 11 LLLT, 5 ESWT and 4 WBV therapies. Six of the articles evaluated combination therapies and seven compared the effects of different devices.

CONCLUSIONS:

Some of the noninvasive body contouring devices in animal and human studies such as cryolipolysis, RF, LLLT and HIFU showed statistical significant effects on body contouring, removing unwanted fat and cellulite in some body areas. However, the clinical effects are mild to moderate, for example 2 - 4 cm circumference reduction as a sign of subcutaneous fat reduction during total treatment sessions. Overall, there is no definitive noninvasive treatment method for cellulite. Additionally, due to the methodological differences in the existing evidence, comparing the techniques is difficult.

In October 2016, the Advisory Committee on Immunization Practices (ACIP) voted to approve the Recommended Adult Immunization Schedule for Adults Aged 19 Years or Older, United States, 2017. The 2017 adult immunization schedule summarizes ACIP recommendations in 2 figures, footnotes for the figures, and a table of contraindications and precautions for vaccines recommended for adults (Figure). These documents can also be found at www.cdc.gov/vaccines/schedules. The full ACIP recommendations for each vaccine can be found at www.cdc.gov/vaccines/hcp/acip-recs/index.html. The 2017 adult immunization schedule was also reviewed and approved by the American College of Physicians, the American Academy of Family Physicians, the American College of Obstetricians and Gynecologists, and the American College of Nurse-Midwives.

Slumped posture is a diagnostic feature of depression. While research shows upright posture improves self-esteem and mood in healthy samples, little research has investigated this in depressed samples. This study aimed to investigate whether changing posture could reduce negative affect and fatigue in people with mild to moderate depression undergoing a stressful task.

METHODS:

Sixty-one community participants who screened positive for mild to moderate depression were recruited into a study purportedly on the effects of physiotherapy tape on cognitive function. They were randomized to sit with usual posture or upright posture and physiotherapy tape was applied. Participants completed the Trier Social Stress Test speech task. Changes in affect and fatigue were assessed. The words spoken by the participants during their speeches were analysed.

RESULTS:

At baseline, all participants had significantly more slumped posture than normative data. The postural manipulation significantly improved posture and increased high arousal positive affect and fatigue compared to usual posture. The upright group spoke significantly more words than the usual posture group, used fewer first person singular personal pronouns, but more sadness words. Upright shoulder angle was associated with lower negative affect and lower anxiety across both groups.

LIMITATIONS:

The experiment was only brief and a non-clinical sample was used.

CONCLUSIONS:

This preliminary study suggests that adopting an upright posture may increase positive affect, reduce fatigue, and decrease self-focus in people with mild-to-moderate depression. Future research should investigate postural manipulations over a longer time period and in samples with clinically diagnosed depression.

While many studies have found the built environment to be associated with walking, most have used cross-sectional research designs and few have examined more distal cardiometabolic outcomes. This study contributes longitudinal evidence based on changes in walking, body mass index (BMI), and cardiometabolic risk following residential relocation.

METHODS:

We examined 1,079 participants in the CARDIA study who moved residential locations between 2000 and 2006 (ages 32-46 in 2000, 49% white/51% black, 55% female). We created a walkability index from measures of population density, street connectivity, and food and physical activity resources, measured at participants' pre- and post-move residential locations. Outcomes measured before and after the move included walking, BMI, waist circumference, blood pressure, insulin resistance, triglycerides, cholesterol, atherogenic dyslipidemia, and C-reactive protein. Fixed effects (FE) models were used to estimate associations between within-person change in walkability and within-person change in each outcome. These estimates were compared to those from random effects (RE) models to assess the implications of unmeasured confounding.

RESULTS:

In FE models, a one-SD increase in walkability was associated with a 0.81 mmHg decrease in systolic blood pressure [95% CI: (-1.55, -0.07)] and a 7.36 percent increase in C-reactive protein [95% CI: (0.60, 14.57)]. Although several significant associations were observed in the RE models, Hausman tests suggested that these estimates were biased for most outcomes. RE estimates were most commonly biased away from the null or in the opposite direction of effect as the FE estimates.

CONCLUSIONS:

Greater walkability was associated with lower blood pressure and higher C-reactive protein in FE models, potentially reflecting competing health risks and benefits in dense, walkable environments. RE models tended to overstate or otherwise misrepresent the relationship between walkability and health. Approaches that base estimates on variation between individuals may be subject to bias from unmeasured confounding, such as residential self-selection.

Studies have found differences in practice patterns between male and female physicians, with female physicians more likely to adhere to clinical guidelines and evidence-based practice. However, whether patient outcomes differ between male and female physicians is largely unknown.

OBJECTIVE:

To determine whether mortality and readmission rates differ between patients treated by male or female physicians.

DESIGN, SETTING, AND PARTICIPANTS:

We analyzed a 20% random sample of Medicare fee-for-service beneficiaries 65 years or older hospitalized with a medical condition and treated by general internists from January 1, 2011, to December 31, 2014. We examined the association between physician sex and 30-day mortality and readmission rates, adjusted for patient and physician characteristics and hospital fixed effects (effectively comparing female and male physicians within the same hospital). As a sensitivity analysis, we examined only physicians focusing on hospital care (hospitalists), among whom patients are plausibly quasi-randomized to physicians based on the physician's specific work schedules. We also investigated whether differences in patient outcomes varied by specific condition or by underlying severity of illness.

Elderly hospitalized patients treated by female internists have lower mortality and readmissions compared with those cared for by male internists. These findings suggest that the differences in practice patterns between male and female physicians, as suggested in previous studies, may have important clinical implications for patient outcomes.

An oxidized/reduced state of plasma albumin reflects malnutrition due to an insufficient diet in rats.

We examined whether protein- and food-intake restrictions modulate the oxidized/reduced state of plasma albumin in Sprague-Dawley rats. Rats were fed a 3%, 5%, 10% or 20% casein diet for 2 weeks. The plasma albumin concentration significantly decreased with decreasing protein intake. However, no significant difference in plasma albumin concentration was seen between rats fed the 5% or 10% casein diet. In rats fed the 5% casein diet, the percentage of mercaptalbumin within total plasma albumin was significantly lower and that of nonmercaptalbumin-1 was significantly higher than in rats fed the 10% casein diet. In experiments with food-intake restriction for 2 weeks, rats were fed 50% or 75% of the amount of a 20% casein diet consumed by control rats. The percentage of mercaptalbumin was significantly lower and that of nonmercaptalbumin-2 was significantly higher in rats with food-intake restriction than in control rats. When rats with malnutrition were refed with the 20% casein diet ad libitum, the percentage of mercaptalbumin rapidly increased. The change in the percentage of mercaptalbumin was correlated with the plasma transthyretin concentration. These results indicate that the oxidized/reduced state of plasma albumin may be applied as a sensitive marker of nutritional status reflecting dietary pattern.

KEYWORDS:

insufficient diet; malnutrition; rats; reduced albumin; transthyretin

Association Between 900 Steps a Day and Functional Decline in Older Hospitalized Patients.

Statins showed mixed results in heart failure (HF) patients. The benefits in major HF outcomes, including all-cause mortality and sudden cardiac death (SCD), have always been discordant across systematic reviews and meta-analyses. We intended to systematically identify and appraise the available evidence that evaluated the effectiveness of statins in clinical outcomes for HF patients.

We identified 24 randomized clinical trials that evaluated the efficacy of statins for HF patients. All randomized clinical trials were assessed for risk of bias and pooled together in a meta-analysis. Pre-specified outcomes were sudden cardiac death, all-cause mortality, and hospitalization for worsening heart failure.

Statins do not reduce sudden cardiac death, all-cause mortality, but may slightly decrease hospitalization for worsening heart failure in HF patients. The evaluation of the risk of biases suggested moderate quality of the published results. Until new evidence is available, this study supports the 2013 ACCF/AHA guidelines to not systematically prescribe statins in "only" HF patients, which should help avoid unnecessary polypharmacy.

I am pleased to introduce this special section on centenarians and dementia. These articles were submitted as a group and are based on a symposium presented by the authors at the Annual Meeting of The Gerontological Society of America in November 1997. The three papers provide critical and diverse insights into this important topic. Because the oldest old are the fastest growing portion of the population in most industrialized and even some nonindustrialized nations, the importance of understanding the oldest old is clear. These articles focus on centenarians but vary considerably in both size and representativeness of this special population. Currently, it is very difficult to find large numbers of people who have survived to be centenarians whether the population base is a large metropolitan area such as Boston or even a larger but regional geographical area such as Georgia. Given these limitations, the data these three articles represent are especially unique. The authors use the data to examine the basis of dementia, a question that is likely to become increasingly important in the years to come as the number of elderly people in general, as well as the number of centenarians specifically, significantly increases. Some authors conclude that the probability of dementia increases significantly and linearly with age whereas others suggest a different pattern of occurrence indicating that dementia may develop in a linear fashion up until the achievement of late old age (i.e., 100 years). These authors suggest that survival to age 100 marks the beginning of a crossover effect, making the probability of dementia less likely and less predictable. Because the availability of children as support providers decreases as one achieves centenarian status and caregiving of people with dementia is very difficult, these findings have important clinical, policy, and service implications. Centenarians will have survived many of their children or will have children who are, themselves, 80 years old and likely to have their own significant health problems or functional limitations. Silver, Jilinskaia, and Perls 2001 report on 43 confirmed centenarians who are part of the New England Centenarian Study. The authors find a prevalence range of dementia at about two thirds, with one third of this population assessed as having no or very mild dementia. Hagberg, Alfredson, Poon, and Homma 2001 examined between 100 and 200 centenarians in each of three countries: Japan, Sweden, and the United States. They report a dementia prevalence rate between 40% and 63%. Hagberg and colleagues concur with the assumption of increased cognitive differentiation with age and report similar results across the three countries. Andersen-Ranberg, Vasegaard, and Jeune 2001, using a population-based survey of all persons living in Denmark during a specified 1-year period (approximately 276 people), found that 50% of their sample could be described as mildly to severely demented.

Clearly there is much gerontologists still do not know about the relationship between dementia and age. Thus far, researchers have been limited by studies with quite small numbers. However, the number of centenarians will increase significantly in the next decades. This fact is foreshadowed by the demographics already existing in China. Although not represented in this special section, I have recently become aware of an ongoing multiwave, longitudinal study of centenarians in China that includes over 2000 centenarians as well as another 3,000 80-year-olds and 3,000 90-year-olds. This study only included 50% of the counties in 22 provinces of Mainland China. Such figures make it incredibly clear that the number of centenarians alive in this century is increasing exponentially. We, as gerontologists, need to understand this unique group of people and be prepared to meet their needs. It is the goal of this special section to contribute both to the recognition and description of this very unique population of older people.

"The prevalences of dementia reported in the Japanese and Swedish studies are quite different—63% in the Japanese study, with 43% men and 71% women, and 27–40% in the Swedish study, with 16% men and 30% women. The U.S. study excluded participants with dementia. In other centenarian studies, the reports of dementia prevalence also differ. In the Hungarian study, the prevalence of dementia is 43% for men and 63% for women (Beregi and Klinger 1989). The Finnish study reported 20% for men and almost 50% for women (Louhija 1994). There are at least two possible reasons for the discrepancy across studies: the definition of dementia and the sample representativeness employed. A primary reason for the different prevalence figures may be the use of different criteria in defining dementia in oldest old persons. Questionnaires with normative cutoff scores were used in the Japanese study. DSM-III-R criteria in terms of psychometric tests and clinical assessment were used in the Swedish study. In both cases, the paramount question is whether a compensation for aging decline was made in applying the normative cutoff. This was a practice in the Japanese study (Fig. 3)."

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

Cognitive functional status of age-confirmed centenarians in a population-based study.

The New England Centenarian Study is a population-based study of all centenarians in 8 towns near Boston, MA. Age was confirmed for 43 centenarians all alive on a designated date. To determine prevalence of dementia in centenarians, the authors analyzed neuropsychological, medical, and functional status data for 34 (79%) of the centenarians. Definition of dementia was based on the Consortium to Establish a Registry for Alzheimer's Disease criteria, and a Clinical Dementia Rating (CDR) score was formulated for each participant. Seven (21%) had no dementia (CDR score 0), and an additional 4 (12%) were assigned a CDR score of 0.5, uncertain or deferred diagnosis. The remaining 22 (64%) had at least some degree of dementia. The authors calculated Barthel Index scores to determine ability to perform activities of daily living. There was a statistically significant correlation between CDR scores and Barthel Index scores (r = -0.73). Correlation was strongest for those with no or severe dementia, with the greatest range of function measured among those with moderate dementia.

>>>>>>>>>>>>>>>>>>>>>>>>>

Dementia is not inevitable: a population-based study of Danish centenarians.

The authors evaluated the prevalence of dementia in centenarians. In this population-based survey, persons living in Denmark who turned 100 during the period April 1, 1995--May 31, 1996 (N = 276) were interviewed and examined at their residences. Additional health information was retrieved from medical files, including the National Discharge Registry. A participation rate was 75%, and no differences were found between participants and nonparticipants regarding sex and type of housing. The prevalence of mild to severe dementia in centenarians was 51%; 37% had no signs of dementia. Among the 105 demented centenarians, 13 (12%) had diseases (vitamin B12 and folic acid deficiencies, hypothyroidism, Parkinson's disease) that could contribute to a dementia diagnosis. Of the remaining 92 demented participants, 46 (50%) had 1 one or more cerebro- or cardiovascular diseases known to be risk factors in the development of dementia. The prevalence of these risk factors was the same in demented and nondemented participants, whereas hypertension was significantly more frequent in nondemented than demented participants. Dementia is common but not inevitable in centenarians. Cerebro- and cardiovascular diseases are equally common in demented and nondemented persons.

Dietary soy and natto intake and cardiovascular disease mortality in Japanese adults: the Takayama study.

Whether soy intake is associated with a decreased risk of cardiovascular disease (CVD) remains unclear. A traditional Japanese soy food, natto, contains a potent fibrinolytic enzyme. However, its relation to CVD has not been studied.

OBJECTIVE:

We aimed to examine the association of CVD mortality with the intake of natto, soy protein, and soy isoflavones in a population-based cohort study in Japan.

DESIGN:

The study included 13,355 male and 15,724 female Takayama Study participants aged ≥35 y. At recruitment in 1992, each subject was administered a validated semiquantitative food-frequency questionnaire. Deaths from CVD were ascertained over 16 y.

RESULTS:

A total of 1678 deaths from CVD including 677 stroke and 308 ischemic heart disease occurred during follow-up. The highest quartile of natto intake compared with the lowest intake was significantly associated with a decreased risk of mortality from total CVD after control for covariates: the HR was 0.75 (95% CI: 0.64, 0.88, P-trend = 0.0004). There were no significant associations between the risk of mortality from total CVD and intakes of total soy protein, total soy isoflavone, and soy protein or soy isoflavone from soy foods other than natto. The highest quartiles of total soy protein and natto intakes were significantly associated with a decreased risk of mortality from total stroke (HR = 0.75, 95% CI: 0.57, 0.99, P-trend = 0.03 and HR = 0.68, 95% CI: 0.52, 0.88, P-trend = 0.0004, respectively). The highest quartile of natto intake was also significantly associated with a decreased risk of mortality from ischemic stroke (HR = 0.67, 95% CI:0.47, 0.95, P-trend = 0.03).

CONCLUSION:

Data suggest that natto intake may contribute to the reduction of CVD mortality.

Our trial INTACT (Intensive Nutrition in Acute Lung Injury Trial) was designed to compare the impact of feeding from acute lung injury (ALI) diagnosis to hospital discharge, an interval that, to our knowledge, has not yet been explored. It was stopped early because participants who were randomly assigned to energy intakes at nationally recommended amounts via intensive medical nutrition therapy experienced significantly higher mortality hazards than did those assigned to standard nutrition support care that provided energy at 55% of recommended concentrations.

OBJECTIVE:

We assessed the influence of dose and timing of feeding on hospital mortality.

DESIGN:

Participants (n = 78) were dichotomized as died or discharged alive. Associations between the energy and protein received overall, early (days 1-7), and late (days ≥8) and the hazards of hospital mortality were evaluated between groups with multivariable analysis methods.

Food fortification has been recommended to improve a population's micronutrient status. Biofortification techniques modestly elevate the zinc content of cereals, but few studies have reported a positive impact on functional indicators of zinc status.

OBJECTIVE:

We determined the impact of a modest increase in dietary zinc that was similar to that provided by biofortification programs on whole-body and cellular indicators of zinc status.

DESIGN:

Eighteen men participated in a 6-wk controlled consumption study of a low-zinc, rice-based diet. The diet contained 6 mg Zn/d for 2 wk and was followed by 10 mg Zn/d for 4 wk. To reduce zinc absorption, phytate was added to the diet during the initial period. Indicators of zinc homeostasis, including total absorbed zinc (TAZ), the exchangeable zinc pool (EZP), plasma and cellular zinc concentrations, zinc transporter gene expression, and other metabolic indicators (i.e., DNA damage, inflammation, and oxidative stress), were measured before and after each dietary-zinc period.

RESULTS:

TAZ increased with increased dietary zinc, but plasma zinc concentrations and EZP size were unchanged. Erythrocyte and leukocyte zinc concentrations and zinc transporter expressions were not altered. However, leukocyte DNA strand breaks decreased with increased dietary zinc, and the level of proteins involved in DNA repair and antioxidant and immune functions were restored after the dietary-zinc increase.

CONCLUSIONS:

A moderate 4-mg/d increase in dietary zinc, similar to that which would be expected from zinc-biofortified crops, improves zinc absorption but does not alter plasma zinc. The repair of DNA strand breaks improves, as do serum protein concentrations that are associated with the DNA repair process.

Epidemiological evidence from Western populations suggests that dairy food intake may reduce the risk of hypertension, probably through its calcium content. However, there are no epidemiological studies among Asian populations with generally lower dairy and calcium consumption.

OBJECTIVE:

The relation between dairy or calcium intake and risk of hypertension was evaluated in a Chinese population in Singapore.

METHODS:

The analysis included 37,124 Chinese men and women aged 45-74 y who participated in the Singapore Chinese Health Study in 1993-1998. The subjects included in the present study had no history of cancer, hypertension, or cardiovascular disease at baseline and completed ≥1 follow-up interview. Diet at baseline was assessed by using a validated 165-item semiquantitative food-frequency questionnaire. The occurrence of new, physician-diagnosed hypertension was ascertained through follow-up interviews during 1999-2004 and 2006-2010. The Cox proportional hazard regression method was used to compute HRs and 95% CIs with adjustment for potential confounders.

Process point of view (POV) models of mortality, such as the Strehler-Mildvan and stochastic vitality models, represent death in terms of the loss of survival capacity through challenges and dissipation. Drawing on hallmarks of aging, we link these concepts to candidate biological mechanisms through a framework that defines death as challenges to vitality where distal factors defined the age-evolution of vitality and proximal factors define the probability distribution of challenges. To illustrate the process POV, we hypothesize that the immune system is a mortality nexus, characterized by two vitality streams: increasing vitality representing immune system development and immunosenescence representing vitality dissipation. Proximal challenges define three mortality partitions: juvenile and adult extrinsic mortalities and intrinsic adult mortality. Model parameters, generated from Swedish mortality data (1751-2010), exhibit biologically meaningful correspondences to economic, health and cause-of-death patterns. The model characterizes the twentieth century epidemiological transition mainly as a reduction in extrinsic mortality resulting from a shift from high magnitude disease challenges on individuals at all vitality levels to low magnitude stress challenges on low vitality individuals. Of secondary importance, intrinsic mortality was described by a gradual reduction in the rate of loss of vitality presumably resulting from reduction in the rate of immunosenescence. Extensions and limitations of a distal/proximal framework for characterizing more explicit causes of death, e.g. the young adult mortality hump or cancer in old age are discussed.

One bout of aerobic exercise and regular participation in aerobic exercise has been shown to result in a lowering of office and ambulatory blood pressure of hypertensive individuals. Higher-intensity aerobic exercise, up to 70% of maximal oxygen consumption, does not produce a greater hypotensive effect, compared with moderate-intensity aerobic exercise. Intermittent aerobic and anaerobic exercise, however, performed at an intensity >70% of maximal oxygen uptake has been shown to significantly reduce office and ambulatory blood pressure of hypertensive individuals. Thus, faster, more intense forms of exercise can also bring about blood pressure reduction in the hypertensive population. Compared with continuous moderate-intensity aerobic exercise, high-intensity intermittent exercise typically results in a greater aerobic fitness increase in less time and produces greater changes in arterial stiffness, endothelial function, insulin resistance and mitochondrial biogenesis. One of the characteristics of high-intensity intermittent training is that it typically involves markedly lower training volume compared with traditional aerobic and resistance exercise programmes making it a time-efficient strategy to accrue adaptations and blood pressure benefits. This review briefly summarizes the results of studies that have examined the effects of single and repeated bouts of aerobic and resistance exercise on office and ambulatory blood pressure of hypertensive individuals. Then a more detailed summary of studies examining the effect of high-intensity intermittent exercise and training on hypertension is provided.

Interactive effects of obesity and physical fitness on risk of ischemic heart disease.

Obesity and low physical fitness are known risk factors for ischemic heart disease (IHD), but their interactive effects are unclear. Elucidation of interactions between these common, modifiable risk factors may help inform more effective preventive strategies. We examined interactive effects of obesity, aerobic fitness and muscular strength in late adolescence on risk of IHD in adulthood in a large national cohort.

SUBJECTS/METHODS:

We conducted a national cohort study of all 1 547 407 military conscripts in Sweden during 1969-1997 (97-98% of all 18-year-old males each year). Aerobic fitness, muscular strength and body mass index (BMI) measurements were examined in relation to IHD identified from outpatient and inpatient diagnoses through 2012 (maximum age 62 years).

RESULTS:

There were 38 142 men diagnosed with IHD in 39.7 million person years of follow-up. High BMI or low aerobic fitness (but not muscular strength) was associated with higher risk of IHD, adjusting for family history and socioeconomic factors. The combination of high BMI (overweight/obese vs normal) and low aerobic fitness (lowest vs highest tertile) was associated with highest IHD risk (incidence rate ratio, 3.11; 95% confidence interval (CI), 2.91-3.31; P<0.001). These exposures had no additive and a negative multiplicative interaction (that is, their combined effect was less than the product of their separate effects). Low aerobic fitness was a strong risk factor even among those with normal BMI.

CONCLUSIONS:

In this large cohort study, low aerobic fitness or high BMI at age 18 was associated with higher risk of IHD in adulthood, with a negative multiplicative interaction. Low aerobic fitness appeared to account for a similar number of IHD cases among those with normal vs high BMI (that is, no additive interaction). These findings suggest that interventions to prevent IHD should begin early in life and include not only weight control but aerobic fitness, even among persons of normal weight.

Sirtuin (SIRT) is a main regulator of metabolism and lifespan, and its importance has been implicated in the prevention against aging-related diseases. The purpose of this study was to identify the pattern of serum SIRT1 activity according to age and sex, and to investigate how serum SIRT1 activity is correlated with other metabolic parameters in Korean adults. The Biobank of Jeju National University Hospital, a member of the Korea Biobank Network, provided serum samples from 250 healthy adults. Aging- and metabolism-related factors were analyzed in serum, and the data were compared by the stratification of age and sex. Basal metabolic rate (BMR) decreased with age and was significantly lower in men in their fifties and older and in women in their forties and older compared with twenties in men and women, respectively. SIRT1 activities were altered by age and sex. Especially, women in their thirties showed the highest SIRT1 activities. Correlation analysis displayed that SIRT1 activity is positively correlated with serum triglyceride (TG) in men, and with waist circumference, systolic blood pressure, diastolic blood pressure, and serum TG in women. And, SIRT1 activity was negatively correlated with aspartate aminotransferase/alanine aminotransferase ratio in women (r = -0.183, p = 0.039). Positive correlation was observed between SIRT1 activity and BMR in women (r = 0.222, p = 0.027), but not in men. Taken together, these findings suggest the possibility that serum SIRT1 activities may be utilized as a biomarker of aging. In addition, positive correlation between SIRT1 activity and BMR in women suggests that serum SIRT1 activity may reflect energy expenditure well in human.

The predictive value of cumulative blood pressure (BP) on all-cause mortality and cardiovascular and cerebrovascular events (CCE) has hardly been studied. In this prospective cohort study including 52,385 participants from the Kailuan Group who attended three medical examinations and without CCE, the impact of cumulative systolic BP (cumSBP) and cumulative diastolic BP (cumDBP) on all-cause mortality and CCEs was investigated. For the study population, the mean (standard deviation) age was 48.82 (11.77) years of which 40,141 (76.6%) were male. The follow-up for all-cause mortality and CCEs was 3.96 (0.48) and 2.98 (0.41) years, respectively. Multivariate Cox proportional hazards regression analysis showed that for every 10 mm Hg·year increase in cumSBP and 5 mm Hg·year increase in cumDBP, the hazard ratio for all-cause mortality were 1.013 (1.006, 1.021) and 1.012 (1.006, 1.018); for CCEs, 1.018 (1.010, 1.027) and 1.017 (1.010, 1.024); for stroke, 1.021 (1.011, 1.031) and 1.018 (1.010, 1.026); and for MI, 1.013 (0.996, 1.030) and 1.015 (1.000, 1.029). Using natural spline function analysis, cumSBP and cumDBP showed a J-curve relationship with CCEs; and a U-curve relationship with stroke (ischemic stroke and hemorrhagic stroke). Therefore, increases in cumSBP and cumDBP were predictive for all-cause mortality, CCEs, and stroke.

Unhealthy gut microbes can trigger a rise in blood pressure, which can lead to hypertension.

Probiotics for High Blood Pressure

Scientists have determined that microorganisms within the intestines partially determine blood pressure levels in rats. These microorganisms play a critical role in the onset of high blood pressure. The findings were recently published in Physiological Ergonomics. The study matters a great deal as the biology of rats is similar enough to that of humans to make scientific findings in rat studies relevant to humanity. Let's delve into the study details to explain how the findings came about and what they mean in the context of human health.

Study Details

Two groups of rats were studied in the above-referenced research. One set had high blood pressure. This group was characterized as the “hypertensive” group. The second group had normal blood pressure and was appropriately referred to as the “normal” set. Researchers removed a section of the biological material of each group's large intestines.

The animals were provided with antibiotics for 10 days to minimize natural microbiota levels. Once the antibiotics were applied for this time period, hypertensive microbiota were transplanted to the rats with normal blood pressure. The normal microbiota were transplanted to the hypertensive set.

The Results

The scientists found that the set treated with the hypertensive microbiota endured high blood pressure. The surprising result is that rats given the normal microbiota did not endure a meaningful decrease in blood pressure. However, readings did drop by a small margin. This finding will likely spur even more studies of microbiota in the onset of hypertension in human beings.

The finding also lends credence to the notion that probiotics have an important role to play in the treatment of hypertension. Probiotics are helpful microorganisms found within the gut. Adding probiotics to one's diet should make a positive impact on blood pressure.

Gut dysbiosis has been linked to cardiovascular diseases including hypertension. We tested the hypothesis that hypertension could be induced in a normotensive strain of rats or attenuated in a hypertensive strain of rats by exchanging the gut microbiota between the two strains. Cecal contents from spontaneously hypertensive stroke prone rats (SHRSP) were pooled. Similarly, cecal contents from normotensive WKY rats were pooled. Four week old recipient WKY and SHR rats, previously treated with antibiotics to reduce the native microbiota, were gavaged with WKY or SHRSP microbiota, resulting in four groups; WKY with WKY microbiota (WKY g-WKY), WKY with SHRSP microbiota (WKY g-SHRSP), SHR with SHRSP microbiota (SHR g-SHRSP), and SHR with WKY microbiota (SHR g-WKY). Systolic blood pressure (SBP) was measured weekly using tail-cuff plethysmography. At 11.5 weeks of age systolic blood pressure increased 26mmHg in WKY g-SHRSP compared to that in WKY g-WKY (182±8 versus 156±8 mmHg, p=0.02). Although the SBP in SHR g-WKY tended to decrease compared to SHR g-SHRSP, the differences were not statistically significant. Fecal pellets were collected at 11.5 weeks of age for identification of the microbiota by sequencing the 16S ribosomal RNA gene. We observed a significant increase in the Firmicutes:Bacteroidetes ratio in the hypertensive WKY g-SHRSP, as compared to the normotensive WKY g-WKY (p=0.042). Relative abundance of multiple taxa correlated with SBP. We conclude that gut dysbiosis can directly affect SBP. Manipulation of the gut microbiota may represent an innovative treatment for hypertension.

One of the most popular expressions of massive group classes of aerobic physical activity is Zumba fitness. The aim of the study was to compare and relate the energy expenditure and the amount and intensity of physical effort during a Zumba fitness class in women with different Body Mass Index (BMI).

METHODS:

Body displacements of 61 adult women who performed a one-hour Zumba session were evaluated with triaxial accelerometers. In order to observe the effect of BMI women were divided into normal weight (n= 26), overweight (n= 21) and obese groups (n=14).

RESULTS:

The average number of steps was 4533.3 ± 1351 and the percentage of total class time of moderate to vigorous intensity (% MVPA) was 53.8 ± 14.4 %. The metabolic intensity average was 3.64 ± 1.1 MET, with an energy expenditure by total body mass of 3.9 ± 1.6 kcal/kg. When analyzing groups, the normal weight group had a greater number of steps (5184.2 ± 1561.1 steps/class) compared to overweight (4244.8 ± 1049.3 steps/class) and obese women (3756.9 ± 685.7 steps/class) with p < 0.05. Also, the normal weight group spent a lower percentage of class time at the lower levels of intensity (sedentary and lifestyle activity levels) and more time at the highest levels (vigorous and very vigorous) compared to obese women (p < 0.05). Participants with a normal weight obtained a higher % MVPA (62.1 ± 15 %) compared to overweight (50.1 ± 9.4 %) and obese (44.1 ± 11.9 %) groups with p < 0.05. A metabolic intensity of 4.6 ± 1.9 MET in the normal weight group was higher compared to 3.5 ± 1.0 MET in the overweight (p < 0.05) and 3.1 ± 1.2 MET in the obese group (p < 0.05), was observed. The subjective perception of effort was 7.84 ± 0.9 (Borg CR 10), no differences between groups. Also we observed in all participants that at higher BMI values, there were lower energy expenditure values per kilo of weight (r= -0.40; p < 0.001), metabolic intensity (r= -0.39; p < 0.001), step counts (r= -0.43; p < 0.001) and % MVPA (r= -0.50; p < 0.001).

CONCLUSIONS:

These results show that a higher BMI is associated with a lower intensity of effort, energy expenditure and amount of physical activity during a one-hour Zumba class, restricting to overweight and obese women to achieving the effort parameters recommended to control weight and improve cardiovascular fitness.

Use of Framingham risk score and new biomarkers to predict cardiovascular mortality in older people: population based observational cohort study.

To investigate the performance of classic risk factors, and of some new biomarkers, in predicting cardiovascular mortality in very old people from the general population with no history of cardiovascular disease.

DESIGN:

The Leiden 85-plus Study (1997-2004) is an observational prospective cohort study with 5 years of follow-up.

SETTING:

General population of the city of Leiden, the Netherlands.

PARTICIPANTS:

Population based sample of participants aged 85 years (215 women and 87 men) with no history of cardiovascular disease; no other exclusion criteria. Main measurements Cause specific mortality was registered during follow-up. All classic risk factors included in the Framingham risk score (sex, systolic blood pressure, total and high density lipoprotein cholesterol, diabetes mellitus, smoking and electrocardiogram based left ventricular hypertrophy), as well as plasma concentrations of the new biomarkers homocysteine, folic acid, C reactive protein, and interleukin 6, were assessed at baseline.

RESULTS:

During follow-up, 108 of the 302 participants died; 32% (35/108) of deaths were from cardiovascular causes. Classic risk factors did not predict cardiovascular mortality when used in the Framingham risk score (area under receiver operating characteristic curve 0.53, 95% confidence interval 0.42 to 0.63) or in a newly calibrated model (0.53, 0.43 to 0.64). Of the new biomarkers studied, homocysteine had most predictive power (0.65, 0.55 to 0.75). Entering any additional risk factor or combination of factors into the homocysteine prediction model did not increase its discriminative power.

CONCLUSIONS:

In very old people from the general population with no history of cardiovascular disease, concentrations of homocysteine alone can accurately identify those at high risk of cardiovascular mortality, whereas classic risk factors included in the Framingham risk score do not. These preliminary findings warrant validation in a separate cohort.

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

Carotid Plaque Score and Risk of Cardiovascular Mortality in the Oldest Old: Results from the TOOTH Study.

Accumulating evidence suggests that predictability of traditional cardiovascular risk factors declines with advancing age. We investigated whether carotid plaque scores (CPSs) were associated with cardiovascular disease (CVD) death in the oldest old, and whether asymmetrical dimethylarginine (ADMA), a marker of endothelial dysfunction, moderated the association between the CPS and CVD death.

METHODS:

We conducted a prospective cohort study of Japanese subjects aged ≥85 years without CVD at baseline. We followed this cohort for 6 years to investigate the association of CPS with CVD death via multivariable Cox proportional hazard analysis. We divided participants into three groups according to CPS (no, 0 points; low, 1.2-4.9 points; high, ≥5.0 points). The predictive value of CPS for estimating CVD death risk over CVD risk factors, including ADMA, was examined using C-statistics.

RESULTS:

We analyzed 347 participants (151 men, 196 women; mean age, 87.6 years), of which 135(38.9%) had no carotid plaque at baseline, and 48 (13.8%) had high CPS. Of the total, 29 (8.4%) participants experienced CVD-related death during the study period. Multivariable analysis revealed a significant association of high CPS with CVD-related mortality relative to no CPS (hazard ratio, 3.90; 95% confidence interval: 1.47-10.39). ADMA was not associated with CVD death, but the significant association between CPS and CVD death was observed only in lower ADMA level. The addition of CPS to other risk factors improved the predictability of CVD death (p=0.032).

CONCLUSIONS:

High CPS correlated significantly with a higher CVD death risk in the oldest old with low cardiovascular risk. Ultrasound carotid plaque evaluation might facilitate risk evaluations of CVD death in the very old.

The aims of this meta-analysis were to evaluate the effects of coenzyme Q10 (CoQ10) supplementation on inflammatory mediators including C-reactive protein (CRP), interleukin-6 (IL-6) and tumor necrosis factor-α (TNF-α) by analyzing published randomized controlled trials (RCTs). A systematic search in PubMed, Cochrane Library and Clinicaltrials.gov was performed to identify eligible RCTs. Data synthesis was performed using a random- or a fixed-effects model depending on the results of heterogeneity tests, and pooled data were displayed as weighed mean difference (WMD) and 95% confidence interval (CI). Seventeen RCTs were selected for the meta-analysis. CoQ10 supplementation significantly reduced the levels of circulating CRP (WMD: -0.35mg/L, 95% CI: -0.64 to -0.05, P=0.022), IL-6 (WMD: -1.61pg/mL, 95% CI: -2.64 to -0.58, P=0.002) and TNF-α (WMD: -0.49pg/mL, 95% CI: -0.93 to -0.06, P=0.027). The results of meta-regression showed that the changes of CRP were independent of baseline CRP, treatment duration, dosage, and patients characteristics. In the meta-regression analyses, a higher baseline IL-6 level was significantly associated with greater effects of CoQ10 on IL-6 levels (P for interaction=0.006). In conclusion, this meta-analysis of RCTs suggests significant lowering effects of CoQ10 on CRP, IL-6 and TNF-α. However, results should be interpreted with caution because of the evidence of heterogeneity and limited number of studies.

Chronic inflammation contributes to the onset and development of metabolic diseases. Clinical evidence has suggested that coenzyme Q10 (CoQ10) has some effects on inflammatory markers. However, these results are equivocal. The aim of this systematic review was to assess the effects of CoQ10 on serum levels of inflammatory markers in people with metabolic diseases.

METHODS:

Electronic databases were searched up to February 2016 for randomized controlled trials (RCTs). The outcome parameters were related to inflammatory factors, including interleukin-6 (IL-6), tumor necrosis factor-alpha (TNF-α) and C reactive protein (CRP). RevMan software was used for meta-analysis. Meta-regression analysis, Egger line regression test and Begg rank correlation test were performed by STATA software.

CoQ10 supplementation may partly improve the process of inflammatory state. The effects of CoQ10 on inflammation should be further investigated by conducting larger sample size and well-defined trials of long enough duration.

Emerging evidence from cohort studies indicates that adiposity is associated with greater incidence of head and neck cancer (HNC). However, most studies have used self-reported anthropometry which is prone to error.

Vegetable crop consumption is one of the main sources of dietary exposure to toxic trace elements (TEs). A paired survey of soil and vegetable samples was conducted in 589 agricultural sites in the Youxian prefecture, southern China, to investigate the effect of soil factors on the accumulation of arsenic, cadmium, mercury, and lead in different vegetables. A site-specific model was developed to estimate the health risk from vegetable consumption. The TE concentration varied in different plant species, and rape can be cultivated in contaminated areas for its potential use in restricting the transfer of TE from soil to edible plant parts. The accumulation of TEs in vegetables was governed by multiple factors, mainly element interaction, metal availability (extractable CaCl2 fraction), and soil pH. Soil Zn may promote Cd accumulation in vegetables when soil Cd/Zn ratio>0.02. Cadmium is a major hazardous component. About 80.8% of the adult populations consuming locally produced vegetables had a daily Cd intake risk above the safe standard. Among investigated vegetables, radish is potentially hazardous for populations because of its high consumption rate and high Cd content but low Zn accumulation. The consumption of radish cultivated in highly acidic soil (4<pH≤5) and high Cd contamination (CaCl2-Cd=1.0mgkg-1) had a significant probability (89.4%) to be above the safe standard; while this risk was significantly decreased to 8.9% in soil of near-neutral pH (6<pH≤7). The wide range of TE concentrations and soil factors suggests that a site-specific risk assessment is needed for better and safer vegetable production.

The population is aging and multimorbidity is becoming a common problem in the elderly.

OBJECTIVE:

To explore the effect of multimorbidity patterns on mortality for all causes at 3- and 5-year follow-up periods.

MATERIALS AND METHODS:

A prospective community-based cohort (2009-2014) embedded within a randomized clinical trial was conducted in seven primary health care centers, including 328 subjects aged 85 years at baseline. Sociodemographic variables, sensory status, cardiovascular risk factors, comorbidity, and geriatric tests were analyzed. Multimorbidity patterns were defined as combinations of two or three of 16 specific chronic conditions in the same individual.

Multimorbidity as specific combinations of chronic conditions showed an effect on mortality, which would be higher than the risk attributable to individual morbidities. The most important predicting pattern for mortality was the combination of AF, CKD, and visual impairment after 3 years. These findings suggest that a new approach is required to target multimorbidity in octogenarians.

As type 2 diabetes (T2D) patients have a high risk for coronary heart disease (CHD) and all-cause mortality and smoking is a major single risk factor for total and CHD mortality, it is important to understand the impact of smoking to the outcome events in comparison to people without T2D. Studies of excess risk of CHD incidence and mortality, and all-cause mortality in T2D patients related to smoking are controversial. We aimed to assess the risk of CHD incidence and mortality, and all-cause mortality in a large Finnish population cohort consisting of people with and without T2Daccording to smoking status.

METHODS:

Prospective follow-up of 28 712 men and 30 700 women aged 25-64 years living in eastern and south-western Finland. The data on mortality were obtained from the nationwide death register using the unique national personal identification number. Follow-up information regarding CHD was based on the Finnish Hospital Discharge Register for non-fatal outcomes. The Cox proportional hazards models were used to estimate the association between diabetes and smoking subgroups and the risk for total and CHD mortality.

RESULTS:

T2D patients who were smoking had higher all-cause mortality in both men (HR 3.76; 95% CI 2.95-4.78) and women (HR 4.51; 95% CI 2.91-7.00) than non-smoking diabetic men (HR 2.03; 95% CI 1.51-2.74) and women (HR 2.11; 95% CI 1.71-2.59). The CHD mortality risk for smoking men with T2D was higher (HR 6.15; 95% CI 4.22-8.96) than in non-smoking diabetic men (HR 2.62; 95% CI 1.60-4.29). Similar results were found in women revealing corresponding HR for CHD mortality of 6.92 (95% CI 2.79-17.19) for smoking, T2D women and 4.06 (95% CI 2.83-5.82) for non-smoking T2D women, respectively. Even though the risk of CHD incidence in T2D patients who had stopped smoking was statistically significantly higher than in their non-smoking non-diabetic counterparts, their CHD incidence was lower than in smoking T2D patients (HR in men 3.00; HR in women 2.80).

CONCLUSION:

It is important to address tobacco consumption in T2D patients, especially during primary health care contacts in order to reduce their high risk of CHD and all-cause mortality.

KEYWORDS:

Coronary heart disease; Diabetes mellitus; Mortality; Smoking

Male caffeine and alcohol intake in relation to semen parameters and in vitro fertilization outcomes among fertility patients.

Much of the literature on the impact of male caffeine and alcohol intake on reproductive outcomes has utilized semen quality as a proxy for male fertility, although semen parameters have a limited predictive value for spontaneous pregnancy. The objective of this study was to investigate whether male caffeine and alcohol intakes are associated with semen parameters and assisted reproductive technology outcome. The Environment and Reproductive Health Study, an ongoing prospective cohort study, enrolls subfertile couples presenting for treatment at an academic fertility center (2007-2012). A total of 171 men with 338 semen analyses and 205 assisted reproductive technology cycles were included in this analysis. Diet was assessed using a 131-item food frequency questionnaire. Mixed models adjusting for potential confounders were used to evaluate the relationships of male caffeine and alcohol intakes with semen parameters and assisted reproductive technology outcomes. There was no association between male caffeine and alcohol intake and semen quality. Male caffeine intake was negatively related to live birth after assisted reproductive technologies (p-trend < 0.01), and male alcohol intake was positively related to live birth after assisted reproductive technologies (p-trend = 0.04). Adjusted live birth rate among couples with a male partner in the highest quartile of caffeine intake (≥272 mg/day) compared to couples with a male partner in the lowest quartile of intake (<99 mg/day) was 19% vs. 55%, respectively, p < 0.01. In terms of alcohol intake, adjusted live birth rate among couples with a male partner in the highest quartile of alcohol intake (≥22 g/day) compared to couples with a male partner in the lowest quartile of intake (<3 g/day) was 61% vs. 28%, respectively, p = 0.05. In conclusion, male pre-treatment caffeine and alcohol intakes were associated with live birth after assisted reproductive technologies, but not with semen parameters, among fertility patients.

You’ve heard it many times before from your doctor: If you’re taking antibiotics, don’t stop taking them until the pill vial is empty, even if you feel better.

The rationale behind this commandment has always been that stopping treatment too soon would fuel the development of antibiotic resistance — the ability of bugs to evade these drugs. Information campaigns aimed at getting the public to take antibiotics properly have been driving home this message for decades.

But the warning, a growing number of experts say, is misguided and may actually be exacerbating antibiotic resistance.

The reasoning is simple: Exposure to antibiotics is what drives bacteria to develop resistance. Taking drugs when you aren’t sick anymore simply gives the hordes of bacteria in and on your body more incentive to evolve to evade the drugs, so the next time you have an infection, they may not work.

The traditional reasoning from doctors “never made any sense. It doesn’t make any sense today,” Dr. Louis Rice, chairman of the department of medicine at the Warren Alpert Medical School at Brown University, told STAT.

Some colleagues credit Rice with being the first person to declare the emperor was wearing no clothes, and it is true that he challenged the dogma in lectures at major meetings of infectious diseases physicians and researchers in 2007 and 2008. A number of researchers now share his skepticism of health guidance that has been previously universally accepted.

The question of whether this advice is still appropriate will be raised at a World Health Organization meeting next month in Geneva. A report prepared for that meeting — the agency’s expert committee on the selection and use of essential medicine — already notes that the recommendation isn’t backed by science.

In many cases “an argument can be made for stopping a course of antibiotics immediately after a bacterial infection has been ruled out … or when the signs and symptoms of a mild infection have disappeared,” suggests the report, which analyzed information campaigns designed to get the public on board with efforts to fight antibiotic resistance.

No one is doubting the lifesaving importance of antibiotics. They kill bacteria. But the more the bugs are exposed to the drugs, the more survival tricks the bacteria acquire. And the more resistant the bacteria become, the harder they are to treat.

The concern is that the growing number of bacteria that are resistant to multiple antibiotics will lead to more incurable infections that will threaten medicine’s ability to conduct routine procedures like hip replacements or open heart surgery without endangering lives.

So how did this faulty paradigm become entrenched in medical practice? The answer lies back in the 1940s, the dawn of antibiotic use.

A Petri dish of penicillin showing its inhibitory effect on some bacteria but not on others.

At the time, resistance wasn’t a concern. After the first antibiotic, penicillin, was discovered, more and more gushed out of the pharmaceutical product pipeline.

Doctors were focused only on figuring out how to use the drugs effectively to save lives. An ethos emerged: Treat patients until they get better, and then for a little bit longer to be on the safe side. Around the same time, research on how to cure tuberculosis suggested that under-dosing patients was dangerous — the infection would come back.

“The problem is once it gets baked into culture, it’s really hard to excise it,” said Dr. Brad Spellberg, who is also an advocate for changing this advice. Spellberg is an infectious diseases specialist and chief medical officer at the Los Angeles County-University of Southern California Medical Center in Los Angeles.

We think of medicine as a science, guided by mountains of research. But doctors sometimes prescribe antibiotics more based on their experience and intuition than anything else. There are treatment guidelines for different infections, but some provide scant advice on how long to continue treatment, Rice acknowledged. And response to treatment will differ from patient to patient, depending on, among other things, how old they are, how strong their immune systems are, or how well they metabolize drugs.

There’s little incentive for pharmaceutical companies to conduct expensive studies aimed at finding the shortest duration of treatment for various conditions. But in the years since Rice first raised his concerns, the National Institutes of Health has been funding such research and almost invariably the ensuing studies have found that many infections can be cured more quickly than had been thought. Treatments that were once two weeks have been cut to one, 10 days have been reduced to seven and so on.

There have been occasional exceptions. Just before Christmas, scientists at the University of Pittsburgh reported that 10 days of treatment for otitis media — middle ear infections — was better than five days for children under 2 years of age.

It was a surprise, said Spellberg, who noted that studies looking at the same condition in children 2 and older show the shorter treatment works.

More of this work is needed, Rice said. “I’m not here saying that every infection can be treated for two days or three days. I’m just saying: Let’s figure it out.”

In the meantime, doctors and public health agencies are in a quandary. How do you put the new thinking into practice? And how do you advise the public? Doctors know full well some portion of people unilaterally decide to stop taking their antibiotics because they feel better. But that approach is not safe in all circumstances — for instance tuberculosis or bone infections. And it’s not an approach many physicians feel comfortable endorsing.

“This is a very tricky question. It’s not easy to make a blanket statement about this, and there isn’t a simple answer,” Dr. Lauri Hicks, director of the Centers for Disease Control and Prevention’s office of antibiotic stewardship, told STAT in an email.

“There are certain diagnoses for which shortening the course of antibiotic therapy is not recommended and/or potentially dangerous. … On the other hand, there are probably many situations for which antibiotic therapy is often prescribed for longer than necessary and the optimal duration is likely ‘until the patient gets better.’”

CDC’S Get Smart campaign, on appropriate antibiotic use, urges people never to skip doses or stop the drugs because they’re feeling better. But Hicks noted the CDC recently revised it to add “unless your healthcare professional tells you to do so” to that advice.

And that’s one way to deal with the situation, said Dr. James Johnson, a professor of infectious diseases medicine at the University of Minnesota and a specialist at the Minnesota VA Medical Center.

“In fact sometimes some of us give that instruction to patients. ‘Here, I’m going to prescribe you a week. My guess is you won’t need it more than, say, three days. If you’re all well in three days, stop then. If you’re not completely well, take it a little longer. But as soon as you feel fine, stop.’ And we can give them permission to do that.”

Spellberg is more comfortable with the idea of people checking back with their doctor before stopping their drugs — an approach that requires doctors to be willing to have that conversation. “You should call your doc and say ‘Hey, can I stop?’ … If your doctor won’t get on the phone with you for 20 seconds, you need to find another doctor.”

Serum uric acid and eGFR_CKDEPI differently predict long-term cardiovascular events and all causes of deaths in a residential cohort.

Significant (p<0.00001) differences in SUA quintiles were seen for SBP, total and HDL cholesterol, body mass index and eGFR_CKDEPI whereas cigarettes and blood glucose were not statistically different. There were increasingly larger proportions of all events in SUA quintiles (0.05>p<0.0001). Among 4 major continuous variables, SUA was largely accurate (ROC>0.610) to predict all end-points whereas eGFR_CKDEPI was the worse univariate predictor. Multivariately, age, gender, SBP and cigarettes were significant predictors for all end-points. Total cholesterol was a significant predictor only for CHDH events. Blood glucose and SUA were contributors for CVDH events (RR, for 1mg/dl of SUA, 1.09, 95%CI 1.01-1.17), CVD deaths (RR 1.11, 95%CI 1.03-1.20) and ALL deaths (RR 1.08, 95%CI 1.03-1.14) whereas (eGFR_CKDEPI)(2) was for ALL deaths only (RR 1.02, 95%CI 1.00-1.04).

CONCLUSION:

SUA is a predictor of long-term incidence of cardiovascular events and deaths and all-cause mortality and should be considered for risk predictive purposes and instruments whereas eGFR_CKDEPI only predicts all-cause mortality by a U-shaped relation.

Although metabolic syndrome incidence has substantially increased during the last few decades, it largely remains unclear whether this metabolic disorder is associated with total cancer mortality. The present study was carried out to investigate this important question.

METHODS:

A total of 687 cancer deaths were identified from 14,916 participants in the third National Health and Nutrition Examination Survey by linking them to the National Death Index database through December 31, 2006. Cox proportional hazards regression was performed to calculate hazard ratios (HR) and 95% confidence intervals (CI) for total cancer mortality in relation to metabolic syndrome and its individual components.

RESULTS:

After adjustment for confounders, a diagnosis of metabolic syndrome was associated with 33% elevated total cancer mortality. Compared with individuals without metabolic syndrome, those with 3, 4 and 5 abnormal components had HRs (95% CIs) of 1.28 (1.03-1.59), 1.24 (0.96-1.60), and 1.87 (1.34-2.63), respectively (p-trend = 0.0003). Systolic blood pressure and serum glucose were associated with an increased risk of death from total cancer [hr (95% CI) for highest vs. lowest quartiles: 1.67 (1.19-2.33), p-trend 0.002 and 1.34 (1.04-1.74), p-trend 0.003, respectively]. Overall null results were obtained for lung cancer mortality. The effects of metabolic syndrome and its components on non-lung cancer mortality were generally similar to, but somewhat larger than, those for total cancer mortality.

CONCLUSION:

Our study is among the first to reveal that metabolic syndrome is associated with increased total cancer mortality.

Whether antiplatelet agents have a preventive effect on cognitive function remains unknown. We examined the potential association between the use of cilostazol, an antiplatelet agent and cyclic adenosine monophosphate phosphodiesterase 3 inhibitor, and the risk of dementia in an Asian population. Patients initiating cilostazol therapy between 1 January 2004 and 31 December 2009 without a prior history of dementia were identified from Taiwan's National Health Insurance database. Participants were stratified by age, sex, comorbidities, and comedication. The outcome of interest was all-cause dementia (ICD-9-CM codes 290.0, 290.4, 294.1, 331.0). Cox regression models were used to estimate the hazard ratio (HR) of dementia. The cumulative cilostazol dosage was stratified by quartile of defined daily doses using no cilostazol use as a reference. A total of 9148 participants 40 years of age or older and free of dementia at baseline were analyzed. Patients using cilostazol (n = 2287) had a significantly decreased risk of incident dementia compared with patients not using the drug [n = 6861; adjusted HR (aHR) 0.75; 95% confidence interval (CI) 0.61-0.92]. Notably, cilostazol use was found to have a dose-dependent association with reduced rate of dementia emergence (p for trend = 0.001). Subgroup analysis identified a decline of dementia in cilostazol users with diagnosed ischemic heart disease (aHR 0.44, 95% CI 0.24-0.83) and cerebral vascular disease (aHR 0.34, 95% CI 0.21-0.54). These observations suggest that cilostazol use may reduce the risk to develop dementia, and a high cumulative dose further decreases the risk of dementia. These findings should be examined further in randomized clinical trials.

The biological functions of high-density lipoproteins (HDLs) contribute to explaining the cardioprotective role of the lipoprotein beyond quantitative HDL cholesterol levels. A few small-scale interventions with a single antioxidant have improved some HDL functions. However, to date, no long-term, large-scale, randomized controlled trial has been conducted to assess the effects of an antioxidant-rich dietary pattern (such as a traditional Mediterranean diet [TMD]) on HDL function in humans.

METHODS:

This study was performed in a random subsample of volunteers from the PREDIMED Study (Prevención con Dieta Mediterránea; n=296) after a 1-year intervention. We compared the effects of 2 TMDs, one enriched with virgin olive oil (TMD-VOO; n=100) and the other enriched with nuts (TMD-Nuts; n=100), with respect to a low-fat control diet (n=96). We assessed the effects of both TMDs on the role of HDL particles on reverse cholesterol transport (cholesterol efflux capacity, HDL ability to esterify cholesterol, and cholesteryl ester transfer protein activity), HDL antioxidant properties (paraoxonase-1 arylesterase activity and total HDL antioxidant capacity on low-density lipoproteins), and HDL vasodilatory capacity (HDL ability to induce the release of nitric oxide in endothelial cells). We also studied the effects of a TMD on several HDL quality-related characteristics (HDL particle oxidation, resistance against oxidative modification, main lipid and protein composition, and size distribution).

Conscious sedation has been widely utilized in plastic surgery. However, inadequate research has been published evaluating adequate drug dosage and depth of sedation. In clinical practice, sedation is often inadequate or accompanied by complications when sedatives are administered according to body weight alone. The purpose of this study was to identify variables influencing the depth of sedation during conscious sedation for plastic surgery.

Alcohol intake and female sex were positively associated with the mean BIS (P<0.01). Age was negatively associated with the mean BIS (P<0.01). Body mass index (P=0.263), creatinine clearance (P=0.832), smoking history (P=0.398), glucose (P=0.718), AST (P=0.729), and ALT (P=0.423) were not associated with the BIS.

CONCLUSIONS:

Older patients tended to have a greater depth of sedation, whereas females and patients with greater alcohol intake had a shallower depth of sedation. Thus, precise dose adjustments of sedatives, accounting for not only weight but also age, sex, and alcohol consumption, are required to achieve safe, effective, and predictable conscious sedation.

Clostridium difficile infection (CDI) is the most common cause of infectious diarrhea and represents an important burden for healthcare worldwide. Symptoms of severe CDI include watery, foul-smelling diarrhea, peripheral leucocytosis, increased C-reactive protein (CRP), acute renal failure, hypotension and pseudomembranous colitis. Recent studies indicate that the main cause of CDI is dysbiosis, an imbalance in the normal gut microbiota. The restoration of a healthy gut microbiota composition via fecal microbiota transplantation (FMT) recently became more popular. The aim of the present study was to assess the effect of FMT on the healing of CDI and to analyze the changes in the level of pro-inflammatory markers (C-reactive protein, fecal calprotectin) and pro-inflammatory cytokines. Eighteen patients with CDI were included in our study (6 males and 12 females) with recurrent and/or severe CDI. The FMT was performed in 17 patients using colonoscopy, including 16 patients receiving a one-time FMT and 1 patient who needed 2 additional FMTs. One patient was treated with a single round of FMT using push-and-pull enteroscopy. In all CDI patients, before and 3 weeks after FMT, the following parameters were analyzed: C-reactive protein, fecal calprotectin, and plasma interleukin (IL)-6, IL-8 and IL-12, and tumor necrosis factor-alpha (TNF-α). In addition, the plasma level of LL-37, a cathelicidine peptide was assessed by fluorescence-activated cell sorting (FACS) before and 3 months after FMT. Finally, in 7 patients a microbiome analysis was performed by sequencing of 16SrRNA in stool probes obtained before and 3 weeks after FMT. The healing rate of CDI was 94%. In all successfully treated patients no recurrent CDI was observed during follow-up (16 months). The serum level of pro-inflammatory cytokines (TNF-α, IL-1β, IL-6, IL-8 and IL-12) significantly decreased after FMT. Similarly, CRP and fecal calprotectin normalized after FMT. 3 months after FMT a significant increase of LL-37 in the plasma of successfully treated patients was monitored. The sequencing analysis demonstrated an elevated abundance of beneficial bacterial species such as Lactobacillaceae, Ruminococcaceae, Desulfovibrionaceae, Sutterellaceae and Porphyromonodacea after FMT. No serious side effects were observed. We concluded that FMT represented a very effective and safe treatment of recurrent and/or severe CDI and led to favorable shifts in the composition of gut microbiome.

Body weight trajectories and risk of oesophageal and gastric cardia adenocarcinomas: a pooled analysis of NIH-AARP and PLCO Studies.

Elevated body mass index (BMI, kg m-2) has been consistently associated with oesophageal adenocarcinoma (EA) and gastric cardia adenocarcinoma (GCA) incidence. However, effects of adiposity over the life course in relation to EA/GCA have not been thoroughly explored.

Compared with individuals with a BMI<25 kg m-2 at all time points, exceeding a BMI of 25 kg m-2 at age 20 was associated with increased risks of EA (HR=1.76, 95% CI: 1.35-2.29) and GCA (HR=1.62, 95% CI: 1.16-2.25). Similarly, a BMI trajectory of overweight (⩾25-<30 kg m-2) at age 20 progressing to obesity (⩾30 kg m-2) by age 50 was associated with increased risks of EA (HR=2.90, 95% CI: 1.67-5.04) and GCA (HR=4.07, 95% CI: 2.32-7.15), compared with individuals with a normal weight (⩾18.5-<25 kg m-2) trajectory. Weight gain of ⩾20 kg between age 20 and baseline was also associated with a two times increased risk of EA (HR=1.97, 95% CI: 1.43-2.73) and more modestly with GCA (HR=1.40, 95% CI: 0.96-2.05).

CONCLUSIONS:

Being overweight in early adulthood and weight gain later in life were each associated with increased risks of EA and GCA. This underscores the potential of weight control programs for reducing EA and GCA risk.

Sprint interval training (SIT) is a time-efficient strategy to improve cardiorespiratory fitness (CRF); however, most protocols have been studied in laboratory settings and require specialized equipment. We investigated the efficacy of brief intense stair climbing as a practical model of SIT to improve CRF.

The acute phase of study 1 established that the mean HR, blood [lactate], and RPE were similar when participants (n = 8) performed an SIT protocol that involved 3 × 20-s "all-out" efforts of either continuously ascending stairs or cycling. The chronic phase demonstrated that CRF, as determined by peak oxygen uptake (V˙O2peak), increased by 12% or ~1 MET (8.27 ± 1.05 to 9.25 ± 1.01 METs, P = 0.002) when participants (n = 12) performed the 3 × 20-s stair climbing protocol 3 d·wk for 6 wk. The acute phase of study 2 established that HR and RPE were similar when participants (n = 11) performed three different stair climbing protocols: the 3 × 20-s continuous ascent model used in study 1 and two 3 × 60-s models of ascending and descending either one or two flights of stairs (P > 0.05). The chronic phase demonstrated that V˙O2peak increased by 7% (8.91 ± 1.30 to 9.51 ± 1.52 METs, P = 0.01) when the same group of participants performed the one-flight 3 × 60-s protocol 3 d·wk for 6 wk. The Cederholm index determined from an oral glucose tolerance test was 57 ± 17 and 64 ± 21 mg·L·mmol·mU·min before and after training, respectively (P = 0.056).

We performed a systematic review of patient-centered outcomes after the concomitant use of proton pump inhibitors (PPIs) and other drugs.

METHODS:

We searched 4 databases in July 2016 to find studies that reported mortality and morbidity after the concomitant use of PPIs and other drugs. We conducted direct meta-analyses using a random-effects model and graded the quality of evidence according to the Grading of Recommendations Assessment, Development and Evaluation working group approach.

FINDINGS:

We included data from 17 systematic reviews and meta-analyses, 16 randomized controlled trials, and 16 observational studies that examined the concomitant use of PPIs with medications from 10 drug classes. Low-quality evidence suggests that the use of PPIs is associated with greater morbidity when administered with antiplatelet drugs, bisphosphonates, antibiotics, anticoagulants, metformin, mycophenolate mofetil, or nelfinavir. Concomitant PPIs reduce drug-induced gastrointestinal bleeding and are associated with greater docetaxel and cisplatin response rates in patients with metastatic breast cancer. For demonstrated statistically significant relative risks and benefits from concomitant PPIs, the magnitudes of the effects are small, with <100 attributable events per 1000 patients treated, and the effects are inconsistent among specific drugs. Among individual PPIs, the concomitant use of pantoprazole or esomeprazole, but not omeprazole or lansoprazole, is associated with an increased risk for all-cause mortality, nonfatal myocardial infarction, or stroke. Clopidogrel is associated with a greater risk for myocardial infarction compared with prasugrel. Conflicting results between randomized controlled trials and observational studies and high risk for bias in the body of evidence lessened our confidence in the results.

IMPLICATIONS:

Available evidence suggests a greater risk for adverse patient outcomes after the concomitant use of PPIs and medications from 9 drug classes and warns against inappropriate drug combinations.

In this Viewpoint, Chobanian proposes blood pressure goals for treatment of hypertension in light of data from the ACCORD, SPRINT, and HOPE-3 trials, which tested differences in patient outcomes by treatment targets.

Although several cross-sectional studies have reported that pain is associated with functional disability in the elderly, data regarding a longitudinal association between pain and disability are inconsistent. This study aimed to investigate the association of pain severity with subsequent functional disability due to all causes as well as stroke, dementia, and joint disease/fracture.

METHODS:

The authors conducted a prospective cohort study of 13,702 Japanese individuals aged 65 yr or older. Information regarding pain severity during the previous 4 weeks and other lifestyle factors was collected via questionnaire in 2006. Data on the incidence of functional disability were retrieved from the Long-term Care Insurance database. Cox proportional hazards regression analysis was used to estimate the multivariate-adjusted hazard ratios for incident functional disability.

RESULTS:

The authors documented 2,686 (19.6%) cases of incident functional disability. The multivariate hazard ratio of functional disability was 1.15 (95% CI, 1.02 to 1.31) among respondents with moderate pain and 1.31 (95% CI, 1.12 to 1.54) among respondents with severe pain in comparison with those without pain (P trend < 0.001). These positive associations were particularly remarkable for disability due to joint disease/fracture: the multivariate hazard ratio was 1.88 (95% CI, 1.37 to 2.58) for moderate pain and 2.76 (95% CI, 1.93 to 3.95) for severe pain (P trend < 0.001). There was a negative association between pain severity and disability due to dementia (P trend = 0.041) and no significant association between pain severity and disability due to stroke.

CONCLUSIONS:

Among elderly Japanese individuals, the authors found a significant positive association between pain severity and future incident functional disability.

Recent changes to the U.S. Food and Drug Administration boxed warning for metformin will increase its use in persons with historical contraindications or precautions. Prescribers must understand the clinical outcomes of metformin use in these populations.

On the basis of quantitative and qualitative syntheses involving 17 observational studies, metformin use is associated with reduced all-cause mortality in patients with CKD, CHF, or CLD with hepatic impairment, and with fewer heart failure readmissions in patients with CKD or CHF.

LIMITATIONS:

Strength of evidence was low, and data on multiple outcomes of interest were sparse. Available studies were observational and varied in follow-up duration.

CONCLUSION:

Metformin use in patients with moderate CKD, CHF, or CLD with hepatic impairment is associated with improvements in key clinical outcomes. Our findings support the recent changes in metformin labeling.

Oral supplementation with vitamin B3 protects against glaucoma development in mice.

Vitamin B3 protects mice from glaucoma

Glaucoma is the most common cause of age-related blindness in the United States. There is currently no cure, and once vision is lost, the condition is irreversible. Williams et al. now report that vitamin B3 (also known as niacin) prevents eye degeneration in glaucoma-prone mice (see the Perspective by Crowston and Trounce). Supplementing the diets of young mice with vitamin B3 averted early signs of glaucoma. Vitamin B3 also halted further glaucoma development in aged mice that already showed signs of the disease. Thus, healthy intake of vitamin B3 may protect eyesight.

Glaucomas are neurodegenerative diseases that cause vision loss, especially in the elderly. The mechanisms initiating glaucoma and driving neuronal vulnerability during normal aging are unknown. Studying glaucoma-prone mice, we show that mitochondrial abnormalities are an early driver of neuronal dysfunction, occurring before detectable degeneration. Retinal levels of nicotinamide adenine dinucleotide (NAD+, a key molecule in energy and redox metabolism) decrease with age and render aging neurons vulnerable to disease-related insults. Oral administration of the NAD+ precursor nicotinamide (vitamin B3), and/or gene therapy (driving expression of Nmnat1, a key NAD+-producing enzyme), was protective both prophylactically and as an intervention. At the highest dose tested, 93% of eyes did not develop glaucoma. This supports therapeutic use of vitamin B3 in glaucoma and potentially other age-related neurodegenerations.

A dietary supplement may offer protection from loss of vision in glaucoma

Summary

Advancing age predisposes us to a number of neurodegenerative diseases, yet the underlying mechanisms are poorly understood. With some 70 million individuals affected, glaucoma is the world's leading cause of irreversible blindness. Glaucoma is characterized by the selective loss of retinal ganglion cells that convey visual messages from the photoreceptive retina to the brain. Age is a major risk factor for glaucoma, with disease incidence increasing near exponentially with increasing age. Treatments that specifically target retinal ganglion cells or the effects of aging on glaucoma susceptibility are currently lacking. On page 756 of this issue, Williams et al. (1) report substantial advances toward filling these gaps by identifying nicotinamide adenine dinucleotide (NAD+) decline as a key age-dependent risk factor and showing that restoration with long-term dietary supplementation or gene therapy robustly protects against neuronal degeneration.

The antimicrobial triclosan, found in soaps and shampoos, causes liver fibrosis and cancer in laboratory mice through molecular mechanisms

Soap & Cancer

Washing with soap - image from Shutterstock

Triclosan is an antimicrobial agent commonly added to soaps and shampoos. Robert H. Tukey, from the University of California/San Diego (California, USA), and colleagues found that triclosan disrupted liver integrity and compromised liver function in mouse models. Mice exposed to triclosan for six months (roughly equivalent to 18 human years) were more susceptible to chemical-induced liver tumors. Their tumors were also larger and more frequent than in mice not exposed to triclosan. The team posits that the mechanism of action may be that triclosan interferes with the constitutive androstane receptor, a protein responsible for detoxifying foreign chemicals in the body. To compensate for this stress, liver cells proliferate and turn fibrotic over time. Repeated triclosan exposure and continued liver fibrosis eventually promote tumor formation. The study authors urge that: “These findings strongly suggest there are adverse health effects in mice with long-term [triclosan] exposure, especially on enhancing liver fibrogenesis and tumorigenesis, and the relevance of [triclosan] liver toxicity to humans should be evaluated.”

>>>>>>>>>>>>>>>>>>>>

The commonly used antimicrobial additive triclosan is a liver tumor promoter.

Triclosan [5-chloro-2-(2,4-dichlorophenoxy)phenol; TCS] is a synthetic, broad-spectrum antibacterial chemical used in a wide range of consumer products including soaps, cosmetics, therapeutics, and plastics. The general population is exposed to TCS because of its prevalence in a variety of daily care products as well as through waterborne contamination. TCS is linked to a multitude of health and environmental effects, ranging from endocrine disruption and impaired muscle contraction to effects on aquatic ecosystems. We discovered that TCS was capable of stimulating liver cell proliferation and fibrotic responses, accompanied by signs of oxidative stress. Through a reporter screening assay with an array of nuclear xenobiotic receptors (XenoRs), we found that TCS activates the nuclear receptor constitutive androstane receptor (CAR) and, contrary to previous reports, has no significant effect on mouse peroxisome proliferation activating receptor α (PPARα). Using the procarcinogen diethylnitrosamine (DEN) to initiate tumorigenesis in mice, we discovered that TCS substantially accelerates hepatocellular carcinoma (HCC) development, acting as a liver tumor promoter. TCS-treated mice exhibited a large increase in tumor multiplicity, size, and incidence compared with control mice. TCS-mediated liver regeneration and fibrosis preceded HCC development and may constitute the primary tumor-promoting mechanism through which TCS acts. These findings strongly suggest there are adverse health effects in mice with long-term TCS exposure, especially on enhancing liver fibrogenesis and tumorigenesis, and the relevance of TCS liver toxicity to humans should be evaluated.

Sirtuins have received considerable attention since the discovery that silent information regulator 2 (Sir2) extends the lifespan of yeast. Sir2, a nicotinamide adenine dinucleotide- (NAD-) dependent histone deacetylase, serves as both a transcriptional effector and energy sensor. Oxidative stress and apoptosis are implicated in the pathogenesis of neurodegenerative eye diseases. Sirtuins confer protection against oxidative stress and retinal degeneration. In mammals, the sirtuin (SIRT) family consists of seven proteins (SIRT1-SIRT7). These vary in tissue specificity, subcellular localization, and enzymatic activity and targets. In this review, we present the current knowledge of the sirtuin family and discuss their structure, cellular location, and biological function with a primary focus on their role in different neuroophthalmic diseases including glaucoma, optic neuritis, and age-related macular degeneration. The potential role of certain therapeutic targets is also described.

Improvement of the omega 3 index of healthy subjects does not alter the effects of dietary saturated fats or n-6PUFA on LDL profiles.

Dietary fat composition is known to modulate circulating lipid and lipoprotein levels. Although supplementation with long chain omega-3 polyunsaturated fatty acids (LCn-3PUFA) has been shown to reduce plasma triglyceride levels, the effect of the interactions between LCn-3PUFA and the major dietary fats consumed has not been previously investigated.

METHODS:

In a randomized controlled parallel design clinical intervention, we examined the effect of diets rich in either saturated fatty acids (SFA) or omega-6 polyunsaturated fatty acids (n-6PUFA) on plasma lipid levels and lipoprotein profiles (lipoprotein size, concentration and distribution in subclasses) in subjects with an adequate omega 3 index. Twenty six healthy subjects went through a four-week pre-supplementation period with LCn-3PUFA and were then randomized to diets rich in either n-6PUFA or SFA both supplemented with LCn-3PUFA.

Evidence of a role for type 2 diabetes in overall cancer risk and risk for specific types of cancer is limited in ethnic Chinese populations. We therefore investigated whether there is an association between diabetes and cancer incidence in Taiwan.

METHODS:

This study recruited a total of 3602 adults aged 35 years or over (average 54.9 ± 12.3 years, 52.8% women). Participants with fasting glucose ≥126 mg/dL, or taking hypoglycemic medications, were classed as having type 2 diabetes. Cancer incidence was established through regular follow-up interviews and medical records. Cox proportional hazard regression models were used to examine associations for diabetes with risk of all-cause and site-specific cancers.

Vitamin D deficiency may be a risk factor for mortality but previous meta-analyses lacked standardization of laboratory methods for 25-hydroxyvitamin D (25[OH]D) concentrations and used aggregate data instead of individual participant data (IPD). We therefore performed an IPD meta-analysis on the association between standardized serum 25(OH)D and mortality.

METHODS:

In a European consortium of eight prospective studies, including seven general population cohorts, we used the Vitamin D Standardization Program (VDSP) protocols to standardize 25(OH)D data. Meta-analyses using a one step procedure on IPD were performed to study associations of 25(OH)D with all-cause mortality as the primary outcome, and with cardiovascular and cancer mortality as secondary outcomes. This meta-analysis is registered at ClinicalTrials.gov, number NCT02438488.

FINDINGS:

We analysed 26916 study participants (median age 61.6 years, 58% females) with a median 25(OH)D concentration of 53.8 nmol/L. During a median follow-up time of 10.5 years, 6802 persons died. Compared to participants with 25(OH)D concentrations of 75 to 99.99 nmol/L, the adjusted hazard ratios (with 95% confidence interval) for mortality in the 25(OH)D groups with 40 to 49.99, 30 to 39.99, and <30 nmol/L were 1.15 (1.00-1.29), 1.33 (1.16-1.51), and 1.67 (1.44-1.89), respectively. We observed similar results for cardiovascular mortality, but there was no significant linear association between 25(OH)D and cancer mortality. There was also no significantly increased mortality risk at high 25(OH)D levels up to 125 nmol/L.

INTERPRETATION:

In the first IPD meta-analysis using standardized measurements of 25(OH)D we observed an association between low 25(OH)D and increased risk of all-cause mortality. It is of public health interest to evaluate whether treatment of vitamin D deficiency prevents premature deaths.

Preclinical evidence shows that short-term fasting (STF) protects healthy cells against side effects of chemotherapy and makes cancer cells more vulnerable to it. This pilot study examines the feasibility of STF and its effects on tolerance of chemotherapy in a homogeneous patient group with early breast cancer (BC).

METHODS:

Eligible patients had HER2-negative, stage II/III BC. Women receiving (neo)-adjuvant TAC (docetaxel/doxorubicin/cyclophosphamide) were randomized to fast 24 h before and after commencing chemotherapy, or to eat according to the guidelines for healthy nutrition. Toxicity in the two groups was compared. Chemotherapy-induced DNA damage in peripheral blood mononuclear cells (PBMCs) was quantified by the level of γ-H2AX analyzed by flow cytometry.

RESULTS:

Thirteen patients were included of whom seven were randomized to the STF arm. STF was well tolerated. Mean erythrocyte- and thrombocyte counts 7 days post-chemotherapy were significantly higher (P = 0.007, 95 % CI 0.106-0.638 and P = 0.00007, 95 % CI 38.7-104, respectively) in the STF group compared to the non-STF group. Non-hematological toxicity did not differ between the groups. Levels of γ-H2AX were significantly increased 30 min post-chemotherapy in CD45 + CD3- cells in non-STF, but not in STF patients.

CONCLUSIONS:

STF during chemotherapy was well tolerated and reduced hematological toxicity of TAC in HER2-negative BC patients. Moreover, STF may reduce a transient increase in, and/or induce a faster recovery of DNA damage in PBMCs after chemotherapy. Larger studies, investigating a longer fasting period, are required to generate more insight into the possible benefits of STF during chemotherapy.

The effects of vitamin and mineral supplementation on symptoms of schizophrenia: a systematic review and meta-analysis.

When used as an adjunctive with antipsychotics, certain vitamins and minerals may be effective for improving symptomatic outcomes of schizophrenia, by restoring nutritional deficits, reducing oxidative stress, or modulating neurological pathways.

METHOD:

We conducted a systematic review of all randomized controlled trials (RCTs) reporting effects of vitamin and/or mineral supplements on psychiatric symptoms in people with schizophrenia. Random-effects meta-analyses were used to calculate the standardized mean difference between nutrient and placebo treatments.

There is preliminary evidence that certain vitamin and mineral supplements may reduce psychiatric symptoms in some people with schizophrenia. Further research is needed to examine how the benefits of supplementation relate to nutrient deficits and the impact upon underlying neurobiological pathways, in order to establish optimal nutrient formulations for improving clinical outcomes in this population. Future studies should also explore the effects of combining beneficial nutrients within multi-nutrient formulas.

KEYWORDS:

Adjunctive; diet; food; nutrition; psychosis

Living near major roads and the incidence of dementia, Parkinson's disease, and multiple sclerosis: a population-based cohort study

Emerging evidence suggests that living near major roads might adversely affect cognition. However, little is known about its relationship with the incidence of dementia, Parkinson's disease, and multiple sclerosis. We aimed to investigate the association between residential proximity to major roadways and the incidence of these three neurological diseases in Ontario, Canada.

Methods

In this population-based cohort study, we assembled two population-based cohorts including all adults aged 20–50 years (about 4·4 million; multiple sclerosis cohort) and all adults aged 55–85 years (about 2·2 million; dementia or Parkinson's disease cohort) who resided in Ontario, Canada on April 1, 2001. Eligible patients were free of these neurological diseases, Ontario residents for 5 years or longer, and Canadian-born. We ascertained the individual's proximity to major roadways based on their residential postal-code address in 1996, 5 years before cohort inception. Incident diagnoses of dementia, Parkinson's disease, and multiple sclerosis were ascertained from provincial health administrative databases with validated algorithms. We assessed the associations between traffic proximity and incident dementia, Parkinson's disease, and multiple sclerosis using Cox proportional hazards models, adjusting for individual and contextual factors such as diabetes, brain injury, and neighbourhood income. We did various sensitivity analyses, such as adjusting for access to neurologists and exposure to selected air pollutants, and restricting to never movers and urban dwellers.

Findings

Between 2001, and 2012, we identified 243 611 incident cases of dementia, 31 577 cases of Parkinson's disease, and 9247 cases of multiple sclerosis. The adjusted hazard ratio (HR) of incident dementia was 1·07 for people living less than 50 m from a major traffic road (95% CI 1·06–1·08), 1·04 (1·02–1·05) for 50–100 m, 1·02 (1·01–1·03) for 101–200 m, and 1·00 (0·99–1·01) for 201–300 m versus further than 300 m (p for trend=0·0349). The associations were robust to sensitivity analyses and seemed stronger among urban residents, especially those who lived in major cities (HR 1·12, 95% CI 1·10–1·14 for people living <50 m from a major traffic road), and who never moved (1·12, 1·10–1·14 for people living <50 m from a major traffic road). No association was found with Parkinson's disease or multiple sclerosis.

Interpretation

In this large population-based cohort, living close to heavy traffic was associated with a higher incidence of dementia, but not with Parkinson's disease or multiple sclerosis.

A rise in fructose consumption has been implicated in the etiology of obesity, diabetes and cardiovascular disease. Serum uric acid (UA) elevates after fructose ingestion, increasing the risk of cardiovascular disease. However, the impact of fructose ingestion on nitric oxide (NO) has not yet been confirmed. The aim of this study was to investigate the postprandial metabolic and endocrine responses following an acute ingestion of fructose and glucose in healthy subjects.

METHOD:

This was a double-blinded, randomized, crossover postprandial trial. Eighteen healthy young subjects (9 males and 9 females) with a mean age of 23.6 ± 2.3 years and mean BMI of 20.2 ± 1.5 kg/m2 completed the experiment that was conducted in Hangzhou, China. Volunteers were randomized to two groups (A and B): after an 8-h overnight fast, volunteers either ingested 300 mL of 25% glucose (group A) or fructose (group B) solution at 0830 within 5 min. After a one-week washout period, volunteers were crossed over to receive the alternate test solution. Blood pressure was measured at 0 h, 1 h, 2 h and 3 h and venous blood was drawn at 0 h, 0.5 h, 1 h, 2 h and 3 h after ingestion of the test solution.

Ingestion of a 75 g fructose load led to acute but unfavorable changes in certain metabolic and endocrine responses including increased serum concentrations and 3 h-AUC of UA, AR and LDH, increased SBP, and decreased endothelial NO production when compared with the same amount of ingested glucose.

Stress-response pathways have evolved to maintain cellular homeostasis and to ensure the survival of organisms under changing environmental conditions. Whereas severe stress is detrimental, mild stress can be beneficial for health and survival, known as hormesis. Although the universally conserved heat-shock response regulated by transcription factor HSF-1 has been implicated as an effector mechanism, the role and possible interplay with other cellular processes, such as autophagy, remains poorly understood. Here we show that autophagy is induced in multiple tissues of Caenorhabditis elegans following hormetic heat stress or HSF-1 overexpression. Autophagy-related genes are required for the thermoresistance and longevity of animals exposed to hormetic heat shock or HSF-1 overexpression. Hormetic heat shock also reduces the progressive accumulation of PolyQ aggregates in an autophagy-dependent manner. These findings demonstrate that autophagy contributes to stress resistance and hormesis, and reveal a requirement for autophagy in HSF-1-regulated functions in the heat-shock response, proteostasis and ageing.

Impact of the dietary fatty acid intake on C-reactive protein levels in US adults.

Growing evidence suggests that the effects of diet on cardiovascular disease (CVD) occur through mechanisms involving subclinical inflammation. We assessed whether reported dietary fatty acid intake correlates with a serum high-sensitivity C-reactive protein (hs-CRP) concentration in a population-based sample of US men and women.In this cross-sectional analysis, participants were selected from the US National Health and Nutrition Examination Survey (NHANES) and restricted to those with available data on dietary intake, biochemical and anthropometric measurements from 2001 to 2010. All statistical analyses accounted for the survey design and sample weights by using SPSS Complex Samples v22.0 (IBM Corp, Armonk, NY).Of the 17,689 participants analyzed, 8607 (48.3%) were men. The mean age was 45.8 years in the overall sample, 44.9 years in men, and 46.5 years in women (P = 0.047). The age-, race-, and sex-adjusted mean dietary intakes of total polyunsaturated fatty acids (PUFAs), PUFAs 18:2 (octadecadienoic), and PUFAs 18:3 (octadecatrienoic) monotonically decreased across hs-CRP quartiles (P < 0.001), whereas dietary cholesterol increased across hs-CRP quartiles (P < 0.001)This study provides further evidence of an association between fatty acid intake and subclinical inflammation markers. hs-CRP concentrations are likely modulated by dietary fatty acid intake. However, the causality of this association needs to be demonstrated in clinical trials.

Metformin is an oral anti-diabetic used as first-line therapy for type 2 diabetes. Because benefits of metformin extend beyond diabetes to other age-related pathology, and because its effect on gene expression profiles resembles that of caloric restriction, metformin has a potential as an anti-aging intervention and may soon be assessed as an intervention to extend healthspan. However, beneficial actions of metformin in the central nervous system have not been clearly established. The current study examined the effect of chronic oral metformin treatment on motor and cognitive function when initiated in young, middle-aged, or old male mice. C57BL/6 mice aged 4, 11, or 22 months were randomly assigned to either a metformin group (2 mg/ml in drinking water) or a control group. The mice were monitored weekly for body weight, as well as food and water intake and a battery of behavioral tests for motor, cognitive and visual function was initiated after the first month of treatment. Liver, hippocampus and cortex were collected at the end of the study to assess redox homeostasis. Overall, metformin supplementation in male mice failed to affect blood glucose, body weights and redox homeostasis at any age. It also had no beneficial effect on age-related declines in psychomotor, cognitive or sensory functions. However, metformin treatment had a deleterious effect on spatial memory and visual acuity, and reduced SOD activity in brain regions. These data confirm that metformin treatment may be associated with deleterious effect resulting from the action of metformin on the central nervous system.

Nephrolithiasis is a highly prevalent disease worldwide with rates ranging from 7 to 13% in North America, 5-9% in Europe, and 1-5% in Asia. Due to high rates of new and recurrent stones, management of stones is expensive and the disease has a high level of acute and chronic morbidity. The goal of this study is to review the epidemiology of stone disease in order to improve patient care. A review of the literature was conducted through a search on Pubmed®, Medline®, and Google Scholar®. This review was presented and peer-reviewed at the 3rd International Consultation on Stone Disease during the 2014 Société Internationale d'Urologie Congress in Glasgow. It represents an update of the 2008 consensus document based on expert opinion of the most relevant studies. There has been a rising incidence in stone disease throughout the world with a narrowing of the gender gap. Increased stone prevalence has been attributed to population growth and increases in obesity and diabetes. General dietary recommendations of increased fluid, decreased salt, and moderate intake of protein have not changed. However, specific recommended values have either changed or are more frequently reported. Geography and environment influenced the likelihood of stone disease and more information is needed regarding stone disease in a large portion of the world including Asia and Africa. Randomized controlled studies are lacking but are necessary to improve recommendations regarding diet and fluid intake. Understanding the impact of associated conditions that are rapidly increasing will improve the prevention of stone disease.

>>>>>>>>>>>>>>>>>>>>>>

Within each geographic location there are variations in temperature related to seasonal changes which have been described as “stone season.” Several studies have found that increases in temperature in summer months result in an increase in stone formation. A study evaluating total admissions to emergency departments were obtained from the Taiwan National Health Insurance Research Database (1999–2003) which provided monthly urinary calculi attack rates per 100,000 of the population [45]. The seasonal trends in the monthly urinary calculi attack rates revealed a peak in July–September, followed by a sharp decline in October. Another study from Saudi Arabia evaluated 307 renal stones analyzed during 1 year period from September 2000 to August 2001 from different hospitals in Riyadh and found that maximum number of stones were analyzed in peak summer months [34].

Trends in global warming will likely result in shifting and expansion of areas at increased risk for stone formation [46]. A study modeling the impact of climate change on stone disease found that the fraction of the U.S. population living in high-risk zones for nephrolithiasis will grow from 40% in 2000 to 56% by 2050, and to 70% by 2095. There is a predicted increase of 1.6–2.2 million lifetime cases of nephrolithiasis by 2050, which increases expenditures by 25%.

Diet

In the ICUD consensus document published in 2008, aspects of the relationship between urolithiasis and diet are compared to the updated ICUD 2008–2014 recommendations combined with the American Urological Association (AUA) guidelines [47], which are presented in Table 3. Since 2008, our search identified 30 articles [48–78]. Only six studies involving human subjects have been published since this time and of these, only two of them were RCTs [70, 74]. Moreover, only the former investigation had stone recurrence as its end point for establishing the efficacy or otherwise of the dietary protocols under investigation. Scrutiny of the rest of the articles reveals that the core dietary risk factors—calcium, oxalate, animal protein, carbohydrates, and sodium—remain unchanged. There were no “new” dietary risk factors which were proposed as being significant, although dietary fat was mentioned in two articles [60, 65]. The implied recommendations which emerged from these articles were to reduce the intake of saturated fats and to increase the intake of omega-3 essential fatty acids.

Table 3

-------------------------

Summary of dietary and supplemental findings and recommendations from the ICUD Consensus Document 2008 compared with the updated literature and AUA guidelines

Vit D supplement===Explicit recommendation not given, but restriction is inferred===If indicated, vit D supplementation should not be withheld solely on the basis of stone disease. However, over-repletion may be detrimental so careful repletion is recommended

Atkins diet===Explicit recommendation not given, but avoidance is inferred===Avoid

>>>>>>>>>>>>>>>>>>>>>>>

Several additional details (qualitative and quantitative) which were not described in the 2008 Consensus Document emerged during the present review. Regarding the protective effect of (modestly) increasing dietary calcium, Sellaturay et al. suggested that adding two glasses of milk per day to the diet is strongly associated with a decreased risk of kidney stones [49]; Worcester and Coe provided a more quantitative recommendation of 800–1000 mg/day of dietary calcium [55]. The most recent AUA guidelines increased the recommendation to 1000–1200 mg/day of dietary calcium [47]. With respect to the intake of dietary oxalate, comprehensive lists of oxalate-rich foods were given in two articles [55, 65] and a guideline for the upper limit for oxalate intake was recommended [55]. The limits on protein intake which have been recommended since 2008 are fairly consistent [55, 60, 65]. Johri et al. reported that 40–50 g protein is approximately equivalent to 140–160 g animal flesh, irrespective of whether it is red meat, fish, or poultry [65]. Since sodium and salt (NaCl) are often used interchangeably, care has to be exercised in interpreting the suggested upper limits recommended in different articles: 2–3 g Na/day [79], 6 g NaCl/day (equivalent to 2.4 g Na/day) [60], and 100 mmol Na/day (equivalent to 2.3 g Na/day) [55]. Supplemental calcium is not regarded favorably [49, 55]. Restriction of vitamin C intake continues to be advised [56, 60, 61], with an upper limit of 1500 mg/day being recommended [60]. The role of vitamin D supplementation is controversial. Vitamin D supplementation is regarded as a risk factor [56] and over-repletion of vitamin D can be deleterious as noted in one study [60]. Others have noted that vitamin D therapy, if indicated, should not be withheld solely on the basis of stone disease [77]. Monitoring these patients may be difficult given that one study did not show a relationship between serum vitamin D level and 24-h urine calcium excretion in stone formers [78]. Therefore, repletion should be done carefully. The deleterious effects on urinary stone risk factors caused by high protein, low carbohydrate, ketogenic (Atkins) diets have been reinforced due to increase in urinary calcium and intracellular acidosis [51, 64, 80].

The associations between diet and stone disease are not always conclusive or consistent with several meta-analyses finding inconclusive results [52, 59]. Interestingly, Goldfarb et al. pointed out in the 2008 Consensus Document that the association of a high protein diet with stone prevalence has not been uniformly reported in epidemiological studies nor has any RCT demonstrated that a low animal protein diet has benefits with respect to prevention of stone formation [81]. While, Dussol et al. found that a low animal protein diet administered over 4 years, did not protect against stone recurrence in 175 idiopathic calcium stone formers, [70] the association between a high animal protein diet and nephrolithiasis continues to be widely reported and, as a result, restriction of dietary animal protein is commonly advocated and practiced [49, 50, 53–56, 60, 64, 65]. At this time, there is insufficient compelling evidence to either discount or advocate the beneficial effects of a reduced dietary intake of animal protein with respect to reducing stone formation.

In summary, the present review has shown that since publication of the 2008 Consensus Document, there has not been any major shift in global expert opinion about dietary risk factors for kidney stone formation, nor has there been any new compelling evidence to unambiguously demonstrate an association between dietary interventions and stone recurrence. The recommendations given in Table 4 must be regarded as guidelines, rather than rigid rules. As is common practice, each patient must be evaluated individually. Dietary interventions can then be tailored according to the patient’s specific metabolic profile.

Intake of n-3 fatty acids and adherence to the Mediterranean diet (MedDiet) have been shown to slow the progression of age-related cognitive decline, but the results are mixed. We summarized and evaluated the effect of n-3 fatty acids and MedDiet on cognitive outcomes in a cognitively healthy aged population.

METHODS:

Relevant published studies from January 2000 to May 2015 were identified by searching three electronic databases: Pubmed, Web of Science/MEDLINE, and CINHAL. Observational studies and randomized controlled trials (RCTs) were considered.

RESULTS:

Twenty-four studies were included for the systematic review. n-3 Fatty acids were associated with better global cognition and some specific cognitive domains though some results were conflicting. Adherence to the MedDiet was also significantly associated with better cognitive performance and less cognitive decline. Finally, better cognitive performance was observed in men compared to women and conflicting results were also found for the influence of APOE4 genotype on the association between n-3 fatty acids or MedDiet and cognition.

CONCLUSIONS:

Studies suggest that n-3 fatty acids in the diet and adherence to the MedDiet are beneficial in slowing age-related cognitive decline. However, more high-quality RCTs would be useful to clarify the effect of n-3 fatty acid supplements on cognition.

Bone mineral content (BMC) and bone mineral density (BMD) are positively correlated with dietary protein intakes, which account for 1-8% of BMC and BMD variances. However, the relation between bone strength and microstructure, which are variables that are not captured by areal bone mineral density (aBMD), and dietary protein intakes, particularly from specific dietary sources, has not been clearly established.

OBJECTIVE:

We investigated the association between the peripheral skeleton-predicted failure load and stiffness, bone microstructure, and dietary protein intakes from various origins (animal, divided into dairy and nondairy, and vegetable origins) in healthy postmenopausal women.

DESIGN:

In a cross-sectional study in 746 Caucasian women aged 65.0 ± 1.4 y, we measured the aBMD with the use of dual-energy X-ray absorptiometry, the distal radius and tibia bone microstructures with the use of high-resolution peripheral quantitative computerized tomography, and bone strength with the use of a finite element analysis, and we evaluated dietary protein and calcium with the use of a validated food-frequency questionnaire.

RESULTS:

Mean dietary calcium and protein intakes were greater than recommended amounts for this class of age. The predicted failure load and stiffness at the distal radius and tibia were positively associated with total, animal, and dairy protein intakes but not with vegetable protein intake. Failure load differences were accompanied by modifications of the aBMD and of cortical and trabecular bone microstructures. The associations remained statistically significant after adjustment for weight, height, physical activity, menopause duration, calcium intake, and the interaction between calcium and protein intake. A principal component analysis of the volumetric BMD and bone microstructure indicated that trabecular bone mainly contributed to the positive association between protein intakes and bone strength.

CONCLUSIONS:

These results, which were recorded in a very homogeneous population of healthy postmenopausal women, indicate that there is a beneficial effect of animal and dairy protein intakes on bone strength and microstructure. Specifically, there is a positive association between the bone failure load and stiffness of the peripheral skeleton and dietary protein intake, which is mainly related to changes in the trabecular microstructure.

The conclusions from epidemiological studies are controversial between apple and pear consumption and type 2 diabetes mellitus (T2DM) risk. The present study aimed to investigate whether apple and pear consumption was inversely associated with T2DM risk, and to evaluate the potential dose-response relationship. The Cochrane library, Embase and PubMed databases were searched up to Nov 2016. Prospective cohort studies, which reported the association of apple and pear consumption with incidence of T2DM, were included. Multivariate-adjusted relative risks (RRs) for the highest versus lowest category were combined by using a random-effects model. A restricted cubic spline regression model was performed to examine the dose-response relationship. A total of 5 independent prospective cohort studies were included (14 120 T2DM incident cases and 228 315 participants). The summary estimate showed that consumption of apples and pears was associated with 18% reduction in T2DM risk (95% confidence interval (CI): 0.75, 0.88; I2 = 0.00%). Dose-response analysis showed that one serving per week increment of apple and pear consumption was associated with a 3% (95% CI: 0.96, 0.98; p for trend <0.001) reduction in T2DM risk. The present meta-analysis provides significant evidence of an inverse association between apple and pear consumption and T2DM risk.

We conducted a dose-response meta-analysis to summarize the evidence from prospective cohort studies regarding the association of fruit and vegetable consumption with risk of type 2 diabetes mellitus (T2DM).

To clarify and quantify the potential dose-response association between the intake of fruit and vegetables and risk of type 2 diabetes.

DESIGN:

Meta-analysis and systematic review of prospective cohort studies.

DATA SOURCE:

Studies published before February 2014 identified through electronic searches using PubMed and Embase.

ELIGIBILITY CRITERIA FOR SELECTING STUDIES:

Prospective cohort studies with relative risks and 95% CIs for type 2 diabetes according to the intake of fruit, vegetables, or fruit and vegetables.

RESULTS:

A total of 10 articles including 13 comparisons with 24,013 cases of type 2 diabetes and 434,342 participants were included in the meta-analysis. Evidence of curve linear associations was seen between fruit and green leafy vegetables consumption and risk of type 2 diabetes (p=0.059 and p=0.036 for non-linearity, respectively). The summary relative risk of type 2 diabetes for an increase of 1 serving fruit consumed/day was 0.93 (95% CI 0.88 to 0.99) without heterogeneity among studies (p=0.477, I(2)=0%). For vegetables, the combined relative risk of type 2 diabetes for an increase of 1 serving consumed/day was 0.90 (95% CI 0.80 to 1.01) with moderate heterogeneity among studies (p=0.002, I(2)=66.5%). For green leafy vegetables, the summary relative risk of type 2 diabetes for an increase of 0.2 serving consumed/day was 0.87 (95% CI 0.81 to 0.93) without heterogeneity among studies (p=0.496, I(2)=0%). The combined estimates showed no significant benefits of increasing the consumption of fruit and vegetables combined.

Increasing evidence has suggested an association between sleep duration and osteoporosis risk, although the results of previous studies have been inconsistent. To our knowledge, this is the first meta-analysis of the literature and quantitative estimates of the association between sleep duration and risk of osteoporosis in population-based studies of middle aged and elderly women.

METHODS:

Pertinent studies were identified by searching PubMed and EMBASE databases up to February 2016. Five out of six included studies were cross-sectional and one was a prospective cohort study. They included 72,326 participants from three different countries. We extracted 31,625 individuals in these studies for our meta-analysis.

RESULTS:

A pooled odds ratio analysis in women between 40 to 86years indicated that there is an inverse relationship between sleep duration and osteoporosis (overall OR =1.07 95% CI: 1.00-1.15). The negative association of long sleep duration (8h or more per day) with osteoporosis risk was observed in middle aged and elderly women (OR =1.22, 95% CI: 1.06-1.38) but not in women with short sleep duration (7h or less per day) (OR =0.98, 95% CI: 0.90-1.05).

CONCLUSION:

This meta-analysis suggests that long sleep duration (8h or more per day) may be associated with a higher risk of osteoporosis in middle-aged and elderly. Further prospective cohort studies with longer follow-up periods, valid instruments for measurement of sleep duration and dynamic sleep quality are warranted to support the possible relationship between sleep duration and osteoporosis risk in women.

A systematic literature search was performed up to November 2016 using PubMed and Scopus databases. The differences of fatty acid content between cases and controls were calculated as weighted mean differences (WMD) by using a random-effects model. The intervention effects of RCTs were calculated as WMD for net changes in ALT, AST, liver fat, TAG and fasting glucose levels, respectively. Meta-regression with restricted maximum likelihood estimation was used to evaluate a potential linear relationship between confounding factors and effect sizes. Generalized least square was performed for dose-response analysis.

there is limited data comparing conditions and health service use across care settings in centenarians. To improve health service delivery in centenarians, the aim of this study was to compare the proportion of centenarians who have chronic conditions, take medication and use health care services across different care settings.

Methods

this cohort study uses routine data from a major health insurance company serving Berlin, Germany and the surrounding region, containing almost complete information on health care transactions. The sample comprised all insured individuals aged 100 years and older (N = 1,121). Community-dwelling and institutionalised individuals were included. Charlson comorbidity index was based on 5 years of recordings. Hospital stays, medical specialist visits and medication prescribed in the previous year were analysed.

Results

while 6% of the centenarians did not receive any support; 45% received family homecare or homecare by professional care services; 49% were in long-term care. The most frequent conditions were dementia and rheumatic disease/arthritis, with the highest prevalence found among long-term care residents. A total of 97% of the centenarians saw a general practitioner in the previous year. Women were more often in long-term care and less often without any care. Centenarians with long-term care showed higher proportions of comorbidities, greater medication use, and more visits to medical specialists compared with centenarians in other care settings.

Conclusions

the higher prevalence of dementia and rheumatic disease/arthritis in long-term care compared to other care settings emphasises the role of these diseases in relation to the loss of physical and cognitive functioning.

Previous studies have suggested that acute respiratory infection (ARI) and nonsteroidal anti-inflammatory drugs (NSAIDs) use could trigger acute myocardial infarction (AMI). In some countries, physicians prescribe NSAIDs for patients with ARI for symptom relief. However, there is no research evaluating whether NSAIDs use during ARI episodes may increase the risk of AMI.

Methods.

We identified 9793 patients with an incident hospitalization of AMI (index date) between 2007 and 2011. Using case-crossover design, we compared the following exposure status between the case (1–7-day before index date) and matched control period (366–372-day before index date): NSAIDs use during ARI episodes, ARI episodes without NSAIDs use, NSAIDs use only, or no exposure. Multivariable conditional logistic regression models were used to estimate odds ratios adjusted for potential confounders.

Acute respiratory infections (ARIs) are increasingly recognized as triggers of acute cardiovascular events [1]. Observational studies using large electronic healthcare databases have shown an approximately 2–5-fold transient increase in the risk of acute myocardial infarction (AMI), stroke, and other thrombotic events such as deep venous thrombosis after ARI [2–4]. Although the definition of ARI in these studies is frequently based on clinical, rather than microbiological, criteria evidence that some specific infections trigger vascular events comes indirectly from vaccine trials. One meta-analysis of randomized controlled trials of influenza vaccine in patients with existing cardiovascular disease showed a reduced risk of major adverse cardiovascular outcomes at 1 year [5]. For pneumococcal vaccine, meta-analyses of observational studies suggest a small protective effect against cardiovascular events in people aged >65 years [6, 7], but controlled trial evidence is lacking. In people who develop ARI, unanswered...

The use of dietary supplements fuels a multibillion-dollar industry, with the figure estimated to reach $60 billion by 2020.1 The use of vitamin and mineral supplements increases in people with a history of anxiety and/or depression, among other health challenges.2 Women are 50% more susceptible to depression, generalized anxiety disorder, panic disorder, phobias and insomnia compared to men due to a variety of factors, including conflicting societal and family roles and limited free time.3-5 Stressed, anxious and time poor women may therefore be particularly susceptible to self-diagnosis, and therefore be especially vulnerable to the lure of dietary supplement advertisements and labeling targeting a reduction in these mood states. Indeed, a number of surveys have found that dietary supplement usage is more prevalent among women than men, lending support to this hypothesis.6,7 In addition, complementary or alternative therapies, which include dietary supplements, are estimated to be used by over half of the individuals who are diagnosed with anxiety or mood disorders.8 This estimate is likely to be higher if one includes those who self-diagnose, using the advice of friends and family to guide both purchasing and usage decisions. Unfortunately, these acquisitions are based largely on personal hope and media hype regarding supplement use, as evidenced by the systematic review published in this issue of the JBI Database of Systematic Reviews and Implementation Reports that highlights a lack of research-based evidence to support their use for anxiety and stress reduction in the general female population.9

The comprehensive search identified 14 eligible studies, all of which investigated the effectiveness of specific nutrients on female anxiety or stress, with 80 percent of these studies investigating these mood states during a hormonal phase, such as pregnancy, premenstrual tension (PMT), peri-menopause or menopause.9 The effectiveness of specific nutrient interventions on the reduction of anxiety or stress was evident in a number of studies where a specific hormonal phase was present, but not in other studies where no hormonal phase was present. In addition, during similar hormonal phases, combining nutrients led to a reduction in anxiety in one group, while another group using them separately did not experience a similar reduction. These mixed results, among others in our review, have limited usefulness for women attempting the use of dietary supplementation to manage anxiety or stress.9 In addition, intervention duration may be an important factor to take into account in determining the effects of specific nutrients, the idea of which is supported in a number of studies in our review.9 The need for increased intervention duration and possible effectiveness using nutrient combination may therefore need to be communicated to women who are aiming to manage these mood states with supplements during specific hormonal phases. Unfortunately, dietary supplementation labeling does not communicate these concepts adequately, and medical practitioners, along with allied health care providers, may also be remiss in explaining them sufficiently, leaving supplement users vulnerable to the opinion and advice of friends and family to guide supplement use. Another factor, which was not discussed in any of our selected papers, is whether nutrient dosage should be adjusted according to body weight. To date, only one essential fatty acid (EFA) dietary supplement suggests intake based on body weight.10 Evidence on the effectiveness of combination-nutrient dietary supplementation addressing female anxiety and stress, regardless of hormonal phase, is also sparse, with results from a number of studies yielding mixed results, 85% of which were partly funded by commercial supplement manufacturers introducing a possible risk of bias.9

There is clear biochemical evidence to indicate that nutrients play a critically important role in central nervous system (CNS) functioning.11 In addition, subclinical deficiencies in nutrients may influence psychological wellbeing before physical ailments are noted.12 There is also evidence to suggest that there is widespread prevalence of nutrient deficiencies in developing and developed countries, due to both poor dietary choices and soil nutrient deficiencies.13,14 Stress is likely to exacerbate these deficiencies due to increased nutrient requirements during periods of chronic stress, leading to a cascade of negative physical and mental sequelae such as cardiovascular disease, lifestyle-related diabetes, metabolic and immune dysfunction along with cognitive decline and depression, which has been linked to chronic stress.15-17 Self-diagnosis coupled with experiencing a lack of effectiveness from the use of supplementation may lead to a worsening of stress and anxiety-induced symptoms. Dietary supplementation may therefore play an important role in managing female anxiety and stress, but, as evidenced by the heterogeneity of the studies in our systematic review, and the mixed results, future studies examining the effectiveness of specific nutrient intervention on female anxiety and stress, regardless of hormonal phase, are warranted and sorely needed. However, future studies should take into account the relationship between nutrient intake and status at baseline, intervention duration, the combination of specific nutrients, dosage and individual differences, such as body weight, while controlling for hormonal phase. Unless studies address these relationships, any resulting systematic reviews will be afflicted by the same limitations from which our review suffered. Therefore, at this point in time, the majority of women may not experience the full benefit of specific dietary supplementation for anxiety or stress management.

Data were extracted using the standardized data extraction instruments from the Joanna Briggs Institute.

DATA SYNTHESIS:

Due to heterogeneity of the included studies, narrative synthesis was performed.

RESULTS:

Fourteen studies were included in this review. Essential fatty acids were effective in reducing perceived stress and salivary cortisol levels during pregnancy and anxiety in premenstrual women, and anxiety during menopause in the absence of depression, but were ineffective when depression was disregarded. Disregarding the hormonal phase, EFAs were ineffective in reducing stress or anxiety in four groups of women. Combined magnesium and vitamin B6 supplementation reduced premenstrual anxiety but had no effect when used in isolation and did not affect stress in women suffering from dysmenorrhea when combined or used in isolation. Older women experienced anxiety reduction using vitamin B6, but not folate or vitamin B12. High-dose sustained-release vitamin C was effective in reducing anxiety and blood pressure in response to stress.

CONCLUSION:

The current review suggests that EFAs may be effective in reducing prenatal stress and salivary cortisol and may reduce anxiety during premenstrual syndrome and during menopause in the absence of depression. Magnesium and vitamin B6 may be effective in combination in reducing premenstrual stress, and vitamin B6 alone may reduce anxiety effectively in older women. High-dose sustained-release vitamin C may reduce anxiety and mitigate increased blood pressure in response to stress.

IMPLICATIONS FOR PRACTICE:

Essential fatty acids may be effective in reducing prenatal stress and salivary cortisol levels, and premenstrual or menopausal anxiety in the absence of depression. Combining magnesium and vitamin B6 may reduce premenstrual anxiety and vitamin B6 may reduce anxiety in older women. High-dose sustained-release vitamin C may reduce anxiety and mitigate increased blood pressure in response to stress.

IMPLICATIONS FOR RESEARCH:

Investigating supplementation in longer term studies is warranted and should include compliance testing, the use of inert substances as controls and reliable outcome measures.

[The below paper is not pdf-availed.]

Healthy lifestyle and normal waist circumference are associated with a lower 5-year risk of type 2 diabetes in middle-aged and elderly individuals: Results from the healthy aging longitudinal study in Taiwan (HALST).

Type 2 diabetes mellitus (DM) is known to be closely associated with lifestyle and obesity and has a prevalence that increases with age. This study aimed to assess the short-term composite effect of diet, physical activity, psychosocial health, and waist circumference (WC) on the incidence of DM in the elderly and to provide a lifestyle-based predictive index.We used baseline measurements (2009-2013) of 5349 community-dwelling participants (aged 55 years and older, 52% female) of the Healthy Aging Longitudinal Study in Taiwan (HALST) for fasting plasma glucose, HbA1C, serum cholesterol, triglycerides, blood pressures, WC, and outcomes of home-visit questionnaire. Principal component analysis (PCA) was used to identify participants with a healthy lifestyle (HLF: higher diet, physical activity, and psychosocial scores) and a lower WC, with cutoffs determined by the receiver-operating characteristics. A Cox regression model was applied to 3424 participants without DM at baseline by linking to their National Health Insurance records (median follow-up of 3.1 years).In total, 247 new DM cases (7.2%) were identified. The HLF and lower WC group had a relative risk (RR) of DM of 0.54 (95% CI 0.35-0.82) compared to the non-HLF and higher WC group. When stratified by the presence of impaired glucose tolerance (IGT) or metabolic syndrome (MS), only participants with IGT/MS showed significant risks (RR 0.55; 95% CI 0.33-0.92). However, except for WC, the individual lifestyle factors were nonsignificant in the overall model without PCA.A composite protective effect of HLF and normal WC on DM within 5 years was observed, especially in those with IGT or MS. Psychosocial health constituted an important lifestyle factor in the elderly. The cutoffs identified could be used as a lifestyle-based risk index for DM. Maintaining an HLF to prevent DM is especially important for the elderly.

Hazard ratios (HR) and 95% confidence intervals (95% CI) from Cox proportional hazard analyses of self-reported physician-diagnosed incident kidney cancer versus MET-hours per week in 91,820 subjects recruited between 1991 and 1993 (7.7 yr follow-up of 42,833 subjects) and between 1998 and 1999 (6.4 yr follow-up of 33,053 subjects) as part of the National Runners' Health Study and between 1998 and 1999 as part of the National Walkers' Health Study (5.7 yr follow-up of 15,934 subjects).

The scope and purpose of this review was to summarize the aims, methods, findings, and future of centenarian and (semi)-supercentenarian studies in Japan, particularly those from our own interdisciplinary laboratory. Medically, approximately 97% of centenarians contract chronic diseases including hypertension and gastrointestinal disease; however, they present with few cardiovascular risk factors. The low prevalence of diabetes mellitus and carotid atherosclerotic plaques are peculiarities of centenarians, which could be associated with high adiponectin levels. While conducting the Tokyo centenarian study (TCS), we found that only 20% of centenarians enjoyed physical and cognitive independence at the age of 100 years, although most remained independent in daily living until into their 90s. Those who maintained physical independence at 100 years of age were highly likely to become semi-supercentenarians (over 105 years) or even supercentenarians (beyond 110 years). We also describe parts of results of the Japan Semi-supercentenarian Study (JSS), which showed that the suppression of chronic inflammation is an important driver of successful aging at extreme old age. Telomere maintenance and an extremely low frequency of APOE-ε4 alleles are genetic peculiarities of (semi)-supercentenarians. The available data confirm our conviction that semi-supercentenarians are a more appropriate model for the study of human longevity.

Impact of Diabetes Mellitus on Long-Term Mortality in Patients Presenting for Coronary Angiography.

To understand the current impact of diabetes mellitus (DM) on long-term outcomes among patients referred for coronary angiography, we studied 14,337 consecutive patients (5,279 diabetic patients [37%]) referred to coronary angiography for assessment or treatment of coronary artery disease. We investigated long-term all-cause mortality and its interaction with hypoglycemic therapy and presenting coronary status. At baseline, patients with DM had more hypertension, hyperlipidemia, and renal failure; more were women, overweight, and more had previous coronary interventions. Mortality was higher in those with DM and was related to treatment status: multivariate adjusted hazard ratio during a median follow-up period of 78 months was 1.41 (95% CI 1.11 to 1.80, p = 0.006) for diet only-treated DM, 1.63 (95% CI 1.51 to 1.77, p <0.001) for DM treated with oral hypoglycemics, and 2.50 (95% CI 2.20 to 2.85, p <0.001) for DM requiring insulin therapy. The earlier findings were similar in magnitude in patients presenting with acute or stable coronary syndromes. In addition, long-term mortality of medically treated DM presenting with a stable coronary syndrome was even higher than that of nondiabetic patients presenting with an acute coronary syndrome (hazard ratio 1.21, 95% CI 1.08 to 1.35, p = 0.001). In conclusion, in patients referred for coronary angiography in the current era, DM remained an independent predictor of long-term mortality regardless of the coronary presentation and mortality increased in direct relation to intensity of hypoglycemic therapy at presentation.

Prospective studies on the association between soft drink consumption and incident risk of the metabolic syndrome (MetS) have not been carried out in Asians. We explored the sex-specific association between soft drink consumption and incident risk of the MetS in Korean adults during 10 years of follow-up. A total of 5797 subjects who were free of the MetS at baseline were studied. Soft drink consumption was assessed using a semi-quantitative FFQ. Time-dependent Cox proportional hazard model was used to examine hazard ratios (HR) of incidence of the MetS and its components in relation to soft drink consumption. In women, the multivariable-adjusted HR for developing the MetS was 1·8-fold higher in frequent consumers of soft drinks (≥4 servings/week) compared with rare consumers (95 % CI 1·23, 2·64). The adjusted HR for elevated blood pressure increased by 2-fold (95 % CI 1·24, 3·14) and for hypertriacylglycerolaemia by 1·9-fold (95 % CI 1·19, 2·88) in frequent consumers of soft drinks compared with rare consumers. However, in men, there was no association between soft drink consumption and incident risk of the MetS or its components. Frequent soft drink consumption was associated with increased risk of developing the MetS and its components only in middle-aged Korean women, suggesting sex differences for the risk of the MetS related to diet.

Data sources Medline, Embase, the Cochrane Central Register of Controlled Trials, Web of Science, ClinicalTrials.gov, and the International Standard Randomised Controlled Trials Number registry from inception to December 2015.

Eligibility criteria for study selection Randomised, double blind, placebo controlled trials of supplementation with vitamin D3 or vitamin D2 of any duration were eligible for inclusion if they had been approved by a research ethics committee and if data on incidence of acute respiratory tract infection were collected prospectively and prespecified as an efficacy outcome.

Results 25 eligible randomised controlled trials (total 11 321 participants, aged 0 to 95 years) were identified. IPD were obtained for 10 933 (96.6%) participants. Vitamin D supplementation reduced the risk of acute respiratory tract infection among all participants (adjusted odds ratio 0.88, 95% confidence interval 0.81 to 0.96; P for heterogeneity <0.001). In subgroup analysis, protective effects were seen in those receiving daily or weekly vitamin D without additional bolus doses (adjusted odds ratio 0.81, 0.72 to 0.91) but not in those receiving one or more bolus doses (adjusted odds ratio 0.97, 0.86 to 1.10; P for interaction=0.05). Among those receiving daily or weekly vitamin D, protective effects were stronger in those with baseline 25-hydroxyvitamin D levels <25 nmol/L (adjusted odds ratio 0.30, 0.17 to 0.53) than in those with baseline 25-hydroxyvitamin D levels ≥25 nmol/L (adjusted odds ratio 0.75, 0.60 to 0.95; P for interaction=0.006). Vitamin D did not influence the proportion of participants experiencing at least one serious adverse event (adjusted odds ratio 0.98, 0.80 to 1.20, P=0.83). The body of evidence contributing to these analyses was assessed as being of high quality.

Conclusions Vitamin D supplementation was safe and it protected against acute respiratory tract infection overall. Patients who were very vitamin D deficient and those not receiving bolus doses experienced the most benefit.

A clinically useful effect remains uncertain despite hints in a new analysis

Vitamin D supplementation is a hot topic, provoking passionate arguments for and against widespread supplementation. Recently in The BMJ we discussed the evidence, concluding that vitamin D supplements should not be taken by adults to prevent non-musculoskeletal disease.1 Three months later comes a meta-analysis by Martineau and colleagues (doi:10.1136/bmj.i6583), concluding that prevention of acute respiratory tract infection is a “major new indication for vitamin D supplementation.”2 Given the short time between articles, why are the conclusions so different? Is this really a major new development, providing the long sought reliable evidence of benefits of vitamin D on a non-skeletal outcome in the general population? Or is it yet another hypothesis about vitamin D supplementation that needs testing in adequately powered randomised controlled trials?

Eight trial level meta-analyses have examined this topic since 2012, with conflicting findings: three reported benefits and five no consistent benefits from vitamin D.345678910 Martineau and colleagues extend this work by analysing individual patient data from 25 randomised controlled trials with acute respiratory tract infection as an outcome, involving 11 321 participants of all ages, some with existing chest disease. The headline result is a 12% reduction in the odds of an acute respiratory tract infection from supplementation.

There are reasons for viewing the headline result cautiously. In absolute terms, the primary result is a reduction from 42% to 40% in the proportion of participants experiencing at least one acute respiratory tract infection. It seems unlikely that the general population would consider a 2% absolute risk reduction sufficient justification to take supplements. Furthermore, the definition of acute respiratory tract infection varied between studies, consisting of a mixture of diverse conditions such as acute otitis media, laboratory confirmed influenza, self reported colds, parent reported colds or chest infections, or radiograph confirmed pneumonia. It is difficult to know whether a reduction in this mixture of conditions is applicable to the general population and how it should be interpreted clinically.

The meta-analysis includes individual patient data from 25 studies, which is an impressive achievement. Obtaining and analysing individual patient data for meta-analyses is difficult and time consuming. However, the selection of trials was sometimes unclear. A table of excluded trials with reasons for their exclusion would have been helpful.1112 Conversely, prospective data collection was one of the authors’ inclusion criteria,13 but they also included two trials that collected data retrospectively.141516 The differing conclusions to the previous systematic reviews, may be in part due to methodological differences such as these, as occurs in other overlapping meta-analyses of vitamin D supplements.1718

As in previous reviews, there is noticeable heterogeneity in the trial results (authors’ figure 2). Individual patient data analyses allow exploration of heterogeneity to a much greater degree than trial level analyses. The authors found potentially important factors modifying the response to supplementation: those with 25-hydroxyvitamin D levels less than 25 nmol/L and those receiving daily or weekly doses rather than bolus dose, had greater benefits, although vitamin D status was only available for less than 40% of trial participants. Although not statistically significant, there could be a clinically relevant interaction with age: in table 2 the benefits from vitamin D appear largely confined to the smallest subgroup of children aged 1.1-15.9 years (n=1079, absolute risk reduction 13%). In the three other larger subgroups (≤1 years n=5571, 16-65 years n=3051, >65 years n=1232), the absolute reductions were small and statistically non-significant, ranging from 0-3%.

Should these results change clinical practice? Probably not. The results are heterogeneous and not sufficiently applicable to the general population. We think that they should be viewed as hypothesis generating only, requiring confirmation in well designed adequately powered randomised controlled trials. Several very large such randomised controlled trials of vitamin D supplements will report on the effects on respiratory infections within the next few years. These trials have not targeted individuals with very low serum concentrations of vitamin D, and there is still a need for trials in these population groups. We consider that current evidence does not support the use of vitamin D supplementation to prevent disease, except for those at high risk of osteomalacia, currently defined as 25-hydroxyvitamin D levels less than 25 nmol/L.

More than 90% of the world’s population lives in unhealthy air, and the total number of deaths from outdoor air pollution reached about 4.2 million in 2015, according to a report released on 14 February. Deaths due to inhalation of fine airborne particles increased by more than 20% from 1990 to 2015, according to the State of Global Air 2017 report from the Global Burden of Disease project and the Health Effects Institute in Boston, Massachusetts. This type of air pollution is especially high in North Africa and the Middle East, but is also a major issue in Bangladesh, India and China. Particulate matter is now the fifth major health risk, behind high blood pressure, smoking, high blood sugar and high cholesterol, says the report.

Because certain flavonols and phenolic acids are found in pollen and nectar of most angiosperms, they are routinely ingested by Apis mellifera, the western honey bee. The flavonol quercetin and the phenolic acid p-coumaric acid are known to upregulate detoxification enzymes in adult bees; their presence or absence in the diet may thus affect the toxicity of ingested pesticides. We conducted a series of longevity assays with one-day-old adult workers to test if dietary phytochemicals enhance longevity and pesticide tolerance. One-day-old bees were maintained on sugar syrup with or without casein (a phytochemical-free protein source) in the presence or absence of quercetin and p-coumaric acid as well as in the presence or absence of two pyrethroid insecticides, bifenthrin and β-cyfluthrin. Dietary quercetin (hazard ratio, HR = 0.82), p-coumaric acid (HR = 0.91) and casein (HR = 0.74) were associated with extended lifespan and the two pyrethroid insecticides, 4 ppm bifenthrin (HR = 9.17) and 0.5 ppm β-cyfluthrin (HR = 1.34), reduced lifespan. Dietary quercetin enhanced tolerance of both pyrethroids; p-coumaric acid had a similar effect trend, although of reduced magnitude. Casein in the diet appears to eliminate the life-prolonging effect of p-coumaric acid in the absence of quercetin. Collectively, these assays demonstrate that dietary phytochemicals influence honey bee longevity and pesticide stress; substituting sugar syrups for honey or yeast/soy flour patties may thus have hitherto unrecognized impacts on adult bee health.

Liver cancer is the third leading cause of cancer mortality worldwide with hepatocellular carcinoma (HCC) representing more than 90% of primary liver cancers. Most HCC patients are also suffering from chronic liver disease (CLD). Evidence is emerging that the composition of diet plays an important role in HCC and CLD development and may also have a chemoprotective role. In contrast to other types of cancer, there are few studies investigating the role of diet in hepatocarcinogenesis. From the available data it is evident that high intakes of red meat and dietary sugar positively correlate with HCC occurrence. On the contrary, high consumption of white meat, fish, vegetables, fruits and cereals are inversely associated with HCC risk. This letter discusses the potential role of dietary interventions in the prevention of hepatocarcinogenesis. The increasing HCC incidence and its high fatality are making HCC prevention an urgent matter. Dietary modifications are found to offer protection against HCC, however, new studies from well-designed and large prospective trials are required to confirm these results.

Forecasting life expectancy by age and sex is broadly used for research and planning of health sevices, social services, pensions, and economics, and has been developed at the national and multicountry levels.1, 2, 3 The basic idea for predicting life expectancy is closely related to the concept of epidemiological transition.4 However, the idea is neither entirely based on evidence nor well-defined methodologically.2 Improvement in life expectancy is achieved through reductions in infant and younger age mortality and the progressive delay of mortality among older people.5

Given the lack of comparable longitudinal mortality data and the difficulty in understanding the outputs from compound forecasting modelling, it is clear that better longitudinal data about age-specific mortality are needed to reliably predict extending longevity. In The Lancet, Vasilis Kontis and colleagues6 have tried to avoid the pitfalls of such modelling. Using probabilistic projections from 21 different models with a 90% complete dataset, the authors predict that life expectancy will increase uninterruptedly in 35 industrialised countries by 2030 with a persistent female advantage. The sex difference is predicted to diminish by 2030 in most industrialised countries, except in Mexico, Chile, France, and Greece. A high probability of surpassing the maximum human lifespan that was considered as a potential maximum by some researchers,7, 8 is predicted in men and women.6

The study notes the extensive shortage of relevant data about unexpected events (eg, climate fluctuations and disease outbreaks) and changes in social determinants (eg, migration patterns and economic crises), which lead to uncertainties in projections of life expectancy. Most population projections are deterministic models, using simple extrapolations, and take into account the non-linearity observed. Hence, these models are unable to determine the probability of the uncertainty of future trends.2 This study proposes a Bayesian method for probabilistic projection of multiple models to fully capture the uncertainty of future trends in mortality and life expectancy.

The study predicted uninterrupted gains of life expectancy with a higher probability for men (85%) than for women (65%) in 35 countries, with sex gaps in life expectancy expected to shrink in all countries, except in Mexico. Female life expectancy would break the 90-year barrier with a more than 50% probability by 2030, a barrier believed unreachable at the turn of the 21st century. The forecasts of life expectancy at birth are broadly similar to the UN predictions.9 The findings raise crucial issues about which responses are appropriate to tackle such worsening disparity in terms of health policy and provision of health services. Identifying the best model for forecasting life expectancy will help assess appropriate health prevention strategies in different populations.

In terms of relevant policy implications, this study provides substantial evidence of longevity gains and identifies the groups predicted to have a greater contribution to gains. Accurate forecasting of life expectancy is needed. Country differences in life expectancy over 25 years remain. Global Health 203510 indicates that most low-income and middle-income countries achieved a “grand convergence” in health. Countries are moving towards universal health coverage. Forecasting life expectancy at birth and at age 65 years can help governments and health services to make the right investments in health, such as averting deaths due to infectious diseases and reducing maternal and child mortality. Achieving universal health coverage is worthy, plausible, and needs to be continued.

>>>>>>>>>>>>>>>>>>>>>>>>

Lancet, in press 2017

Future life expectancy in 35 industrialised countries: projections with a Bayesian model ensemble

Projections of future mortality and life expectancy are needed to plan for health and social services and pensions. Our aim was to forecast national age-specific mortality and life expectancy using an approach that takes into account the uncertainty related to the choice of forecasting model.

Methods

We developed an ensemble of 21 forecasting models, all of which probabilistically contributed towards the final projections. We applied this approach to project age-specific mortality to 2030 in 35 industrialised countries with high-quality vital statistics data. We used age-specific death rates to calculate life expectancy at birth and at age 65 years, and probability of dying before age 70 years, with life table methods.

Findings

Life expectancy is projected to increase in all 35 countries with a probability of at least 65% for women and 85% for men. There is a 90% probability that life expectancy at birth among South Korean women in 2030 will be higher than 86·7 years, the same as the highest worldwide life expectancy in 2012, and a 57% probability that it will be higher than 90 years. Projected female life expectancy in South Korea is followed by those in France, Spain, and Japan. There is a greater than 95% probability that life expectancy at birth among men in South Korea, Australia, and Switzerland will surpass 80 years in 2030, and a greater than 27% probability that it will surpass 85 years. Of the countries studied, the USA, Japan, Sweden, Greece, Macedonia, and Serbia have some of the lowest projected life expectancy gains for both men and women. The female life expectancy advantage over men is likely to shrink by 2030 in every country except Mexico, where female life expectancy is predicted to increase more than male life expectancy, and in Chile, France, and Greece where the two sexes will see similar gains. More than half of the projected gains in life expectancy at birth in women will be due to enhanced longevity above age 65 years.

Interpretation

There is more than a 50% probability that by 2030, national female life expectancy will break the 90 year barrier, a level that was deemed unattainable by some at the turn of the 21st century. Our projections show continued increases in longevity, and the need for careful planning for health and social services and pensions.

Coffee, tea and melanoma risk: findings from the European Prospective Investigation into Cancer and Nutrition.

In vitro and animal studies suggest that bioactive constituents of coffee and tea may have anticarcinogenic effects against cutaneous melanoma, however epidemiological evidence is limited to date. We examined the relationships between coffee (total, caffeinated or decaffeinated) and tea consumption and risk of melanoma in the European Prospective Investigation into Cancer and Nutrition (EPIC). EPIC is a multi-centre prospective study that enrolled over 500,000 participants aged 25-70 years from ten European countries in 1992-2000. Information on coffee and tea drinking was collected at baseline using validated country-specific dietary questionnaires. We used adjusted Cox proportional hazards regression models to calculate hazard ratios (HR) and 95% confidence intervals (95% CI) for the associations between coffee and tea consumption and melanoma risk. Overall, 2,712 melanoma cases were identified during a median follow-up of 14.9 years among 476,160 study participants. Consumption of caffeinated coffee was inversely associated with melanoma risk among men (HR for highest quartile of consumption vs. non-consumers 0.31, 95% CI 0.14-0.69) but not among women (HR 0.96, 95% CI 0.62-1.47). There were no statistically significant associations between consumption of decaffeinated coffee or tea and the risk of melanoma among both men and women. The consumption of caffeinated coffee was inversely associated with melanoma risk among men in this large cohort study. Further investigations are warranted to confirm our findings and clarify the possible role of caffeine and other coffee compounds in reducing the risk of melanoma.

The infrapatellar fat pad (IPFP) represents intra-articular adipose tissue that may contribute to intra-articular inflammation and pain by secretion of proinflammatory cytokines. Here we examined the impact of weight loss by diet and/or exercise interventions on the IPFP volume.

The average weight loss amounted to 1.0% in the E group, 10.5% in the D group, and 13.0% in the D+E group. A significant (p < 0.01) reduction in IPFP volume was observed in the E (2.1%), D (4.0%), and D+E (5.2%) groups. The IPFP volume loss in the D+E group was significantly greater than that in the E group (p < 0.05) when not adjusting for parallel comparisons. Across intervention groups, there were significant correlations between IPFP volume change, individual weight loss (r = 0.40), and change in total body fat mass (dual-energy X-ray absorptiometry; r = 0.44, n = 88) and in subcutaneous thigh fat area (computed tomography; r = 0.32, n = 82).

CONCLUSIONS:

As a potential link between obesity and knee osteoarthritis, the IPFP was sensitive to intervention by diet and/or exercise, and its reduction was correlated with changes in weight and body fat.

To assess the net impact of vital exhaustion on cardiovascular events and all-cause mortality, we conducted a systematic search of PubMed, EMBASE, and PsychINFO (through April 2016) to identify all studies which investigated the relation between vital exhaustion (VE) and health outcomes. Inclusion criteria were as follows: (1) a cohort study (prospective cohort or historical cohort) consisting of adults (>18 years); (2) at least 1 self-reported or interview-based assessment of VE or exhaustion; (3) evaluated the association between vital exhaustion or exhaustion and relevant outcomes; and (4) reported adjusted risk estimates of vital exhaustion/exhaustion for outcomes. Maximally adjusted effect estimates with 95% CIs along with variables used for adjustment in multivariate analysis were also abstracted. Primary study outcome was cardiovascular events. Secondary outcomes were stroke and all-cause mortality. Seventeen studies (19 comparisons) with a total of 107,175 participants were included in the analysis. Mean follow-up was 6 years. VE was significantly associated with an increased risk for cardiovascular events (relative risk 1.53, 95% CI 1.28 to 1.83, p <0.001) and all-cause mortality (relative risk 1.48, 95% CI 1.28 to 1.72, p <0.001). VE also showed a trend for increased incident stroke (relative risk 1.46, 95% CI 0.97 to 2.21, p = 0.07). Subgroup analyses yielded similar results. VE is a significant risk factor for cardiovascular events, comparable in potency to common psychosocial risk factors. Our results imply a need to more closely study VE, and potentially related states of exhaustion, such as occupational burnout.

Reduced caloric intake including fasting, as well as the dietary composition or the timing of food intake impact longevity, likely through a modification in the onset or the severity of chronic aging-related diseases such as cancer. As with pre- and post-operative dietary recommendations, evidence-based nutritional advice from healthcare professionals during and after cancer treatment is often vague or conflicting. We hypothesize that preventive dietary recommendations can help in the context of both chronic cancer treatment efficacy and the avoidance of development of secondary malignancies, as well as in the context of protection from the acute stress of surgery. In this perspective review, we will discuss the latest findings on the potential role of short-term dietary restriction in cancer treatment and improvement of surgical outcome.

Electronic health records offer the opportunity to discover new clinical implications for established blood tests, but international comparisons have been lacking. We tested the association of total white cell count (WBC) with all-cause mortality in England and New Zealand.

SETTING:

Primary care practices in England (ClinicAl research using LInked Bespoke studies and Electronic health Records (CALIBER)) and New Zealand (PREDICT).

We found 'J'-shaped associations between WBC and mortality; the second quintile was associated with lowest risk in both cohorts. High WBC within the reference range (8.65-10.05×109/L) was associated with significantly increased mortality compared to the middle quintile (6.25-7.25×109/L); adjusted HR 1.51 (95% CI 1.43 to 1.59) in CALIBER and 1.33 (95% CI 1.06 to 1.65) in PREDICT. WBC outside the reference range was associated with even greater mortality. The association was stronger over the first 6 months of follow-up, but similar across ethnic groups.

CONCLUSIONS:

Clinically recorded WBC within the range considered 'normal' is associated with mortality in ethnically different populations from two countries, particularly within the first 6 months. Large-scale international comparisons of electronic health record cohorts might yield new insights from widely performed clinical tests.

Emerging evidence suggests that arterial stiffness, an important marker of cardiovascular health, is associated with alcohol consumption. However, the role of longer‐term consumption patterns in the progression of arterial stiffness over time remains unclear. A longitudinal cohort design was used to evaluate the association between alcohol consumption over 25 years and subsequent changes in arterial stiffness.

This work demonstrates that consistently heavy alcohol consumption is associated with higher cardiovascular risk, especially among males, and also provides new insights into the potential impact of changes in drinking levels over time. It discusses the additional insights possible when capturing longitudinal consumption patterns in lieu of reliance on recent intake alone.

Fibroblast growth factor 21 (FGF21) is a peptide hormone that is synthesized by several organs and regulates energy homeostasis. Excitement surrounding this relatively recently identified hormone is based on the documented metabolic beneficial effects of FGF21, which include weight loss and improved glycemia. The biology of FGF21 is intrinsically complicated owing to its diverse metabolic functions in multiple target organs and its ability to act as an autocrine, paracrine, and endocrine factor. In the liver, FGF21 plays an important role in the regulation of fatty acid oxidation both in the fasted state and in mice consuming a high-fat, low-carbohydrate ketogenic diet. FGF21 also regulates fatty acid metabolism in mice consuming a diet that promotes hepatic lipotoxicity. In white adipose tissue (WAT), FGF21 regulates aspects of glucose metabolism, and in susceptible WAT depots, it can cause browning. This peptide is highly expressed in the pancreas, where it appears to play an anti-inflammatory role in experimental pancreatitis. It also has an anti-inflammatory role in cardiac muscle. Although typically not expressed in skeletal muscle, FGF21 is induced in situations of muscle stress, particularly mitochondrial myopathies. FGF21 has been proposed as a novel therapeutic for metabolic complications such as diabetes and fatty liver disease. This review aims to interpret and delineate the ever-expanding complexity of FGF21 physiology.

KEYWORDS:

adipose; diet; fat; liver; obesity; β-klotho

...

Dr. E. Maratos-Flier has consulted for Sanofi, Aventis, Novo-Nordisk, and Novartis on a one-time basis. Lilly has provided the laboratory with FGF21 under a materials transfer agreement.

Although more than 90% of patients with breast cancer have early stage disease at diagnosis, about 25% will eventually die of distant metastasis.1 Many patients with breast cancer seek information from a variety of sources about behaviours that may reduce their risk of recurrence.2 Making positive lifestyle changes can also be psychologically beneficial to patients by empowering them, since the feeling of loss of control is one of biggest challenges of a cancer diagnosis.

In this review, we identify which lifestyle changes can be recommended to patients as an adjunct to standard breast cancer treatments, to reduce their risk of distant recurrence and death. We review the role of lifestyle factors, particularly weight management, exercise, diet, smoking, alcohol intake and vitamin supplementation, on the prognosis of patients with breast cancer. ...

Proton pump inhibitor (PPI) use is associated with an increased risk of acute kidney injury (AKI), incident chronic kidney disease (CKD), and progression to end-stage renal disease (ESRD). PPI-associated CKD is presumed to be mediated by intervening AKI. However, whether PPI use is associated with an increased risk of chronic renal outcomes in the absence of intervening AKI is unknown. To evaluate this we used the Department of Veterans Affairs national databases to build a cohort of 144,032 incident users of acid suppression therapy that included 125,596 PPI and 18,436 Histamine H2 receptor antagonist (H2 blockers) consumers. Over 5 years of follow-up in survival models, cohort participants were censored at the time of AKI occurrence. Compared with incident users of H2 blockers, incident users of PPIs had an increased risk of an estimated glomerular filtration rate (eGFR) under 60 ml/min/1.73m2 (hazard ratio 1.19; 95% confidence interval 1.15-1.24), incident CKD (1.26; 1.20-1.33), eGFR decline over 30% (1.22; 1.16-1.28), and ESRD or eGFR decline over 50% (1.30; 1.15-1.48). Results were consistent in models that excluded participants with AKI either before chronic renal outcomes, during the time in the cohort, or before cohort entry. The proportion of PPI effect mediated by AKI was 44.7%, 45.47%, 46.00%, and 46.72% for incident eGFR under 60 ml/min/1.73m2, incident CKD, eGFR decline over 30%, and ESRD or over 50% decline in eGFR, respectively. Thus, PPI use is associated with increased risk of chronic renal outcomes in the absence of intervening AKI. Hence, reliance on antecedent AKI as warning sign to guard against the risk of CKD among PPI users is not sufficient as a sole mitigation strategy.

There have been treatments approved by the US Food and Drug Administration for relapsing-remitting multiple sclerosis (RRMS) for more than 20 years. During this time, the efficacy of newly approved therapies has steadily improved, and we are now seeing that many patients do well with these treatments. Some patients have fared so well with treatments that neurologists have begun to use the term NEDA (no evidence of disease activity) to describe patients with multiple sclerosis (MS) who do not experience relapses, disability progression, or new lesions on magnetic resonance imaging, which is becoming the new goal of MS therapy.1 Particularly for patients with recent-onset RRMS, it is clear that clinicians have a number of options.

---------------

Key Points

Question

What are the long-term outcomes after autologous hematopoietic stem cell transplantation for the treatment of multiple sclerosis?

Findings

In this multicenter cohort study of 281 patients with predominantly progressive forms of multiple sclerosis who underwent autologous hematopoietic stem cell transplant between 1995 and 2006, transplant-related mortality was 2.8% within 100 days of transplant, and neurological progression-free survival was 46% at 5 years. Younger age, relapsing form of multiple sclerosis, fewer prior immunotherapies, and lower neurological disability score were significantly associated with better outcomes.

Meaning

The results support the rationale for further randomized clinical trials of autologous hematopoietic stem cell transplantation for the treatment of multiple sclerosis.

Abstract

Importance

Autologous hematopoietic stem cell transplantation (AHSCT) may be effective in aggressive forms of multiple sclerosis (MS) that fail to respond to standard therapies.

Objective

To evaluate the long-term outcomes in patients who underwent AHSCT for the treatment of MS in a large multicenter cohort.

Design, Setting, and Participants

Data were obtained in a multicenter, observational, retrospective cohort study. Eligibility criteria were receipt of AHSCT for the treatment of MS between January 1995 and December 2006 and the availability of a prespecified minimum data set comprising the disease subtype at baseline; the Expanded Disability Status Scale (EDSS) score at baseline; information on the administered conditioning regimen and graft manipulation; and at least 1 follow-up visit or report after transplant. The last patient visit was on July 1, 2012. To avoid bias, all eligible patients were included in the analysis regardless of their duration of follow-up. Data analysis was conducted from September 1, 2014 to April 27, 2015.

Exposures

Demographic, disease-related, and treatment-related exposures were considered variables of interest, including age, disease subtype, baseline EDSS score, number of previous disease-modifying treatments, and intensity of the conditioning regimen.

In this observational study of patients with MS treated with AHSCT, almost half of them remained free from neurological progression for 5 years after transplant. Younger age, relapsing form of MS, fewer prior immunotherapies, and lower baseline EDSS score were factors associated with better outcomes. The results support the rationale for further randomized clinical trials of AHSCT for the treatment of MS.

Multiple sclerosis (MS) is a central nervous system disorder characterized by inflammation, loss of the insulating tissue (myelin) surrounding and protecting nerve axons, and multifocal scarring.1 Multiple sclerosis is a prototypic autoimmune disease likely mediated by pathogenic T and B lymphocytes. There has been substantial progress in the management of MS during the past decade, with 10 variably effective therapies now available and approved by the US Food and Drug Administration (FDA). These therapies suppress the early relapsing-remitting form of MS; however, the late neurodegenerative phase of the disease, progressive MS, remains largely untreatable.

------------------------

Abstract

IMPORTANCE:

No current therapy for relapsing-remitting multiple sclerosis (MS) results in significant reversal of disability.

OBJECTIVE:

To determine the association of nonmyeloablative hematopoietic stem cell transplantation with neurological disability and other clinical outcomes in patients with MS.

DESIGN, SETTING, AND PARTICIPANTS:

Case series of patients with relapsing-remitting MS (n = 123) or secondary-progressive MS (n = 28) (mean age, 36 years; range, 18-60 years; 85 women) treated at a single US institution between 2003 and 2014 and followed up for 5 years. Final follow-up was completed in June 2014.

INTERVENTIONS:

Treatment with cyclophosphamide and alemtuzumab (22 patients) or cyclophosphamide and thymoglobulin (129 patients) followed by infusion of unmanipulated peripheral blood stem cells.

Outcome analysis was available for 145 patients with a median follow-up of 2 years and a mean of 2.5 years. Scores from the EDSS improved significantly from a pretransplant median of 4.0 to 3.0 (interquartile range [iQR], 1.5 to 4.0; n = 82) at 2 years and to 2.5 (IQR, 1.9 to 4.5; n = 36) at 4 years (P < .001 at each assessment). There was significant improvement in disability (decrease in EDSS score of ≥1.0) in 41 patients (50%; 95% CI, 39% to 61%) at 2 years and in 23 patients (64%; 95% CI, 46% to 79%) at 4 years. Four-year relapse-free survival was 80% and progression-free survival was 87%. The NRS scores improved significantly from a pretransplant median of 74 to 88.0 (IQR, 77.3 to 93.0; n = 78) at 2 years and to 87.5 (IQR, 75.0 to 93.8; n = 34) at 4 years (P < .001 at each assessment). The median MSFC scores were 0.38 (IQR, -0.01 to 0.64) at 2 years (P < .001) and 0.45 (0.04 to 0.60) at 4 years (P = .02). Total quality-of-life scores improved from a mean of 46 (95% CI, 43 to 49) pretransplant to 64 (95% CI, 61 to 68) at a median follow-up of 2 years posttransplant (n = 132) (P < .001). There was a decrease in T2 lesion volume from a pretransplant median of 8.57 cm3 (IQR, 2.78 to 22.08 cm3) to 5.74 cm3 (IQR, 1.88 to 14.45 cm3) (P < .001) at the last posttransplant assessment (mean follow-up, 27 months; n = 128).

CONCLUSIONS AND RELEVANCE:

Among patients with relapsing-remitting MS, nonmyeloablative hematopoietic stem cell transplantation was associated with improvement in neurological disability and other clinical outcomes. These preliminary findings from this uncontrolled study require confirmation in randomized trials.

Glaucomas are neurodegenerative diseases that cause vision loss, especially in the elderly. The mechanisms initiating glaucoma and driving neuronal vulnerability during normal aging are unknown. Studying glaucoma-prone mice, we show that mitochondrial abnormalities are an early driver of neuronal dysfunction, occurring before detectable degeneration. Retinal levels of nicotinamide adenine dinucleotide (NAD+, a key molecule in energy and redox metabolism) decrease with age and render aging neurons vulnerable to disease-related insults. Oral administration of the NAD+ precursor nicotinamide (vitamin B3), and/or gene therapy (driving expression of Nmnat1, a key NAD+-producing enzyme), was protective both prophylactically and as an intervention. At the highest dose tested, 93% of eyes did not develop glaucoma. This supports therapeutic use of vitamin B3 in glaucoma and potentially other age-related neurodegenerations.

Plasma concentrations and intakes of amino acids in male meat-eaters, fish-eaters, vegetarians and vegans: a cross-sectional analysis in the EPIC-Oxford cohort.

We aimed to investigate the differences in plasma concentrations and in intakes of amino acids between male meat-eaters, fish-eaters, vegetarians and vegans in the Oxford arm of the European Prospective Investigation into Cancer and Nutrition.

SUBJECTS/METHODS:

This cross-sectional analysis included 392 men, aged 30-49 years. Plasma amino acid concentrations were measured with a targeted metabolomic approach using mass spectrometry, and dietary intake was assessed using a food frequency questionnaire. Differences between diet groups in mean plasma concentrations and intakes of amino acids were examined using analysis of variance, controlling for potential confounding factors and multiple testing.

RESULTS:

In plasma, concentrations of 6 out of 21 amino acids varied significantly by diet group, with differences of -13% to +16% between meat-eaters and vegans. Concentrations of methionine, tryptophan and tyrosine were highest in fish-eaters and vegetarians, followed by meat-eaters, and lowest in vegans. A broadly similar pattern was seen for lysine, whereas alanine concentration was highest in fish-eaters and lowest in meat-eaters. For glycine, vegans had the highest concentration and meat-eaters the lowest. Intakes of all 18 dietary amino acids differed by diet group; for the majority of these, intake was highest in meat-eaters followed by fish-eaters, then vegetarians and lowest in vegans (up to 47% lower than in meat-eaters).

CONCLUSIONS:

Men belonging to different habitual diet groups have significantly different plasma concentrations of lysine, methionine, tryptophan, alanine, glycine and tyrosine. However, the differences in plasma concentrations were less marked than and did not necessarily mirror those seen for amino acid intakes.

Those who follow a gluten-free diet may be at risk of increased arsenic and mercury exposure.

Gluten-Free Diet with a Side of Risks

One percent of Americans have been diagnosed with Celiac Disease, and must adhere to a gluten-free diet. Additionally, many people have turned to a gluten-free diet as they feel that it helps them with health issues, such as inflammation. However, a recent study shows that such a diet might boost the risk for mercury and arsenic exposure. These are highly toxic metals that can cause cardiovascular disease, neurological problems and even cancer. The details of the research were recently published in the journal Epidemiology.

The Risks of a Gluten-free Diet

Gluten is a protein in barley, rye and wheat. Food products that are gluten-free tend to be made with rice flour rather than wheat. Rice bioaccumulates some toxic metals like mercury and arsenic. This accumulation occurs as a result of exposure to metal-laden fertilizers, water and soil. Though little is known about the ramifications of a diet that is high in rice, it is widely known that toxic metals are terrible for human health.

A whopping 25 percent of Americans went gluten-free in 2015. This is about a two-thirds increase since two years prior. Going gluten-free makes sense for those with celiac disease as gluten induces a chaotic immune response. Many others have determined that they feel better refraining from eating gluten for various physical and mental reasons.

About the Study

Assistant professor of epidemiology at the UIC School of Public Health, Maria Argos, examined data derived from the National Health and Nutrition Examination Survey along with the help of her colleagues. These researchers were on the prowl for a connection between gluten-free diets and the signature biomarkers for toxic metals in urine and blood. Argos' team pinpointed over 70 participants who consumed gluten-free foods out of nearly 7,500 who filled out the survey between '09 and '14. The study's participants were between the ages of 6 and 80 years old.

Those who reported consuming gluten-free foods had an elevated concentration of mercury within their blood and an elevated level of arsenic within their urine. It is particularly interesting to note that arsenic levels in the gluten-free crowd were nearly double the levels of those who consumed foods with gluten. The gluten-free crowds' mercury levels were 70 percent higher than those who consumed food with gluten.

How the Study Results Should be Interpreted

The study results show that there are several unintended consequences of consuming foods void of gluten. Additional studies must be performed to determine if there are legitimate health consequences from a heightened exposure to mercury and arsenic as a result of a gluten-free diet. Subsequent research will help piece together this puzzle.

There are regulations for arsenic levels in food across Europe. Maybe it is in the public's interest for the government of the United States and other nations to legislate similar regulations as more people turn to gluten-free diets.

What to do about this

It's not simply going gluten free that raises these risks, but how one goes gluten free. Refraining from eating too many processed gluten-free foods that are made with rice flour and/or rice syrup should be considered. Alternative flours, such as almond and coconut should be considered when baking. Brown rice tends to have more arsenic than white rice. According to Consumer Reports, Basmati rice from California is the lowest in arsenic. Millet is a viable alternative as it is fluffy like rice when cooked and has far less arsenic. Quinoa is a low-arsenic grain, high in protein, that may be substituted, as well.

Specific foods such as fish and rice have high concentrations of metals such as arsenic, mercury, lead, cadmium, and cobalt. Many gluten-free diets (GFDs) include these foods, so we evaluated whether a GFD was associated with increased metal bioaccumulation.

METHODS:

We performed a population-based cross-sectional study using data collected from the National Health and Nutrition Examination Survey (NHANES), from 2009 through 2012, collecting information on the diagnosis of celiac disease and adherence to a GFD. We tested NHANES blood samples to identify individuals with undiagnosed celiac disease, using assays for immunoglobulin A tissue transglutaminase followed by a confirmatory test for endomysial antibody. Among a total of 11,353 NHANES participants, celiac disease was diagnosed in 55 participants, based on test results or a reported clinical diagnosis. We collected NHANES survey data on blood levels of lead, mercury, and cadmium from subjects who were on a GFD (n=115) and participants who were not on a GFD (n=11,235). Levels of total arsenic in urine samples were available from 3901 subjects not following a GFD and 32 individuals following a GFD. NHANES participants were asked questions about fish and shellfish consumption. We performed multivariate logistic regression analyses to associate gluten-related conditions with blood concentrations of mercury, cadmium, and lead and urine concentration of total arsenic, adjusting for demographic characteristics, as well as for rice consumption or seafood intake. Geometric means were reported for urinary concentrations of total arsenic and blood concentrations of mercury, cadmium, and lead for demographic groups and subjects with gluten-related conditions (subjects without celiac disease who avoid gluten).

RESULTS:

Persons following a GFD had significantly increased total blood mercury (1.37 mcg/L) compared with persons not on a GFD (0.93 mcg/L) (P=.008), as well as increased blood levels of lead (1.42 mcg/L vs 1.13 mcg/L; P=.007), and cadmium (0.42 mcg/L vs 0.34 mcg/L; P=.03). Urine samples from subjects on a GFD had higher concentrations of total arsenic (15.15 mcg/L) than urine samples from subjects not on a GFD (8.38 mcg/L) (P=.002). After controlling for demographic characteristics, levels of all heavy metals remained significantly higher in persons following a GFD, compared to those not following a GFD. After exclusion of persons with celiac disease, people without celiac disease on a GFD (n=101) had significantly increased blood concentrations of total mercury (1.40 mcg/L) than persons without celiac disease and not on a GFD (n=10,890) (0.93 mcg/L; P=.02) and higher blood concentrations of lead (1.44 mcg/L vs 1.13 mcg/L; P=.01) and higher urine concentrations of total arsenic (14.69 mcg/L, n=3632 vs. 8.32 mcg/L, n=28) (P=.01). Blood samples from persons without celiac disease avoiding gluten had higher levels of cadmium (0.42 mcg/L) than persons without celiac disease and not following a GFD (0.34 mcg/L), but this difference was not significant (P=.06).

CONCLUSION:

In an analysis of data collected from NHANES, persons on a GFD have significantly higher urine levels of total arsenic and blood levels of mercury, lead, and cadmium than persons not avoiding gluten. Studies are needed to determine the long-term effects of accumulation of these elements in persons on a GFD.

In the United States, colorectal cancer (CRC) incidence and mortality have declined by roughly 3% per year since 2001 (1). Screening probably explains much of this public health success; however, the optimal method for it remains unclear. Colonoscopy accounts for at least 60% of all CRC screening in the United States, despite its greater expense and risk for complications compared with other options (2). Surprisingly little published evidence supports the predominance of colonoscopy. Unlike for fecal occult blood testing or flexible sigmoidoscopy, no controlled studies have shown that colonoscopy reduces CRC incidence or mortality. Most studies have reported that the cost-effectiveness of other CRC screening methods equals or exceeds that of colonoscopy (3). Recently, the clinical effectiveness of screening colonoscopy itself came under fire, with several studies showing excellent protection against left-sided CRC but far less against right-sided disease (4). Long-awaited trials comparing colonoscopy with stool blood–based screening methods are under way, but informative results will not be available for years.

To update and reanalyze 2 systematic reviews to examine the effects of calcium intake on cardiovascular disease (CVD) among generally healthy adults.

DATA SOURCES:

MEDLINE; Cochrane Central Register of Controlled Trials; Scopus, including EMBASE; and previous evidence reports from English-language publications from 1966 to July 2016.

STUDY SELECTION:

Randomized trials and prospective cohort and nested case-control studies with data on dietary or supplemental intake of calcium, with or without vitamin D, and cardiovascular outcomes.

DATA EXTRACTION:

Study characteristics and results extracted by 1 reviewer were confirmed by a second reviewer. Two raters independently assessed risk of bias.

DATA SYNTHESIS:

Overall risk of bias was low for the 4 randomized trials (in 10 publications) and moderate for the 27 observational studies included. The trials did not find statistically significant differences in risk for CVD events or mortality between groups receiving supplements of calcium or calcium plus vitamin D and those receiving placebo. Cohort studies showed no consistent dose-response relationships between total, dietary, or supplemental calcium intake levels and cardiovascular mortality and highly inconsistent dose-response relationships between calcium intake and risks for total stroke or stroke mortality.

Obesity-related diseases, including type 2 diabetes and cardiovascular disease, have reached epidemic proportions in industrialized nations, and dietary interventions for their prevention are therefore important. Resistant starches (RS) improve insulin sensitivity in clinical trials, but the mechanisms underlying this health benefit remain poorly understood. Because RS fermentation by the gut microbiota results in the formation of physiologically active metabolites, we chose to specifically determine the role of the gut microbiota in mediating the metabolic benefits of RS. To achieve this goal, we determined the effects of RS when added to a Western diet on host metabolism in mice with and without a microbiota.

RESULTS:

RS feeding of conventionalized mice improved insulin sensitivity and redressed some of the Western diet-induced changes in microbiome composition. However, parallel experiments in germ-free littermates revealed that RS-mediated improvements in insulin levels also occurred in the absence of a microbiota. RS reduced gene expression of adipose tissue macrophage markers and altered cecal concentrations of several bile acids in both germ-free and conventionalized mice; these effects were strongly correlated with the metabolic benefits, providing a potential microbiota-independent mechanism to explain the physiological effects of RS.

CONCLUSIONS:

This study demonstrated that some metabolic benefits exerted by dietary RS, especially improvements in insulin levels, occur independently of the microbiota and could involve alterations in the bile acid cycle and adipose tissue immune modulation. This work also sets a precedent for future mechanistic studies aimed at establishing the causative role of the gut microbiota in mediating the benefits of bioactive compounds and functional foods.

Cisplatin is one of the mostly used antineoplastic drugs in the treatment of cancer, but its clastogenic potential has become of great interest. In patients treated with long-term cisplatin, genetic damage can be observed during chemotherapy or many years later. The aim of this study was to investigate the possible anticlastogenic effect of pretreatment with olive, extra virgin olive, canola or corn oils on cisplatin-induced chromosomal aberrations in Wistar rat bone marrow cells. The animals received pretreatment with a single dose of vegetable oils (5 ml/kg b.w.) by gavage before cisplatin i.p. (5 mg/kg b.w.), and were sacrificed 24 h after cisplatin injection. The pretreatment with a single dose of olive, extra virgin olive and canola oils caused a statistically significant decrease in the total of chromosomal aberrations and abnormal metaphases induced by cisplatin when compared with the groups treated with cisplatin alone. The possible explanation for the anticlastogenic effects observed in the pretreatment with olive, extra virgin olive and canola oils is ascribed to the oil contents. In conclusion, from the findings we suggest that these oils have some antioxidant effect, and the anticlastogenesis mechanisms of these oils need to be explored further before their use during cisplatin chemotherapy.

Cross-Sectional Positive Association of Serum Lipids and Blood Pressure With Serum Sodium Within the Normal Reference Range of 135-145 mmol/L.

Serum sodium concentration is maintained by osmoregulation within normal range of 135 to 145 mmol/L. Previous analysis of data from the ARIC study (Atherosclerosis Risk in Communities) showed association of serum sodium with the 10-year risk scores of coronary heart disease and stroke. Current study evaluated the association of within-normal-range serum sodium with cardiovascular risk factors.

APPROACH AND RESULTS:

Only participants who did not take cholesterol or blood pressure medications and had sodium within normal 135 to 145 mmol/L range were included (n=8615), and the cohort was stratified based on race, sex, and smoking status. Multiple linear regression analysis of data from ARIC study was performed, with adjustment for age, blood glucose, insulin, glomerular filtration rate, body mass index, waist to hip ratio, and calorie intake. The analysis showed positive associations with sodium of total cholesterol, low-density lipoprotein cholesterol, and total cholesterol to high-density lipoprotein cholesterol ratio; apolipoprotein B; and systolic and diastolic blood pressure. Increases in lipids and blood pressure associated with 10 mmol/L increase in sodium are similar to the increases associated with 7 to 10 years of aging. Analysis of sodium measurements made 3 years apart demonstrated that it is stable within 2 to 3 mmol/L, explaining its association with long-term health outcomes. Furthermore, elevated sodium promoted lipid accumulation in cultured adipocytes, suggesting direct causative effects on lipid metabolism.

CONCLUSIONS:

Serum sodium concentration is a cardiovascular risk factor even within the normal reference range. Thus, decreasing sodium to the lower end of the normal range by modification of water and salt intake is a personalizable strategy for decreasing cardiovascular risks.

To examine the association between serum sodium concentration and incident major cardiovascular disease (CVD) outcomes and total mortality in older men.

METHODS AND RESULTS:

A prospective study of 3099 men aged 60-79 years without a history of cardiovascular disease followed up for an average 11 years during which there were 528 major CVD events (fatal coronary heart disease [CHD] and non-fatal MI, stroke and CVD death) and 873 total deaths. A U shaped relationship was seen between serum sodium concentration and major CVD events and mortality. Hyponatremia (<136 mEq/L) and low sodium within the normal range (136-138 mEq/L) showed significantly increased risk of major CVD events and total mortality compared to men within the upper normal range (139-143 mEq/L) after adjustment for a wide range of confounders and traditional risk factors [adjusted HRs 1.55 (1.13,2.12) and 1.40 (1.14,1.72) for major CVD events respectively and 1.30 (1.02,1.66) and 1.30 (1.11,1.53) respectively for total mortality]. Hyponatremia was associated with inflammation, NT-proBNP, low muscle mass and alkaline phosphatase; these factors contributed to the increased total mortality associated with hyponatremia but did not explain the increased risk of CVD events associated with hyponatremia or low normal sodium concentration. Hypernatremia (≥145 mEq/L) was associated with significantly increased risk of CVD events and mortality due to CVD causes.

CONCLUSION:

Mild hyponatremia even within the normal sodium range and hypernatremia are both associated with increased total mortality and major CVD events in older men without CVD which is not explained by known adverse CV risk factors.

KEYWORDS:

Cardiovascular disease; Mortality; Serum sodium; Stroke

21. Small increases in plasma sodium are associated with higher risk of mortality in a healthy population.

Elevated blood pressure (BP) is the most common cause of cardiovascular disease. Salt intake has a strong influence on BP, and plasma sodium (pNa) is increased with progressive increases in salt intake. However, the associations with pNa and BP had been reported inconsistently. We evaluated the association between pNa and BP, and estimated the risks of all-cause-mortality according to pNa levels. On the basis of data collected from health checkups during 1995-2009, 97,009 adult subjects were included. Positive correlations between pNa and systolic BP, diastolic BP, and pulse pressure (PP) were noted in participants with pNa ≥138 mM/L (P<0.001). In participants aged ≥50 yr, SBP, DBP, and PP were positively associated with pNa. In participants with metabolic syndrome components, the differences in SBP and DBP according to pNa were greater (P<0.001). A cumulative incidence of mortality was increased with increasing pNa in women aged ≥50 yr during the median 4.2-yr-follow-up (P<0.001). In women, unadjusted risks for mortality were increased according to sodium levels. After adjustment, pNa ≥145 mM/L was related to mortality. The positive correlation between pNa and BP is stronger in older subjects, women, and subjects with metabolic syndrome components. The incidence and adjusted risks of mortality increase with increasing pNa in women aged ≥50 yr.

The Effects of Moderate Whole Grain Consumption on Fasting Glucose and Lipids, Gastrointestinal Symptoms, and Microbiota.

This study was designed to determine if providing wheat, corn, and rice as whole (WG) or refined grains (RG) under free-living conditions will change parameters of health over a six-week intervention in healthy, habitual non-WG consumers. Measurements of body composition, fecal microbiota, fasting blood glucose, total cholesterol, high density lipoprotein (HDL), low density lipoprotein (LDL), and triglycerides were made at baseline and post intervention. Subjects were given adequate servings of either WG or RG products based on their caloric need and asked to keep records of grain consumption, bowel movements, and GI symptoms weekly. After six weeks, subjects repeated baseline testing. Significant decreases in total, LDL, and non-HDL cholesterol were seen after the WG treatments but were not observed in the RG treatment. During Week 6, bowel movement frequency increased with increased WG consumption. No significant differences in microbiota were seen between baseline and post intervention, although, abundance of order Erysipelotrichales increased in RG subjects who ate more than 50% of the RG market basket products. Increasing consumption of WGs can alter parameters of health, but more research is needed to better elucidate the relationship between the amount consumed and the health-related outcome.

The aim of the present work was to verify whether extra-virgin olive oil, a food naturally containing phenolic antioxidants, has the potential to protect from the pro-aging effects of a high-calorie diet. Male rats were fed from age 12 months to senescence a high-calorie diet containing either corn oil (CO), or extra-virgin olive oil with high (H-EVOO) or low (L-EVOO) amounts of phenols. The prolonged high fat intake led to obesity, liver lipid degeneration and insulin resistance, which were not counteracted by high phenol intake. No difference in overall survival was found at the end of the experiment in the animals treated with H-EVOO compared to the other groups. However, we did detect a protective effect of olive oil on some age-related pathologies and on blood pressure, of which the former was associated with the antioxidant content. Concomitantly, a decrease in DNA oxidative damage in blood cells and plasma TBARS and an increase in liver superoxide dismutase were detected following H-EVOO consumption. Thus, although olive oil phenols cannot reverse the detrimental effects of a prolonged intake of high amounts of fat, improving the quality of olive oil in terms of antioxidant content can be beneficial.

Transition through life span is accompanied by numerous molecular changes, such as dysregulated gene expression, altered metabolite levels, and accumulated molecular damage. These changes are thought to be causal factors in aging; however, because they are numerous and are also influenced by genotype, environment, and other factors in addition to age, it is difficult to characterize the cumulative effect of these molecular changes on longevity. We reasoned that age-associated changes, such as molecular damage and tissue composition, may influence life span when used in the diet of organisms that are closely related to those that serve as a dietary source. To test this possibility, we used species-specific culture media and diets that incorporated molecular extracts of young and old organisms and compared the influence of these diets on the life span of yeast, fruitflies, and mice. In each case, the "old" diet or medium shortened the life span for one or both sexes. These findings suggest that age-associated molecular changes, such as cumulative damage and altered dietary composition, are deleterious and causally linked with aging and may affect life span through diet.

KEYWORDS:

Yeast; aging; damage; flies; lifespan; mice; molecular changes

[Association between body mass index and both total and cause-specific mortality in China: findings from data through the China Kadoorie Biobank].

Objective: To evaluate the associations between body mass index (BMI) and both total and cause-specific mortality.

Methods: After excluding participants with heart disease, stroke, cancer, chronic obstructive pulmonary disease, and diabetes at baseline study, 428 593 participants aged 30-79 in the China Kadoorie Biobank study were chosen for this study. Participants were categorized into 9 groups according to their BMI status. Cox regression analysis was used to estimate the hazard ratios (HRs) and 95% confidence intervals (CIs) of mortality on BMI.

Results: Among 3 085 054 person-years of the follow-up program between 2004 and 2013 (median 7.2 years), a total of 7 862 men and 6 315 women died. After adjusting for known or potential confounders, an increased risks of all-cause deaths were shown among participants with a BMI less than 18.5 (HR=1.40, 95%CI: 1.31-1.50), between 18.5-20.4 (HR=1.11, 95%CI: 1.05-1.17), and more than 35.0 (HR=2.05, 95%CI: 1.60-2.61), when compared to those with BMI between 20.5-22.4. Ranges of BMI with lower risk of cause-specific mortality were: 18.5-23.9 for ischemic heart disease, <26.0 for cerebro-vascular disease, 26.0-34.9 for cancers, and 24.0-25.9 for respiratory diseases.

Conclusions: In this large prospective study, both underweight and obesity were associated with the increased total and certain cause-specific mortality, which were independent from other risk factors of death. Programs related to extensive follow-up, thorough analysis BMI and the risks of incidence on major chronic diseases all need to be developed, in order to better understand the impact of BMI on human health.

KEYWORDS:

Body mass index; Chronic disease; Mortality; Prospective study

Body mass index and the risk of infection - from underweight to obesity.

In children and adolescents underweight is a significant risk factor of infection especially in developing countries, probably reflecting malnutrition and poor hygienic standard. Data from industrialized countries suggest that infection rate is also increased in obese children and adolescents. Similarly, several studies suggest a U-shaped increased infection rate in both underweight and obese adults. In the latter infections of the skin and respiratory tract as well as surgical site infections have consistently been reported to be more common than in normal-weight subjects. Paradoxically, mortality of critically ill patients was reduced in obesity in some studies.

IMPLICATIONS:

Several studies in children or adults suggest that both underweight and obesity are associated with increased infection risk. However, confounding factors such as malnutrition, hygienic status and underlying disease or co-morbidities might aggravate accurate assessment of the impact on body weight on infection risk.

Advanced glycation end products (AGEs) increase in dysmetabolic conditions. Lifestyle, including diet, has shown be effective in preventing the development of metabolic syndrome (MetS). We investigated whether AGE metabolism is affected by diets with different fat quantity and quality in MetS patients.

Low AGE content in HMUFA diet reduces sAGEs and modulates the gene expression related to AGE metabolism in MetS patients, which may be used as a therapeutic approach to reduce the incidence of MetS and related chronic diseases.

Prostatitis is a chronic inflammation of the prostate gland that can compromise a man’s quality of life. Naif Alwithanani, from Case Western Reserve University (Ohio, USA), and colleagues studied 27 men, ages 21 years and older, each of whom were diagnosed with prostatitis within the past year (via biopsy and prostate specific antigen [PSA] test). The men were assessed for symptoms of prostate disease by answering questions on the International-Prostate Symptom Score (IPSS) test. Of the 27 participants, 21 had no or mild inflammation, but 15 had biopsy-confirmed malignancies, and 2 had both inflammation and a malignancy. Each of the subjects had at least 18 teeth, and all of them showed moderate to severe gum disease. They received treatment and were tested again for periodontal disease four to eight weeks later and showed significant improvement. During the periodontal care, the men received no treatment for their prostate conditions. But even without prostate treatment, 21 of the 27 men showed decreased levels of PSA. Those with the highest levels of inflammation benefited the most from the periodontal treatment. Six participants showed no changes. Symptom scores on the IPSS test also showed improvement. The study authors write that: “Periodontal treatment improved prostate symptom score and lowered PSA value in men afflicted with chronic periodontitis.”

Objective: To assess changes in voiding symptoms, serum PSA and inflammatory cytokine levels after non-surgical periodontal treatment in men with chronic periodontitis.

Patients and methods: Twenty-seven men who underwent prostate biopsy because of abnormal findings on digital rectal examination or elevated PSA (≥4 ng/ml) participated in the study. Dental plaque (PI) and gingival(GI) indices, bleeding on probing (BOP), probing depth (PD), clinical attachment level (CAL), gingival recession(GR), PSA, IPSS, IL-1β, and C-Reactive Protein (CRP) were determined before and after periodontal treatment. The Mann-Whitney test was used to compare PSA level at baseline with prostate inflammation, prostate malignancy, and Gleason score. The Wilcoxon Rank-Sum Test was used to examine differences in baseline and post-periodontal treatment values. Change in PSA level after periodontal treatment was correlated with change in other parameters studied, using Spearman?s correlation.

Malnutrition and frailty are two geriatric syndromes that significantly affect independent living and health in community-dwelling older adults. Although the pathophysiology of malnutrition and physical frailty share common pathways, it is unknown to what extent these syndromes overlap and how they relate to each other.

METHODS:

A systematic review was performed resulting in a selection of 28 studies that assessed both malnutrition and frailty in community-dwelling older adults. Furthermore, a meta-analysis was performed on 10 studies that used Mini- Nutritional Assessment and the Fried frailty phenotype to estimate the prevalence of malnutrition within physical frailty and vice versa.

RESULTS:

In the systematic review, 25 of the 28 studies used the Mini-Nutritional Assessment (long or short form) for malnutrition screening. For frailty assessment, 23 of the 28 studies focused on the physical frailty phenotype, of which 19 followed the original Fried phenotype. Fifteen studies analyzed the association between malnutrition and frailty, which was significant in 12 of these. The meta-analysis included 10 studies with a total of 5447 older adults. In this pooled population of community-dwelling older adults [mean (standard deviation) age: 77.2 (6.7) years], 2.3% was characterized as malnourished and 19.1% as physically frail. The prevalence of malnutrition was significantly associated with the prevalence of physical frailty (P < .0001). However, the syndromes were not interchangeable: 68% of the malnourished older adults was physically frail, whereas only 8.4% of the physical frail population was malnourished.

CONCLUSIONS:

The systematic review and meta-analysis revealed that malnutrition and physical frailty in community-dwelling older adults are related, but not interchangeable geriatric syndromes. Two out of 3 malnourished older adults were physically frail, whereas close to 10% of the physically frail older adults was identified as malnourished.

KEYWORDS:

Malnutrition; community-dwelling; frailty

The Long-term Effect of Acupuncture for Migraine Prophylaxis: A Randomized Clinical Trial.

The long-term prophylactic effects of acupuncture for migraine are uncertain.

OBJECTIVE:

To investigate the long-term effects of true acupuncture compared with sham acupuncture and being placed in a waiting-list control group for migraine prophylaxis.

DESIGN, SETTING, AND PARTICIPANTS:

This was a 24-week randomized clinical trial (4 weeks of treatment followed by 20 weeks of follow-up). Participants were randomly assigned to true acupuncture, sham acupuncture, or a waiting-list control group. The trial was conducted from October 2012 to September 2014 in outpatient settings at 3 clinical sites in China. A total of 249 participants 18 to 65 years old with migraine without aura based on the criteria of the International Headache Society, with migraine occurring 2 to 8 times per month.

INTERVENTIONS:

Participants in the true acupuncture and sham acupuncture groups received treatment 5 days per week for 4 weeks for a total of 20 sessions. Participants in the waiting-list group did not receive acupuncture but were informed that 20 sessions of acupuncture would be provided free of charge at the end of the trial.

MAIN OUTCOMES AND MEASURES:

Participants used diaries to record migraine attacks. The primary outcome was the change in the frequency of migraine attacks from baseline to week 16. Secondary outcome measures included the migraine days, average headache severity, and medication intake every 4 weeks within 24 weeks.

RESULTS:

A total of 249 participants 18 to 65 years old were enrolled, and 245 were included in the intention-to-treat analyses. One hundred eighty-nine (77.1%) were women. Baseline characteristics were comparable across the 3 groups. The mean (SD) change in frequency of migraine attacks differed significantly among the 3 groups at 16 weeks after randomization (P < .001); the mean (SD) frequency of attacks decreased in the true acupuncture group by 3.2 (2.1), in the sham acupuncture group by 2.1 (2.5), and the waiting-list group by 1.4 (2.5); a greater reduction was observed in the true acupuncture than in the sham acupuncture group (difference of 1.1 attacks; 95% CI, 0.4-1.9; P = .002) and in the true acupuncture vs waiting-list group (difference of 1.8 attacks; 95% CI, 1.1-2.5; P < .001). Sham acupuncture was not statistically different from the waiting-list group (difference of 0.7 attacks; 95% CI, -0.1 to 1.4; P = .07).

CONCLUSIONS AND RELEVANCE:

Among patients with migraine without aura, true acupuncture may be associated with long-term reduction in migraine recurrence compared with sham acupuncture or assigned to a waiting list.

Study determines that reading books may help us live longer by as many as two years, and the more frequently you read, the better.

Read More Books to Increase Longevity

Despite the recent popularity of the Kindle and other e-readers, sales of printed books are increasing. In 2015, there were 571 million units sold in the United States, compared to 559 million the previous year.1 Reading books is a popular way of relaxing and escaping stressful thoughts, as well as passing the time. Reading can also preserve structural integrity in the brain, as people age. Now, it is believed to have the added benefit of helping us to live longer.

Becca R. Levy, a professor of epidemiology at Yale University of Public Health, and her colleagues, analyzed data provided by the Health and Retirement Study (a nationally representative sample of American adults, 50 years of age or older). 3,635 men and women were included in the study, and all self-reported their reading habits. For approximately 12 years, they were followed-up, and their survival was monitored. Those who read books for up to 3.5 hours weekly were 17% less likely to die over the 12 year follow-up, compared to those who did not read books. Those who read for over 3.5 hours per week were 23% less likely to die. Over the course of the 12 years, the adults who read books survived almost 2 years longer than the adults who did not read.

Females were found to be most likely to read books, along with those who were college-educated, and those who had a higher income. Those who reported reading magazines and newspapers also displayed increased survival over those who did not read at all, though the difference was less significant than with the book reading. After accounting for various factors such as age, wealth, sex, self-reported health, education, and marital status, the study results remained consistent.

The mechanisms by which book reading may increase longevity were not identified, however, the team suggests that it may be a result of the cognitive benefits. They concluded that "These findings suggest that the benefits of reading books include a longer life in which to read them." Levy and her team published their findings in the journal Social Science & Medicine.

Although books can expose people to new people and places, whether books also have health benefits beyond other types of reading materials is not known. This study examined whether those who read books have a survival advantage over those who do not read books and over those who read other types of materials, and if so, whether cognition mediates this book reading effect. The cohort consisted of 3635 participants in the nationally representative Health and Retirement Study who provided information about their reading patterns at baseline. Cox proportional hazards models were based on survival information up to 12 years after baseline. A dose-response survival advantage was found for book reading by tertile (HRT2 = 0.83, p < 0.001, HRT3 = 0.77, p < 0.001), after adjusting for relevant covariates including age, sex, race, education, comorbidities, self-rated health, wealth, marital status, and depression. Book reading contributed to a survival advantage that was significantly greater than that observed for reading newspapers or magazines (tT2 = 90.6, p < 0.001; tT3 = 67.9, p < 0.001). Compared to non-book readers, book readers had a 23-month survival advantage at the point of 80% survival in the unadjusted model. A survival advantage persisted after adjustment for all covariates (HR = .80, p < .01), indicating book readers experienced a 20% reduction in risk of mortality over the 12 years of follow up compared to non-book readers. Cognition mediated the book reading-survival advantage (p = 0.04). These findings suggest that the benefits of reading books include a longer life in which to read them.

The ketogenic diet (KD) is a very low-carbohydrate, high-fat and adequate-protein diet that without limiting calories induces different metabolic adaptations, eg, increased levels of circulating ketone bodies and a shift to lipid metabolism. Our objective was to assess the impact of a 6-week non-energy-restricted KD in healthy adults beyond cohorts of athletes on physical performance, body composition, and blood parameters.

METHODS:

Our single arm, before-and-after comparison study consisted of a 6-week KD with a previous preparation period including detailed instructions during classes and individual counselling by a dietitian. Compliance with the dietary regimen was monitored by measuring urinary ketones daily, and 7-day food records. All tests were performed after an overnight fast: cardiopulmonary exercise testing via cycle sprioergometry, blood samples, body composition, indirect calorimetry, handgrip strength, and questionnaires addressing complaints and physical sensations.

RESULTS:

Forty-two subjects aged 37 ± 12 years with a BMI of 23.9 ± 3.1 kg/m2 completed the study. Urinary ketosis was detectable on 97% of the days, revealing very good compliance with the KD. Mean energy intake during the study did not change from the habitual diet and 71.6, 20.9, and 7.7% of total energy intake were from fat, protein, and carbohydrates, respectively. Weight loss was -2.0 ± 1.9 kg (P < 0.001) with equal losses of fat-free and fat mass. VO2peak and peak power decreased from 2.55 ± 0.68 l/min to 2.49 ± 0.69 l/min by 2.4% (P = 0.023) and from 241 ± 57 W to 231 ± 57 W by 4.1% (P < 0.001), respectively, whereas, handgrip strength rose slightly from 40.1 ± 8.8 to 41.0 ± 9.1 kg by 2.5% (P = 0.047). The blood lipids TG and HDL-C remained unchanged, whereas total cholesterol and LDL-C increased significantly by 4.7 and 10.7%, respectively. Glucose, insulin, and IGF-1 dropped significantly by 3.0, 22.2 and 20.2%, respectively.

CONCLUSIONS:

We detected a mildly negative impact from this 6-week non-energy-restricted KD on physical performance (endurance capacity, peak power and faster exhaustion). Our findings lead us to assume that a KD does not impact physical fitness in a clinically relevant manner that would impair activities of daily living and aerobic training. However, a KD may be a matter of concern in competitive athletes.

To investigate the association between serum potassium, mortality, and kidney outcomes in the general population and whether potassium-altering medications modify these associations.

PATIENTS AND METHODS:

We studied 15,539 adults in the Atherosclerosis Risk in Communities Study. Cox proportional hazard regression was used to investigate the association of serum potassium at baseline (1987-1989), evaluated categorically (hypokalemia, <3.5 mmol/L; normokalemia, ≥3.5 and <5.5 mmol/L; hyperkalemia, ≥5.5 mmol/L) and continuously using linear spline terms (knots at 3.5 and 5.5 mmol/L), with mortality, sudden cardiac death, incident chronic kidney disease, and end-stage renal disease. The end date of follow-up for all outcomes was December 31, 2012. We also evaluated whether classes of potassium-altering medications modified the association between serum potassium and adverse outcomes.

RESULTS:

Overall, 413 (2.7%) of the participants had hypokalemia and 321 (2.1%) had hyperkalemia. In a fully adjusted model, hyperkalemia was significantly associated with mortality (hazard ratio, 1.24; 95% CI, 1.04-1.49) but not sudden cardiac death, chronic kidney disease, or end-stage renal disease. Hypokalemia as a categorical variable was not associated with any outcome; however, associations of hypokalemia with all-cause mortality and kidney outcomes were observed among those who were not taking potassium-wasting diuretics (all P for interaction, <.001).

CONCLUSIONS:

Higher values of serum potassium were associated with a higher risk of mortality in the general population. Lower levels of potassium were associated with adverse kidney outcomes and mortality among participants not taking potassium-wasting diuretics.

>>>>>>>>>>>>>>>>>>>>>>

[The below paper is pdf-availed.]

The Relation of Serum Potassium Concentration with Cardiovascular Events and Mortality in Community-Living Individuals.

Hyperkalemia is associated with adverse outcomes in patients with CKD and in hospitalized patients with acute medical conditions. Little is known regarding hyperkalemia, cardiovascular disease (CVD), and mortality in community-living populations. In a pooled analysis of two large observational cohorts, we investigated associations between serum potassium concentrations and CVD events and mortality, and whether potassium-altering medications and eGFR<60 ml/min per 1.73 m2 modified these associations.

DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS:

Among 9651 individuals from the Multi-Ethnic Study of Atherosclerosis (MESA) and the Cardiovascular Health Study (CHS), who were free of CVD at baseline (2000-2002 in the MESA and 1989-1993 in the CHS), we investigated associations between serum potassium categories (<3.5, 3.5-3.9, 4.0-4.4, 4.5-4.9, and ≥5.0 mEq/L) and CVD events, mortality, and mortality subtypes (CVD versus non-CVD) using Cox proportional hazards models, adjusting for demographics, time-varying eGFR, traditional CVD risk factors, and use of potassium-altering medications.

RESULTS:

Compared with serum potassium concentrations between 4.0 and 4.4 mEq/L, those with concentrations ≥5.0 mEq/L were at higher risk for all-cause mortality (hazard ratio, 1.41; 95% confidence interval, 1.12 to 1.76), CVD death (hazard ratio, 1.50; 95% confidence interval, 1.00 to 2.26), and non-CVD death (hazard ratio, 1.40; 95% confidence interval, 1.07 to 1.83) in fully adjusted models. Associations of serum potassium with these end points differed among diuretic users (Pinteraction<0.02 for all), such that participants who had serum potassium ≥5.0 mEq/L and were concurrently using diuretics were at higher risk of each end point compared with those not using diuretics.