Wednesday, July 21, 2010

Cleaning products for use in commercial, agricultural and domestic settings could be contributing to a rise in bacterial resistance in food borne pathogens including Salmonella, BBSRC-funded scientists at Birmingham University have found. They recommend a decrease in the "frivolous" use of biocides, particularly in domestic products to ensure the number of resistant bacterial strains does not increase. Improving the control of bacteria that cause food poisoning reduces losses and wastage throughout the food production pipeline thus helping to ensure future food security.

Biocides are chemicals which kill pathogenic (disease-causing) bacteria and are therefore commonly used in hospitals, farms, food processing outlets and increasingly the home, to eradicate bacteria and prevent sources of possible infection. Increased use of biocides in domestic products has lead to accumulation and persistence of some biocides in the environment. It is this persistence which is of interest to Dr Mark Webber and his team.

"The use of biocides in many settings is crucial but the increasing accumulation of biocides in the environment can lead to bacteria being repeatedly exposed to them in low concentrations. We have found in the laboratory that this exposure results in bacteria which adapt and become more tolerant to biocides but are also in some cases cross-resistant to antibiotics used for treatment of infection" said Dr Webber.

While many uses of biocides are essential, some could be described as frivolous. Biocide use should be restricted to where there is a clear need or benefit in order to lessen the likelihood of biocide and antibiotic resistance emerging. There are already a number of pathogenic bacteria which cannot be killed using conventional antibiotic treatment. Alternative treatment regimes are being investigated, but it is recommended that measures, such as limiting unnecessary use of biocides, are taken to ensure the number of antibiotic resistant strains remains low.

High blood pressure could be the result of the kidneys triggering a reaction in the nervous system, according to a scientific study revealing a new level of interaction between the body’s vital organs.

High blood pressure can be attributed to a disruption of blood flow to the kidneys, known as renovascular hypertension, which is caused by a narrowing or obstruction of the blood vessels that supply the kidneys. To date, renovascular hypertension had been understood as the kidney’s reaction to this disrupted blood flow, which triggers hormone release from the kidneys, causing retention of body fluids, thereby elevating blood pressure.

However, researchers at the University of Bristol have revealed that the brain is also involved in the development of high blood pressure. The implication is that the renovascular hypertension triggers messages to the brain that activates the part of the nervous system (so called sympathetic nervous system) which makes the heart beat harder and narrows blood vessels causing blood pressure to rise.

The findings of the study, funded by the British Heart Foundation, are revealed in a paper published by the journal, Hypertension

Julian Paton, Professor of Physiology at Bristol, said: “This exciting study demonstrates that the kidney talks to the brain when it is starved of blood and oxygen. This conversation results in blood pressure to rise to levels sufficient to satisfy the kidney's own needs but at the cost of inducing high blood pressure throughout the body.”

High blood pressure is a major killer worldwide with one-in-three people now affected; this is set to rise to over 1.56 billion in 2025. Most people die from stroke, heart attacks or kidney failure when their blood pressure gets too high.

Remarkably, the researchers were able to completely prevent high blood pressure by blocking a signaling mechanism in the brainstem that causes the excessive sympathetic activity during restrictions in blood flow to the kidneys.

“If translatable to man, the results from this animal study should make clinicians think twice in their management and treatment of renovascular hypertension,” added Prof Paton.

Professor Jeremy Pearson, Associate Medical Director at the British Heart Foundation, said: "This careful, clever research adds further good evidence for the idea that high blood pressure is strongly controlled by the brain. The fact that the team were able to prevent high blood pressure in rats by blocking certain brain signals raises the hope that new treatments could work in the same way.”

With carbon dioxide in the atmosphere approaching alarming levels, even halting emissions altogether may not be enough to avert catastrophic climate change. Could scrubbing carbon dioxide from the air be a viable solution? A new study by scientists at the Carnegie Institution suggests that while removing excess carbon dioxide would cool the planet, complexities of the carbon cycle would limit the effectiveness of a one-time effort. To keep carbon dioxide at low levels would require a long-term commitment spanning decades or even centuries.

Previous studies have shown that reducing carbon dioxide emissions to zero would not lead to appreciable cooling, because carbon dioxide already within the atmosphere would continue to trap heat. For cooling to occur, greenhouse gas concentrations would need to be reduced. “We wanted to see what the response would be if carbon dioxide were actively removed from the atmosphere,” says study coauthor Ken Caldeira of Carnegie’s Department of Global Ecology. “Our study is the first to look at how much carbon dioxide you would need to remove and for how long to keep atmospheric carbon dioxide concentrations low. This has obvious implications for the public and for policy makers as we weigh the costs and benefits of different ways of mitigating climate change.”

For the study, Caldeira and lead author Long Cao, also at Carnegie, did not focus on any specific method of capturing and storing carbon dioxide from the ambient air. The possibilities include approaches as diverse as industrial-scale chemical technologies and changing land use so more carbon dioxide is naturally absorbed by vegetation. For the study, the researchers used an Earth system model under projected conditions at the middle of this century when global surface temperatures have been raised 2° C (3.6° F). They then simulated the effects of an idealized case in which carbon emissions were reduced to zero and carbon dioxide in the atmosphere was instantaneously restored to pre-industrial levels.

The researchers found that removing all human-emitted carbon dioxide from the atmosphere caused temperatures to drop, but it offset less than half of CO2-induced warming. Why would removing all the extra carbon dioxide have such a small effect? The researchers point to two primary reasons. First, slightly more than half of the carbon dioxide emitted by fossil-fuels over the past two centuries has been absorbed in the oceans, rather than staying in the atmosphere. When carbon dioxide is removed from the atmosphere, it is partially replaced by gas coming out of ocean water. Second, the rapid drop in atmospheric carbon dioxide and the change in surface temperature alters the balance of the land carbon cycle, causing the emission of carbon dioxide from the soil to exceed its uptake by plants. As a result, carbon dioxide is released back into the atmosphere.

According to the simulations, for every 100 billion tons of carbon removed from the atmosphere, average global temperatures would drop 0.16° C (0.28° F).

Further simulations showed that in order to keep carbon dioxide at low levels, the process of extracting carbon dioxide from the air would have to continue for many decades, and perhaps centuries, after emissions were halted.

“If we do someday decide that we need to remove carbon dioxide from the atmosphere to avoid a climate crisis, we might find ourselves committed to carbon dioxide removal for a long, long time,” says Caldeira. “A more prudent plan might involve preventing carbon dioxide emissions now rather than trying to clean up the atmosphere later.”

A new analysis suggests that woolly mammoths and other large mammals went extinct more than 10,000 years ago because they fell victim to the same “trophic cascade” of ecosystem disruption caused today by the global decline of top predators, including wolves, cougars and sharks.

In each case the cascading events were originally begun by human disruption of ecosystems, the new study concludes, but around 15,000 years ago the problem was not the loss of a key predator, but the addition of one – human hunters with spears.

In a study published in the journal Bioscience, researchers propose that this mass extinction was caused by newly arrived humans tipping the balance of power and competing with major predators such as saber-tooth cats. An equilibrium that had survived for thousands of years was disrupted, possibly explaining the loss of two-thirds of North America’s large mammals during this period.

“For decades, scientists have been debating the causes of this mass extinction, and the two theories with the most support are hunting pressures from the arrival of humans, and climate change,” said William Ripple, a professor of forest ecosystems and society at Oregon State University, and an expert on the ecosystem alterations that scientists are increasingly finding when predators are added or removed.

“We believe humans indeed may have been a factor, but not as most current theory suggests, simply by hunting animals to extinction,” Ripple said. “Rather, we think humans provided competition for other predators that still did the bulk of the killing. But we were the triggering mechanism that disrupted the ecosystem.”

In the late Pleistocene, researchers say, major predators dominated North America in an uneasy stability with a wide range of mammals: mammoths, mastodons, ground sloths, camels, horses, and several species of bison. The new study cites previous evidence from carnivore tooth wear and fracture, growth rates of prey, and other factors that suggest that there were no serious shortages of food caused by environmental change 10,000 to 15,000 years ago.

Quite to the contrary, the large herbivores seemed to be growing quickly and just as quickly had their numbers reduced by a range of significant carnivorous predators, not the least of which were lions, dire wolves, and two species of saber-tooth cats. Food was plentiful for herbivores, the system was balanced, but it was dominated by predators.

“When human hunters arrived on the scene, they provided new competition with these carnivores for the same prey,” said Blaire Van Valkenburgh, an expert at UCLA on the paleobiology of carnivores, and a co-author with Ripple on this study.

“The humans were also omnivores, and could live on plant foods if necessary,” Van Valkenburgh said. “We think this may have triggered a sequential collapse not only in the large herbivores but ultimately their predators as well. Importantly, humans had some other defenses against predation, such as fire, weapons and living in groups, so they were able to survive.”

But the driving force in eliminating the large mammals, according to the new theory, was not humans – they just got the process started. After that, predators increasingly desperate for food may have driven their prey to extinction over long periods of time – and then eventually died out themselves.

In recent studies in Yellowstone National Park and elsewhere, scientists from OSU and other institutions have explored these “trophic cascades,” often caused by the loss or introduction of a single major predator in an ecosystem. With the elimination of wolves from Yellowstone, for instance, the numbers of elk exploded. This caused widespread overgrazing; damage to stream ecosystems; the slow demise of aspen forests; and ultimate effects on everything from trees to beaver, fish, birds, and other life forms. When wolves were re-introduced to Yellowstone, studies are showing that those processes have begun to reverse themselves.

“We think the evidence shows that major ecosystem disruptions, resulting in these domino effects, can be caused either by subtracting or adding a major predator,” Ripple said. “In the case of the woolly mammoths and saber-tooth cats, the problems may have begun by adding a predator, in this case, humans.”

The new analysis draws on many other existing studies in making its case.

For instance, other research describes this process with a model in modern times in Alaska. There, the allowance of limited human hunting on moose caused wolves to switch some of their predation to sheep, ultimately resulting in a precipitous decline in populations not only of moose but also wolves and sheep.

The loss of species in North America during the late Pleistocene was remarkable; about 80 percent of 51 large herbivore species went extinct, along with more than 60 percent of important large carnivores. Previous research has documented the growth rates of North American mammoths by studying their tusks, revealing no evidence of reduced growth caused by inadequate food – thus offering no support for climate-induced habitat decline.

It seems that diverse and abundant carnivores kept herbivore numbers below levels where food becomes limiting. By contrast, the large population of predators such as dire wolves and saber-tooth cats caused them to compete intensely for food, as evidenced by heavy tooth wear.

“Heavily worn and fractured teeth are a result of bone consumption, something most carnivores avoid unless prey is difficult to acquire,” said Van Valkenburgh.

Trophic cascades initiated by humans are broadly demonstrated, the researchers report. In North America, it may have started with the arrival of the first humans, but continues today with the extirpation of wolves, cougars and other predators around the world. The hunting of whales in the last century may have led to predatory killer whales turning their attention to other prey such as seals and sea otters - and the declines in sea otter populations has led to an explosion of sea urchins and collapse of kelp forest ecosystems.

“In the terrestrial realm, it is important that we have a better understanding of how Pleistocene ecosystems were structured as we proceed in maintaining and restoring today’s ecosystems,” the researchers wrote in their conclusion. “In the aquatic realm, the Earth’s oceans are the last frontier for megafaunal species declines and extinctions.”

“The tragic cascade of species declines due to human harvesting of marine megafauna happening now may be a repeat of the cascade that occurred with the onset of human harvesting of terrestrial megafauna more than 10,000 years ago. This is a sobering thought, but it is not too late to alter our course this time around in the interest of sustaining Earth’s ecosystems.”

A protein found in cells throughout the body must be present in a specific set of neurons in the brain to prevent weight gain after chronic feeding on high-calorie meals, new findings from UT Southwestern Medical Center researchers suggest.

Nicknamed the “longevity” protein because of its apparent role in mediating the effects of dietary restriction on life span, SIRT1 has been studied as a potential target for anti-aging drugs. Prior research also has shown that this metabolic sensor protein in peripheral tissues plays an important role in regulating metabolism, but its physiological relevance in brain neurons remained unclear.

“This is the first study to show that SIRT1 in hypothalamic neurons, specifically POMC neurons, is required for preventing diet-induced obesity and maintaining normal body weight,” said Dr. Roberto Coppari, assistant professor of internal medicine at UT Southwestern and senior author of the mouse study, available online and in the July 7 issue of Cell Metabolism.

POMC, or pro-opiomelanocortin, neurons are found in the hypothalamus region of the brain and are known to play an important role in suppressing appetite and inducing weight loss. There are about 3,000 POMC neurons in a mouse brain.

The researchers genetically engineered mice to lack SIRT1 only in these specific hypothalamic neurons. They found that when fed a high-calorie diet, the mice lacking SIRT1 in POMC neurons gained more weight and were generally more susceptible to diet-induced obesity than those with the metabolic sensor protein intact.

The mutant mice also had almost twice as much abdominal fat and more of the hormone leptin than those mice with their SIRT1 intact, despite the fact that all the mice maintained the same food intake and movement levels.

“We found that SIRT1 must be present in POMC neurons in order for the hormone leptin to properly engage its receptors in these neurons. Without SIRT1, leptin sensing is altered and the animals gain more fat mass when fed a high-calorie diet,” Dr. Coppari said.

In addition, the researchers found that SIRT1 must be present in POMC neurons for leptin to stimulate the remodeling of white adipose tissue into brown fat tissue, which “burns” fat to generate heat. White adipocytes primarily store fat.

“When SIRT1 is present in POMC neurons, the neurons properly convey a signal from leptin to the white perigonadal fat, which is designed to store energy. This signal is needed for the fat to undergo a remodeling process and expand the brown fat cells component as a protective measure against obesity,” Dr. Coppari said. “If you don’t have these kinds of defense mechanisms, you likely become hypersensitive to diet-induced obesity. A primary defect in SIRT1 in POMC neurons might be present in some individuals who are more prone to develop obesity when constantly exposed to an abundance of high-fat, high-calorie foods.”

Dr. Coppari said the idea of a drug that selectively could target neurons controlling specific fat depots – and that could trigger the remodeling of white fat into brown fat – has high potential.

“The drawback to harnessing adrenergic receptors to make more brown adipocytes, as a lot of people are thinking about doing, is that it puts a lot of pressure on the cardiovascular system,” he said. “However, the idea of having a drug that could selectively affect specific hypothalamic neurons that then control specific branches of the sympathetic nervous system suggests that one could avoid acting on unwanted cells but selectively on those able to burn calories such as brown adipocytes.

“We could control the remodeling of a particular fat depot into brown, which would then be more likely to cause weight loss without increasing the risk of cardiovascular problems,” he said.

The next step, Dr. Coppari said, is to determine whether SIRT1 is mediating other signaling pathways in the brain that in addition to regulating body weight are key for normal glucose balance.

Scientists have designed a nanoparticle that appears to effectively deliver genetic material into cells with minimal toxic effects.

In lab experiments, the researchers have found that this device, a vector, is able to deliver DNA deeply enough into a cell to allow genetic material to be activated – a critical step in gene therapy. This vector is between 2 ½ and 10 times more effective than other experimental materials, according to the research.

Biomedical researchers continue to pursue gene therapy as a treatment option for a variety of diseases known to be caused by a genetic defect. That pursuit includes efforts to ensure the safety of the therapy and find the most effective way to deliver the genes.

In many experiments, deactivated viruses that retain their ability to infect other cells are used as vectors to deliver normal genes intended to replace, or turn off, defective genes. But because some of the viruses can generate an immune response that complicates the treatment, scientists also are pursuing nonviral vector techniques for gene therapy.

In this case, Ohio State University scientists combined two ingredients – calcium phosphate and a lipid shell – to create a nanoparticle that protects DNA during its journey to the cell and then dissolves to allow for gene activation in the target cell. Nano refers to the tiny size of the particle in question – its general structure can be detected only by an atomic force microscope.

Calcium phosphate is a mineral found in bones and teeth. Lipids are fatty molecules that help maintain the structure of cell membranes. Alone, calcium phosphate is toxic and lipids get absorbed by cells. Together, they form a protective and inflexible structure that, thanks to complex chemical reactions, self-destructs once inside a cell.

“Our nanoparticle is a foreign body just like a viral vector is, but it has a self-destructive mechanism so it does not generate a strong response from the immune system,” said Chenguang Zhou, a graduate student in pharmaceutics at Ohio State and lead author of the study. “The material we use is also biocompatible. Calcium phosphate is in our bones and the lipids we use are synthetic, but can be biologically degraded. That’s why there is low toxicity.”

The research is published in a recent issue of the International Journal of Pharmaceutics.

Zhou noted that other researchers have tried to use liposomes – nanometer-sized bubbles made out of the same material as a cell membrane – to create nonviral vectors for gene delivery. While the material did a good job of protecting the DNA, it did not do a good job of releasing the gene into a cell.

“The liposome gets internalized into cells. It’s sort of like eating food that gets stuck in the stomach or intestines, but never gets to the rest of the body,” he said.

Similarly, calcium phosphate alone has been considered as a gene delivery vehicle. But because of its salty properties, it becomes unstable and expands in size, which makes it too big to penetrate some cell and vascular walls, and which can cause the immune system to reject it.

“So what we do is encapsulate a calcium phosphate core inside the liposome,” Zhou said. “And when this calcium phosphate gets inside a cell and that environment becomes acidic, it gets dissolved and then the gene can be very effectively released into the cytoplasm and transported to the nucleus. That is the theory.”

Zhou and colleagues have developed what they consider an easy method to manufacture this particle. They create a synthetic lipid and place it in a solution that contains calcium and phosphate, which becomes integrated with the lipid. As the acidic properties in the solution change, the calcium phosphate forms a core.

The scientists then mix a solution containing plasmid DNA with their newly formed particle, and all the materials become bundled together. Plasmid DNA is a circular DNA molecule that is able to turn on gene activity that starts a protein-building process without altering an entire genome, or the complete hereditary information of an organism.

Because this particular vector is intended for injection into the bloodstream as a cancer treatment, the particle is designed to protect the DNA from being digested by enzymes as it travels to target cells. A test that exposed this solution to serum, a component of blood, showed that the hybrid particle provided this protection, while unprotected DNA was digested by enzymes.

The researchers next applied this DNA-infused particle solution to mouse cells. The DNA contained the gene code for green fluorescent protein that would be turned on only after it entered the cell. They observed that the particle penetrated the cell membrane and, after a series of interactions occurred, the green fluorescent protein lit up inside the cell, indicating the DNA had reached its target.

For comparison, the group also monitored DNA movement on its own and in other types of vectors. Their hybrid vector was 24 times more effective at delivering genetic material to the cell than was DNA on its own, and 10 times more effective than calcium phosphate preparations.

“We know the particle gets to where it needs to go and what happens to the particle,” Zhou said. “Do we know that the DNA reaches the nucleus? That is something we still need to find out. But because we saw the green fluorescent protein expressed, we think it got to the nucleus or at least as far as the cytoplasm. What’s important is that the protein got inside the cell.”

The study also showed that this hybrid particle maintained its structure for at least 21 days and, when compared with a variety of other potential vector substances, did very little damage to cells, meaning it is not as toxic as most other materials.

With viral vectors, gene therapy is considered a one-time treatment because when the virus carrying new genes infects a cell, that interaction changes the recipient’s entire genome, effectively canceling the activity of the defective gene. Zhou said that with this nonviral vector, treatment would be designed as an intravenous injection on a regular basis until cells are “infected enough to make a change.”

The researchers next plan to test the particle’s ability to travel through the bloodstream and enter target cells in animals.