Unseen Is Free

OpenX

Google Translate

Tuesday, March 3, 2015

Violent collisions between the growing Earth and other objects in the solar system generated significant amounts of iron vapor, according to a new study by LLNL scientist Richard Kraus and colleagues.

The results show that iron vaporizes easily during impact events, which forces planetary scientists to change how they think about the growth of planets and evolution of our solar system.

This artist’s illustration shows a planetary scale impact on the Moon.

Illustration by W.K. Hartmann

For planetary scientists, one of the most important and complex research areas is predicting how planets form and evolve to their current state. Generally speaking, planets form by a series of impacts, with the speed of the impacts being slow at first, a few miles per hour, but then faster as the planets grow larger, up to 100,000 miles per hour.

At the end stages of formation, when the impact speeds are high and the material conditions are extreme (high temperatures and pressures), planetary scientists don't have great models for how to describe what happens to the colliding bodies.

"One major problem is how we model iron during impact events, as it is a major component of planets and its behavior is critical to how we understand planet formation," Kraus said. "In particular, it is the fraction of that iron that is vaporized on impact that is not well understood."

The Sandia Z Machine was used to develop a new shock-wave technique to measure an important material property.

Photo by Randy Montoya.

Using Sandia National Laboratory's Z-Machine, the team developed a new shock-wave technique to measure an important material property -- the entropy gain during shock compression. By measuring the entropy, they determined the critical impact conditions to vaporize the iron within objects that collide with the growing Earth.

The scientists found that iron will vaporize at significantly lower impact speeds than previously thought. This translates to more iron being vaporized during Earth's period of formation.

"This causes a shift in how we think about processes like the formation of Earth's iron core," Kraus said. "Rather than the iron in the colliding objects sinking down directly to the Earth's growing core, the iron is vaporized and spread over the surface within a vapor plume. After cooling, the vapor would have condensed into an iron rain that mixed into the Earth's still-molten mantle.

"The timing of Earth's core formation can only be determined via chemical signatures in Earth's mantle, a technique that requires assumptions about how well the iron is mixed. This new information actually changes our estimates for the timing of when Earth's core was formed," Kraus added.

Dust plays an extremely important role in the universe - both in the formation of planets and new stars. But dust was not there from the beginning and the earliest galaxies had no dust, only gas.

The Atacama Large Millimeter/submillimeter Array (ALMA) in northern Chile consists of 66 radio telescopes that collectively observe the sky in the millimeter and submillimeter wavelength range. By observing the distant galaxy with the ALMA telescopes at infrared wavelengths, scientists could detect its dust.

Credit: ALMA/ESO

Now an international team of astronomers, led by researchers from the Niels Bohr Institute, has discovered a dust-filled galaxy from the very early universe. The discovery demonstrates that galaxies were very quickly enriched with dust particles containing elements such as carbon and oxygen, which could form planets. The results are published in the scientific journal, Nature.

X-shooter is the world's most sensitive instrument of its kind to observe the Universe from Earth.

Cosmic dust are smoke-like particles made up of either carbon (fine soot) or silicates (fine sand). The dust is comprised primarily of elements such as carbon, silicon, magnesium, iron and oxygen. The elements are synthesised by the nuclear combustion process in stars and driven out into space when the star dies and explodes. In space, they gather in clouds of dust and gas, which form new stars, and for each generation of new stars, more elements are formed. This is a slow process and in the very earliest galaxies in the history of the universe, dust had not yet formed.

Our cosmic roots: By scrutinising images from Hubble Ultra Deep Field – the deepest images of the sky ever made – researchers have gained fascinating new insights about the distant universe.

But now a team of researchers have discovered a very distant galaxy that contains a large amount of dust, changing astronomers' previous calculations of how quickly the dust was formed.

"It is the first time dust has been discovered in one of the most distant galaxies ever observed - only 700 million years after the Big Bang. It is a galaxy of modest size and yet it is already full of dust. This is very surprising and it tells us that ordinary galaxies were enriched with heavier elements far faster than expected," explains Darach Watson, an astrophysicist with the Dark Cosmology Centre at the Niels Bohr Institute at the University of Copenhagen.

Darach Watson led the project, with Lise Christensen from the Dark Cosmology Centre and researchers from Sweden, Scotland, France and Italy.

Lucky location

Because the galaxy is very distant and therefore incredibly faint, it would not usually be detectable from Earth. But a fortunate circumstance means the light from it has been amplified. This is because a large cluster of galaxies called Abell 1689, lies between the galaxy and Earth. The light is refracted by the gravity of the galaxy cluster, thus amplifying the distant galaxy. The phenomenon is called gravitational lensing and it works like a magnifying glass.

This is an electron microscope image of dust particles from interstellar space. The dust particles are typically about 100 nanometers (one nanometer is one millionth of a millimeter) and are made up of either carbon (soot) or silicates (fine sand).Credit: A. Takigawa/R. Stroud/ L. Nittler

"We looked for the most distant galaxies in the universe. Based on the colours of the light observed with the Hubble Space Telescope we can see which galaxies could be very distant. Using observations from the very sensitive instrument, the X-shooter spectrograph on the Large Telescope, VLT in Chile, we measured the galaxy's spectrum and from that calculated its redshift, i.e. the change in the light's wavelength as the object recedes from us. From the redshift we can calculate the galaxy's distance from us and it turned out to be, as we suspected, one of the most distant galaxies we know of to date," explains Lise Christensen, an astrophysicist at the Dark Cosmology Centre at the Niels Bohr Institute.

Early planet formation

Darach Watson explains that they then studied the galaxy with the ALMA telescopes, which can observe far-infrared wavelengths and then it became really interesting, because now they could see that the galaxy was full of dust. He explains that young stars in early galaxies emit hot ultraviolet light. The hot ultraviolet radiation heats the surrounding ice-cold dust, which then emits light in the far-infrared.

"It is this far-infrared light, which tells us that there is dust in the galaxy. It is very surprising and it is the first time that dust has been found in such an early galaxy. The process of star formation must therefore have started very early in the history of the universe and be associated with the formation of dust. The detection of large amounts of solid material shows that the galaxy was enriched very early with solids which are a prerequisite for the formation of complex molecules and planets," explains Darach Watson.

This image, taken by the Hubble Space Telescope, shows a part of the galaxy cluster, Abell 1689, whose gravitational field amplifies the distant galaxy behind it. The distant dust-filled galaxy in zoom in the box. Credit: Hubble Space Telescope

Now the researchers hope that future observations of a large number of distant galaxies using the ALMA telescopes could help unravel how frequently such evolved galaxies occur in this very early epoch of the history of the universe.

Quantum mechanics tells us that light can behave simultaneously as a particle or a wave. However, there has never been an experiment able to capture both natures of light at the same time; the closest we have come is seeing either wave or particle, but always at different times. Taking a radically different experimental approach, EPFL scientists have now been able to take the first ever snapshot of light behaving both as a wave and as a particle. The breakthrough work is published in Nature Communications.

When UV light hits a metal surface, it causes an emission of electrons. Albert Einstein explained this "photoelectric" effect by proposing that light - thought to only be a wave - is also a stream of particles. Even though a variety of experiments have successfully observed both the particle- and wave-like behaviors of light, they have never been able to observe both at the same time.

A new approach on a classic effect

A research team led by Fabrizio Carbone at EPFL has now carried out an experiment with a clever twist: using electrons to image light. The researchers have captured, for the first time ever, a single snapshot of light behaving simultaneously as both a wave and a stream of particles particle.

The experiment is set up like this: A pulse of laser light is fired at a tiny metallic nanowire. The laser adds energy to the charged particles in the nanowire, causing them to vibrate. Light travels along this tiny wire in two possible directions, like cars on a highway. When waves traveling in opposite directions meet each other they form a new wave that looks like it is standing in place. Here, this standing wave becomes the source of light for the experiment, radiating around the nanowire.

This is where the experiment's trick comes in: The scientists shot a stream of electrons close to the nanowire, using them to image the standing wave of light. As the electrons interacted with the confined light on the nanowire, they either sped up or slowed down. Using the ultrafast microscope to image the position where this change in speed occurred, Carbone's team could now visualize the standing wave, which acts as a fingerprint of the wave-nature of light.

While this phenomenon shows the wave-like nature of light, it simultaneously demonstrated its particle aspect as well. As the electrons pass close to the standing wave of light, they "hit" the light's particles, the photons. As mentioned above, this affects their speed, making them move faster or slower. This change in speed appears as an exchange of energy "packets" (quanta) between electrons and photons. The very occurrence of these energy packets shows that the light on the nanowire behaves as a particle.

"This experiment demonstrates that, for the first time ever, we can film quantum mechanics - and its paradoxical nature - directly," says Fabrizio Carbone. In addition, the importance of this pioneering work can extend beyond fundamental science and to future technologies. As Carbone explains: "Being able to image and control quantum phenomena at the nanometer scale like this opens up a new route towards quantum computing."

This work represents a collaboration between the Laboratory for Ultrafast Microscopy and Electron Scattering of EPFL, the Department of Physics of Trinity College (US) and the Physical and Life Sciences Directorate of the Lawrence Livermore National Laboratory. The imaging was carried out EPFL's ultrafast energy-filtered transmission electron microscope - one of the two in the world.

Monday, March 2, 2015

Chemists at Bielefeld University have developed a molecule containing copper that binds specifically with DNA and prevents the spread of cancer. First results show that it kills the cancer cells more quickly than cisplatin – a widely used anti-cancer drug that is frequently administered in chemotherapy. When developing the anti-tumour agent, Professor Dr. Thorsten Glaser and his team cooperated with biochemists and physicists. The design of the new agent is basic research. ‘How and whether the copper complex will actually be given to cancer patients is something that medical research will have to determine in the years to come,’ says the chemist.

The new agent containing copper (above) ‘docks’ precisely with the DNA molecule (below) of a cancer cell and stops it from growing. As a result, the cancer cell dies.

Photo: Bielefeld University

Ever since the end of the 1970s, doctors have been using cisplatin to treat cancer. For lung cancer and testicular cancer, the drug promotes healing; however, it does not work for all types of cancer. Cisplatin is also one of the anti-cancer drugs that most frequently induce nausea, vomiting, and diarrhoea. ‘Therefore we wanted to develop an alternative agent that would work differently, have fewer side effects, and treat other types of cancer as well,’ says Thorsten Glaser, Professor of Inorganic Chemistry at Bielefeld University. ‘In addition, we wanted an agent that would treat cancers that have become immune to cisplatin through its use in earlier treatments.’ Glaser and his team are using methods from chemistry to produce new molecules that are not found in nature, and to equip these with specific properties.

Cisplatin attacks the DNA of cancer cells. DNA is composed of nucleobases, phosphates, and sugar. Whereas cisplatin binds with the nucleobases, the new molecule developed by the researchers attacks the phosphate in the DNA. ‘We did this by integrating two metal ions of copper in our molecule that preferentially bind with phosphates.’ As soon as the ions bind with the phosphate, the DNA of the cancer cell changes. This disrupts the cellular processes, prevents the cell from reproducing, and leads to the destruction of the pathological cell.

‘Just as a key only works in one specific lock, our molecule only fits the phosphates and blocks them,’ says Glaser. A bit like the end of a horseshoe, there are two metal ions of copper protruding from the new molecule. The gap between the two ends of the horseshoe corresponds exactly to that between the phosphates in the DNA so that they can dock together and form a perfect fit. ‘Because two phosphates bind simultaneously, the binding strength is greater. And that increases the efficacy.’

‘Much of the research on anti-cancer drugs concentrates on variants of cisplatin. Our copper complex, in contrast, is a completely new agent,’ says Prof. Dr. Thorsten Glaser.

Photo: Bielefeld University

The scientists at Bielefeld University have developed a procedure for manufacturing the new molecule. They have proved that their copper agent can bind with DNA and change it. And they have studied whether and how well their agent prevents the spread of the DNA and thereby of the cells. The replication of the genome in cells proceeds in a similar way to a polymerase chain reaction (PCR). The researchers have confirmed that the copper complex stops this chain reaction.

Finally, the scientists applied the agent to cancer cells. They administered the substance to a cell culture with cancer cells. The result was that ‘the copper complex is more effective than cisplatin,’ says Glaser. ‘The highest number of cancer cells died at a concentration of 10 micromolar. With cisplatin, you need 20 micromolar.’

When carrying out the research on the new agent, Professor Glaser and his team cooperated with the research teams of Professor Dr. Dario Anselmetti (Biophysics and nanoscience) and Professor Dr. Gabriele Fischer von Mollard (Biochemistry) – both also at Bielefeld University. Dario Anselmetti’s colleagues used atomic force microscopy to produce the images confirming that the copper complex binds with the DNA. Gabriele Fischer von Mollard’s team tested how the cancer cell culture responded to the agent.

The three research groups are cooperating within Collaborative Research Centre (CRC) 613 ‘Physics of Single-Molecule Processes and of Molecular Recognition in Organic Systems’.

Modern astronomy began with a supernova. In November 1572, Danish astronomer Tycho Brahe discovered a new star – and destroyed the idea of a sky of fixed stars. Today, we know that Brahe was observing the death of a star, which ended in a massive explosion. Friedrich Röpke aims to find out how these supernova explosions proceed.

Three-dimensional simulation of a Type Ia supernova explosion

Image: F. K. Röpke MPI for Astrophysics, Garching

The astrophysicist is now leader of the new research group "Physics of Stellar Objects" (PSO) at Heidelberg Institute for Theoretical Studies (HITS). As of March 1, 2015, he has been appointed professor for Theoretical Astrophysics at Heidelberg University. His workplace is HITS. This joint appointment is a perfect proof for the close cooperation between the two institutes. With Friedrich Röpke and Volker Springel, there now are two HITS astrophysicists who are also professors at Heidelberg University.

“The new group is another important component of our concept, “ says Klaus Tschira who founded the HITS in 2010 as a non-profit research institute. “Research on stellar astrophysics, like Friedrich Röpke does, is a perfect complement of the work of Volker Springel’s group on large-scale processes like galaxy formation.“

Friedrich Röpke (40) studied Physics at the University of Jena and the University of Virginia, Charlottesville/USA, and received his PhD in 2003 from the Technische Universität München. In the following years, he worked as a postdoc at the Max-Planck-Institute for Astrophysics (MPA) in Garching and at the University of California, Santa Cruz/USA.

In 2008, Friedrich Röpke habilitated at the TU München and also became leader of an Emmy Noether research group at MPA. Three years later, he got appointed professor for Astrophysics at the University of Würzburg. In 2010, the researcher was awarded the "ARCHES Award" by the German Federal Ministry for Education and Research together with Prof. Avishay Gal-Yam from the Weizmann Institute, Rehovot/Israel. The award honors young scientists whose work shows great potential to have noticeable impact on their respective fields of research.

Friedrich Röpke studies Type Ia supernovae. Observation of these cosmic explosions allows astronomers to determine distances in space. In 2011, the Nobel Prize in Physics was awarded to researchers who proved the accelerated expansion of the Universe with supernovae. The PSO group collaborates closely with one of the laureates from 2011, Brian Schmidt (Australian National University, Canberra) in a program supported by the German Academic Exchange Service DAAD. Friedrich Röpke’s research aims to understand exactly what happens when stars die.

Remnant of SN 1572 as seen in X-ray light from the Chandra X-ray Observatory. The supernova of 1572 is often called "Tycho's supernova", because of Tycho Brahe's extensive work De nova et nullius aevi memoria prius visa stella ("Concerning the Star, new and never before seen in the life or memory of anyone", published in 1573 with reprints overseen by Johannes Kepler in 1602, and 1610), a work containing both Tycho Brahe's own observations and the analysis of sightings from many other observers.

Credit: Chandra X-ray Observatory.

Together with other scientists, he used computer simulations to show that some highly-luminous supernovae are the result of two compact stars, so-called “white dwarfs", merging together. He also investigates alternatives by modeling the explosion of a white dwarf when it reaches its maximum stable mass (the so-called Chandrasekhar limit), using highly complex simulations on supercomputers. White dwarfs are only about the size of the Earth and are extremely dense. When they explode as supernova, they shine brighter than the whole galaxy. "Our detailed simulations helped us to predict data that closely reproduce actual telescope observations of Type Ia supernovae, " explains the astrophysicist.

“Modelling of supernova explosions is, however, just one part of our research at HITS,” says Friedrich Röpke. “We also strive for a better understanding of how stars evolve and how the elements that make up our world are formed within them.” Classical astrophysics follows stellar evolution based on very simplifying assumptions. "To improve the predictive power of the models, we have to describe the physical processes taking place within stars in a dynamic way," says the astrophysicist. He and his group have developed a new computer code that – combined with the rapidly increasing capacities of supercomputers – opens new perspectives for the modelling of stars.

In contrast to what we are used to from our solar system, most stars in the Universe exist as part of multiple star systems. The interaction between those stars greatly affects their evolution but the involved physical processes are poorly understood until today. The two astrophysics groups at HITS are cooperating on new computer simulations to bring some light into the darkness.

An international team of researchers has demonstrated a way to assess the quality of water on Earth from space by using satellite technology that can visualise pollution levels otherwise invisible to the human eye through ‘Superhero vision’.

The research team from the University of Leicester, the Hungarian Academy of Science and industrial partners has used the MEdium Resolution Imaging Spectrometer (MERIS) instrument hosted on the satellite ENVISAT to measure pollution levels in lakes on Earth through ‘Superhero vision’, allowing it to see wavelengths invisible to the human eye, which only sees red, green and blue light.

Satellite image mapping of Lake Balaton

Credit: Viktor Tóth

While these methods have previously been used for seas and oceans, they are not readily available for lakes, especially shallow lakes with complex optical environments defined by a mix of different natural substances in the water.

University of Leicester PhD student Stephanie Palmer, who worked in Hungary at the Balaton Limnological Institute of the Hungarian Academy of Sciences for three years, analysed ten years of data from the European ENVISAT satellite in the search for specific chronological sequences describing algal blooms.

Lake Balaton in Hungary is a popular tourist area especially vulnerable to environmental and meteorological changes that could result in the build-up of algae.

Professor Heiko Balzter, Director of the Leicester Centre for Landscape and Climate Research in the University of Leicester’s Department of Geography and co-author of the study, said: “Lake Balaton is incredibly important for the Hungarian tourism sector, since on an annual basis up to 1 million tourists are visiting it. It is also home to a large diversity of fish and other species. When water quality samples are taken from a ship it is not only a logistic nightmare, but the data collection costs a lot of money and only provides point measurements, making the estimation on the whole lake level very speculative.

“The frequent satellite data of MERIS adds a synoptic observation to the study of algal blooms and how they develop.

“Leonardo da Vinci said that ‘Water is the driving force of all nature’. There is a lot of truth in that. Clean water is the basis of human life, and while we are used to turning the tap on and having unlimited access to clean drinking water here in the UK, this is not the case for everyone in the world.”

When algae grow in lakes they contain the green substance ‘chlorophyll-a’ which is measured at the Balaton Limnological Institute at Lake Balaton with a research ship.

During the research, which was supported by the European Commission, Marie Curie Programme and the GIONET project (European Centre of Excellence in Earth Observation Research Training, grant number PITN-GA-2010-264509), over 1,000 satellite images were processed at the satellite data analysis centre of Airbus Defence and Space based in Farnborough. They were turned into maps of the green chlorophyll concentration in the water, and their quality checked with over 250 ship-based measurements taken over five years.

Viktor Tóth, research fellow of the Balaton Limnological Institute, added: “It is incredible that such ecological differences occur within one given lake and this makes Lake Balaton so special and suitable for remote sensing projects. The work of Stephanie Palmer has proved that with proper calibration remote sensing is a valuable tool not only for water quality assessment but also for algologists.”

The study, ‘Validation of Envisat MERIS algorithms for chlorophyll retrieval in a large, turbid and optically-complex shallow lake’, is published in the February edition of the top-rated remote sensing journal Remote Sensing of Environment, Volume 157, Pages 158–169. It is open access and free of charge to access.

A second paper, ‘Satellite remote sensing of phytoplankton phenology in Lake Balaton using 10 years of MERIS observations’, outlines a satellite remote sensing approach to retrieving and mapping freshwater phytoplankton phenology in application to Lake Balaton, Hungary.

A major challenge facing full electric vehicles is the limited range between charges. An EU-backed project has designed a new-generation battery that can potentially power a car for up to 500 km, instead of the current 150 km, before needing to be recharged. This breakthrough could encourage more people to buy electric vehicles - good for the environment and Europe's competitiveness.

Credit: Robert Hoetink - Fotolia

Lithium-ion batteries are all around us. They are popular in portable devices, such as smart phones, laptops, cameras and more. They are also common in electric vehicles and can even be found in aerospace applications. The reason for the proliferation is that lithium-ion batteries have a high energy density, a slow loss of charge and no “memory effect” – reduced effectiveness when batteries are not completely recharged.

However, when it comes to electric vehicles that are only powered by a battery, the energy density in lithium ion batteries still falls short. Such full electric vehicles currently have a top range of just 150 km before they need to be recharged.

“At present, the best technologies are lithium-ion batteries, but they are somewhat limited. The energy content can be increased by up to 50% but not much more than that,” explains Stefano Passerini, a professor specialising in electrochemical energy storage at Germany’s Helmholtz-Institute Ulm. He says new battery technologies and chemistries, known as next-generation batteries, are needed to make electric cars more viable.

And that was precisely the focus of the EU-funded LABOHR project, which Passerini coordinated when he was with Germany’s Westfälische Wilhelms-University Münster. The project completed its work in March 2014.

The next-generation technology in question is the lithium-air battery (Li-air), which is both environmentally safe and requires no fossil fuels, says Passerini. Although originally proposed in the 1970s, materials technology had, at the time, not advanced enough to design and build anything remotely on the scale required to power a vehicle.

But the past few years have brought renewed interest as electric cars, buses, motorcycles and other forms of transport have finally begun to come into their own and researchers have been keen to find ways to overcome their limitations.

“There has been quite a lot of work on lithium-air batteries in recent years, but it has been focused on the fundamental science or has been limited to very small cells,” notes Elie Paillard, a senior researcher in Passerini’s group who is working on the technical and scientific challenges of lithium-air batteries within LABOHR.

Driven by powerful ambition

Even though no workable blueprint existed for a Li-air battery to power a vehicle, LABOHR set out to design a prototype for a battery that could not only propel a vehicle but radically increase its range.

The concept uses environmentally benign ionic liquid electrolytes and nano-structured electrodes. These harvest dry oxygen from the air during discharge and return the oxygen to the atmosphere when the battery is recharging. This design helps to avoid cathode clogging, a common problem with conventional batteries.

LABOHR focused on both design aspects and fundamental research. The project investigated the possibility of scaling up Li-air technology into a battery-pack for electric vehicles. It studied key technological issues, such as the stabilisation of the lithium-metal electrode and the development of porous carbons and catalysts for the air electrode.

“We came up with a design for a large battery system for cars and we also proved that the principle works on a large scale, but we don’t have a prototype yet,” says Passerini.

He adds: “If we can close the gap between the engineering and the chemistry, it will be possible to make a mid-size car like the Volkswagen Golf travel 500 km with one charge.”

With the end of the project, the former project partners plan to work on getting to the prototype stage. Passerini notes that it would take about a decade before such a battery can be put into production.

Describing it as the “Holy Grail” for the automotive sector, Passerini says that interest from European industry in this technology is enormous. In addition to Volkswagen, which was involved in LABOHR, BMW is also very interested and is already financing related work by the team.

LABOHR’s potential not only contributes to the EU’s environmental, energy-efficiency and transportation objectives, it can help to advance its renewable energy policy goals.

“This kind of battery can also be used to store renewable energy, such as that generated by wind turbines,” says Passerini. “But if we can be successful with electrical vehicles, then stationary applications like this will follow, as they are simpler systems.”

Houses of the future could be partially built with bacteria. It sounds like science fiction but researchers involved in an EU-backed project in Madrid are working towards making this a concrete reality.

Credit: EU Research and Innovation

It starts with a common type of soil bacterium being revitalised in a mixture of urea and nutrients at a constant temperature, of around 30 degrees Celsius.

Piero Tiano, a biologist with the Italian Institute for the Conservation and Preservation of Cultural Heritage told euronews how it works: “Inside this mix, bacteria starts to develop; they basically grow in number. The bacteria has to reach a certain quantity in order to make cement. After around three hours of fermentation, our mix is ready for use”.

The scientists then add the revitalized bacteria to a mix of sand, industrial cement waste and the ash of rice husks.

Cement manufacturing accounts for some five percent of global carbon emissions, researchers say. This project aims to prove that greener, ecologically friendly cement is possible.

“Our raw materials are basically all waste. So we don’t have added costs,” said Laura Sánchez Alonso, a mining Engineer and Eco-Cement project coordinator.

“For instance, we don’t need to extract and transport the limestone commonly used to produce cement. And we also save the energy costs”

Less heat means lower cost and emissions, as James Stuart, a sustainable design consultant, explained: “In ordinary cement, they have to use very high temperatures, up to 1,400-1,500 degrees Celsius in order to turn limestone into cement. That is part of the process. And that takes an awful lot of energy. Here we only need bacteria to multiply at 30 degrees. So that is a massive difference. And that amount of heat energy is saved because we are using a biological process to bind the particles together”.

The way the bacteria bind the particles together is by naturally producing calcium carbonate.

Initial tests have proved promising. Researchers – microbiologists and chemists – are working to help the bacteria become more efficient.

“It’s important to know the ideal density of bacteria in the mix,” said Linda Wittig, an industrial chemist with Fraunhofer-IFAM.

“We’ve done research on that. We know for instance that greater bacteria density does not always mean that the product will be more resilient. On the contrary, sometimes beyond a certain point, too many bacteria can undermine the strength of the final product. So we need to find the optimal number of cells to produce the cement”.

Preliminary results of ongoing tests on the material’s plasticity, elasticity and resistance to stress or deformation are already showing the way for eventual applications.

“We decided to use this material as mortar and not as concrete because it is not as strong as traditional concrete. But it can be easily transformed. This is the reason why we decided to use this material as mortar,” said Nikos Bakas, a civil engineer at Neapolis University in Cyprus.

Whatever the final applications, researchers hope the new material could be a reality on European construction sites in less than a decade.

Friday, February 27, 2015

A new type of methane-based, oxygen-free life form that can metabolize and reproduce similar to life on Earth has been modeled by a team of Cornell University researchers.

Taking a simultaneously imaginative and rigidly scientific view, chemical engineers and astronomers offer a template for life that could thrive in a harsh, cold world - specifically Titan, the giant moon of Saturn. A planetary body awash with seas not of water, but of liquid methane, Titan could harbor methane-based, oxygen-free cells.

Credit: NASA/JPL/University of Arizona

Their theorized cell membrane, composed of small organic nitrogen compounds and capable of functioning in liquid methane temperatures of 292 degrees below zero, is published inScience Advances, Feb. 27. The work is led by chemical molecular dynamics expert Paulette Clancy and first author James Stevenson, a graduate student in chemical engineering. The paper's co-author is Jonathan Lunine, director for Cornell's Center for Radiophysics and Space Research.

A representation of a nine nanometer azotosome, about the size of the virus, wiht a piece of membrane cut away to show the interior

Credit: James Stevenson

Lunine is an expert on Saturn's moons and an interdisciplinary scientist on the Cassini-Huygens mission that discovered methane-ethane seas on Titan. Intrigued by the possibilities of methane-based life on Titan, and armed with a grant from the Templeton Foundation to study non-aqueous life, Lunine sought assistance about a year ago from Cornell faculty with expertise in chemical modeling. Clancy, who had never met Lunine, offered to help.

"We're not biologists, and we're not astronomers, but we had the right tools," Clancy said. "Perhaps it helped, because we didn't come in with any preconceptions about what should be in a membrane and what shouldn't. We just worked with the compounds that we knew were there and asked, 'If this was your palette, what can you make out of that?'"

On Earth, life is based on the phospholipid bilayer membrane, the strong, permeable, water-based vesicle that houses the organic matter of every cell. A vesicle made from such a membrane is called a liposome. Thus, many astronomers seek extraterrestrial life in what's called the circumstellar habitable zone, the narrow band around the sun in which liquid water can exist. But what if cells weren't based on water, but on methane, which has a much lower freezing point?

The engineers named their theorized cell membrane an "azotosome," "azote" being the French word for nitrogen. "Liposome" comes from the Greek "lipos" and "soma" to mean "lipid body;" by analogy, "azotosome" means "nitrogen body."

The azotosome is made from nitrogen, carbon and hydrogen molecules known to exist in the cryogenic seas of Titan, but shows the same stability and flexibility that Earth's analogous liposome does. This came as a surprise to chemists like Clancy and Stevenson, who had never thought about the mechanics of cell stability before; they usually study semiconductors, not cells.

The engineers employed a molecular dynamics method that screened for candidate compounds from methane for self-assembly into membrane-like structures. The most promising compound they found is an acrylonitrile azotosome, which showed good stability, a strong barrier to decomposition, and a flexibility similar to that of phospholipid membranes on Earth. Acrylonitrile - a colorless, poisonous, liquid organic compound used in the manufacture of acrylic fibers, resins and thermoplastics - is present in Titan's atmosphere.

Excited by the initial proof of concept, Clancy said the next step is to try and demonstrate how these cells would behave in the methane environment - what might be the analogue to reproduction and metabolism in oxygen-free, methane-based cells.

Lunine looks forward to the long-term prospect of testing these ideas on Titan itself, as he put it, by "someday sending a probe to float on the seas of this amazing moon and directly sampling the organics."

Stevenson said he was in part inspired by science fiction writer Isaac Asimov, who wrote about the concept of non-water-based life in a 1962 essay, "Not as We Know It."

Said Stevenson: "Ours is the first concrete blueprint of life not as we know it."

A team of researchers from Italy, Israel and the United Kingdom has succeeded in generating mature, functional skeletal muscles in mice using a new approach for tissue engineering. The scientists grew a leg muscle starting from engineered cells cultured in a dish to produce a graft. The subsequent graft was implanted close to a normal, contracting skeletal muscle where the new muscle was nurtured and grown. In time, the method could allow for patient-specific treatments for a large number of muscle disorders. The results are published in EMBO Molecular Medicine.

Skeletal muscle

Credit: Eastern Kentucky University

The scientists used muscle precursor cells - mesoangioblasts - grown in the presence of a hydrogel (support matrix) in a tissue culture dish. The cells were also genetically modified to produce a growth factor that stimulates blood vessel and nerve growth from the host. Cells engineered in this way express a protein growth factor that attracts other essential cells that give rise to the blood vessels and nerves of the host, contributing to the survival and maturation of newly formed muscle fibres. After the graft was implanted onto the surface of the skeletal muscle underneath the skin of the mouse, mature muscle fibres formed a complete and functional muscle within several weeks. Replacing a damaged muscle with the graft also resulted in a functional artificial muscle very similar to a normal Tibialis anterior.

Tissue engineering of skeletal muscle is a significant challenge but has considerable potential for the treatment of the various types of irreversible damage to muscle that occur in diseases like Duchenne muscular dystrophy. So far, attempts to re-create a functional muscle either outside or directly inside the body have been unsuccessful. In vitro-generated artificial muscles normally do not survive the transfer in vivo because the host does not create the necessary nerves and blood vessels that would support the muscle's considerable requirements for oxygen.

"The morphology and the structural organisation of the artificial organ are extremely similar to if not indistinguishable from a natural skeletal muscle," says Cesare Gargioli of the University of Rome, one of the lead authors of the study.

In future, irreversibly damaged muscles could be restored by implanting the patient's own cells within the hydrogel matrix on top of a residual muscle, adjacent to the damaged area. "While we are encouraged by the success of our work in growing a complete intact and functional mouse leg muscle we emphasize that a mouse muscle is very small and scaling up the process for patients may require significant additional work," comments EMBO Member Giulio Cossu, one of the authors of the study. The next step in the work will be to use larger animal models to test the efficacy of this approach before starting clinical studies.

Thursday, February 26, 2015

Scientists have discovered the brightest quasar in the early universe, powered by the most massive black hole yet known at that time. The international team led by astronomers from Peking University in China and from the University of Arizona announce their findings in the scientific journal Nature on Feb. 26.

This is an artist's impression of a quasar with a supermassive black hole in the distant universe.Credit: Zhaoyu Li/NASA/JPL-Caltech/Misti Mountain Observatory

The discovery of this quasar, named SDSS J0100+2802, marks an important step in understanding how quasars, the most powerful objects in the universe, have evolved from the earliest epoch, only 900 million years after the Big Bang, which is thought to have happened 13.7 billion years ago. The quasar, with its central black hole mass of 12 billion solar masses and the luminosity of 420 trillion suns, is at a distance of 12.8 billion light-years from Earth.

The discovery of this ultraluminous quasar also presents a major puzzle to the theory of black hole growth at early universe, according to Xiaohui Fan, Regents' Professor of Astronomy at the UA's Steward Observatory, who co-authored the study.

"How can a quasar so luminous, and a black hole so massive, form so early in the history of the universe, at an era soon after the earliest stars and galaxies have just emerged?" Fan said. "And what is the relationship between this monster black hole and its surrounding environment, including its host galaxy?

"This ultraluminous quasar with its supermassive black hole provides a unique laboratory to the study of the mass assembly and galaxy formation around the most massive black holes in the early universe."

The quasar dates from a time close to the end of an important cosmic event that astronomers referred to as the "epoch of reionization": the cosmic dawn when light from the earliest generations of galaxies and quasars is thought to have ended the "cosmic dark ages" and transformed the universe into how we see it today.

Discovered in 1963, quasars are the most powerful objects beyond our Milky Way galaxy, beaming vast amounts of energy across space as the supermassive black hole in their center sucks in matter from its surroundings. Thanks to the new generation of digital sky surveys, astronomers have discovered more than 200,000 quasars, with ages ranging from 0.7 billion years after the Big Bang to today.

The newly discovered quasar SDSS J0100+2802 is the one with the most massive black hole and the highest luminosity among all known distant quasars. The background photo, provided by Yunnan Observatory, shows the dome of the 2.4meter telescope and the sky above it.Credit: Zhaoyu Li/Shanghai Observatory

Shining with the equivalent of 420 trillion suns, the new quasar is seven times brighter than the most distant quasar known (which is 13 billion years away). It harbors a black hole with mass of 12 billion solar masses, proving it to be the most luminous quasar with the most massive black hole among all the known high redshift (very distant) quasars.

"By comparison, our own Milky Way galaxy has a black hole with a mass of only 4 million solar masses at its center; the black hole that powers this new quasar is 3,000 time heavier," Fan said.

Feige Wang, a doctoral student from Peking University who is supervised jointly by Fan and Prof. Xue-Bing Wu at Peking University, the study's lead author, initially spotted this quasar for further study.

"This quasar was first discovered by our 2.4-meter Lijiang Telescope in Yunnan, China, making it the only quasar ever discovered by a 2-meter telescope at such distance, and we're very proud of it," Wang said. "The ultraluminous nature of this quasar will allow us to make unprecedented measurements of the temperature, ionization state and metal content of the intergalactic medium at the epoch of reionization."

Following the initial discovery, two telescopes in southern Arizona did the heavy lifting in determining the distance and mass of the black hole: the 8.4-meter Large Binocular Telescope, or LBT, on Mount Graham and the 6.5-meter Multiple Mirror Telescope, or MMT, on Mount Hopkins. Additional observations with the 6.5-meter Magellan Telescope in Las Campanas Observatory, Chile, and the 8.2-meter Gemini North Telescope in Mauna Kea, Hawaii, confirmed the results.

"This quasar is very unique," said Xue-Bing Wu, a professor of the Department of Astronomy, School of Physics at Peking University and the associate director of the Kavli Institute of Astronomy and Astrophysics. "Just like the brightest lighthouse in the distant universe, its glowing light will help us to probe more about the early universe."

Wu leads a team that has developed a method to effectively select quasars in the distant universe based on optical and near-infrared photometric data, in particular using data from the Sloan Digital Sky Survey and NASA's Wide-Field Infrared Explorer, or WISE, satellite.

"This is a great accomplishment for the LBT," said Fan, who chairs the LBT Scientific Advisory Committee and also discovered the previous record holders for the most massive black hole in the early universe, about a fourth of the size of the newly discovered object. "The especially sensitive optical and infrared spectrographs of the LBT provided the early assessment of both the distance of the quasars and the mass of the black hole at the quasar's center."

For Christian Veillet, director of the Large Binocular Telescope Observatory, or LBTO, this discovery demonstrates both the power of international collaborations and the benefit of using a variety of facilities spread throughout the world.

"This result is particularly gratifying for LBTO, which is well on its way to full nighttime operations," Veillet said. "While in this case the authors used two different instruments in series, one for visible light spectroscopy and one for near-infrared imaging, LBTO will soon offer a pair of instruments that can be used simultaneously, effectively doubling the number of observations possible in clear skies and ultimately creating even more exciting science."

To further unveil the nature of this remarkable quasar, and to shed light on the physical processes that led to the formation of the earliest supermassive black holes, the research team will carry out further investigations on this quasar with more international telescopes, including the Hubble Space Telescope and the Chandra X-ray Telescope.

It started with a trip to the basement of the American Museum of Natural History in New York to inspect preserved animal hides. Later, Georgia Institute of Technology researchers built a wind tunnel about 2 feet tall, complete with a makeshift eye. By putting both steps together, the team discovered that 22 species of mammals - from humans, to hedgehogs, to giraffes ¬- are the same: their eyelash length is one-third the width of their eye. Anything shorter or longer, including the fake eyelashes that are popular in Hollywood and make-up aisles, increases airflow around the eye and leads to more dust hitting the surface.

Giraffes and 21 other mammals, including humans, all eyelashes that are one third the width of their eye.
Credit: Georgia Institute of Technology

"Eyelashes form a barrier to control airflow and the rate of evaporation on the surface of the cornea," said Guillermo Amador, a Georgia Tech Ph.D. candidate in the George W. Woodruff School of Mechanical Engineering who authored the study. "When eyelashes are shorter than the one-third ratio, they have only a slight effect on the flow. Their effect is more pronounced as they lengthen up until one-third. After that, they start funneling air and dust particles into the eye."

The study is currently published in the Journal of the Royal Society Interface.

Amador and the research team, which is led by Assistant Professor David Hu, sent a student to the museum in 2012 to measure eyes and eyelashes of various animals. Aside from an elephant, which has extremely long eyelashes, every species studied had evolved to the same ratio of lash length to eye width.

These are the eye lashes of a goat.

Credit: Georgia Institute of Technology

The team then built the wind tunnel to re-create air flows on a mimic of an adult, human eye. A 4-millimeter deep, 20-millimeter diameter aluminum dish served as the cornea. It sat on top of an acrylic plate, which imitated the rest of the face. Mesh surrounded the dish to replicate the eyelashes.

They discovered the ideal ratio while varying the mesh length during evaporation and particle deposition studies.

"As short lashes grew longer, they reduced air flow, creating a layer of slow-moving air above the cornea," said Hu. "This kept the eye moist for a longer time and kept particles away. The majority of air essentially hit the eyelashes and rolled away from the eye."

This image shows the eye and eyelashes of an ostrich.

Credit: Georgia Institute of Technology

The opposite process occurred with longer eyelashes. The lashes extended further into the airflow and created a cylinder. The air and its molecules channeled toward the eye and led to faster evaporation.

"This is why long, elegant, fake eyelashes aren't ideal," said Amador. "They may look good, but they're not the best thing for the health of your eyes."

There are exceptions, though. The research team notes that people who can't grow eyelashes could wear fake ones, if they're the correct length, for extra protection and to reduce dry eye.

"Even if they're not the correct length, more eyelashes are always better than less," said Alexander Alexeev, an associate professor in the School of Mechanical Engineering. "If fake eyelashes are dense enough, they may give the same overall effect in protecting the eye even if they are longer than one-third."

The team also says the findings could be used to create eyelash-inspired filaments to protect solar panels, photographic sensors or autonomous robots in dusty environments.

Most of the laws of nature treat particles and antiparticles equally, but stars and planets are made of particles, or matter, and not antiparticles, or antimatter. That asymmetry, which favors matter to a very small degree, has puzzled scientists for many years.

UCLA physicists offer a possible solution to the mystery of the origin of matter in the universe.

Credit: NASA

New research by UCLA physicists, published in the journal Physical Review Letters, offers a possible solution to the mystery of the origin of matter in the universe.

Alexander Kusenko, a professor of physics and astronomy in the UCLA College, and colleagues propose that the matter-antimatter asymmetry could be related to the Higgs boson particle, which was the subject of prominent news coverage when it was discovered at Switzerland's Large Hadron Collider in 2012.

Specifically, the UCLA researchers write, the asymmetry may have been produced as a result of the motion of the Higgs field, which is associated with the Higgs boson, and which could have made the masses of particles and antiparticles in the universe temporarily unequal, allowing for a small excess of matter particles over antiparticles.

If a particle and an antiparticle meet, they disappear by emitting two photons or a pair of some other particles. In the "primordial soup" that existed after the Big Bang, there were almost equal amounts of particles of antiparticles, except for a tiny asymmetry: one particle per 10 billion. As the universe cooled, the particles and antiparticles annihilated each other in equal numbers, and only a tiny number of particles remained; this tiny amount is all the stars and planets, and gas in today's universe, said Kusenko, who is also a senior scientist with the Kavli Institute for the Physics and Mathematics of the Universe.

The research also is highlighted by Physical Review Letters in a commentary in the current issue.

The 2012 discovery of the Higgs boson particle was hailed as one of the great scientific accomplishments of recent decades. The Higgs boson was first postulated some 50 years ago as a crucial element of the modern theory of the forces of nature, and is, physicists say, what gives everything in the universe mass. Physicists at the LHC measured the particle's mass and found its value to be peculiar; it is consistent with the possibility that the Higgs field in the first moments of the Big Bang was much larger than its "equilibrium value" observed today.

The Higgs field "had to descend to the equilibrium, in a process of 'Higgs relaxation,'" said Kusenko, the lead author of the UCLA research.

Two of Kusenko's graduate students, Louis Yang of UCLA and Lauren Pearce of the University of Minnesota, Minneapolis, were co-authors of the study. The research was supported by the U.S. Department of Energy (DE-SC0009937), the World Premier International Research Center Initiative in Japan and the National Science Foundation (PHYS-1066293).

An unusual comet skimmed past the sun on Feb 18-21, 2015, as captured by the European Space Agency (ESA) and NASA's Solar and Heliospheric Observatory, or SOHO.

This comet was interesting for two reasons. First it's what's called a non-group comet, meaning it's not part of any known family of comets. Most comets seen by SOHO belong to the Kreutz family - all of which broke off from a single giant comet many centuries ago.

Image Credit: NASA/Goddard Space Flight Center/Duberstein

Watch the video to see the comet fly around the sun. Toward the end of the video, as the comet begins to develop a tail, the sun releases an eruption of solar material, called a coronal mass ejection, or CME, to add something more to the scene.

Image Credit: ESA/NASA/SOHO/Hill

The second reason it's interesting is because the vast majority of comets that come close enough to the sun to be seen by SOHO do not survive the trip. Known as sungrazers, these comets usually evaporate in the intense sunlight. This comet made it to within 2.2 million miles of the sun's surface - but survived the trip intact.

A description of sungrazer comets and where they come from.

Image Credit: NASA/Goddard Space Flight Center/Duberstein

"There's a half-decent chance that ground observers might be able to detect it in the coming weeks," said Karl Battams, a solar scientist at the Naval Research Lab in Washington, D.C. "But it's also possible that events during its trip around the sun will cause it to die fairly fast."

Since launching in 1995, SOHO has become the number one comet finder of all time -- this was comet discovery number 2,875. However, SOHO sees non-group comets like this only a few times a year.

Quasars--supermassive black holes found at the center of distant massive galaxies--are the most-luminous beacons in the sky. These central supermassive black holes actively accrete the surrounding materials and release a huge amount of their gravitational energy. An international team of astronomers, including Carnegie's Yuri Beletsky, has discovered the brightest quasar ever found in the early universe, which is powered by the most massive black hole observed for an object from that time. Their work is published February 26 by Nature.

This is an artist's rendering of a very distant, very ancient quasar, courtesy of the European Southern Observatory.Credit: ESO/M. Kornmesser

The quasar was found at a redshift of z=6.30. This is a measurement of how much the wavelength of light emitted from it that reaches us on Earth is stretched by the expansion of the universe. As such, it can be used to calculate the quasar's age and distance from our planet. A higher redshift means larger distance and hence looking further back in time.

At a distance of 12.8 billion light years from Earth, this quasar was formed only 900 million years after the Big Bang. Named SDSS J0100+2802, studying this quasar will help scientists understand how quasars evolved in the earliest days of the universe. There are only 40 known quasars have a redshift of higher than 6, a point that marks the beginning of the early universe.

"This quasar is very unique. Just like the brightest lighthouse in the distant universe, its glowing light will help us to probe more about the early universe," said team-leader Xue-Bing Wu of Peking University and the Kavli Institute of Astronomy and Astrophysics.

With a luminosity of 420 trillion that of our own Sun's, this new quasar is seven times brighter than the most distant quasar known (which is 13 billion years away). It harbors a black hole with mass of 12 billion solar masses, proving it to be the most luminous quasar with the most massive black hole among all the known high redshift quasars.

The team developed a method of detecting quasars at redshifts of 5 and higher. These detections were verified by the 6.5-meter Multiple Mirror Telescope (MMT) and 8.4m Large Binocular Telescope (LBT) in Arizona; the 6.5m Magellan Telescope at Carnegie's Las Campanas Observatory in Chile; and the 8.2m Gemini North Telescope in Hawaii.

"This quasar is a unique laboratory to study the way that a quasar's black hole and host galaxy co-evolve," Beletsky said. "Our findings indicate that in the early Universe, quasar black holes probably grew faster than their host galaxies, although more research is needed to confirm this idea."

Tuesday, February 24, 2015

What connects Earth's largest, hottest desert to its largest tropical rain forest?

The Sahara Desert is a near-uninterrupted brown band of sand and scrub across the northern third of Africa. The Amazon rain forest is a dense green mass of humid jungle that covers northeast South America. But after strong winds sweep across the Sahara, a tan cloud rises in the air, stretches between the continents, and ties together the desert and the jungle. It’s dust. And lots of it.

The lidar instrument aboard the CALIPSO satellite sends out pulses of light that bounce off particles in the atmosphere and back to the satellite. It distinguishes dust from other particles based on optical properties.
Image Credit: NASA Goddard's Scientific Visualization Studio

For the first time, a NASA satellite has quantified in three dimensions how much dust makes this trans-Atlantic journey. Scientists have not only measured the volume of dust, they have also calculated how much phosphorus – remnant in Saharan sands from part of the desert’s past as a lake bed – gets carried across the ocean from one of the planet’s most desolate places to one of its most fertile.

For the first time, a NASA satellite has quantified in three dimensions how much dust makes the trans-Atlantic journey from the Sahara Desert the Amazon rain forest. Among this dust is phosphorus, an essential nutrient that acts like a fertilizer, which the Amazon depends on in order to flourish.

Image Credit: NASA's Goddard Space Flight Center

A new paper published Feb. 24 in Geophysical Research Letters, a journal of the American Geophysical Union, provides the first satellite-based estimate of this phosphorus transport over multiple years, said lead author Hongbin Yu, an atmospheric scientist at the University of Maryland who works at NASA's Goddard Space Flight Center in Greenbelt, Maryland. A paper published online by Yu and colleagues Jan. 8 in Remote Sensing of the Environment provided the first multi-year satellite estimate of overall dust transport from the Sahara to the Amazon.

This trans-continental journey of dust is important because of what is in the dust, Yu said. Specifically the dust picked up from the Bodélé Depression in Chad, ann ancient lake bed where rock minerals composed of dead microorganisms are loaded with phosphorus. Phosphorus is an essential nutrient for plant proteins and growth, which the Amazon rain forest depends on n order to flourish.

Nutrients – the same ones found in commercial fertilizers – are in short supply in Amazonian soils. Instead they are locked up in the plants themselves. Fallen, decomposing leaves and organic matter provide the majority of nutrients, which are rapidly absorbed by plants and trees after entering the soil. But some nutrients, including phosphorus, are washed away by rainfall into streams and rivers, draining from the Amazon basin like a slowly leaking bathtub.

The phosphorus that reaches Amazon soils from Saharan dust, an estimated 22,000 tons per year, is about the same amount as that lost from rain and flooding, Yu said. The finding is part of a bigger research effort to understand the role of dust and aerosols in the environment and on local and global climate.

Dust in the Wind

"We know that dust is very important in many ways. It is an essential component of the Earth system. Dust will affect climate and, at the same time, climate change will affect dust," said Yu. To understand what those effects may be, "First we have to try to answer two basic questions. How much dust is transported? And what is the relationship between the amount of dust transport and climate indicators?"

Conceptual image of dust from the Saharan Desert crossing the Atlantic Ocean to the Amazon rainforest in South America.

Credit: Conceptual Image Lab, NASA/Goddard Space Flight Center

The new dust transport estimates were derived from data collected by a lidar instrument on NASA's Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation, or CALIPSO, satellite from 2007 though 2013.

The data show that wind and weather pick up on average 182 million tons of dust each year and carry it past the western edge of the Sahara at longitude 15W. This volume is the equivalent of 689,290 semi trucks filled with dust. The dust then travels 1,600 miles across the Atlantic Ocean, though some drops to the surface or is flushed from the sky by rain. Near the eastern coast of South America, at longitude 35W, 132 million tons remain in the air, and 27.7 million tons – enough to fill 104,908 semi trucks – fall to the surface over the Amazon basin. About 43 million tons of dust travel farther to settle out over the Caribbean Sea, past longitude 75W.

Yu and colleagues focused on the Saharan dust transport across the Atlantic Ocean to South America and then beyond to the Caribbean Sea because it is the largest transport of dust on the planet.

Dust collected from the Bodélé Depression and from ground stations on Barbados and in Miami give scientists an estimate of the proportion of phosphorus in Saharan dust. This estimate is used to calculate how much phosphorus gets deposited in the Amazon basin from this dust transport.

The seven-year data record, while too short for looking at long-term trends, is nevertheless very important for understanding how dust and other aerosols behave as they move across the ocean, said Chip Trepte, project scientist for CALIPSO at NASA's Langley Research Center in Virginia, who was not involved in either study.

"We need a record of measurements to understand whether or not there is a fairly robust, fairly consistent pattern to this aerosol transport," he said.

Looking at the data year by year shows that that pattern is actually highly variable. There was an 86 percent change between the highest amount of dust transported in 2007 and the lowest in 2011, Yu said.

Why so much variation? Scientists believe it has to do with the conditions in the Sahel, the long strip of semi-arid land on the southern border of the Sahara. After comparing the changes in dust transport to a variety of climate factors, the one Yu and his colleagues found a correlation to was the previous year's Sahel rainfall. When Sahel rainfall increased, the next year's dust transport was lower.

The mechanism behind the correlation is unknown, Yu said. One possibility is that increased rainfall means more vegetation and less soil exposed to wind erosion in the Sahel. A second, more likely explanation is that the amount of rainfall is related to the circulation of winds, which are what ultimately sweep dust from both the Sahel and Sahara into the upper atmosphere where it can survive the long journey across the ocean.

CALIPSO collects "curtains" of data that show valuable information about the altitude of dust layers in the atmosphere. Knowing the height at which dust travels is important for understanding, and eventually using computers to model, where that dust will go and how the dust will interact with Earth's heat balance and clouds, now and in future climate scenarios.

"Wind currents are different at different altitudes," said Trepte. "This is a step forward in providing the understanding of what dust transport looks like in three dimensions, and then comparing with these models that are being used for climate studies."

Climate studies range in scope from global to regional changes, such as those that may occur in the Amazon in coming years. In addition to dust, the Amazon is home to many other types of aerosols like smoke from fires and biological particles, such as bacteria, fungi, pollen, and spores released by the plants themselves. In the future, Yu and his colleagues plan to explore the effects of those aerosols on local clouds – and how they are influenced by dust from Africa.

Does your mind wander when performing monotonous, repetitive tasks? Of course! But daydreaming involves more than just beating back boredom. In fact, according to a new study published in the Proceedings of the National Academy of Sciences, a wandering mind can impart a distinct cognitive advantage.

Art by Giora Eshkol

Scientists at Bar-Ilan University are the first to demonstrate how an external stimulus of low-level electricity can literally change the way we think, producing a measurable up-tick in the rate at which daydreams - or spontaneous, self-directed thoughts and associations - occur. Along the way, they made another surprising discovery: that while daydreams offer a welcome "mental escape" from boring tasks, they also have a positive, simultaneous effect on task performance.

The new study was carried out in Bar-Ilan's Cognitive Neuroscience Laboratory supervised by Prof. Moshe Bar, part of the University's Gonda (Goldschmied) Multidisciplinary Brain Research Center which Prof. Bar also directs.

What Makes a Mind Wander?

While a far cry from the diabolical manipulation of dream content envisioned in "Inception" - the science-fiction thriller starring Leonardo DiCaprio - the Bar-Ilan University study is the first to prove that a generic external stimulus, unrelated to sensory perception, triggers a specific type of cognitive activity.

In the experiment - designed and executed by Prof. Bar's post-doctoral researcher Dr. Vadim Axelrod - participants were treated with transcranial direct current stimulation (tDCS), a non-invasive and painless procedure that uses low-level electricity to stimulate specific brain regions. During treatment, the participants were asked to track and respond to numerals flashed on a computer screen. They were also periodically asked to respond to an on-screen "thought probe" in which they reported - on a scale of one to four - the extent to which they were experiencing spontaneous thoughts unrelated to the numeric task they had been given.

The Brain-Daydream Connection

According to Prof. Bar - a long-time faculty member at Harvard Medical School who has authored several studies exploring the link between associative thinking, memory and predictive ability - the specific brain area targeted for stimulation in this study was anything but random.

"We focused tDCS stimulation on the frontal lobes because this brain region has been previously implicated in mind wandering, and also because is a central locus of the executive control network that allows us to organize and plan for the future," Bar explains, adding that he suspeced that there might be a connection between the two.

As a point of comparison and in separate experiments, the researchers used tDCS to stimulate the occipital cortex - the visual processing center in the back of the brain. They also conducted control studies where no tDCS was used.

While the self-reported incidence of mind wandering was unchanged in the case of occipital and sham stimulation, it rose considerably when this stimulation was applied to the frontal lobes. "Our results go beyond what was achieved in earlier, fMRI-based studies," Bar states. "They demonstrate that the frontal lobes play a causal role in the production of mind wandering behavior."

Improved "Cognitive Capacity" of the Wandering Mind

In an unanticipated finding, the present study demonstrated how the increased mind wandering behavior produced by external stimulation not only does not harm subjects' ability to succeed at an appointed task, it actually helps. Bar believes that this surprising result might stem from the convergence, within a single brain region, of both the "thought controlling" mechanisms of executive function and the "thought freeing" activity of spontaneous, self-directed daydreams.

"Over the last 15 or 20 years, scientists have shown that - unlike the localized neural activity associated with specific tasks - mind wandering involves the activation of a gigantic default network involving many parts of the brain," Bar says. "This cross-brain involvement may be involved in behavioral outcomes such as creativity and mood, and may also contribute to the ability to stay successfully on-task while the mind goes off on its merry mental way."

While it is commonly assumed that people have a finite cognitive capacity for paying attention, Bar says that the present study suggests that the truth may be more complicated.

"Interestingly, while our study's external stimulation increased the incidence of mind wandering, rather than reducing the subjects' ability to complete the task, it caused task performance to become slightly improved. The external stimulation actually enhanced the subjects' cognitive capacity."

Toward A Less-Mysterious Mind

Bar says that, in the future, he would be interested in studying how external stimulation might affect other cognitive behaviors, such as the ability to focus or perform multiple tasks in parallel. And while any therapeutic application of this technique is speculative at best, he believes that it might someday help neuroscientists understand the behavior of people suffering from low or abnormal neural activity.

In the meantime, Bar's team at the Bar-Ilan University Lab for Cognitive Neuroscience is pleased to note that in their work on mind wandering - probably the most omnipresent internal cognitive function - they have made the human brain just a little less mysterious.

The research described above was funded, in part, by the Israeli Center of Research Excellence in Cognition (ICORE).

Oxytocin, sometimes referred to as the 'love' or 'cuddle' hormone, has a legendary status in popular culture due to its vital role in social and sexual behaviour and long-term bonding.

Now researchers from the University of Sydney and the University of Regensburg have discovered it also has a remarkable influence on the intoxicating effect of alcohol, which they report in the scientific journal Proceedings of the National Academy of Sciences on 24 February.

Dr Michael Bowen, School of Psychology, University of Sydney is lead author on a PNAS paper showing oxytocin counteracts the intoxicating effect of alcohol in rats.

Credit: University of Sydney

When the researchers infused oxytocin into the brains of rats which were then given alcohol it prevented the drunken lack of coordination caused by the alcohol.

"In the rat equivalent of a sobriety test, the rats given alcohol and oxytocin passed with flying colours, while those given alcohol without oxytocin were seriously impaired," Dr Bowen said.

The researchers demonstrated that oxytocin prevents alcohol from accessing specific sites in the brain that cause alcohol's intoxicating effects, sites known as delta-subunit GABA-A receptors.

"Alcohol impairs your coordination by inhibiting the activity of brain regions that provide fine motor control. Oxytocin prevents this effect to the point where we can't tell from their behaviour that the rats are actually drunk. It's a truly remarkable effect," Dr Bowen said.

This 'sobering-up' effect of oxytocin has yet to be shown in humans but the researchers plan to conduct these studies in the near future.

"The first step will be to ensure we have a method of drug delivery for humans that allows sufficient amounts of oxytocin to reach the brain. If we can do that, we suspect that oxytocin could also leave speech and cognition much less impaired after relatively high levels of alcohol consumption," Dr Bowen said.

It's worth noting that oxytocin can't save you from being arrested while driving home from the pub.

"While oxytocin might reduce your level of intoxication, it won't actually change your blood alcohol level," Dr Bowen said. "This is because the oxytocin is preventing the alcohol from accessing the sites in the brain that make you intoxicated, it is not causing the alcohol to leave your system any faster".

Some people might worry a drug which decreases your level of intoxication could encourage you to drink more. As it turns out, separate experiments conducted by the researchers and other groups have shown that taking oxytocin actually reduces alcohol consumption and craving in both rats and humans.

"We believe that the effects of oxytocin on alcohol consumption and craving act through a similar mechanism in the brain to the one identified in our research," said Dr Bowen.

Their findings could see the development of new oxytocin-based treatments for alcohol-use disorders that target this mechanism.

Coastal communities in 15 states that depend on the $1 billion shelled mollusk industry (primarily oysters and clams) are at long-term economic risk from the increasing threat of ocean acidification, a new report concludes.

This first nationwide vulnerability analysis, which was funded through the National Science Foundation's National Socio-Environmental Synthesis Center, was published today in the journal Nature Climate Change.

Oysters at hatcheries in Oregon and Washington are showing the effects of ocean acidification.Credit: Oregon State University

The Pacific Northwest has been the most frequently cited region with vulnerable shellfish populations, the authors say, but the report notes that newly identified areas of risk from acidification range from Maine to the Chesapeake Bay, to the bayous of Louisiana.

"Ocean acidification has already cost the oyster industry in the Pacific Northwest nearly $110 million and jeopardized about 3,200 jobs," said Julie Ekstrom, who was lead author on the study while with the Natural Resources Defense Council. She is now at the University of California at Davis.

George Waldbusser, an Oregon State University marine ecologist and biogeochemist, said the spreading impact of ocean acidification is due primarily to increases in greenhouse gases.

"This clearly illustrates the vulnerability of communities dependent on shellfish to ocean acidification," said Waldbusser, a researcher in OSU's College of Earth, Ocean, and Atmospheric Sciences and co-author on the paper. "We are still finding ways to increase the adaptive capacity of these communities and industries to cope, and refining our understanding of various species' specific responses to acidification.

"Ultimately, however, without curbing carbon emissions, we will eventually run out of tools to address the short-term and we will be stuck with a much larger long-term problem," Waldbusser added.

The analysis identified several "hot zones" facing a number of risk factors. These include:

The Pacific Northwest: Oregon and Washington coasts and estuaries have a "potent combination" of risk factors, including cold waters, upwelling currents that bring corrosive waters closer to the surface, corrosive rivers, and nutrient pollution from land runoff;

New England: The product ports of Maine and southern New Hampshire feature poorly buffered rivers running into cold New England waters, which are especially enriched with acidifying carbon dioxide;

Mid-Atlantic: East coast estuaries including Narragansett Bay, Chesapeake Bay, and Long Island Sound have an abundance of nitrogen pollution, which exacerbates ocean acidification in waters that are shellfish-rich;

Gulf of Mexico: Terrebonne and Plaquemines Parishes of Louisiana, and other communities in the region, have shellfish economies based almost solely on oysters, giving this region fewer options for alternative - and possibly more resilient - mollusk fisheries.

The project team has also developed an interactive map to explore the vulnerability factors regionally.

One concern, the authors say, is that many of the most economically dependent regions - including Massachusetts, New Jersey, Virginia and Louisiana - are least prepared to respond, with minimal research and monitoring assets for ocean acidification.

The Pacific Northwest, on the other hand, has a robust research effort led by Oregon State University researchers, who already have helped oyster hatcheries rebound from near-disastrous larval die-offs over the past decade. The university recently announced plans to launch a Marine Studies Initiative that would help address complex, multidisciplinary problems such as ocean acidification.

"The power of this project is the collaboration of natural and social scientists focused on a problem that has and will continue to impact industries dependent on the sea," Waldbusser said.

Waldbusser recently led a study that documented how larval oysters are sensitive to a change in the "saturation state" of ocean water - which ultimately is triggered by an increase in carbon dioxide. The inability of ecosystems to provide enough alkalinity to buffer the increase in CO2 is what kills young oysters in the environment.

SEO: Search Engine Optimization

Lijit Ad Wijit

javascript:void(0)

Social Icons

Followers

Blog Archive

Featured Posts

My Blog List

Press Releases and Announcements Welcome

Nano Patents and Innovations welcomes press releases concerning nanotechnology and science innovations from corporations, universities and research laboratories. Please send them to the email address at the top of this page. Thank you.

CUSTOM ANALYSIS

iRAP will be happy to talk to you about your market research needs if you need to expand on the current market survey to cover a new product or technology. We are confident that we will meet your expectations and provide you the most accurate industry and market analysis. Send us an email outlining your market research and industry analysis objectives and we will provide you a FREE quote