Thursday, 31 May 2012

Odd orbits of remote objects hint at unseen world, new calculations suggest. Too far out to be easily spotted by telescopes, the potential unseen planet appears to be making its presence felt by disturbing the orbits of so-called Kuiper belt objects, said Rodney Gomes, an astronomer at the National Observatory of Brazil in Rio de Janeiro. An as yet undiscovered planet might be orbiting at the dark fringes of the solar system, according to new research. Kuiper belt objects are small icy bodies—including some dwarf planets—that lie beyond the orbit of Neptune.

Dozens of the other objects are hundreds of miles across, and more are being discovered every year. Once considered the ninth planet in our system, the dwarf planet Pluto, for example, is one of the largest Kuiper belt objects, at about 1,400 miles (2,300 kilometers) wide. (See "Three New 'Plutos'? (Related: "Pluto Neighbor Gets Downsized. ")

The objects' unexpected orbits have a few possible explanations, said Gomes, who presented his findings Tuesday at a meeting of the American Astronomical Society in Timberline Lodge, Oregon Possible Dwarf Planets Found. ")

What's intriguing, Gomes said, is that, according to his new calculations, about a half dozen Kuiper belt objects—including the remote body known as Sedna—are in strange orbits compared to where they should be, based on existing solar system models.

"But I think the easiest one is a planetary-mass solar companion"—a planet that orbits very far out from the sun but that's massive enough to be having gravitational effects on Kuiper belt objects.

Mystery Planet a Captured Rogue?

If there's no distant world, Gomes concludes, the models don't produce the highly elongated orbits we see for six of the objects. Gomes speculates that the mystery object could be a rogue planet that was kicked out of its own star system and later captured by the sun's gravity. (See "'Nomad' Planets More Common Than Thought, May Orbit Black Holes. How big exactly the planetary body might be isn't clear, but there are a lot of possibilities, Gomes added. But so would a Mars-size object—roughly half Earth's size—in a highly elongated orbit that would occasionally bring the body sweeping to within 5 billion miles (8 billion kilometers) of the sun. ") Based on his calculations, Gomes thinks a Neptune-size world, about four times bigger than Earth, orbiting 140 billion miles (225 billion kilometers) away from the sun—about 1,500 times farther than Earth—would do the trick. For the new work, Gomes analyzed the orbits of 92 Kuiper belt objects, then compared his results to computer models of how the bodies should be distributed, with and without an additional planet.

Or the putative planet could have formed closer to our sun, only to be cast outward by gravitational encounters with other planets.

However, actually finding such a world would be a challenge.

To begin with, the planet might be pretty dim. Also, Gomes's simulations don't give astronomers any clue as to where to point their telescopes—"it can be anywhere," he said.

No Smoking Gun

And Hal Levison, an astronomer at the Southwest Research Institute in Boulder, Colorado, says he isn't sure what to make of Gomes's finding. "It seems surprising to me that a [solar] companion as small as Neptune could have the effect he sees," Levison said. Other astronomers are intrigued but say they'll want a lot more proof before they're willing to agree that the solar system—again—has nine planets. But, he added, "I don't think he really has any evidence that suggests it is out there. ")

"Obviously, finding another planet in the solar system is a big deal," said Rory Barnes, an astronomer at the University of Washington. So while, yes, the evidence doesn't exist yet, I thought the bigger point was that he showed us that there are ways to find that evidence. But "I know Rodney, and I'm sure he did the calculations right. " "

Instead, he added, Gomes "has laid out a way to determine how such a planet could sculpt parts of our solar system. (Also see "Record Nine-Planet Star System Discovered? "What he showed in his probability arguments is that it's slightly more likely. He doesn't have a smoking gun yet. "

Douglas Hamilton, an astronomer from the University of Maryland, agrees that the new findings are far from definitive.

Tuesday, 29 May 2012

An international team of scientists led by Professor Chandra Wickramasinghe, of the University of Buckingham, publish their finding in the journal Astrophysics and Space Science. Recent estimates have suggested that our galaxy has as many planets as stars - approximately 200 billion - with most of those planets not orbiting a star. Our galaxy could have as many as a hundred thousand billion life-carrying Earth-sized planets floating between the stars, according to a new study. But this latest study dramtically increases the number of 'free-floating' planets.

Dr Simon O'Toole, an astronomer at the Australian Astronomical Observatory, isn't convinced the number of free-floating planets is so high. this process offers a way by which evolved genes from Earth life could become dispersed through the galaxy," they write. Wickramasinghe and colleagues propose that these planets originated in the early universe a few million years after the Big Bang, and that they make up most of the so-called "missing mass" of galaxies, known as dark matter. "It's a fascinating idea, but involves too many assumptions to say for sure that it's going to be real," says O'Toole As each one passes by our solar system, it accumulates up to 1000 tonnes of interplanetary dust onto its surface. They calculate that on average our solar system would be visited by a free-floating planet once every 26 million years. "If the dust included microbial material that originated on Earth...

He also questions the rate at which such planets would encounter our solar system.

"Any planet that came near our solar system would cause all sorts of gravitational disturbances among the planets," says O'Toole. "[Therefore any encounters] would only be around the very edges, beyond Neptune's orbit."

Despite some scepticism in astronomical circles, he says it is possible for free-floating planets to collect material from our solar system.

"I like the idea of these wandering planets picking up microbial matter, a kind of panspermia idea. People are very interested in it, but aren't 100 per cent behind it."

Sunday, 27 May 2012

Hydrogen gas offers one of the most promising sustainable energy alternatives to limited fossil fuels. But traditional methods of producing pure hydrogen face significant challenges in unlocking its full potential, either by releasing harmful carbon dioxide into the atmosphere or requiring rare and expensive chemical elements such as platinum.

Now, scientists at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory have developed a new electrocatalyst that addresses one of these problems by generating hydrogen gas from water cleanly and with much more affordable materials. The novel form of catalytic nickel-molybdenum-nitride -- described in a paper published online May 8, 2012 in the journal Angewandte Chemie International Edition -- surprised scientists with its high-performing nanosheet structure, introducing a new model for effective hydrogen catalysis.

"We wanted to design an optimal catalyst with high activity and low costs that could generate hydrogen as a high-density, clean energy source," said Brookhaven Lab chemist Kotaro Sasaki, who first conceived the idea for this research. "We discovered this exciting compound that actually outperformed our expectations."

Goldilocks chemistry

Water provides an ideal source of pure hydrogen -- abundant and free of harmful greenhouse gas byproducts. The electrolysis of water, or splitting water (H2O) into oxygen (O2) and hydrogen (H2), requires external electricity and an efficient catalyst to break chemical bonds while shifting around protons and electrons. To justify the effort, the amount of energy put into the reaction must be as small as possible while still exceeding the minimum required by thermodynamics, a figure associated with what is called overpotential.

For a catalyst to facilitate an efficient reaction, it must combine high durability, high catalytic activity, and high surface area. The strength of an element's bond to hydrogen determines its reaction level -- too weak, and there's no activity; too strong, and the initial activity poisons the catalyst.

"We needed to create high, stable activity by combining one non-noble element that binds hydrogen too weakly with another that binds too strongly," said James Muckerman, the senior chemist who led the project. "The result becomes this well-balanced Goldilocks compound -- just right."

Unfortunately, the strongest traditional candidate for an electrocatlytic Goldilocks comes with a prohibitive price tag.

Problems with platinum

Platinum is the gold standard for electrocatalysis, combining low overpotential with high activity for the chemical reactions in water-splitting. But with rapidly rising costs -- already hovering around $50,000 per kilogram -- platinum and other noble metals discourage widespread investment.

"People love platinum, but the limited global supply not only drives up price, but casts doubts on its long-term viability," Muckerman said. "There may not be enough of it to support a global hydrogen economy."

In contrast, the principal metals in the new compound developed by the Brookhaven team are both abundant and cheap: $20 per kilogram for nickel and $32 per kilogram for molybdenum. Combined, that's 1000 times less expensive than platinum. But with energy sources, performance is often a more important consideration than price.

Turning nickel into platinum

In this new catalyst, nickel takes the reactive place of platinum, but it lacks a comparable electron density. The scientists needed to identify complementary elements to make nickel a viable substitute, and they introduced metallic molybdenum to enhance its reactivity. While effective, it still couldn't match the performance levels of platinum.

"We needed to introduce another element to alter the electronic states of the nickel-molybdenum, and we knew that nitrogen had been used for bulk materials, or objects larger than one micrometer," said research associate Wei-Fu Chen, the paper's lead author. "But this was difficult for nanoscale materials, with dimensions measuring billionths of a meter."

The scientists expected the applied nitrogen to modify the structure of the nickel-molybdenum, producing discrete, sphere-like nanoparticles. But they discovered something else.

Subjecting the compound to a high-temperature ammonia environment infused the nickel-molybdenum with nitrogen, but it also transformed the particles into unexpected two-dimensional nanosheets. The nanosheet structures offer highly accessible reactive sites -- consider the surface area difference between bed sheets laid out flat and those crumpled up into balls -- and therefore more reaction potential.

Using a high-resolution transmission microscope in Brookhaven Lab's Condensed Matter Physics and Materials Science Department, as well as x-ray probes at the National Synchrotron Light Source, the scientists determined the material's 2D structure and probed its local electronic configurations.

"Despite the fact that metal nitrides have been extensively used, this is the first example of one forming a nanosheet," Chen said. "Nitrogen made a huge difference -- it expanded the lattice of nickel-molybdenum, increased its electron density, made an electronic structure approaching that of noble metals, and prevented corrosion."

Hydrogen future

The new catalyst performs nearly as well as platinum, achieving electrocatalytic activity and stability unmatched by any other non-noble metal compounds. "The production process is both simple and scalable," Muckerman said, "making nickel-molybdenum-nitride appropriate for wide industrial applications."

While this catalyst does not represent a complete solution to the challenge of creating affordable hydrogen gas, it does offer a major reduction in the cost of essential equipment. The team emphasized that the breakthrough emerged through fundamental exploration, which allowed for the surprising discovery of the nanosheet structure.

"Brookhaven Lab has a very active fuel cell and electrocatalysis group," Muckerman said. "We needed to figure out fundamental approaches that could potentially be game-changing, and that's the spirit in which we're doing this work. It's about coming up with a new paradigm that will guide future research."

Friday, 25 May 2012

The research program is aimed at exploring the fundamental technologies needed to achieve practical hypersonic flight. Air Force Research Laboratory (AFRL) is celebrating the successful launch of an experimental hypersonic scramjet research flight from the Pacific Missile Range Facility on the island of Kauai, Hawaii. Being able to fly at hypersonic speeds could revolutionize high speed, long distance flight and provide more cost-effective access to space NASA, AFRL and Australia's Defence Science and Technology Organisation (DSTO) are working with a number of partners on the HIFiRE (Hypersonic International Flight Research Experimentation Program) program to advance hypersonic flight -- normally defined as beginning at Mach 5 -- five times the speed of sound. A team that includes NASA and the U. S.

The payload was developed under a partnership between the AFRL and NASA, with contributions from the Navy's detachment at White Sands Missile Range, N. "At Mach 6 the inlet compression and combustion process was designed to reduce the flow to below Mach 1 -- subsonic combustion. and ATK GASL located in Ronkonkoma, N. It was the fourth of a planned series of up to 10 flights under HIFiRE and the second focused on scramjet engine research. M. "This is the first time we have flight tested a hydrocarbon-fueled scramjet accelerating from Mach 6 to Mach 8," said NASA Hypersonics Project Scientist Ken Rock, based at NASA'S Langley Research Center in Hampton, Va. " So this test will give us unique scientific data about scramjets transitioning from subsonic to supersonic combustion -- something we can't simulate in wind tunnels. The HIFiRE 2 scramjet research payload included a hypersonic inward turning inlet, followed by a scramjet combustor and dual-exhaust nozzle. During the experiment the scramjet -- aboard its sounding rocket -- climbed to about 100,000 feet (30,480 meters) in altitude, accelerated from Mach 6 to Mach 8 (4,567 to 6,090 miles per hour; 7,350 to 9,800 kilometers per hour) and operated about 12 seconds -- a big accomplishment for flight at hypersonic speeds. Y. But at Mach 8 flight the flow remained greater than Mach 1 or supersonic throughout the engine. More than 700 instruments on board recorded and transmitted data to researchers on the ground.

The HIFiRE 2 mission, the first flight of this sounding rocket configuration, opens the door for a new high--performance flight configuration to support future Air Force, Navy, and NASA flight research The success of the three-stage launch system, consisting of two Terrier boost motors and an Oriole sustainer motor, is another important achievement of the HIFiRE 2 mission. The data collected during the execution of the HIFiRE experiments is expected to make a significant contribution to the development of future high-speed air-breathing engine concepts and help improve design, modeling, and simulation tools.

The HIFiRE team has already achieved other milestones such as the design, assembly and extensive pre-flight testing of the hypersonic vehicles and the design of complex avionics and flight systems. Demonstrating supersonic combustion in flight with a hydrocarbon fueled scramjet, compared to a hydrogen-fueled scramjet, is significant, according to researchers. While hydrogen fuel is more reactive, hydrocarbon fuel offers many benefits, including operational simplicity and higher fuel density so a hypersonic vehicle can carry more fuel. This represents yet another noteworthy achievement for the HIFiRE program, with additional test flights scheduled in the coming months and years.

Wednesday, 23 May 2012

A stream of highly charged particles from the Sun is headed straight toward Earth, threatening to plunge cities around the world into darkness and bring the global economy screeching to a halt. This isn’t the premise of the latest doomsday thriller. Massive solar storms have happened before — and another one is likely to occur soon, according to Mike Hapgood, a space weather scientist at the Rutherford Appleton Laboratory near Oxford, England. Much of the planet’s electronic equipment, as well as orbiting satellites, have been built to withstand these periodic geomagnetic storms. But the world is still not prepared for a truly damaging solar storm, Hapgood argues in a recent commentary published in the journal Nature. Hapgood talked with The Times about the potential effects of such a storm and how the world should prepare for it. There can be a whole range of effects. The classic one everyone quotes is the effect on the power grid. A big geomagnetic storm can essentially put extra electric currents into the grid. If it gets bad enough, you can have a complete failure of the power grid — it happened in Quebec back in 1989. If you’ve got that, then you’ve just got to get it back on again. But you could also damage the transformers, which would make it much harder to get the electric power back. The storms can also disrupt communications on transoceanic flights. Sometimes when that happens, they will either divert or cancel flights. So that would be the like the disruption we had in Europe from the volcano two years ago, where they had to close down airspace for safety reasons.

Monday, 21 May 2012

Eventually, the star pulls the larger planet into a tight circular orbit This theory suggests a close encounter between two large gaseous planets early in the solar system's formation, forcing one to be flung out of the solar system, and the other into a highly elliptical orbit. Using data collected by NASA's Kepler space telescope, a team of researchers led by Dr Jason Steffen, of the Fermilab Center for Particle Astrophysics in Illinois, examined 63 stars that are known to have hot Jupiters to see if they had smaller planets orbiting alongside. The finding, published in the Proceedings of the National Academy of Sciences, also confirms the chaotic path these planets take to end up close to their parent star. The team also looked at a number of stars with 'warm Jupiters' - large planets that orbit their star once every two weeks - and 'hot Neptunes' - close orbiting mid-sized planets. Until recently, most scientists believed that these large planets slowly migrated in towards their star, pushing smaller planets in with them. Astronomers searching for Earth-like planets can strike off stars that have a 'hot Jupiter' orbiting around them, according to a new study. If the remaining planet passes close to any smaller planets that have formed around the star, it also flings them out of the solar system. "If you have a [previously] measured planet, which in this case is a hot Jupiter, you can look for deviations that result from gravitational interactions with other planets in the same system," says Steffen. They detected the presence of Earth-like planets around approximately 10 per cent of the warm Jupiter systems, and 30 per cent of the hot Neptune systems. Their analysis failed to find any. Hot Jupiters are large gaseous planets that take less than a week to complete one orbit. Chaotic beginnings.

Steffen says the absence of smaller rocky planets alongside hot Jupiters confirms the planet-planet interaction theory and "puts the nails into the coffin" of the migration theory.

He says it also narrows down the number of solar systems astronomers need to focus on when looking for Earth-like planets.

"You're not going to find habitable Earths in these hot Jupiter systems," says Steffen. "There's a lot of value in continuing to monitoring them, but it's for other reasons now."

Saturday, 19 May 2012

British researchers writing in the journal Nature say they have found a major pathway leading to brain cell death in mice with prion disease, the mouse equivalent of Creutzfeld-Jacob Disease (CJD). These misshapen proteins form the plaques found in the brains of patients with Alzheimer's and the Lewy bodies found in Parkinson's disease. The finding, described by one expert as "a major breakthrough in understanding what kills neurons", points to a common mechanism by which brain diseases such as Alzheimer's, Parkinson's and CJD damage the nerve cells. Scientists have figured out how to stop brain cell death in mice with brain disease which could provide a deeper understanding of neurodegenerative diseases such as Alzheimer's and Parkinson's. They then worked out how to block it, and were able to prevent brain cells from dying, helping the mice live longer. In neurodegenerative diseases, proteins "mis-fold" in a various ways, leading to a build up of misshapen proteins, the researchers explain in the study. "What's exciting is the emergence of a common mechanism of brain cell death, across a range of different neurodegenerative disorders, activated by the different mis-folded proteins in each disease," says Professor Giovanna Mallucci, who led the research at the University of Leicester's toxicology unit.

By injecting a protein that blocks the "off" switch, the scientists were able to restore the production of the survival proteins and halt the neurodegeneration. Switching off

Mallucci's team found that the build up of mis-folded proteins in the brains of mice with prion disease activated a natural defence mechanism in cells, which switches off the production of new proteins. This is the trigger point leading to brain cell death, because key proteins essential for cell survival are not made. In these diseases, neurons in the brain die, destroying the brain from the inside. "The fact that in mice with prion disease we were able to manipulate this mechanism and protect the brain cells means we may have a way forward in how we treat other disorders," she says. But why the neurons die has remained an unsolved mystery, presenting an obstacle to developing effective treatments and to being able to diagnose the illnesses at early stages when medicines might work better. An estimated 18 million people worldwide have Alzheimer's, and Parkinson's is thought to affect around one in 100 people over the age of 60. This would normally switch back on again, the researchers explain, but in these ill mice the continued build-up of misshapen proteins keeps the switch turned off. They found the brain cells were protected, protein levels were restored and synaptic transmission - the way brain cells signal to each other - was re-established. Eric Karran, director of research at the charity Alzheimer's Research UK, says while the research was still at an early stage, the results were exciting The mice also lived longer, even though only a very small part of their brains had been treated.

"While neurodegenerative diseases can have many different triggers, this study suggests that they may act through a common mechanism to damage nerve cells. The findings present the appealing concept that one treatment could have benefits for a range of different diseases," he says.

Professor Roger Morris, a molecular neurobiologist at King's College London who was not involved in the work, says the finding is "a major breakthrough in understanding what kills neurons in neurodegenerative disease".

"There are good reasons for believing this response, identified with prion disease, applies also to Alzheimer's and other neurodegenerative diseases," he says.

Thursday, 17 May 2012

However, samples of the upper mantle which have been brought to the surface mostly through tectonic or volcanic activity, show the Earth has much less silicon than the chondritic meteorites. The work, reported in the journal Nature, answers a riddle which has blighted models of the Earth's composition for years. Scientists have used sound waves to determine the composition of the Earth's inner mantle, possibly solving the mystery of our planet's missing silicon. Pushing sound waves

To try and solve the problem, Murakami and colleagues performed a series of laboratory measurements by replicating the lower mantle's environment. They pushed sound waves through different minerals at densities, pressures and temperatures matching the conditions that exist more than 660 kilometres below the surface. After comparing their results with seismic data, they conclude that the lower mantle is far more highly enriched in silicon than the upper mantle. They speculate this may have been caused by fractional crystallisation of the magma ocean extending into the lower mantle during Earth's early history Their findings also show that there is little movement of materials between the lower and upper mantle. The new study by researchers including Dr Motohiko Murakami from Japan's Tohoku University, found that 93 per cent of the lower mantle is composed of a silicate perovskite mineral. Murakami and colleagues say their new data is consistent with the chondritic Earth model. These models rely on the composition of chondritic meteorites, which are thought to have formed in the same region as the Earth more than four billion years ago.

Murakami and colleagues conclude that this primordial chemical stratification may have been preserved until the present day.
Ground breaking

In an accompanying commentary in the same issue of Nature, Professor Ian Jackson from the Australian National University, says Murakami and colleagues' work breaks new ground.

But he warns that the story is unlikely to be over yet.

"Inevitably in such pioneering work, there are still a few technical issues that need to be resolved," says Jackson.

"These tests are done over extreme pressures to more than a million atmospheres, and so the issue of how the pressure is calibrated becomes a key factor."

"What's needed now is more follow up work that would allow these very high pressure and temperature results to be compared with the same tests carried out under more precisely known conditions found at lower to moderate pressures," says Jackson.

"One line of attack would involve using new computer modelling of mineral properties. This is an increasingly powerful way of assessing the behaviour of rocks under extreme conditions."

But Jackson says there is still the possibility the entire chondritic Earth model is wrong.

Some scientists believe the movement of planets during the early formation of the solar system, known as planetary migration, could have mchanged the distribution and mix of asteroids and meteors.

Tuesday, 15 May 2012

"Black holes, like sharks, suffer from a popular misconception that they are perpetual killing machines," says researcher Ryan Chornock from the Harvard-Smithsonian Centre for Astrophysics. Occasionally a star wanders too close, and that's when a feeding frenzy begins. "Actually, they're quiet for most of their lives. Supermassive black holes are giant gravity wells that usually lurk dormant and undetected at the centre of galaxies, but can occasionally be tracked by the scraps left over from their stellar fests. Scientists have witnessed the rare spectacle of a supermassive black hole devouring a star that had ventured too close - an event that occurs about once in 10,000 years. "

If a star passes too close, the black hole's gravitational pull can rip it apart before sucking in its gases, which are heated by the friction and start to glow - giving away the silent killer's hiding place.

Smoking gun

She says this was the first such space feast observed from beginning to end, and "that is very exciting because that time scale is how we determine how big the black hole is" Observations over several months allowed the team to conclude that the black hole was at the centre of a galaxy 2. 7 billion light years away, and about three million times the mass of our Sun, making it about the same size as the Milky Way's central black hole. Its victim was probably a star in its late, red giant phase which had tempted fate by wandering to within about 150 million kilometres of the black hole, about the same distance as the Earth is from the Sun. "This is the first time where we actually have enough detailed information that we can actually determine what kind of star was torn apart by a black hole and how big the black hole was that did it," says Gezari. "Initially we didn't know exactly what this flare was because it was so bright that when we looked at the galaxy we couldn't see the stars to determine how far away the galaxy was," study co-leader Suvi Gezari of John Hopkins University in Baltimore. Their report in the journal Nature describes the flare brightened to a peak that July, before fading away over the course of a year. Chornock and colleagues observed such a glow in May 2010 through a telescope mounted on Mount Haleakala in Hawaii, as well as a NASA satellite.

Eaten alive

The scientists concluded that the star had lost its hydrogen outer shell in a previous pass by the black hole, leaving just its helium core to be consumed in round two.

"It was really spectacular to have so much info and have all the pieces of evidence come together to form a consistent picture of what happened," says Gezari.

Black holes are very dense regions in space-time with a gravitational force so strong that even light cannot escape. Scientists who study them hope to learn more about the evolution of galaxies.

Stars in our own Milky Way galaxy, including the Sun, are too far away to be at risk of being consumed, says Gezari.

"We would have to wait at least 10,000 years before we would be able to see a star being gobbled by our own black hole," she says.

"So the best way to find these events is not to wait around for our own Milky Way galaxy to gobble a star, we actually have to look at hundreds of thousands of galaxies in the sky to catch one in the act of shredding up a star."

Dr David Floyd from the Monash Centre for Astrophysics in Melbourne says scientists normally notice a flare in a distant galaxy and then turn their telescopes towards the object after the process has already begun.

"What makes this so special was the fact that they actually caught the black hole as it was ripping the stellar core apart," says Floyd.

"Now increasingly with new telescopes such as Pan-STARRS, which is the one that was used here, and the LSST or Large Synoptic Survey Telescope, which will be built in Chile, we will be able to scan the skies continuously letting us catch these events as they happen."

Sunday, 13 May 2012

"The ability to compare multiple optical clocks will allow us to perform high precision experiments on very large scales … and is a prerequisite to any redefinition of the second based on an optical standard," says lead-author Katharina Predehl from the Max Planck Institute of Quantum Optics(MPQ) in Germany. But new research, reported in the journal Science, demonstrates for the first time that optical fibre networks can allow optical clocks to be synchronised over long distances. The breakthrough paves the way for a network of super-precise time-keeping devices that could revolutionise technology and fundamental physics experiments. The clocks are so accurate they can only be set if they're linked together, something that can't be done using the noisy satellite technology that connects today's atomic clocks. Optical clocks are similar to atomic clocks, but their 'tick' mirrors the vibrations of atoms at optical frequencies, rather than microwave frequencies, making it possible to measure times as small as a quadrillionth of a second. However, the clocks often take up entire labs spread out across the globe, and their signals become weak and distorted when travelling over long distances German scientists have used optical fibres to send a timing signal from one of the world's most accurate clocks over a distance of more than 900 kilometres. Bridging the distance

Researchers have been trying to create a fibre link between optical clocks for several years.

The previous record was 146 kilometres. "In the short term it won't mean much. But in the same way we depend on today's atomic clocks for telecommunications and navigation devices, we will come to depend on optical clocks. A feedback loop was also used to minimise distortion. Warrington is part of Australia's National Time and Frequency Network (NTFN), a collaboration that aims to build an optical clock in South Australia and develop optical fibre technology that will send its signal to researchers around the country. Revolutionising technology

Extending fibre connections further could allow researchers to build a global network of optical clocks, which could one day change the way the world keeps time, says Dr Bruce Warrington of Australia's National Measurement Institute in Sydney. "We can use exactly the same piece of fibre to communicate an optical clock signal, as long as we can overcome the challenge of co-existing with normal data," he says These techniques allowed the researchers to send a clear signal along 920 kilometres of optical fibre from Germany's national metrology institute in Braunschweig, to the MPQ with extremely high accuracy. "

According to Warrington, an optical clock network would also greatly benefit radio astronomy, such as the Square Kilometre Array project, and would allow researchers to test fundamental physical concepts, such as the speed of light and the theory of relativity. In this study, the team staggered nine amplifiers along the optical fibres in order to keep the frequency strong without introducing noise. According to Professor Andre Luiten, the leader of the NTFN at the University of Western Australia, Australia has a head start, as the implementation of the National Broadband Network will ensure the country is well connected with optical fibres.

Friday, 11 May 2012

While a small handful of these planets are known to be 'Earth-like', astronomers are a long way from knowing whether they can sustain life. "It's probably one of the biggest, most confusing, and important issues that planetary scientists are going to have to deal with in the next 10 to 20 years. Since the first discovery of a planet orbiting another star was made in 1995, the number of exoplanets has skyrocketed to more than 750. " "Determining whether these planets are habitable has become the new holy grail of astronomy," says Lineweaver. Finding planets outside our solar system that can sustain life should be made a top priority, say Australian astronomers. Lineweaver and Chopra reviewed current research examining environments where life is found on Earth and the environments thought to exist on other planets. Understanding habitability and using that knowledge to locate the nearest habitable planet may be crucial for our survival as a species, writes Dr Charley Lineweaver and PhD student Aditya Chopra of the Australian National University in the Annual Reviews of Earth and Planetary Sciences.

He dismisses the idea that humans should stay on Earth, comparing it to the attitude of some towards Columbus' proposed trip across the Atlantic Ocean. '"

Comparing planets

Lineweaver and Chopra's review of the literature found that the presence of water and a temperature range of between -20°C and 122°C are the two most important parameters for harbouring life. They argue that the conditions for life to form, called the abiogensis habitable zone, are much narrower than the conditions needed for life to survive "Over the past few decades our exploration of the Earth has turned up life in all kinds of weird environments where we didn't think life could be in, and we're finding all types of extraterrestrial environments that we didn't know about before," says Lineweaver. Lineweaver says one of the reasons why humans should search for habitable planets is to place future human colonies. "It's a bit like the Europeans in 1450 saying 'Hey what does it matter whether we go exploring the rest of the world? "

The report also raises the possibility of habitable planets that don't contain life. "As these two groups expand they start to overlap in big ways, and that's where habitable planets will be found.

"Life, by managing its own environment, makes a planet habitable. It has produced adaptive features as a result of Darwinian evolution to live in colder and warmer environments," says Lineweaver. "It's kind of like an adult can live in a higher range of temperatures than a baby can."

Lineweaver believes observation programs such as the Kepler telescope, which has been extended to at least 2015, will continue to discover more Earth-like planets.

"The next step will be to develop a satellite that can look at the atmospheres of these planets, which will be able to give us some information about whether there is life there or not," he says.

If we find an Earth-like planet, Lineweaver says the next step is to send an interstellar probe to explore it.

Wednesday, 9 May 2012

"Quantum computing is a kind of information science that is based on the notion that if one performs computations in a fundamentally different way than the way your classical desktop computer works," says study co-author University of Sydney's Dr Michael Biercuk. "

The crystal simulator uses a property of quantum mechanics called superposition, where a quantum particle appears to be in two distinct states at the same time. Researchers say they have designed a tiny crystal that acts like a quantum computer so powerful it would take a computer the size of the known universe to match it. This means the particle, known as a qubit, can be used to solve two equations simultaneously Details of the crystal, which is made up of just 300 atoms, are published today in the journal Nature. "There's a huge potential to solve a variety of problems that are very, very hard or near impossible for standard computer.

"They're not easily checked by a classical computer which opens a whole variety of problems," says Biercuk. And he adds that there is still plenty of work to be done before quantum computers start appearing on desks in homes and offices. "It turns out that that computer would need to be the size of the known universe - which is clearly something that's not possible to achieve," he says. For example, 2 qubits can simultaneously be in 4 states, 3 qubits in 8 states: 2 to the power of n states for n qubits. Experts believe quantum computing is moving to a stage where it is so far out in front and performing such complex tasks it will be difficult to check if it is working accurately. " As the number of qubits increase, the number or states increases exponentially. "The central element is something like a millimetre in diameter, 300 atoms that are suspended in space," says Biercuk. "But of course everything depends on a huge amount of technical infrastructure around it. Outstrips classical computers

According to Biercuk, the computing power of the 300-atom crystal simulator far outstrips the capacity of today's classical computer. There are vacuum chambers and pumps and lasers, and all of that takes up something like a room.

Saturday, 5 May 2012

More than any other factor it is the turbulent, unpredictable sun that drives the planet's meteorological and geophysical events. Many scientists agree the Ice Age cometh, the only question is: will the encroaching freeze last 100 years, or 100,000? The intensifying of geomagnetic storms is even having a significant impact on the human brain igniting more violence, more uprisings, and more threats of war. As the sun's explosions reach a crescendo during 2013 it's gloom may spell Earth's doom—or at least modern civilization's doom. The sun is angry and may be dying Earth is beginning to feel its wrath. S. government to collapse In the years leading up to the beginning of the next Ice Age, the sun is going through remarkable changes…changes that are having major impacts on Earth's weather, climate, volcanoes, and seismic activity. Then the sun will cool and plunge the world into cold. The list of the sun's expected Doomsday events is long and reads like a Hollywood disaster movie. Yellowstone, or the approaching Ice Age.

Megaquakes increase and intensify

Such a quake at a magnitude of 9. Volcanic venting is already occuring in two states, Not-so-gentle nudges from the sun might help it along. The Pacific plate that three continents sit on or near has ruptured tearing open the seafloor and slowing the Earth's rotation. The last time the New Madrid quaked in the mid 1800s church bells rang in Pennsylvania, the Mississippi River ran backward, and destruction was widespread. C. Another giant fault also seems to be awakening: the New Madrid fault in America's heartland. At least a dozen other potential megaquake "hotspots" have been identified across the world and are being keenly watched. The population was much smaller back then Imagine the megathrust quake that struck Fukushima, Japan during March 2011 occuring all over the globe. Seismoligists have imagined it and some of them are very worried—and with good cause. S. and Canada. Geophysicists are warning that might slip sooner rather than later. 0 or higher would obliterate Seattle as well as the rest of Washington state as far east as Idaho. Frantic geophysicists are now wringing their hands over similar megaquakes and destruction happening in high-density populated areas like the Pacific Northwest of the U. Oregon would be devasted and Vancouver, B. decimated. When those faults rupture, no technology can halt the cascading catastrophe.

Mankind goes on crazed killing spree

Two researchers decades apart studied the sun and its magnetic effect on the human brain. What they found was shocking. During solar storm peaks and periods of no sunspot activity the human race seemed to go nuts. Both had disciplines in different fields of knowledge: one was an astrophysicist, the other an historian. The historian's research analyzed data going back thousands of years.

During period of intense solar activity, such as Earth is experiencing now, more humans have a tendency to commit bizarre acts, violence and murders increase, and wars, rebellions and revolutions break out.

Sound familiar?

Coronal Mass Ejection wipes out 21st Century technology

A large solar flare of a class known as an X-flare of 40 or higher can easily wipe out all 21st Century technology and catapult the nations struck back to the 19th Century in a matter of minutes.

Thursday, 3 May 2012

One of the paper's authors, Associate Professor Jenny Adams from New Zealand's University of Canterbury, says tracing cosmic rays back to their source is impossible because they're deflected by magnetic fields and galaxies Various models predict gamma ray bursts are the only events capable of producing enough energy to accelerate cosmic rays. Cosmic rays are highly energetic charged particles such as protons and heavier atomic nuclei, that strike the Earth from deep space with 100,000,000 times more energy than can be created in particle accelerators like CERN's Large Hadron Collider. A report in the journal Nature by scientists from the IceCube collaboration, concluded there wasn't enough evidence to link gamma ray bursts and cosmic rays. Researchers searching for the source of cosmic rays are going back the drawing board after ruling out gamma ray bursts as the most likely source.

Because they're neutral they'll point back to their source. "The problem is, we found no neutrinos coming from gamma ray bursts," says Adams. "

"That's why the IceCube neutrino detector was built. "And if there aren't neutrinos, there can't be cosmic rays. I guess in some ways it's disappointing. "

Buried up to two kilometres under the Antarctic ice at the South Pole, IceCube is the only instrument big enough to detect neutrinos generated by gamma ray bursts. "Wherever cosmic rays are produced, neutrinos should also be produced. "

"But it's also deepened the mystery, because we now have to look for new possible sources to solve the riddle. "

"The Ice Cube data we published in Nature came when the detector was only between half and three quarters complete," says Adams. Now with Ice Cube finished and up to full capacity, Adams believes the search can focus on active galactic nuclei. They were a good candidate. But one of the papers other authors, Dr Gary Hill from the University of Adelaide, isn't so confident. "Gamma ray bursts were our big shot. " Adams says active galactic nuclei such as quasars and blazars, powerful energy beams produced by supermassive black holes, may be possible candidates. "Hopefully we'll find neutrinos or rule out some more models as to where cosmic rays can be produced," says Adams. "The statistical significance between cosmic rays and active galactic nuclei hasn't grown over time despite more observations," says Hill. "

"It really is a mystery because we can't figure out where cosmic rays could come from.

Tuesday, 1 May 2012

Their discovery was based on the measurement of exploding stars called Type 1a supernovae. The study to be published in the Astronomical Journal supports earlier work by scientists including Nobel Laureates Professor Brian Schmidt from the Australian National University, Saul Perlmutter and Adam Riess. The new work by scientists led by Professor Masamune Oguri from Japan's Kavli Institute for the Physics and Mathematics of the Universe reached their findings by examining gravitationally lensed quasars. A new study of quasars has provided further evidence that dark energy is accelerating the expansion of the universe. Gravitational lensing uses the mass of a foreground object, such as a galaxy, to act as a lens to bend light coming towards us from a more distant object.

Within this set, the researchers identified 50 gravitationally lensed quasars, allowing them to determine their likely frequency over a given area. These powerful jets of particles and energy are produced by supermassive black holes in the heart of galaxies from the early universe. Oguri and colleagues spent ten years examining 100,000 quasars charted by the Sloan Digital Sky Survey. Because accelerated expansion would increase the distance to each quasar and the chances of gravitational lensing, they were able to infer the expansion speed of the universe. "Our new result using gravitational lensing not only provides additional strong evidence for the accelerated cosmic expansion, but also is useful for accurate measurements of the expansion speed, which is essential for investigating the nature of dark energy," says Oguri.

New approach

"There are five ways to determine how fast the universe's expansion rate is accelerating," says Lineweaver. " Professor Charlie Lineweaver, a cosmologist from the Australian National University, says this study demonstrates a new way to confirm the accelerated expansion of the universe. "

"All science is built on that. There's also baryonic acoustic oscillations which are density waves that propagate through the universe, there's the cosmic microwave background radiation and you can also count the numbers of star clusters at given distances. You can imagine if one method gave you one answer and another method gave you a different answer, so what you're looking for is consistency. "Schmidt used supernovae, which is still the most accurate method. "

"Having the same conclusions reached by different methods is a tremendously important tool in any scientific research.