Dark Matter Not So Clumpy After All
[12/7/2016]
Dark matter, the mysteriously invisible substance that makes up about 27 percent of the mass in the universe, may not be as clumpy as scientists previously thought.

In 2013, researchers with Europe's Planck mission, which studied the oldest light in the universe, found that dark matter has lumped together over time through gravitational attraction. What started out as a smooth and even distribution of dark matter slowly formed dense chunks over time.

But new research at the European Southern Observatory's (ESO) Very Large Telescope (VLT) at the Paranal Observatory in Chile suggests that dark matter is not quite as clumpy as the Planck mission previously found.
"This latest result indicates that dark matter in the cosmic web, which accounts for about one-quarter of the content of the universe, is less clumpy than we previously believed," Massimo Viola, a researcher at the Leiden Observatory in the Netherlands who co-led in the study, said in a statement.

To see how dark matter is distributed in universe, the international team of researchers used data from the Kilo Degree Survey (KiDS) at the VLT Survey Telescope. This deep-sky survey looked at about 15 million galaxies in five patches of the southern sky, covering an area as big as 2,200 full moons (or 450 square degrees).

Because dark matter's gravity can bend light — a process called gravitational lensing — the light coming from these 15 million galaxies could reveal information about the structure and distribution of dark matter, the researchers suggest. In this study, they looked for a variation of this phenomenon known as weak gravitational lensing, or cosmic shear.
Weak gravitational lensing is a subtle effect that has to be measured with precision. When large-scale structures like galaxy clusters cause weak gravitational lensing, the light-warping effect is subtler and more difficult to detect than gravitational lensing around smaller objects like stars. But with high-resolution images taken by the VLT Survey Telescope, the researchers were able to detect this subtle effect. This study is the first to use this imaging method on such a large portion of the sky to map the invisible matter in the universe, the authors wrote.

When the researchers then used this data to calculate how clumpy dark matter is, they discovered that it is significantly smoother than the Planck satellite data had previously determined. This means that dark matter may be more evenly distributed than scientists have thought.
How dark matter has spread and clumped together since the Big Bang happened 13.8 billion years ago, can provide insights into the evolution of the universe, according to co-author Hendrik Hildebrandt of the Argelander Institute for Astronomy in Bonn, Germany. "Our findings will help to refine our theoretical models of how the universe has grown from its inception up to the present day," Hildebrandt said in the same statement.

"We see an intriguing discrepancy with Planck cosmology at the moment," co-author Konrad Kuijken of the Leiden Observatory in the Netherlands, who is principal investigator of the KiDS survey, said in the statement. "Future missions such as the Euclid satellite and the Large Synoptic Survey Telescope will allow us to repeat these measurements and better understand what the universe is really telling us."
(FULL STORY)

Scientists Catch "Virtual Particles" Hopping In and Out of Existence
[11/30/2016]
About 400 light-years from here, in the area surrounding a neutron star, the electromagnetic field of this unbelievably dense object appears to be creating an area where matter spontaneously appears and then vanishes.

Quantum electrodynamics (QED) describes the relationships between particles of light, or photons, and electrically charged particles such as electrons and protons. The theories of QED suggest that the universe is full of "virtual particles," which are not really particles at all. They are fluctuations in quantum fields that have most of the same properties as particles, except they appear and vanish all the time. Scientists predicted the existence of virtual particles some 80 years ago, but we have never had experimental evidence of this process until now.

SEEING THE INVISIBLE

How can we possibly see such a thing? One of the properties virtual particles have in common with actual particles is that they both affect light. In addition, intense magnetic fields are thought to excite the activity of virtual particles, affecting any light that passes through that space more dramatically.

So a team of astronomers pointed our most advanced ground-based telescope, the European Southern Observatory's Very Large Telescope (VLT), at one of the densest objects we know of: a neutron star.
Neutron stars have magnetic fields that are billions of times stronger than our sun's. Using the VLT, Roberto Mignani from the Italian National Institute for Astrophysics (INAF) and his team observed visible light around the neutron star RX J1856.5-3754 and detected linear polarization—or the alignment of light waves according to external electromagnetic influences—in the empty space around the star. This is rather odd, because conventional relativity says that light should pass freely through a vacuum, such as space, without being altered. The linear polarization was to such a degree (16 degrees to be precise) that the only known explanations are theories of QED and the influence of virtual particles.

"According to QED, a highly magnetized vacuum behaves as a prism for the propagation of light, an effect known as vacuum birefringence," Mignani says. "The high linear polarization that we measured with the VLT can't be easily explained by our models unless the vacuum birefringence effects predicted by QED are included."

HOW DO YOU MEASURE SOMETHING THAT DOESN'T ALWAYS EXIST?

Vacuum birefringence was first predicted in the 1930s by Werner Heisenberg and Hans Heinrich Euler. It was an exciting time for the development of quantum mechanics, when many of the advanced theories still studied today were developed.

In the quantum realm, matter behaves very strangely to say the least. It violates both Newton's classical laws of physics and Einstein's theories of relativity and gravity. Matter can exist in two separate places at once. Entangled particles, separated by miles, can influence each other instantaneously. As far as we can tell, the smallest building blocks of matter exist with multiple, or even infinite properties, known as quantum states, until they are observed or measured.

Fortunately, we can model and even predict some quantum phenomena, and we do this using wave functions. A wave, such as a sine curve, is represented by an equation that has multiple correct values to make it a true mathematical statement. This same basic principle can be applied to physical models of particles that exist in different locations, or with different properties, or sometimes don't exist at all. When the particles are measured, the wave function collapses, and the matter only exists with one set of properties like you would expect. The researchers were able to measure the virtual particles around a neutron star indirectly, by measuring the light that passes through them.

These concepts are so profound that Einstein and Niels Bohr famously debated, at length, whether the universe even exists as a tangible smattering of matter across the void, or if it is a fluid conglomerate of infinite possible realities until we observe it. The first experimental evidence of vacuum birefringence—absurdly strong electromagnetic forces tugging at the very foundations of matter—reminds us that this is still an open-ended question.
(FULL STORY)

New theory of gravity might explain dark matter
[11/8/2016]
A new theory of gravity might explain the curious motions of stars in galaxies. Emergent gravity, as the new theory is called, predicts the exact same deviation of motions that is usually explained by invoking dark matter. Prof. Erik Verlinde, renowned expert in string theory at the University of Amsterdam and the Delta Institute for Theoretical Physics, published a new research paper today in which he expands his groundbreaking views on the nature of gravity.

In 2010, Erik Verlinde surprised the world with a completely new theory of gravity. According to Verlinde, gravity is not a fundamental force of nature, but an emergent phenomenon. In the same way that temperature arises from the movement of microscopic particles, gravity emerges from the changes of fundamental bits of information, stored in the very structure of spacetime.

Newton's law from information

In his 2010 article (On the origin of gravity and the laws of Newton), Verlinde showed how Newton's famous second law, which describes how apples fall from trees and satellites stay in orbit, can be derived from these underlying microscopic building blocks. Extending his previous work and work done by others, Verlinde now shows how to understand the curious behaviour of stars in galaxies without adding the puzzling dark matter.

The outer regions of galaxies, like our own Milky Way, rotate much faster around the centre than can be accounted for by the quantity of ordinary matter like stars, planets and interstellar gasses. Something else has to produce the required amount of gravitational force, so physicists proposed the existence of dark matter. Dark matter seems to dominate our universe, comprising more than 80 percent of all matter. Hitherto, the alleged dark matter particles have never been observed, despite many efforts to detect them.

No need for dark matter

According to Erik Verlinde, there is no need to add a mysterious dark matter particle to the theory. In a new paper, which appeared today on the ArXiv preprint server, Verlinde shows how his theory of gravity accurately predicts the velocities by which the stars rotate around the center of the Milky Way, as well as the motion of stars inside other galaxies.

"We have evidence that this new view of gravity actually agrees with the observations, " says Verlinde. "At large scales, it seems, gravity just doesn't behave the way Einstein's theory predicts."

At first glance, Verlinde's theory presents features similar to modified theories of gravity like MOND (modified Newtonian Dynamics, Mordehai Milgrom (1983)). However, where MOND tunes the theory to match the observations, Verlinde's theory starts from first principles. "A totally different starting point," according to Verlinde.

Adapting the holographic principle

One of the ingredients in Verlinde's theory is an adaptation of the holographic principle, introduced by his tutor Gerard 't Hooft (Nobel Prize 1999, Utrecht University) and Leonard Susskind (Stanford University). According to the holographic principle, all the information in the entire universe can be described on a giant imaginary sphere around it. Verlinde now shows that this idea is not quite correct—part of the information in our universe is contained in space itself.

This extra information is required to describe that other dark component of the universe: Dark energy, which is believed to be responsible for the accelerated expansion of the universe. Investigating the effects of this additional information on ordinary matter, Verlinde comes to a stunning conclusion. Whereas ordinary gravity can be encoded using the information on the imaginary sphere around the universe, as he showed in his 2010 work, the result of the additional information in the bulk of space is a force that nicely matches that attributed to dark matter.

On the brink of a scientific revolution

Gravity is in dire need of new approaches like the one by Verlinde, since it doesn't combine well with quantum physics. Both theories, crown jewels of 20th century physics, cannot be true at the same time. The problems arise in extreme conditions: near black holes, or during the Big Bang. Verlinde says, "Many theoretical physicists like me are working on a revision of the theory, and some major advancements have been made. We might be standing on the brink of a new scientific revolution that will radically change our views on the very nature of space, time and gravity."
(FULL STORY)

Supersolids produced in exotic state of quantum matter
[11/7/2016]
A mind-bogglingly strange state of matter may have finally made its appearance. Two teams of scientists report the creation of supersolids, which are both liquid and solid at the same time. Supersolids have a crystalline structure like a solid, but can simultaneously flow like a superfluid, a liquid that flows without friction.

Research teams from MIT and ETH Zurich both produced supersolids in an exotic form of matter known as a Bose-Einstein condensate. Reports of the work were published online at arXiv.org on October 26 (by the MIT group) and September 28 (by the Zurich group).

Bose-Einstein condensates are created when a group of atoms, chilled to near absolute zero, huddle up into the same quantum state and begin behaving like a single entity. The scientists’ trick for creating a supersolid was to nudge the condensate, which is already a superfluid, into simultaneously behaving like a solid. To do so, the MIT and Zurich teams created regular density variations in the atoms — like the repeating crystal structure of a more typical solid — in the system. That density variation stays put, even though the fluid can still flow.

The new results may be the first supersolids ever created — at least by some definitions. “It’s certainly the first case where you can unambiguously look at a system and say this is both a superfluid and a solid,” says Sarang Gopalakrishnan of the College of Staten Island of the City University of New York. But the systems are far from what physicists predicted when they first dreamt up the strange materials.

Scientists originally expected supersolids to appear in helium-4 — an isotope of the element helium and the same gas that fills balloons at children’s birthday parties. Helium-4 can be chilled and pressurized to produce a superfluid or a solid. Supersolid helium would have been a mixture of these two states.

Previous claims of detecting supersolid helium-4, however, didn’t hold up to scrutiny (SN Online: 10/12/2012). So, says Nikolay Prokof’ev of the University of Massachusetts Amherst, “now we have to go to the artificial quantum matter.” Unlike helium-4, Bose-Einstein condensates can be precisely controlled with lasers, and tuned to behave as scientists wish.

The two groups of scientists formed their supersolids in different ways. By zapping their condensate with lasers, the MIT group induced an interaction that gave some of the atoms a shove. This motion caused an interference between the pushed and the motionless atoms that’s similar to the complex patterns of ripples that can occur when waves of water meet. As a result, zebralike stripes — alternating high- and low-density regions — formed in the material, indicating that it was a solid.

Applying a different method, the ETH Zurich team used two optical cavities — sets of mirrors between which light bounces back and forth repeatedly. The light waves inside the cavities caused atoms to interact and thereby arrange themselves into a crystalline pattern, with atoms separated by an integer number of wavelengths of light.

Authors of the two studies declined to comment on the research, as the papers have been submitted to embargoed journals.

“Experimentally, of course, these are absolutely fantastic achievements,” says Anatoly Kuklov of the College of Staten Island. But, he notes, the particles in the supersolid Bose-Einstein condensates do not interact as strongly as particles would in supersolid helium-4. The idea of a supersolid is so strange because superfluid and solid states compete, and in most materials atoms are forced to choose one or the other. But in Bose-Einstein condensates these two states can more easily live together in harmony, making the weird materials less counterintuitive than supersolid helium-4 would be.

Additionally, says Prokof’ev, “some people will say ‘OK, well, this does not qualify exactly for supersolid state,’” because the spacing of the density variations was set externally, rather than arising naturally as it would have in helium.

Still, such supersolids are interesting for their status as a strange and new type of material. “These are great works,” says Kuklov. “Wide attention is now being paid to supersolidity.”
(FULL STORY)

You Can 3D Print Your Own Mini Universe
[11/1/2016]
Have you ever wondered what the universe looks like in all of its entirety, or how it would feel to hold the universe in the palm of your hand? Good news: It is now possible to do both of these things — all you need is a 3D printer.

Researchers at the Imperial College London have created the blueprints for 3D printing the universe, and have provided the instructions online so anyone with access to a 3D printer can print their own miniature universe. You can see a video on the science behind the 3D-printed universe here.

The researchers' representation of the universe specifically depicts the cosmic microwave background (CMB), or a glowing light throughout the universe that is thought to be leftover radiation from the Big Bang, when the universe was born about 13.8 billion years ago.
(FULL STORY)

Creating Antimatter Via Lasers?
[9/27/2016]
Russian researchers develop calculations to explain the production and dynamics of positrons in the hole-boring regime of ultrahigh-intensity laser-matter interactions.
Dramatic advances in laser technologies are enabling novel studies to explore laser-matter interactions at ultrahigh intensity. By focusing high-power laser pulses, electric fields (of orders of magnitude greater than found within atoms) are routinely produced and soon may be sufficiently intense to create matter from light.

Now, intriguing calculations from a research team at the Institute of Applied Physics of the Russian Academy of Sciences (IAP RAS), and reported this week in Physics of Plasmas, from AIP Publishing, explain the production and dynamics of electrons and positrons from ultrahigh-intensity laser-matter interactions. In other words: They’ve calculated how to create matter and antimatter via lasers.

Strong electric fields cause electrons to undergo huge radiation losses because a significant amount of their energy is converted into gamma rays -- high-energy photons, which are the particles that make up light. The high-energy photons produced by this process interact with the strong laser field and create electron-positron pairs. As a result, a new state of matter emerges: strongly interacting particles, optical fields, and gamma radiation, whose dynamics are governed by the interplay between classical physics phenomena and quantum processes.

A key concept behind the team’s work is based on the quantum electrodynamics (QED) prediction that “a strong electric field can, generally speaking, ‘boil the vacuum,’ which is full of ‘virtual particles,’ such as electron-positron pairs,” explained Igor Kostyukov of IAP RAS. “The field can convert these types of particles from a virtual state, in which the particles aren’t directly observable, to a real one.”

One impressive manifestation of this type of QED phenomenon is a self-sustained laser-driven QED cascade, which is a grand challenge yet to be observed in a laboratory.

But, what’s a QED cascade?

“Think of it as a chain reaction in which each chain link consists of sequential processes,” Kostyukov said. “It begins with acceleration of electrons and positrons within the laser field. This is followed by emission of high-energy photons by the accelerated electrons and positrons. Then, the decay of high-energy photons produces electron-positron pairs, which go on to new generations of cascade particles. A QED cascade leads to an avalanche-like production of electron-positron high-energy photon plasmas.”

For this work, the researchers explored the interaction of a very intense laser pulse with a foil via numerical simulations.

“We expected to produce a large number of high-energy photons, and that some portion of them would decay and produce electron-positron pairs,” Kostyukov continued. “Our first surprise was that the number of high-energy photons produced by the positrons is much greater than that produced by the electrons of the foil. This led to an exponential -- very sharp -- growth of the number of positrons, which means that if we detect a larger number of positrons in a corresponding experiment we can conclude that most of them are generated in a QED cascade.”

They were also able to observe a distinct structure of the positron distribution in the simulations -- despite some randomness of the processes of photon emission and decay.

“By analyzing the positron motion in the electromagnetic fields in front of the foil analytically, we discovered that some characteristics of the motion regulate positron distribution and led to helical-like structures being observed in the simulations,” he added.

The team’s discoveries are of fundamental importance because the phenomenon they explored can accompany the laser-matter interaction at extreme intensities within a wider range of parameters. “It offers new insights into the properties of these types of interactions,” Kostyukov said. “More practical applications may include the development of advanced ideas for the laser-plasma sources of high-energy photons and positrons whose brilliance significantly exceeds that of the modern sources.”

So far, the researchers have focused on the initial stage of interaction when the electron-positron pairs they produced don’t significantly affect the laser¬¬-target interaction.

“Next, we’re exploring the nonlinear stage when the self-generated electron-positron plasma strongly modifies the interaction,” he said. “And we’ll also try to expand our results to more general configurations of the laser–matter interactions and other regimes of interactions -- taking a wider range of parameters into consideration.”

###

The article, "Production and dynamics of positrons in ultrahigh intensity laser-foil interactions," is authored by I. Yu. Kostyukov and E. N. Nerush. The article will appear in the journal Physics of Plasmas on September 27, 2016 (DOI: 10.1063/1.4962567). After that date, it can be accessed at http://scitation.aip.org/content/aip/journal/pop/23/9/10.1063/1.4962567.
(FULL STORY)

No, Astronomers Haven't Decided Dark Energy Is Nonexistent
[10/26/2016]
This week, a number of media outlets have put out headlines like "The universe is expanding at an accelerating rate, or is it?” and “The Universe Is Expanding But Not At An Accelerating Rate New Research Debunks Nobel Prize Theory.” This excitement is due to a paper just published in Nature’s Scientific Reports called "Marginal evidence for cosmic acceleration from Type Ia supernovae,” by Nielsen, Guffanti and Sarkar.
Once you read the article, however, it’s safe to say there is no need to revise our present understanding of the universe. All the paper does is slightly reduce our certainty in what we know—and then only by discarding most of the cosmological data on which our understanding is based. It also ignores important details in the data it does consider. And even if you leave aside these issues, the headlines are wrong anyway. The study concluded that we’re now only 99.7 percent sure that the universe is accelerating, which is hardly the same as “it’s not accelerating.”
The initial discovery that the universe is expanding at an accelerating rate was made by two teams of astronomers in 1998 using Type Ia Supernovae as cosmic measuring tools. Supernovae—exploding stars—are some of the most powerful blasts in the entire cosmos, roughly equivalent to a billion-billion-billion atomic bombs exploding at once. Type Ia’s are a special kind of supernova in that, unlike other supernovae, they all explode with just about the same luminosity every time likely due to a critical mass limit. This similarity means that the differences in their observed brightness are almost entirely based on how far away they are. This makes them ideal for measuring cosmic distances. Furthermore, these objects are relatively common, and they are so bright that we can see them billions of light years away. This shows us how the universe appeared billions of years ago, which we can compare to how it looks today.These supernovae are often called “standard candles” for their consistency, but they’re more accurately “standardizable candles,” because in practice, their precision and accuracy can be improved still further by accounting for small differences in their explosions by observing how long the explosion takes to unfold and how the color of the supernovae are reddened by dust between them and us. Finding a way to do these corrections robustly was what led to the discovery of the accelerating universe. .
The recent paper that has generated headlines used a catalog of Type Ia supernovae collected by the community (including us) which has been analyzed numerous times before. But the authors used a different method of implementing the corrections—and we believe this undercuts the accuracy of their results. They assume that the mean properties of supernovae from each of the samples used to measure the expansion history are the same, even though they have been shown to be different and past analyses have accounted for these differences. However, even ignoring these differences, the authors still find that there is roughly a 99.7 percent chance that the universe is accelerating—very different from what the headlines suggest.Furthermore, the overwhelming confidence astronomers have that the universe is expanding faster now than it was billions of years ago is based on much more than just supernova measurements. These include tiny fluctuations in the pattern of relic heat after the Big Bang (i.e., the cosmic microwave background) and the modern day imprint of those fluctuations in the distribution of galaxies around us (called baryon acoustic oscillations). The present study also ignores the presence of a substantial amount of matter in the Universe, confirmed numerous times and ways since the 1970’s, further reducing the study confidence. These other data show the universe to be accelerating independently from supernovae. If we combine the other observations with the supernova data, we go from 99.99 percent sure to 99.99999 percent sure. That’s pretty sure!
We now know that dark energy, which is what we believe causes the expansion of the universe to accelerate, makes up 70 percent of the universe, with matter constituting the rest. The nature of dark energy is still one of the largest mysteries of all of astrophysics. But there has been no active debate about whether dark energy exists and none about whether the universe is accelerating since this picture was cemented a decade ago.
There are now many new large surveys, both on the ground and in space, whose top priority over the next two decades is to figure out exactly what this dark energy could be. For now, we have to continue to improve our measurements and question our assumptions. While this recent paper does not disprove any theories, it is still good for everyone to pause for a second and remember how big the questions are that we are asking, how we reached the conclusions we have to date and how seriously we need to test each building block of our understanding.
(FULL STORY)

Behind This Plant's Blue Leaves Lies a Weird Trick of Quantum Mechanics
[10/24/2016]
In the fading twilight on the rainforest floor, a plant's leaves glimmer iridescent blue. And now scientists know why. These exotic blue leaves pull more energy out of dim light than ordinary leaves because of an odd trick of quantum mechanics.

A team of plant scientists led by Heather Whitney of the University of Bristol in the U.K. has just discovered the remarkable origin and purpose of the shiny cobalt leaves on the Malaysian tropical plant Begonia pavonina. The plant owes its glimmer to its peculiar machinery for photosynthesis, the process plants use to turn light into chemical energy. Strangely enough, these blue leaves can squeeze more energy out of the red-green light that reaches the eternally dim rainforest floor. Whitney and her colleagues describe the blue leaves today in the journal Nature Plants.

"It's actually quite brilliant. Plants have to cope with every obstacle that's thrown at them without running away. Here we see evidence of a plant that's actually evolved to physically manipulate the little light it receives," says Whitney, "it's quite amazing, and was an absolutely surprising discovery."
(FULL STORY)

Small entropy changes allow quantum measurements to be nearly reversed
[9/30/2016]
In 1975, Swedish physicist Göran Lindblad developed a theorem that describes the change in entropy that occurs during a quantum measurement. Today, this theorem is a foundational component of quantum information theory, underlying such important concepts as the uncertainty principle, the second law of thermodynamics, and data transmission in quantum communication systems.

Now, 40 years later, physicist Mark M. Wilde, Assistant Professor at Louisiana State University, has improved this theorem in a way that allows for understanding how quantum measurements can be approximately reversed under certain circumstances. The new results allow for understanding how quantum information that has been lost during a measurement can be nearly recovered, which has potential implications for a variety of quantum technologies.
Quantum relative entropy never increases
Most people are familiar with entropy as a measure of disorder and the law that "entropy never decreases"—it either increases or stays the same during a thermodynamic process, according to the second law of thermodynamics. However, here the focus is on "quantum relative entropy," which in some sense is the negative of entropy, so the reverse is true: quantum relative entropy never increases, but instead only decreases or stays the same.
In fact, this was the entropy inequality theorem that Lindblad proved in 1975: that the quantum relative entropy cannot increase after a measurement. In this context, quantum relative entropy is interpreted as a measure of how well one can distinguish between two quantum states, so it's this distinguishability that can never increase. (Wilde describes a proof of Lindblad's result in greater detail in his textbook Quantum Information Theory, published by Cambridge University Press.)
One thing that Lindblad's proof doesn't address, however, is whether it makes any difference if the quantum relative entropy decreases by a little or by a lot after a measurement.
In the new paper, Wilde has shown that, if the quantum relative entropy decreases by only a little, then the quantum measurement (or any other type of so-called "quantum physical evolution") can be approximately reversed.
"When looking at Lindblad's entropy inequality, a natural question is to wonder what we could say if the quantum relative entropy goes down only by a little when the quantum physical evolution is applied," Wilde told Phys.org. "It is quite reasonable to suspect that we might be able to approximately reverse the evolution. This was arguably open since the work of Lindblad in 1975, addressed in an important way by Denes Petz in the late 1980s (for the case in which the quantum relative entropy stays the same under the action of the evolution), and finally formulated as a conjecture around 2008 by Andreas Winter. What my work did was to prove this result as a theorem: if the quantum relative entropy goes down only by a little under a quantum physical evolution, then we can approximately reverse its action."

Wilde's improvements to Lindblad's theorem have a variety of implications, but the main one that Wilde discusses in his paper is how the new results allow for recovering quantum information.
"If the decrease in quantum relative entropy between two quantum states after a quantum physical evolution is relatively small," he said, "then it is possible to perform a recovery operation, such that one can perfectly recover one state while approximately recovering the other. This can be interpreted as quantifying how well one can reverse a quantum physical evolution." So the smaller the relative entropy decrease, the better the reversal process.
The ability to recover quantum information could prove useful for quantum error correction, which aims to protect quantum information from damaging external effects. Wilde plans to address this application more in the future with his colleagues.
As Wilde explained, Lindblad's original theorem can also be used to prove the uncertainty principle of quantum mechanics in terms of entropies, as well as the second law of thermodynamics for quantum systems, so the new results have implications in these areas, as well.
"Lindblad's entropy inequality underlies many limiting statements, in some cases said to be physical laws or principles," Wilde said. "Examples are the uncertainty principle and the second law of thermodynamics. Another example is that this entropy inequality is the core step in determining limitations on how much data we can communicate over quantum communication channels. We could go as far as to say that the above entropy inequality constitutes a fundamental law of quantum information theory, which is a direct mathematical consequence of the postulates of quantum mechanics."
Regarding the uncertainty principle, Wilde and two coauthors, Mario Berta and Stephanie Wehner, discuss this angle in a forthcoming paper. They explain that the uncertainty principle involves quantum measurements, which are a type of quantum physical evolution and therefore subject to Lindblad's theorem. In one formulation of the uncertainty principle, two experiments are performed on different copies of the same quantum state, with both experimental outcomes having some uncertainty.
"The uncertainty principle is the statement that you cannot generally make the uncertainties of both experiments arbitrarily small, i.e., there is generally a limitation," Wilde said. "It is now known that a statement of the uncertainty principle in terms of entropies can be proved by using the 'decrease of quantum relative entropy inequality.' So what the new theorem allows for doing is relating the uncertainties of the measurement outcomes to how well we could try to reverse the action of one of the measurements. That is, there is now a single mathematical inequality which captures all of these notions."
In terms of the second law of thermodynamics, Wilde explains how the new results have implications for reversing thermodynamic processes in both classical and quantum systems.
"The new theorem allows for quantifying how well we can approximately reverse a thermodynamic transition from one state to another without using any energy at all," he said.
He explained that this is possible due to the connection between entropy, energy, and work. According to the second law of thermodynamics, a thermodynamic transition from one quantum state to another is allowed only if the free energy decreases from the original state to the final state. During this process, one can gain work and store energy. This law can be rewritten as a statement involving relative entropies and can be proved as a consequence of the decrease of quantum relative entropy.
"What my new work with Stephanie Wehner and Mischa Woods allows for is a refinement of this statement," Wilde said. "We can say that if the free energy does not go down by very much under a thermodynamic transition (i.e., if there is not too much work gained in the process), then it is possible to go back approximately to the original state from the final state, without investing any work at all. The key word here is that you can go back only approximately, so we are not in violation of the second law, only providing a refinement of it."
In addition to these implications, the new theorem can also be applied to other research topics in quantum information theory, including the Holevo bound, quantum discord, and multipartite information measures.
Wilde's work was funded in part by The DARPA Quiness program (ending now), which focused on quantum key distribution, or using quantum mechanics to ensure secret communication between two parties. He describes more about this application, in particular how Alice and Bob might use a quantum state to share secrets that can be kept private from an eavesdropper Eve (and help them survive being attacked by a bear), in a recent blog post.
(FULL STORY)

Did the Mysterious 'Planet Nine' Tilt the Solar System?
[10/19/2016]
The putative "Planet Nine" may have tilted the entire solar system, researchers say.

In January, astronomers revealed evidence for the potential existence of another planet in the solar system. Researchers suggest that if this world — dubbed Planet Nine — exists, it could be about 10 times Earth's mass and orbit the sun at a distance about 500 times the distance from the Earth to the sun.

Previous research suggested that Planet Nine would possess a highly tilted orbit compared with the relatively thin, flat zone in which the eight official planets circle the sun. This led scientists to investigate whether Planet Nine's slant might help explain other tilting seen elsewhere in the solar system.
Now, researchers suggest that Planet Nine's influence might have tilted the entire solar system except the sun.

"Planet Nine may have tilted the other planets over the lifetime of the solar system," said study lead author Elizabeth Bailey, an astrophysicist and planetary scientist at the California Institute of Technology in Pasadena.

Prior work found that the zone in which the eight major planets orbit the sun is tilted by about 6 degrees compared to the sun's equator. This discrepancy has long been a mystery in astronomy.
Bailey and her colleagues ran computer simulations that suggest that the tilt of the eight official planets can be explained by the gravitational influence of Planet Nine "over the 4.5-billion-years-ish lifetime of the solar system," Bailey told Space.com.

Bailey did note that there are other potential explanations for the tilt of the solar system. One alternative is that electrically charged particles influenced by the young sun's magnetic field could have interacted with the disk of gas and dust that gave rise to the planets in ways that tilted the solar system. Another possibility is that there might have been an imbalance in the mass of the nascent sun's core.

"However, all these other ways to explain why the solar system is tilted are really hard to test — they all invoke processes that were possibly present really early in the solar system," Bailey said. "Planet Nine is the first thing that has been proposed to tilt the solar system that doesn't depend on early conditions, so if we find Planet Nine, we will be able to see if it's the only thing responsible for the tilt, or if anything else may have played a role."

The scientists detailed their findings yesterday (Oct. 18) at a joint meeting of the American Astronomical Society's Division for Planetary Sciences and European Planetary Science Congress in Pasadena, California.
(FULL STORY)

Cosmological mystery solved by largest ever map of voids and superclusters
[10/12/2016]
A team of astrophysicists at the University of Portsmouth have created the largest ever map of voids and superclusters in the universe, which helps solve a long-standing cosmological mystery. The map of the positions of cosmic voids – large empty spaces which contain relatively few galaxies – and superclusters – huge regions with many more galaxies than normal – can be used to measure the effect of dark energy 'stretching' the universe.

The results confirm the predictions of Einstein's theory of gravity.
Lead author Dr Seshadri Nadathur from the University's Institute of Cosmology and Gravitation said: "We used a new technique to make a very precise measurement of the effect that these structures have on photons from the cosmic microwave background (CMB) – light left over from shortly after the Big Bang – passing through them.

"Light from the CMB travels through such voids and superclusters on its way to us. According to Einstein's General Theory of Relativity, the stretching effect of dark energy causes a tiny change in the temperature of CMB light depending on where it came from. Photons of light travelling through voids should appear slightly colder than normal and those arriving from superclusters should appear slightly hotter. "This is known as the integrated Sachs-Wolfe (ISW) effect.

"When this effect was studied by astronomers at the University of Hawai'i in 2008 using an older catalogue of voids and superclusters, the effect seemed to be five times bigger than predicted. This has been puzzling scientists for a long time, so we looked at it again with new data."
To create the map of voids and superclusters, the Portsmouth team used more than three-quarters of a million galaxies identified by the Sloan Digital Sky Survey. This gave them a catalogue of structures more than 300 times bigger than the one previously used.
The scientists then used large computer simulations of the universe to predict the size of the ISW effect. Because the effect is so small, the team had to develop a powerful new statistical technique to be able to measure the CMB data.

They applied this technique to CMB data from the Planck satellite, and were able to make a very precise measurement of the ISW effect of the voids and superclusters. Unlike in the previous work, they found that the new result agreed extremely well with predictions using Einstein's gravity.
Dr Nadathur said: "Our results resolve one long-standing cosmological puzzle, but doing so has deepened the mystery of a very unusual 'Cold Spot' in the CMB.
"It has been suggested that the Cold Spot could be due to the ISW effect of a gigantic 'supervoid' which has been seen in that region of the sky. But if Einstein's gravity is correct, the supervoid isn't big enough to explain the Cold Spot.
"It was thought that there was some exotic gravitational effect contradicting Einstein which would simultaneously explain both the Cold Spot and the unusual ISW results from Hawai'i. But this possibility has been set aside by our new measurement – and so the Cold Spot mystery remains unexplained."
(FULL STORY)

The Universe Has 10 Times More Galaxies Than Scientists Thought
[10/13/2016]
More than a trillion galaxies are lurking in the depths of space, a new census of galaxies in the observable universe has found — 10 times more galaxies than were previously thought to exist.

An international team of astronomers used deep-space images and other data from the Hubble Space Telescope to create a 3D map of the known universe, which contains about 100 to 200 billion galaxies. In particular, they relied on Hubble's Deep Field images, which revealed the most distant galaxies ever seen with a telescope. [Video: Our Universe Has Trillions of Galaxies, Hubble Study]

Then, the researchers incorporated new mathematical models to calculate where other galaxies that have not yet been imaged by a telescope might exist. For the numbers to add up, the universe needs at least 10 times more galaxies than those already known to exist. But these unknown galaxies are likely either too faint or too far away to be seen with today's telescopes.
It boggles the mind that over 90 percent of the galaxies in the universe have yet to be studied," Christopher Conselice, a professor of astrophysics at the University of Nottingham in the U.K., who led the study, said in a statement. "Who knows what interesting properties we will find when we observe these galaxies with the next generation of telescopes.”

Looking far out into deep space also means looking back in time, because light takes a long time to travel across cosmic distances. During the study, Conselice and his team looked at parts of the universe up to 13 billion light-years away. Looking this far allowed the researchers to see partial snapshots of the evolution of the universe since 13 billion years ago, or less than 100 million years after the Big Bang.

They discovered that the early universe contained even more galaxies than it does today. Those distant galaxies were small and faint dwarf galaxies, they found. As the universe evolves, such galaxies merge together to form larger galaxies.
n a separate statement, Conselice said that the results are "very surprising as we know that, over the 13.7 billion years of cosmic evolution since the Big Bang, galaxies have been growing through star formation and mergers with other galaxies. Finding more galaxies in the past implies that significant evolution must have occurred to reduce their number through extensive merging of systems."

The results of the study are detailed in The Astrophysical Journal.
(FULL STORY)

Correlation between galaxy rotation and visible matter puzzles astronomers
[10/7/2016]
A new study of the rotational velocities of stars in galaxies has revealed a strong correlation between the motion of the stars and the amount of visible mass in the galaxies. This result comes as a surprise because it is not predicted by conventional models of dark matter.
Stars on the outskirts of rotating galaxies orbit just as fast as those nearer the centre. This appears to be in violation of Newton's laws, which predict that these outer stars would be flung away from their galaxies. The extra gravitational glue provided by dark matter is the conventional explanation for why these galaxies stay together. Today, our most cherished models of galaxy formation and cosmology rely entirely on the presence of dark matter, even though the substance has never been detected directly.
These new findings, from Stacy McGaugh and Federico Lelli of Case Western Reserve University, and James Schombert of the University of Oregon, threaten to shake things up. They measured the gravitational acceleration of stars in 153 galaxies with varying sizes, rotations and brightness, and found that the measured accelerations can be expressed as a relatively simple function of the visible matter within the galaxies. Such a correlation does not emerge from conventional dark-matter models.
(FULL STORY)

The Spooky Secret Behind Artificial Intelligence's Incredible Power
[10/7/2016]
Spookily powerful artificial intelligence (AI) systems may work so well because their structure exploits the fundamental laws of the universe, new research suggests.

The new findings may help answer a longstanding mystery about a class of artificial intelligence that employ a strategy called deep learning. These deep learning or deep neural network programs, as they're called, are algorithms that have many layers in which lower-level calculations feed into higher ones. Deep neural networks often perform astonishingly well at solving problems as complex as beating the world's best player of the strategy board game Go or classifying cat photos, yet know one fully understood why.

It turns out, one reason may be that they are tapping into the very special properties of the physical world, said Max Tegmark, a physicist at the Massachusetts Institute of Technology (MIT) and a co-author of the new research.
The laws of physics only present this "very special class of problems" — the problems that AI shines at solving, Tegmark told Live Science. "This tiny fraction of the problems that physics makes us care about and the tiny fraction of problems that neural networks can solve are more or less the same," he said. [Super-Intelligent Machines: 7 Robotic Futures]

Deep learning

Last year, AI accomplished a task many people thought impossible: DeepMind, Google's deep learning AI system, defeated the world's best Go player after trouncing the European Go champion. The feat stunned the world because the number of potential Go moves exceeds the number of atoms in the universe, and past Go-playing robots performed only as well as a mediocre human player.

But even more astonishing than DeepMind's utter rout of its opponents was how it accomplished the task.

"The big mystery behind neural networks is why they work so well," said study co-author Henry Lin, a physicist at Harvard University. "Almost every problem we throw at them, they crack."

For instance, DeepMind was not explicitly taught Go strategy and was not trained to recognize classic sequences of moves. Instead, it simply "watched" millions of games, and then played many, many more against itself and other players.

Like newborn babies, these deep-learning algorithms start out "clueless," yet typically outperform other AI algorithms that are given some of the rules of the game in advance, Tegmark said.

Another long-held mystery is why these deep networks are so much better than so-called shallow ones, which contain as little as one layer, Tegmark said. Deep networks have a hierarchy and look a bit like connections between neurons in the brain, with lower-level data from many neurons feeding into another "higher" group of neurons, repeated over many layers. In a similar way, deep layers of these neural networks make some calculations, and then feed those results to a higher layer of the program, and so on, he said.

Magical keys or magical locks?

To understand why this process works, Tegmark and Lin decided to flip the question on its head.

"Suppose somebody gave you a key. Every lock you try, it seems to open. One might assume that the key has some magic properties. But another possibility is that all the locks are magical. In the case of neural nets, I suspect it's a bit of both," Lin said.

One possibility could be that the "real world" problems have special properties because the real world is very special, Tegmark said.

Take one of the biggest neural-network mysteries: These networks often take what seem to be computationally hairy problems, like the Go game, and somehow find solutions using far fewer calculations than expected.

It turns out that the math employed by neural networks is simplified thanks to a few special properties of the universe. The first is that the equations that govern many laws of physics, from quantum mechanics to gravity to special relativity, are essentially simple math problems, Tegmark said. The equations involve variables raised to a low power (for instance, 4 or less). [The 11 Most Beautiful Equations]

What's more, objects in the universe are governed by locality, meaning they are limited by the speed of light. Practically speaking, that means neighboring objects in the universe are more likely to influence each other than things that are far from each other, Tegmark said.

Many things in the universe also obey what's called a normal or Gaussian distribution. This is the classic "bell curve" that governs everything from traits such as human height to the speed of gas molecules zooming around in the atmosphere.

Finally, symmetry is woven into the fabric of physics. Think of the veiny pattern on a leaf, or the two arms, eyes and ears of the average human. At the galactic scale, if one travels a light-year to the left or right, or waits a year, the laws of physics are the same, Tegmark said.

Tougher problems to crack

All of these special traits of the universe mean that the problems facing neural networks are actually special math problems that can be radically simplified.

"If you look at the class of data sets that we actually come across in nature, they're way simpler than the sort of worst-case scenario you might imagine," Tegmark said.

There are also problems that would be much tougher for neural networks to crack, including encryption schemes that secure information on the web; such schemes just look like random noise.

"If you feed that into a neural network, it's going to fail just as badly as I am; it's not going to find any patterns," Tegmark said.

While the subatomic laws of nature are simple, the equations describing a bumblebee flight are incredibly complicated, while those governing gas molecules remain simple, Lin added. It's not yet clear whether deep learning will perform just as well describing those complicated bumblebee flights as it will describing gas molecules, he said.

"The point is that some 'emergent' laws of physics, like those governing an ideal gas, remain quite simple, whereas some become quite complicated. So there is a lot of additional work that needs to be done if one is going to answer in detail why deep learning works so well." Lin said. "I think the paper raises a lot more questions than it answers!"
(FULL STORY)

Science of Disbelief: When Did Climate Change Become All About Politics?
[10/7/2016]
Barely over a quarter of Americans know that almost all climate scientists agree that climate change is happening and that humans are to blame, a new Pew Research Center survey finds.

The survey also reveals a strong split between political liberals and political conservatives on the issue. While 55 percent of liberal Democrats say climate scientists are trustworthy, only 15 percent of conservative Republicans say the same.

The findings are in line with the results of other surveys of the politics of climate change, said Anthony Leiserowitz, director of the Yale Program on Climate Change Communication. Leiserowitz was not involved in the Pew study, but he and his colleagues conduct their own surveys on climate attitudes.

"The overwhelming finding that they see here is that there's a strong partisan divide on climate change, and that is a pattern we first saw emerge in 1997," Leiserowitz told Live Science.

The partisan gap isn't necessarily set in stone, however, Leiserowitz said. It's actually been narrowing recently — but it remains to be seen how the result of this year's presidential election may affect the divide.

Prior to 1997, the two major parties held similar beliefs on the occurrence of human-caused climate change, Leiserowitz said. Right around that time, then-President Bill Clinton and then-Vice President Al Gore took on the issue and pushed for the Kyoto Protocol, an international climate treaty meant to reduce greenhouse gas emissions.

"That's the moment when they come back and say, 'This is a global problem, and the U.S. needs to be part of the solution,' that the two parties begin to diverge," Leiserowitz said.

Since then, the American public's belief that climate change is real has fluctuated. Belief that climate change exists and that it's human-caused began to rise around 2004 and hit a peak around 2007, driven by media coverage of California's climate initiatives under Republican Gov. Arnold Schwarzenegger and the Hollywood film "The Day After Tomorrow," released in 2004. (Really: Leiserowitz's research found that Americans who saw the blockbuster were moved to think climate change is a problem. Al Gore's film "An Inconvenient Truth" was released in 2006 but was seen by far fewer people, Leiserowitz said.)

These numbers waned during the 2008 recession, when the media abruptly stopped talking about climate change and the conservative tea-party wing of the Republican Party gained more power, Leiserowitz said. Belief in man-made climate change bottomed out in 2010 and 2011 but has been creeping upward since then, he said. [6 Unexpected Effects of Climate Change]

"That uptick is not coming from Democrats," he said. "Democrats have not really changed much at all. Independents — their belief that global warming is happening — has increased. But the real shift is happening among Republicans, and most interesting, the biggest shift — 19 percentage points — is among conservative Republicans."

But even with those increases, because the percentage of conservative Americans who believed in man-made climate change was so small, the overall proportion of conservatives who believe climate change is caused by human activity is still small. The new Pew survey, conducted between May 10 and June 6, 2016, found that 48 percent of Americans overall believe that the Earth is warming mostly because of human activity. Seventy-nine percent of liberal Democrats held that belief, compared with 63 percent of moderate Democrats, 34 percent of moderate Republicans and 15 percent of conservative Republicans.

Climate scientists have the trust of far more people on the left side of the political spectrum than the right. Only 9 percent of conservative Republicans believe that climate scientists' findings are usually influenced by the best available evidence, compared with 55 percent of liberal Democrats. Only 7 percent of conservative Republicans and 15 percent of moderate Republicans think climate scientists are motivated by concern for the public's best interest, compared with 31 percent of moderate Democrats and 41 percent of liberal Democrats.

Still, up until last spring, the trends were "moving in a more science-aligned direction," Leiserowitz said. Even members of the Republican establishment had been willing to discuss climate change as a problem, Leiserowitz said, citing former presidential candidate John McCain, who had sponsored and supported climate legislation in the U.S. Senate.

"Then, along comes Donald Trump, and he basically flips over all the card tables," Leiserowitz said. The candidate has called climate change a hoax on multiple occasions and once tweeted that "the concept of global warming was created by and for the Chinese in order to make U.S. manufacturing non-competitive." Trump has also been consistent in calling for less regulation of fossil fuel emissions. [Election Day 2016: A Guide to the When, Why, What and How]

"It's not clear where he has taken the Republican base," Leiserowitz said. The outcome of the election alone won't be enough to determine what kind of collateral damage climate opinion will accrue. Should Trump lose, Leiserowitz said, the Republican Party will have to decide whether to move even more rightward or whether to take a more centrist tack.

However, Americans' views aren't quite as extreme as the political class would make it seem, Leiserowitz said. Yale's surveys found that about 17 percent of Americans are alarmed about climate change, and 10 percent are entirely dismissive. The other 63 percent believe in, and are worried about, climate change to differing degrees.

"Most Americans are actually in the middle, and more of those people in the middle are leaning pretty well toward the scientific consensus," Leiserowitz said.
(FULL STORY)

Eyeballing Proxima b: Probably Not a Second Earth
[10/7/2016]
In our profound quest to discover strange new worlds, we've inevitably been trying to find alien planets that possess any Earth-like similarities. Now, with the incredible find of an Earth-mass exoplanet orbiting a neighboring star at just the right distance for liquid water to persist on its surface, hopes are high that we may have discovered an "Earth 2.0" right on our galactic doorstep.

But in our rush to assign any terrestrial likeness to this small exoplanet, we often forget that just because it's in the right place and is (apparently) the right mass, it likely has very little resemblance to Earth. And even if it does possess water, it could still be a very strange world indeed.

In a new study headed by scientists at the French National Center for Scientific Research (CNRS) and Cornell University, computer simulations have been run to figure out the possible characteristics of the small rocky world that was discovered orbiting the red dwarf star Proxima Centauri. Located only 4.2 light-years from Earth, the so-called Proxima b was discovered by the ESO's La Silla observatory in Chile and astronomers of the Pale Red Dot campaign to much excitement in August.

By measuring the slight wobbles of Proxima Centauri, the telescope was able not only to decipher the mass of the exoplanet, it could also calculate its orbital period. With this information, the researchers realized that the world was orbiting the red dwarf within the star's "habitable zone." The habitable zone of any star is the distance at which a planet can orbit that is not too hot and not too cold for liquid water to persist on its surface.

The implications are clear: on Earth, where there's liquid water, there's life — if there's liquid water on Proxima b, perhaps there's life there too. And, if we look for enough into the future, perhaps we might one day become an interstellar species and set up home there.

But it's worth remembering that we currently have very little information about Proxima b. We know that it has an orbital period of a little over 11 days (yes, a "year" on Proxima b is only 11 days).* We know it orbits within the star's habitable zone. We also know its approximate mass. However, we don't know whether or not it has an atmosphere. Also, we don't know Proxima b's physical size. If we don't know its physical size, we can't calculate its average density and therefore there's ambiguity as to what materials it contains. So, in an effort to confront this ambiguity, the researchers ran some simulations of a 1.3 Earth-mass world (the approximate mass of Proxima b) in orbit around a red dwarf star to see what form it might take.

Assuming the rocky world has the smallest physical size allowed for its mass (94 percent Earth's diameter), according to planetary formation models this would consist of a metal core, making up for 65 percent of the mass of the entire planet. The outer layers would consist of rocky mantle and very little water (if any). In this scenario, Proxima b would be a rocky, barren and dry world, resembling a massive Mercury. Last time we checked in on Mercury, it didn't appear very "habitable."

But this is just one possibility. The researchers then shifted the scale to the other extreme. What would happen if the physical size of the planet was pushed to the maximum? Well, the mass of Proxima b could support a world that is 40 percent bigger than Earth. Now things get interesting.

In this scenario, Proxima b would be a lot less dense, meaning there would be less rock and metal. A huge proportion of the planet's mass would consist of water. In fact, 50 percent of the entire planet's mass would be water. This would be a "water world" in the strongest possible sense.

Somewhere between these two scenarios — either a dense and barren rock or bloated water world — is the highly sought-after "Earth 2.0"; basically a world with a small metal core, rocky mantle and plentiful oceans flooding the surface. It's this exoplanetary compromise that you regularly see in artistic impressions of Proxima b, the temperate alien world that looks like Earth:

Alas, this version of Proxima b is just one possibility over a huge range of scenarios. So, yeah, from this study alone, Proxima b is probably not very Earth-like. But wait, there's more.

Just because a planet orbits its star in the habitable zone, it doesn't mean it has the same life-giving qualities as Earth (keep in mind that both Mars and Venus also orbit the sun within our solar system's habitable zone).

Proxima b orbits very close to its star. It's the nature of the beast; red dwarf stars are small and therefore cooler than sun-like stars. Proxima Centauri's habitable zone is therefore one hell of a lot more compact than our sun's. The Proxima Centauri habitable zone is well within the orbit of Mercury. If a planet got that close to our hot sun, it would be burnt to a crisp; for a planet in orbit around Proxima Centauri, this location is an oasis.

But when you orbit so close to a red dwarf, a planet starts to succumb to some tidal difficulties. One face of an orbiting planet around a red dwarf will be constantly facing the star, meaning the planet's spin matches its orbital period. One hemisphere of the planet is in constant light while the other hemisphere is in constant darkness — a situation called "tidal locking."

So, in this case, let's imagine the orbiting exoplanet really is a textbook "Earth-like" world with just the right composition. A world with an iron core, rocky mantle and enough water on the surface to create liquid water oceans that could support life. But this world is tidally locked with its star — that's got to cause some problems, right?

Let's assume that this planet somehow possesses an atmosphere (more on that later), to have one hemisphere being constantly heated while the other hemisphere is constantly frozen certainly doesn't sound like a good time. Many simulations have been run in an attempt to model the complexities of the atmospheric conditions in this situation and most outcomes aren't good. Some scenarios predict planet-wide hurricanes that act like a blast oven, other scenarios predict a dry wasteland on the star-facing hemisphere and a frozen solid dark hemisphere.

There are, however, some planetary models that could save the day for these unfortunate wannabe "second Earths." One fun prediction is the possible existence of "Eyeball Earths." These peculiar planets would still be tidally locked to their star, with one hemisphere a constantly baked desert and the other hemisphere in deep freeze, but there would be a region between day and night where the conditions are just right for a liquid water ocean to circle the world between the darkness and light. Oh, and it would look like an eyeball, seriously:

In other research around atmospheric dynamics of tidally locked exoplanets, there could be a situation where the world has efficient "air conditioning" — hot air from one hemisphere is distributed about the planet in such a way to balance global temperatures. But this assumes a high degree of friction between the lower atmosphere and a craggy, rocky surface and efficient high-altitude air flow.

But the ultimate kicker when considering "Earth-like" exoplanets around red dwarf stars is that just because red dwarfs are small, it doesn't mean they are docile. In fact, red dwarf stars can be downright violent, frequently erupting with powerful flares, flooding any nearby planets with ionizing radiation. This radiation, plus inevitably powerful stellar winds, would likely blow any atmosphere away from our hypothetical burgeoning Earth 2.0. Without an atmosphere, the only vaguely habitable location on that planet would be under the surface, perhaps in a sub-surface ocean protected by an icy crust like Jupiter's moon Europa.

But, like Earth, if these planets have a powerful global magnetosphere, perhaps the worst of the stellar storm can be deflected and an atmosphere could form, who knows?

Though there are many challenges facing our search for "Earth 2.0," we are only just beginning our quest to seek out alien worlds orbiting other stars. Yes, it is an incredible stroke of luck to find a small world orbiting a neighboring star, but as red dwarfs are the most populous type of star in our galaxy, the odds are that a handful may well have just the right ingredients to support a habitable atmosphere. But is Proxima b one of those diamonds in the rough?

For now, with the tools at our disposal, we simply do not know. Perhaps with the launch of NASA's James Webb Space Telescope in 2018 we might be able to tease out the spectroscopic fingerprint of an atmosphere, but that would likely be beyond its capabilities. So we might just have to send an interstellar probe there to find out if Proxima b is really the habitable exoplanet everyone hopes it will be.
(FULL STORY)

Does the Drake Equation Confirm There Is Intelligent Alien Life in the Galaxy?
[10/6/2016]
The Drake Equation, written by astrophysicist Frank Drake in 1961, is a probabilistic equation to come up with an estimate of the number of intelligent, technological civilizations that should be in the Milky Way—and by extension, the universe. It is the foundation for a number of statistical models that suggest intelligent alien life should be widespread throughout the galaxy. In 1961, Drake's original estimate for the number of intelligent civilizations in our galaxy was between 20 and 50,000,000. As a new episode of PBS's Space Time points out, we have significantly refined our estimates for the number of potentially habitable planets in the Milky Way thanks to the Kepler planet-hunting mission. (We think there are around 40 billion rocky planets orbiting within the habitable zone of their parent stars.)
What we still struggle with is pinning down the probability that life will spring from organic compounds, a process known as abiogenesis, and the probability that basic microbial life will eventually evolve into an intelligent species. To help dial in this estimate a bit more, astrophysicists Adam Frank and Woodruff Sullivan asked how small the intelligent life probability would need to be if we are in fact the only technologically advanced species in the entire universe.

They concluded that if only one intelligent civilization ever existed in the history of the known universe (humans, and nothing else ever before), then the probability that a habitable planet produces intelligent life would have to be less than 1 in 400 billion trillion, or 2.5 x 10^-24—and, even if this were the case, there would still only be a 1 percent chance that no technological civilization ever existed other than humans. This is such a ludicrously small probability that astrophysicists are forced to conclude that we are not the only intelligent civilization to ever exist.

If we narrow the focus to just the Milky Way, then there is still only a 1 in 60 billion chance that a habitable planet produces an advanced civilization, assuming that we are the only such civilization to ever exist in the galaxy. Most people therefore conclude that there must be other intelligent civilizations in our galaxy, if not now then at some point in the past. But we have never detected any, and therein lies the Fermi paradox.

Check out the Space Time video above to learn more about the likelihood that we are not alone in our galaxy, and be sure to stay tuned for the bonus question at the end of the episode. If you get it correct, PBS will send you a Space Time t-shirt free of charge.
(FULL STORY)

Scientists build world's smallest transistor
[10/6/2016]
Silicon transistors have been getting smaller and smaller, packing more computing power into smaller dimensions all while using less energy. But silicon transistors can't get much smaller.

To keep the trend going, scientists have turned to silicon alternatives. Recently, a team scientists set a new record for world's smallest transistor using a pair of novel materials, carbon nanotubes and molybdenum disulfide. The combination belongs to a class of materials called transition metal dichalcogenides, or TMDs
Molybdenum disulfide, or MoS2, is an engine lubricant that scientists believe has tremendous potential in the field of electronics. Like silicon, MoS2 boasts a crystalline lattice structure. But electrons don't move as easily through MoS2 as they do through silicon.

Transistors rely on a gate to control the flow of electricity through its terminals. But because silicon allows for such a free flow of electrons, the particles barge through the doors when the gate becomes too small.

"This means we can't turn off the transistors," Sujay Desai, a graduate student at the Department of Energy's Lawrence Berkeley National Laboratory, explained in a news release. "The electrons are out of control."

When electrons are out of control, transistors leak energy.

With MoS2, scientists were able to make the gate -- and the transistor -- much smaller without making susceptible to gate-crashing electrons. In fact, Desai and his research partners built a transistor with a 1-nanometer gate. A single strand of human hair measures roughly 50,000 nanometers across.

While the feat is impressive, and the technology promising, researchers say there is much work to do.

"This work demonstrated the shortest transistor ever," Ali Javey, a professor of electrical engineering and computer sciences at the University of California, Berkeley. "However, it's a proof of concept. We have not yet packed these transistors onto a chip, and we haven't done this billions of times over."

If the technology is going to make in the electronics industry, researchers will need to find new ways to produce the materials at scale.

"Large-scale processing and manufacturing of TMD devices down to such small gate lengths will require future innovations," said Moon Kim, professor of materials science and engineering at the University of Texas, Dallas.

Still, researchers are hopeful the breakthrough will translate to smaller more efficient computer chips, and ultimately, smaller, more efficient electronics.

"A cellphone with this technology built in would not have to be recharged as often," Kim said.
Molybdenum disulfide, or MoS2, is an engine lubricant that scientists believe has tremendous potential in the field of electronics. Like silicon, MoS2 boasts a crystalline lattice structure. But electrons don't move as easily through MoS2 as they do through silicon.

Transistors rely on a gate to control the flow of electricity through its terminals. But because silicon allows for such a free flow of electrons, the particles barge through the doors when the gate becomes too small.

"This means we can't turn off the transistors," Sujay Desai, a graduate student at the Department of Energy's Lawrence Berkeley National Laboratory, explained in a news release. "The electrons are out of control."

When electrons are out of control, transistors leak energy.

With MoS2, scientists were able to make the gate -- and the transistor -- much smaller without making susceptible to gate-crashing electrons. In fact, Desai and his research partners built a transistor with a 1-nanometer gate. A single strand of human hair measures roughly 50,000 nanometers across.

While the feat is impressive, and the technology promising, researchers say there is much work to do.

"This work demonstrated the shortest transistor ever," Ali Javey, a professor of electrical engineering and computer sciences at the University of California, Berkeley. "However, it's a proof of concept. We have not yet packed these transistors onto a chip, and we haven't done this billions of times over."

If the technology is going to make in the electronics industry, researchers will need to find new ways to produce the materials at scale.

"Large-scale processing and manufacturing of TMD devices down to such small gate lengths will require future innovations," said Moon Kim, professor of materials science and engineering at the University of Texas, Dallas.

Still, researchers are hopeful the breakthrough will translate to smaller more efficient computer chips, and ultimately, smaller, more efficient electronics.

"A cellphone with this technology built in would not have to be recharged as often," Kim said.
(FULL STORY)

Newly analyzed observations by NASA's planet-hunting Kepler space telescope show that the star KIC 8462852 — whose occasional, dramatic dips in brightness still have astronomers scratching their heads — has also dimmed overall during the last few years.

"The steady brightness change in KIC 8462852 is pretty astounding," study lead author Ben Montet, of the California Institute of Technology in Pasadena, said in a statement.

"Our highly accurate measurements over four years demonstrate that the star really is getting fainter with time," Montet added. "It is unprecedented for this type of star to slowly fade for years, and we don't see anything else like it in the Kepler data."

KIC 8462852 hit the headlines last September, when a team of astronomers led by Tabetha Boyajian of Yale University announced that the star had dimmed dramatically several times over the past few years — in one case, by a whopping 22 percent.

These brightness dips are too significant to be caused by an orbiting planet, so scientists began suggesting alternative explanations. Perhaps a planet or a family of orbiting comets broke up, for example, and the ensuing cloud of dust and fragments periodically blocks the star's light. Or maybe some unknown object in the depths of space between the star and Earth is causing the dimming.

The brightness dips are even consistent with a gigantic energy-collecting structure built by an intelligent civilization — though researchers have been keen to stress that this "alien megastructure" scenario is quite unlikely.

The weirdness increased in January 2016, when astronomer Bradley Schaefer of Louisiana State University reported that KIC 8462852 also seems to have dimmed overall by 14 percent between 1890 and 1989.

This conclusion is based on Schaefer's analysis of photographic plates of the night sky that managed to capture Tabby's Star, which lies about 1,500 light-years from Earth. Some other astronomers questioned this interpretation, however, suggesting that differences in the instruments used to photograph the sky over that time span may be responsible for the apparent long-term dimming.

So Montet and co-author Joshua Simon, of the Observatories of the Carnegie Institution of Washington, decided to scour the Kepler data for any hint of the trend Schaefer spotted. And they found more than just a hint.

Kepler observed KIC 8462852, along with about 150,000 other stars, from 2009 through 2013. During the first three years of that time span, KIC 8462852 got nearly 1 percent dimmer, Montet and Simon found. The star's brightness dropped by a surprising 2 percent over the next six months, and stayed level for the final six months of the observation period. (Kepler has since moved on to a new mission called K2, during which the telescope is hunting for exoplanets on a more limited basis and performing a variety of other observations.)

"This star was already completely unique because of its sporadic dimming episodes," Simon said in the same statement. "But now we see that it has other features that are just as strange, both slowly dimming for almost three years and then suddenly getting fainter much more rapidly."

Montet and Simon said they don't know what's behind the weird behavior of Tabby's Star, but they hope their results, which have been accepted for publication in The Astrophysical Journal, help crack the case eventually.

"It's a big challenge to come up with a good explanation for a star doing three different things that have never been seen before," Montet said. "But these observations will provide an important clue to solving the mystery of KIC 8462852."
(FULL STORY)

What's Out There? 'Star Men' Doc Tackles Life Questions Through Science
[10/5/2016]
The documentary "Star Men," which has just begun to play in select theatres in the United States, uses the life stories of four prominent astronomers to take a compassionate look at aging, death and humanity's search for meaning.

Following a screening of "Star Men" at the California Institute of Technology (Caltech) in Pasadena last month, one of the film's subjects, astronomer Neville (Nick) Woolf, said that when the project began he thought it would be a science documentary set against the backdrop of the American Southwest.

Instead, he was surprised to see that the film is actually centered on the 50-year friendship among himself and three colleagues — Roger Griffin, Donald Lynden-Bell and Wallace (Wal) Sargent — who worked together at Caltech in the early 1960s.
(FULL STORY)

Evidence for new form of matter-antimatter asymmetry observed
[10/4/2016]
Like two siblings with divergent personalities, a type of particle has shown signs of behaving differently than its antimatter partner. It’s the first time evidence of matter-antimatter differences have been detected in decays of a baryon — a category of particle that includes protons and neutrons. Such matter-antimatter discrepancies are key to explaining how the universe came to be made mostly of matter, scientists believe.

The result is “the first measurement of its kind,” says theoretical physicist Yuval Grossman of Cornell University. “Wow, we can actually see something that we’ve never seen before.”

Evidence of matter-antimatter differences in decays of baryons — particles which are composed of three smaller particles known as quarks — has eluded scientists until now. Previous experiments have found differences between matter and antimatter varieties of mesons, which are made up of one quark and one antiquark, but never in baryons.

For most processes, the laws of physics would be the same if matter were swapped with antimatter and the universe’s directions were flipped, as if reflected in a mirror. But when this principle, known as CP symmetry (for “charge parity”), is violated, matter and antimatter act differently. Now, scientists have found hints of CP violation in the decays of a particle known as a lambda-b baryon.

Scientists with the LHCb experiment, located at the Large Hadron Collider near Geneva, reported the result online September 16 at arXiv.org. They found that when the lambda-b baryon decays, the particles produced by the decay speed away at different angles and momenta for matter and antimatter versions of the baryon. (LHCb scientists declined to comment for this article, citing the embargo policy of Nature Physics, the journal to which the paper was submitted.)

After the Big Bang, the universe initially held equal parts antimatter and matter. But as the universe evolved, the laws of physics favored matter through CP violation, and antimatter became a rarity. Scientists’ well-tested theory of particle physics, the standard model, includes some CP violation, but not enough to explain the current imbalance. So physicists are searching for additional sources of the discrepancy.

It’s not surprising that differences in matter and antimatter appeared in baryons as well as mesons, says theoretical physicist David London of the University of Montreal. But precise measurements of baryons might eventually reveal deviations from the predictions of the standard model. Such a result could point the way to additional asymmetry that allowed the universe as we know it to form. “It's just the first step, and hopefully there will be more such measurements,” says London.
(FULL STORY)

Giant hidden Jupiters may explain lonely planet systems
[10/3/2016]
Lonely planets can blame big, pushy bullies. Giant planets may bump off most of their smaller brethren, partly explaining why the Kepler space telescope has seen so many single-planet systems.

Of the thousands of planetary systems Kepler has discovered, about 80 per cent appear as single planets passing in front of their stars. The rest feature as many as seven planets – a distinction dubbed the Kepler dichotomy.

Recent studies suggest even starker differences. While multiple-planet systems tend to have circular orbits that all lie in the same plane – like our solar system – the orbits of singletons tend to be more elliptical and are often misaligned with the spins of their stars.

Now, a pair of computer simulations suggest that hidden giants may lurk in these single systems. We wouldn’t be able to see them; big, Jupiter-like planets in wide orbits would take too long for Kepler to catch, and they may not have orbits that cause them to pass in front of their stars in our line of sight. But if these unseen bullies are there, they may have removed many of the smaller planets in closer orbits, leaving behind the solitary worlds that Kepler sees.

The simulations show that gravitational interactions involving giants in outer orbits can eject smaller planets from the system, nudge them into their stars or send them crashing into each other.

Pushy planets
“There are bigger things out there trying to pull you around,” says Chelsea Huang at the University of Toronto, Canada. She and her team also showed the giants pull the few remaining inner planets into more elliptical and inclined orbits – the same kind seen in many of the single systems Kepler has spotted.

Alex Mustill at Lund Observatory in Sweden and his colleagues mimicked more general scenarios, including planets orbiting a binary star system, and got similar results. The studies complement each other, say Huang and Mustill.

“We know these configurations have to occur in some fraction of exoplanet systems,” Mustill says.

But that doesn’t mean they’re universal. “They don’t occur all the time, and this is one reason why you can’t explain the large number of single planets purely through this mechanism,” Mustill says. According to his analysis, bullying giants can only account for about 18 per cent of Kepler’s singles.

To confirm their proposed mechanism, the researchers must wait until next year for the launch of the Transiting Exoplanet Survey Satellite (TESS), which will target closer and brighter systems – and thus be easier for follow-up observations to uncover the bully planets.
(FULL STORY)

Rarest nucleus reluctant to decay
[10/3/2016]
Nature’s rarest type of atomic nucleus is not giving up its secrets easily.

Scientists looking for the decay of an unusual form of the element tantalum, known as tantalum-180m, have come up empty-handed. Tantalum-180m’s hesitance to decay indicates that it has a half-life of at least 45 million billion years, Bjoern Lehnert and colleagues report online September 13 at arXiv.org. “The half-life is longer than a million times the age of the universe,” says Lehnert, a nuclear physicist at Carleton University in Ottawa. (Scientists estimate the universe’s age at 13.8 billion years.)

Making up less than two ten-thousandths of a percent of the mass of the Earth’s crust, the metal tantalum is uncommon. And tantalum-180m is even harder to find. Only 0.01 percent of tantalum is found in this state, making it the rarest known long-lived nuclide, or variety of atom.

Tantalum-180m is a bit of an oddball. It is what’s known as an isomer — its nucleus exists in an “excited,” or high-energy, configuration. Normally, an excited nucleus would quickly drop to a lower energy state, emitting a photon — a particle of light — in the process. But tantalum-180m is “metastable” (hence the “m” in its name), meaning that it gets stuck in its high-energy state.
Tantalum-180m is thought to decay by emitting or capturing an electron, morphing into another element — either tungsten or hafnium — in the process. But this decay has never been observed. Other unusual nuclides, such as those that decay by emitting two electrons simultaneously, can have even longer half-lives than tantalum-180m. But tantalum-180m is unique — it is the longest-lived isomer found in nature.

“It’s a very interesting nucleus,” says nuclear physicist Eric Norman of the University of California, Berkeley, who was not involved with the study. Scientists don’t have a good understanding of such unusual decays, and a measurement of the half-life would help scientists pin down the details of the process and the nucleus’ structure.

Lehnert and colleagues observed a sample of tantalum with a detector designed to catch photons emitted in the decay process. After running the experiment for 176 days, and adding in data from previous incarnations of the experiment, the team saw no evidence of decay. The half-life couldn’t be shorter than 45 million billion years, the scientists determined, or they would have seen some hint of the process. “They did a state-of-the-art measurement,” says Norman. “It's a very difficult thing to see.”

The presence of tantalum-180m in nature is itself a bit of a mystery, too. The element-forging processes that occur in stars and supernovas seem to bypass the nuclide. “People don’t really understand how it is created at all,” says Lehnert.

Tantalum-180m is interesting as a potential energy source, says Norman, although “it’s kind of a crazy idea.” If scientists could find a way to tap the energy stored in the excited nucleus by causing it to decay, it might be useful for applications like nuclear lasers, he says.
(FULL STORY)

Weird Science: 3 Win Nobel for Unusual States of Matter
[10/3/2016]
How is a doughnut like a coffee cup? The answer helped three British-born scientists win the Nobel prize in physics Tuesday.

Their work could help lead to more powerful computers and improved materials for electronics.

David Thouless, Duncan Haldane and Michael Kosterlitz, who are now affiliated with universities in the United States, were honored for work in the 1970s and '80s that shed light on strange states of matter.

"Their discoveries have brought about breakthroughs in the theoretical understanding of matter's mysteries and created new perspectives on the development of innovative materials," the Royal Swedish Academy of Sciences said.

Thouless, 82, is a professor emeritus at the University of Washington. Haldane, 65, is a physics professor at Princeton University in New Jersey. Kosterlitz, 73, is a physics professor at Brown University in Providence, Rhode Island, and currently a visiting lecturer at Aalto University in Helsinki.

The 8 million kronor ($930,000) award was divided with one half going to Thouless and the other to Haldane and Kosterlitz.

They investigated strange states of matter like superconductivity, the ability of a material to conduct electricity without resistance.

Their work called on an abstract mathematical field called topology, which presents a particular way to describe some properties of matter. In this realm, a doughnut and a coffee cup are basically the same thing because each contains precisely one hole. Topology describes properties that can only change in full steps; you can't have half a hole.

"Using topology as a tool, they were able to astound the experts," the academy said.

For example, in the 1970s, Kosterlitz and Thouless showed that very thin layers of material — essentially containing only two dimensions rather than three — could undergo fundamental changes known as phase transitions. One example is when a material is chilled enough that it can start showing superconductivity.

Scientists had thought phase changes were impossible in just two dimensions, but the two men showed that changes do occur and that they were rooted in topology.

"This was a radically new way of looking at phases of matter," said Sankar Das Sarma, a physicist at the University of Maryland in College Park.

"Now everywhere we look we find that topology affects the physical world," he said.

Haldane was cited for theoretical studies of chains of magnetic atoms that appear in some materials. He said he found out about the prize through an early morning telephone call.

"My first thought was someone had died," he told The Associated Press. "But then a lady with a Swedish accent was on the line. It was pretty unexpected."

Kosterlitz, a dual U.K.-U.S. citizen, said he got the news in a parking garage while heading to lunch in Helsinki.

"I'm a little bit dazzled. I'm still trying to take it in," he told AP.

Nobel committee member David Haviland said this year's prize was more about theoretical discoveries even though they may result in practical applications.

"These theoreticians have come up with a description of these materials using topological ideas, which have proven very fruitful and has led to a lot of ongoing research about material properties," he said.

Haldane said the award-winning research is just starting to have practical applications.

"The big hope is that some of these new materials could lead to quantum computers and other new technology," he said.

Quantum computers could be powerful tools, but Kosterlitz was not so sure about the prospects for developing them.

"I've been waiting for my desktop quantum computer for years, but it's still showing no signs of appearing," he said. "At the risk of making a bad mistake, I would say that this quantum computation stuff is a long way from being practical."

This year's Nobel Prize announcements started Monday with the medicine award going to Japanese biologist Yoshinori Ohsumi for discoveries on autophagy, the process by which a cell breaks down and recycles content.

The chemistry prize will be announced on Wednesday and the Nobel Peace Prize on Friday. The economics and literature awards will be announced next week.

Besides the prize money, the winners get a medal and a diploma at the award ceremonies on Dec. 10, the anniversary of prize founder Alfred Nobel's death in 1896.
(FULL STORY)

Methane didn’t warm ancient Earth, new simulations suggest
[9/27/2016]
Methane wasn’t the cozy blanket that kept Earth warm hundreds of millions of years ago when the sun was dim, new research suggests.

By simulating the ancient environment, researchers found that abundant sulfate and scant oxygen created conditions that kept down levels of methane — a potent greenhouse gas — around 1.8 billion to 800 million years ago (SN: 11/14/15, p. 18). So something other than methane kept Earth from becoming a snowball during this dim phase in the sun’s life. Researchers report on this new wrinkle in the so-called faint young sun paradox (SN: 5/4/13, p. 30) the week of September 26 in the Proceedings of the National Academy of Sciences.

Limited oxygen increases the production of microbe-made methane in the oceans. With low oxygen early in Earth’s history, many scientists suspected that methane was abundant enough to keep temperatures toasty. Oxygen may have been too sparse, though. Recent work suggests that oxygen concentrations at the time were as low as a thousandth their present-day levels (SN: 11/28/14, p. 14).

Stephanie Olson of the University of California, Riverside and colleagues propose that such low oxygen concentrations thinned the ozone layer that blocks methane-destroying ultraviolet rays. They also estimate that high concentrations of sulfate in seawater at the time helped sustain methane-eating microbes. Together, these processes severely limited methane to levels similar to those seen today — far too low to keep Earth defrosted.
(FULL STORY)

New 'Artificial Synapses' Pave Way for Brain-Like Computers
[9/27/2016]
A brain-inspired computing component provides the most faithful emulation yet of connections among neurons in the human brain, researchers say.

The so-called memristor, an electrical component whose resistance relies on how much charge has passed through it in the past, mimics the way calcium ions behave at the junction between two neurons in the human brain, the study said. That junction is known as a synapse. The researchers said the new device could lead to significant advances in brain-inspired — or neuromorphic — computers, which could be much better at perceptual and learning tasks than traditional computers, as well as far more energy efficient.
(FULL STORY)

Stephen Hawking Is Still Afraid of Aliens
[9/25/2016]
Humanity should be wary of seeking out contact with alien civilizations, Stephen Hawking has warned once again.

In 2010, the famed astrophysicist said that intelligent aliens may be rapacious marauders, roaming the cosmos in search of resources to plunder and planets to conquer and colonize. He reiterates that basic concern in "Stephen Hawking's Favorite Places," a new documentary streaming now on the CuriosityStream video service.

"One day, we might receive a signal from a planet like this," Hawking says in the documentary, referring to a potentially habitable alien world known as Gliese 832c. "But we should be wary of answering back. Meeting an advanced civilization could be like Native Americans encountering Columbus. That didn't turn out so well."
(FULL STORY)

The Ig Nobel Prize Winners of 2016
[9/23/2016]
The 2016 Ig Nobel Prizes were announced on Sept. 22, revealing the honorees who were deemed to have made achievements that make people laugh and then make them think. In the 26th year of the ceremony, those honored did not disappoint. From rats wearing polyester pants and rock personalities to the science of BS and the satisfaction of mirror scratching, here's a look at this year's winners.
(FULL STORY)

Teleported Laser Pulses? Quantum Teleportation Approaches Sci-Fi Level
[9/23/2016]
Crewmembers aboard the starship Enterprise on the iconic TV series "Star Trek" could "beam up" from planets to starships, making travel between great distances look easy. While these capabilities are clearly fictional, researchers have now performed "quantum teleportation" of laser pulses over several miles within two city networks of fiber optics.

Although the method described in the research will not replace city subways or buses with transporter booths, it could help lead to hack-proof telecommunications networks, as well as a "quantum internet" to help extraordinarily powerful quantum computers talk to one another.

Teleporting an object from one point in the universe to another without it moving through the space in between may sound like science fiction, but quantum physicists have actually been experimenting with quantum teleportation since 1998. The current distance record for quantum teleportation — a feat announced in 2012 — is about 89 miles (143 kilometers), between the two Canary Islands of La Palma and Tenerife, off the northwest coast of Africa.

Quantum teleportation relies on the bizarre nature of quantum physics, which finds that the fundamental building blocks of the universe, such as subatomic particles, can essentially exist in two or more places at once. Specifically, quantum teleportation depends on a strange phenomenon known as "quantum entanglement," in which objects can become linked and influence each other instantaneously, no matter how far apart they are.

Currently, researchers cannot teleport matter (say, a human) across space, but they can use quantum teleportation to beam information from one place to another. The quantum teleportation of an electron, for example, would first involve entangling a pair of electrons. Next, one of the two electrons — the one to be teleported — would stay in one place while the other electron would be physically transported to whatever destination is desired.

Then, the fundamental details or "quantum state" of the electron to be teleported are analyzed — an act that also destroys its quantum state. Finally, that data is sent to the destination, where it can be used on the other electron to recreate the first one, so that it is indistinguishable from the original. For all intents and purposes, that electron has teleported. (Because the data is sent using regular signals such as light pulses or electrons, quantum teleportation can proceed no faster than the speed of light.)

Now, two research groups independently report quantum teleportation over several miles of fiber-optic networks in the cities of Hefei, China, and Calgary, Alberta. The scientists detailed their findings online Sept. 19 in two independent papers in the journal Nature Photonics.
(FULL STORY)

China Claims It Developed "Quantum" Radar To See Stealth Planes
[9/23/2016]
Beijing's state media has made the bold claim that a Chinese defense contractor successfully developed the world's first quantum radar system. The radar can allegedly detect objects at range of up to 62 miles. If true, this would greatly diminish the value of so-called "stealth" aircraft, including the B-2 and F-22 Raptor fighter. But it's a pretty far-out claim.

Quantum radar is based on the theory of quantum entanglement and the idea that two different particles can share a relationship with one another to the point that, by studying one particle, you can learn things about the other particle—which could be miles away. These two particles are said to be "entangled".

In quantum radars, a photon is split by a crystal into two entangled photons, a process known as "parametric down-conversion." The radar splits multiple photons into entangled pairs—and A and a B, so to speak. The radar systems sends one half of the pairs—the As—via microwave beam into the air. The other set, the Bs, remains at the radar base. By studying the photons retained at the radar base, the radar operators can tell what happens to the photons broadcast outward. Did they run into an object? How large was it? How fast was it traveling and in what direction? What does it look like?

Quantum radars defeat stealth by using subatomic particles, not radio waves. Subatomic particles don't care if an object's shape was designed to reduce a traditional, radio wave-based radar signature. Quantum radar would also ignore traditional radar jamming and spoofing methods such as radio-wave radar jammers and chaff.

According to Global Times, the 14th Institute of China Electronics Technology Group Corporation (CETC) developed the radar system last month. The subdivision website describes the "14th Institute" as "the birthplace of Radar industry (sic) in China", employing 9,000 workers on a 2,000-acre research campus.

China isn't the only country working on quantum radar: Lockheed Martin was granted a patent on a theoretical design 2008. Lockheed's plans were more far-reaching, including the ability to "visualize useful target details through background and/or camouflaging clutter, through plasma shrouds around hypersonic air vehicles, through the layers of concealment hiding underground facilities, IEDs, mines, and other threats." In many ways, Lockheed's concept of quantum radar resembles the spaceship and handheld sensors on "Star Trek."

Since the 2008 patent, Lockheed's been silent on the subject of quantum radars. Given what a technological leap such a system would be, it's quite possible the research has gone "black"—highly classified and subject to a high level of secrecy.
(FULL STORY)

Earth Wobbles May Have Driven Ancient Humans Out of Africa
[9/22/2016]
Ancient human migrations out of Africa may have been driven by wobbles in Earth's orbit and tilt that led to dramatic swings in climate, a new study finds.

Modern humans first appeared in Africa about 150,000 to 200,000 years ago. It remains a mystery as to why it then took many millennia for people to disperse across the globe. Recent archaeological and genetic findings suggest that migrations of modern humans out of Africa began at least 100,000 years ago, but most humans outside of Africa most likely descended from groups who left the continent more recently — between 40,000 and 70,000 years ago.

Previous research suggested that shifts in climate might help explain why modern human migrations out of Africa happened when they did. For instance, about every 21,000 years, Earth experiences slight changes to its orbit and tilt. These series of wobbles, known as Milankovitch cycles, alter how much sunlight hits different parts of the planet, which in turn influences rainfall levels and the number of people any given region can support.
(FULL STORY)

Alien Planet Has 2 Suns Instead of 1, Hubble Telescope Reveals
[9/22/2016]
Imagine looking up and seeing more than one sun in the sky. Astronomers have done just that, announcing today (Sept. 22) that they have spotted a planet orbiting two stars instead of one, as previously thought, using the Hubble Space Telescope.

Several planets that revolve around two, three or more stars are known to exist. But this is the first time astronomers have confirmed such a discovery of a so-called "circumbinary planet" by observing a natural phenomenon called gravitational microlensing, or the bending of light caused by strong gravity around objects in space. You can see how researchers found the planet in this video.

In binary-star systems, the two stars orbit a common center of mass. When one star passes in front of the other from our perspective on Earth, gravity from the closer star bends and magnifies the light coming from the star in the background. Astronomers can study this distorted light to find clues about the star in the foreground and any potential planets orbiting the star system.
(FULL STORY)

Glider Will Attempt Record-Breaking Flight to Edge of Space
[9/14/2016]
In a spot in South America known for its powerful winds, scientists and engineers are gearing up to attempt a record-breaking feat: to fly a human-carrying glider to the edge of space.

The expedition, known as Perlan Mission II, aims to take the glider up to an elevation of 90,000 feet (27,000 meters). The project is more than an attempt at aviation history; it's designed to study the layers of Earth's atmosphere. The researchers plan to fly the glider on a series of flights to measure electromagnetic fields, pressure, ozone and methane levels, and more.

To reach such great heights, the glider was built to take advantage of an atmospheric phenomenon called stratospheric mountain waves. Normal mountain waves form between cold and warm air masses as they move across mountain ranges and create high-altitude winds. Stratospheric mountain waves, which the researchers plan to ride, form when the polar vortex — a large, low-pressure and cold air system — reaches peak strength, giving the high-altitude winds more energy.

"The strong winds will be perpendicular to the Andes, and as they come over the mountains, they cause a wave in the air that's invisible unless there are clouds present," Jim Payne, chief pilot for the Perlan Mission II project, told Avionics. "We fly in the area where the air is rising and propagates all the way up to 90,000 feet, although meteorologists say it may go up to 130,000 feet [40,000 m]."

Stratospheric mountain waves occur at peak strength in the Southern Hemisphere's winter months [summer in the Northern Hemisphere], so the Perlan Project team members recently traveled to Patagonia, in South America, where they will await ideal conditions for their first attempt at flying to the edge of space.

"Typically, the polar vortex, which causes the high-altitude wave, is best in August and September," Payne said. "So far, August has been disappointing; we haven't had the high-altitude winds. The one downside of this is that we're totally at the mercy of the weather."

If conditions are right and the flight is successful, Perlan would surpass the world altitude record for a fixed-wing aircraft. The current record of 85,068 feet (25,929 m) was set 50 years ago by the SR-71 Blackbird, a jet-powered spy plane, National Geographic reported. Unlike the Blackbird, the Perlan glider would achieve the feat without a drop of fuel.

Earlier this year, another aviation record was set without consuming any fuel. The Solar Impulse 2, a plane powered entirely by the sun, completed a journey around the world, becoming the first solar-powered aircraft to circumnavigate the globe without using any fuel.
(FULL STORY)

Entangled Particles Reveal Even Spookier Action Than Thought
[9/13/2016]
Sorry, Einstein: It looks like the world is spooky — even when your most famous theory is tossed out. This finding comes from a close look at quantum entanglement, in which two particles that are "entangled" affect each other even when separated by a large distance. Einstein found that his theory of special relativity meant that this weird behavior was impossible, calling it "spooky."Now, researchers have found that even if they were to scrap this theory, allowing entangled particles to communicate with each other faster than the speed of light or even instantaneously, that couldn't explain the odd behavior. The findings rule out certain "realist" interpretations of spooky quantum behavior.What that tells us is that we have to look a little bit deeper," said study co-author Martin Ringbauer, a doctoral candidate in physics at the University of Queensland in Australia. "This kind of action-at-a-distance is not enough to explain quantum correlations" seen between entangled particles, Ringbauer said. Most of the time, the world seems — if not precisely orderly — then at least governed by fixed rules. At the macroscale, cause-and-effect rules the behavior of the universe, time always marches forward and objects in the universe have objective, measurable properties.

But zoom in enough, and those common-sense notions seem to evaporate. At the subatomic scale, particles can become entangled, meaning their fates are bizarrely linked. For instance, if two photons are sent from a laser through a crystal, after they fly off in separate directions, their spin will be linked the moment one of the particles is measured. Several studies have now confirmed that, no matter how far apart entangled particles are, how fast one particle is measured, or how many times particles are measured, their states become inextricably linked once they are measured.

For nearly a century, physicists have tried to understand what this means about the universe. The dominant interpretation was that entangled particles have no fixed position or orientation until they are measured. Instead, both particles travel as the sum of the probability of all their potential positions, and both only "choose" one state at the moment of measurement. This behavior seems to defy notions of Einstein's theory of special relativity, which argues that no information can be transmitted faster than the speed of light. It was so frustrating to Einstein that he famously called it "spooky action at a distance."

To get around this notion, in 1935, Einstein and colleagues Boris Podolsky and Nathan Rosen laid out a paradox that could test the alternate hypothesis that some hidden variable affected the fate of both objects as they traveled. If the hidden variable model were true, that would mean "there's some description of reality which is objective," Ringbauer told Live Science. [Spooky! The Top 10 Unexplained Phenomena]

Then in 1964, Irish physicist John Stewart Bell came up with a mathematical expression, now known as Bell's Inequality, that could experimentally prove Einstein wrong by proving the act of measuring a particle affects its state.

In hundreds of tests since, Einstein's basic explanation for entanglement has failed: Hidden variables can't seem to explain the correlations between entangled particles. But there was still some wiggle room: Bell's Inequality didn't address the situation in which two entangled photons travel faster than light.
In the new study, however, Ringbauer and his colleagues took a little bit more of that wiggle room away. In a combination of experiments and theoretical calculations, they show that even if a hidden variable were to travel from entangled photon "A" to entangled photon "B" instantaneously, that would not explain the correlations found between the two particles.

The findings may bolster the traditional interpretation of quantum mechanics, but that leaves physicists with other headaches, Ringbauer said. For one, it lays waste to our conventional notions of cause and effect, he said.

For another, it means that measurements and observations are subjective, Ognyan Oreshkov, a theoretical physicist at the Free University of Brussels in Belgium, told Live Science.

If the state of a particle depends on being measured or observed, then who or what is the observer when, for instance, subatomic particles in a distant supernova interact? What is the measurement? Who is "inside" the entangled system and who is on the outside observing it? Depending on how the system is defined, for instance, to include more and more objects and things, the "state" of any given particle may then be different, Ringbauer said.

"You can always draw a bigger box," Ringbauer said.

Still, realists should take heart. The new findings are not a complete death knell for faster-than-light interpretations of entanglement, said Oreshkov, who was not involved in the current study.

The new study "rules out only one specific model where the influence goes from the outcome of one measurement to the outcome of the other measurement," Oreshkov said. In other words, that photon A is talking to photon B at faster-than-light speeds.

Another possibility, however, is that the influence starts earlier, with the correlation in states somehow going from the point at which the photons became entangled (or at some point earlier in the experiment) to the measured photons at the end of the experiment, Oreshkov added. That, however, wasn't tested in the current research, he said. [10 Effects of Faster-Than-Light Travel]

Most physicists who were holding out for a nonlocal interpretation, meaning one not constrained by the speed of light, believe this latter scenario is more likely, said Jacques Pienaar, a physicist who was recently at the University of Vienna in Austria.

"There won't be anybody reading this paper saying, 'Oh, my God, I've been wrong my whole life,'" Pienaar, who was not involved in the current study, told Live Science. "Everybody is going to find it maybe surprising but not challenging, they'll very easily incorporate it into their theories."The new study suggests it may be time to retire Bell's Inequality, Pienaar said.

"I think that people are too focused on, too obsessed with Bell Inequalities," Pienaar said. "I think it's an idea which was really amazing and changed the whole field, but it's run its course."

Instead, a tangential idea laid out in the paper may be more intriguing – the development of a definition of causality on the quantum scale, he said. If people focus on cracking quantum entanglement from these new perspectives, "I think lots of cool discoveries could be made," Pienaar said
(FULL STORY)

Dark Matter Just Got Murkier
[9/9/2016]
They say that love makes the world go around and that may well be true. But when you look at things on a much larger scale — say the size of galaxies — love just isn't enough. And, for that matter, neither are the stars of the galaxies themselves. In fact, what makes galaxies go around is a kind of matter that has never been directly observed. That undiscovered "stuff" is called dark matter, and an amazing new measurement was recently announced that is causing the scientific world to rethink long-held thoughts.
(FULL STORY)

New 'Gel' May Be Step Toward Clothing That Computes
[9/6/2016]
A gel-like material that can carry out pattern recognition could be a major step toward "materials that compute," with possible applications for "smart" clothing or sensing skins for robots, according to a new study.

Recent advances in both materials and computer science have prompted researchers to look beyond standard silicon-based electronics and exploit the inherent properties of materials to create systems where the material itself is the computer.

Now, a team from the University of Pittsburgh has designed a material that can solve pattern-recognition problems using changes in the oscillations of a chemically powered gel that pulsates like a heart.
(FULL STORY)

3.7-Billion-Year-Old Rock May Hold Earth's Oldest Fossils
[8/31/2016]
Tiny ripples of sediment on ancient seafloor, captured inside a 3.7-billion-year-old rock in Greenland, may be the oldest fossils of living organisms ever found on Earth, according to a new study.

The research, led by Allen Nutman, head of the School of Earth and Environmental Sciences at the University of Wollongong in Australia, described the discovery of what look like tiny waves, 0.4 to 1.5 inches (1 to 4 centimeters) high, frozen in a cross section of the surface of an outcrop of rock in the Isua Greenstone Belt in southwestern Greenland, a formation made up of what geologists regard as the oldest rocks on the Earth's surface.

The researchers said the ripples are the fossilized remains of cone-shaped stromatolites, layered mounds of sediment and carbonates that build up around colonies of microbes that grow on the floor of shallow seas or lakes.
(FULL STORY)

Planck: First Stars Formed Later Than We Thought
[9/3/2016]
ESA's Planck satellite has revealed that the first stars in the Universe started forming later than previous observations of the Cosmic Microwave Background indicated. This new analysis also shows that these stars were the only sources needed to account for reionising atoms in the cosmos, having completed half of this process when the Universe had reached an age of 700 million years.
(FULL STORY)

Galaxy Cluster 11.1 Billion Light-Years from Earth Is Most Distant Ever Seen
[8/31/2016]
NASA has just discovered a group of galaxies far, far away — so far, in fact, that it set a new record for the most distant ever discovered. The cluster of galaxies, named CL J1001+0220 (or CL J1001 for short), resides a whopping 11.1 billion light-years from Earth. Astronomers found the distant cluster of galaxies using a combination of observations from NASA's Chandra X-ray Observatory and several other space telescopes.

Of the 11 galaxies in the cluster, nine appear to be experiencing a firestorm of new star births. "This galaxy cluster isn't just remarkable for its distance, it's also going through an amazing growth spurt unlike any we've ever seen," Tao Wang of the French Alternative Energies and Atomic Energy Commission (CEA) and lead investigator in the discovery, said in a statement.
(FULL STORY)

What Earth's Oldest Fossils Mean for Finding Life on Mars
[8/31/2016]
If recent findings on Earth are any guide, the oldest rocks on Mars may have signs of ancient life locked up inside.

In a new study, a team of geologists led by Allen Nutman, of the University of Wollongong in Australia, discovered 3.7-billion-year-old rocks that may contain the oldest fossils of living organisms yet found on Earth, beating the previous record by 220 million years. The discovery suggests that life on Earth appeared relatively quickly, less than 1 billion years after the planet formed, according to the new research, published online today (Aug. 31) in the journal Nature.
If that's the case, then it's possible that Martian rocks of the same age could also have evidence of microbial life in them, said Abigail Allwood, a research scientist at NASA's Jet Propulsion Laboratory in Pasadena, California. Allowed was not involved with the new study but authored an opinion piece about the discovery, which was also published today in Nature.
(FULL STORY)

Earth Just Narrowly Missed Getting Hit by an Asteroid
[8/30/2016]
On Saturday, astronomers discovered a new asteroid, just a few hours before it almost hit us. The asteroid is called 2016 QA2, and it missed the Earth by less than a quarter of the distance to the moon. That puts it about three times as far away from Earth as our farthest satellites. And we never saw it coming.
(FULL STORY)

Astrobiology Primer v2.0 Released
[8/22/2016]
The long awaited second edition of the Astrobiology Primer is now published in the journal Astrobiology.
This version is an update of the Primer originally published in 2006, written by graduate students and postdoctoral researchers to provide a comprehensive introduction to the field. Redone from scratch, the 2016 version contains updated content that addresses the definition of life in scientific research, the origins of planets and planetary systems, the evolution and interactions of life on Earth, habitability on worlds beyond Earth, the search for life, and the overall implications of the research.

The Primer is intended to be a resource for early-career scientists, especially graduate students, who are new to astrobiology.
(FULL STORY)

A new class of galaxy has been discovered, one made almost entirely of dark matter
[8/25/2016]
Much of the universe is made of dark matter, the unknowable, as-yet-undetected stuff that barely interacts with the "normal" matter around it. In the Milky Way, dark matter outnumbers regular matter by about 5 to 1, and very tiny dwarf galaxies are known to contain even more of the stuff.

But now scientists have found something entirely new: a galaxy with the same mass as the Milky Way but with only 1 percent of our galaxy's star power. About 99.99 percent of this other galaxy is made up of dark matter, and scientists believe it may be one of many.

The galaxy Dragonfly 44, described in a study published Thursday in the Astrophysical Journal Letters, is 300 million light years away. If scientists can track down a similar galaxy closer to home, however, they may be able to use it to make the first direct detection of dark matter.
(FULL STORY)

How We Could Visit the Possibly Earth-Like Planet Proxima b
[8/24/2016]
A potentially Earth-like planet has been discovered orbiting a star located right next door to the sun. Should humanity try to send a probe there as soon as possible?

The newly discovered planet, known as Proxima b, orbits the star Proxima Centauri, the closest star to the sun. Proxima Centauri is about 4.22 light-years — or 25 trillion miles (40 trillion kilometers) — from Earth.

That's a daunting distance. But an initiative announced earlier this year aims to send superfast miniature probes to Proxima Centauri, on a journey that would take about 20 years. With the discovery of Proxima b, the founders of that initiative are even more eager to get going.
In 2015, NASA's New Horizons probe completed its 3-billion-mile (4.8 billion km) journey to Pluto after traveling for about 9.5 years. The spacecraft traveled at speeds topping 52,000 mph (84,000 km/h). At that rate, it would take New Horizons about 54,400 years to reach Proxima Centauri.

Last month, NASA's Juno probe reached speeds of about 165,000 mph (265,000 km/h) as it entered into orbit around Jupiter. At that rate, a probe could reach Proxima Centauri in about 17,157 years. (It should also be noted that there is currently no feasible way to accelerate a craft large enough to carry humans to those speeds.)

In other words, sending a probe to the nearest star system would not be easy.

The founders of the Breakthrough Starshot initiative want to send wafer-thin probes to Proxima Centauri at very high speeds. The plan calls for equipping these probes with thin sails, which would capture the energy imparted by a powerful Earth-based laser.

This laser would accelerate the probes to 20 percent the speed of light (about 134.12 million mph, or 215.85 million km/h), according to the program scientists. At that rate, the probes could reach Proxima Centauri in 20 to 25 years.

But first, scientists and engineers have to build the apparatus that will launch the tiny probes on their journey. In a news conference today (Aug. 24), Pete Worden, chairman of the Breakthrough Prize Foundation, said that a group of experts had convened earlier this week and discussed plans to build a prototype of the Starshot system. However, he added that the full-scale apparatus is at least 20 years off.

"We certainly hope that, within a generation, we can launch these nanoprobes," Worden said. "And so perhaps 20, 25 years from now, we could begin to launch them, and then they would travel for 25 years to get there."

He added that building the full-scale apparatus would likely cost about the same as building the Large Hadron Collider, the largest particle accelerator in the world; that project is estimated to have cost about $10 billion.

"Over the next decade, we will work with experts here at ESO [the European Southern Observatory] and elsewhere to get as much information as possible about the Proxima Centauri planet … even including whether it might bear life, prior to launching mankind's first probe towards the star," Worden said.

Worden said the Breakthrough Prize Foundation also hopes to "obtain similar data about the other nearby stars, Alpha Centauri A and B." (The two Alpha Centauri stars lie about 4.37 light-years from Earth; some astronomers think Proxima Centauri and the Alpha Centauri stars are part of the same system.)
The New Horizons mission to Pluto was a good demonstration of the benefits of sending a probe to study a planet (or dwarf planet). Images of Pluto captured by the world's most powerful telescopes could barely resolve any surface features on the icy world. During its 2015 flyby, New Horizons provided an incredibly detailed view of Pluto's surface and a boatload of new information about its history.

Could a wafer-thin probe sent to Proxima Centauri b reveal similar details about the planet, or perhaps even reveal the presence of life?

There would be some significant limitations to how much information the probes proposed by Breakthrough Starshot would be able to send back to Earth. First and foremost, the data would take 4.22 years to travel back to Earth, on top of the 20 to 25 years it would take the probe to get to Proxima Centauri.

Seth Shostak, a senior astronomer at the SETI Institute (SETI stands for "search for extraterrestrial intelligence"), told Space.com that the prospect of sending a miniature probe to Proxima Centauri is "even more interesting now than it was ... six months ago because now we know there is a planet there."

"I think [the discovery of Proxima b] has real implications for sending something physical to the star system because now there's a target of interest," Shostak said.

But he also brought up some of the unknown variables that people will have to consider when investing in Breakthrough Starshot, including what kind of information the probes could send back from the planet. Those wafer-thin probes would have to carry very small instruments, and thus might be able to do only a very rudimentary study of a planet or star.

It's difficult to predict the exact technology that would be on board, because electrical components and other technical gear will likely continue to shrink in size over the next 20 years. Scientists and engineers would have to consider whether, in the time it would take for information to come back from a probe sent to Proxima Centauri, they could build a telescope capable of gathering the same information.

Penelope Boston, director of NASA's Astrobiology Institute, thinks the continuing trend of hardware miniaturization will make it possible to equip a wafer-thin probe with instrumentation that would make a trip to Proxima Centauri well worth the investment. Boston said the intricate details of a planet's surface can create a huge variety of specific habitats, and resolving the details of those environments on a planet outside Earth's solar system is "certainly beyond the resolution of any conceivable telescope."

"I see the trends in all different kinds of instrumentation going in a kind of ['Star Trek'] tricorder direction, where you have more and more capability packaged into ever-small physical space," Boston told Space.com.
(FULL STORY)

'Virtual' Particles Are Just 'Wiggles' in the Electromagnetic Field
[8/22/2016]
There are a few physics terms floating around in the world that are deceptive little buggers. These jargon phrases seem to succinctly describe a topic, encapsulating a complex process or interaction into a tidy, easily digestible nugget of information. But they're liars. The concepts they're intended to communicate are actually radically different from what the jargon would suggest.

Take, for example, "virtual particles." The term is supposed to answer a very old question: How, exactly, do particles interact? Let's say we have two charged particles, and let's call them Charles and Charlene. Let's continue to say that both Charles and Charlene are negatively charged. Maybe they're electrons; maybe they're muons. Doesn't matter. What matters is that if Charlene comes racing toward Charles, they bounce off each other and end up going their separate ways.

How did that bounce happen? What made it possible for Charles and Charlene to communicate with each other so that they knew to head in a new direction when the collision was all said and done?This is a fantastically basic question, so it seems that if we could satisfactorily answer it, we could unlock Deep and Important Mysteries of the Universe.

The modern perspective of quantum field theory recognizes photons — bits of light — as the carriers of the electromagnetic force. Charles and Charlene are charged particles, so they interact with light. But obviously, Charles and Charlene aren't shooting lasers at each other, so the trite explanation for their brief dalliance is that "they exchange virtual photons."
What in the name of Feynman's ghost does that mean?

Let's take a step back. Back in the olden-days (i.e., the 19th century) view of physics, each charged particle generates an electric field, which is basically an instruction sheet for how other particles can interact with it. In the case of a particle, this field is strong nearby the particle and weaker farther out. That field also points out in every direction away from the particle. [The 9 Biggest Unsolved Mysteries in Physics]

So our Charles particle produces a field that permeates all of space. Other particles, like Charlene, can read this field and move accordingly. If Charlene is super-duper far away from Charles, the field she reads has very, very small numbers, so she barely notices any effect from Charles. But when she gets close, her field reader goes off the charts. Charles' electric field is very clearly saying "GO AWAY," and she obliges.

In this view, the field is just as real and important as the particle. The universe is full of stuff, and the fields tell that stuff how to interact with other stuff.

In the early to mid-20th century, physicists realized that the universe is a much, much stranger place than we had imagined. Marrying special relativity with quantum mechanics, they developed quantum field theory, and let's just say the results weren't what anybody expected.

As the name suggests, the field got a promotion. Instead of just being the bookkeeping device that showed how one particle should interact with another, it became — and here come some italics for emphasis — the primary physical object. In this modern, sophisticated view of the universe, the electron isn't just a lonely particle. Oh no. Instead, there's an electron field, permeating all of space and time like milk in French toast.

This field is it — it's the thing. Particles? They're just pinched-off bits of that field. Or, more accurately, they're excitations (like, wiggles) of the field that can travel freely. That's important, and I'll get back to it soon.

Here's where things start to get fuzzy. A particle traveling from one spot to another doesn't exactly stay a particle, or at least not the same kind of particle.
Let's go back to Charles, the charged particle. Since he's charged, by definition he interacts with light, which is the electromagnetic field. So wiggles in the electron field (a field made up of electrons) can affect wiggles in the electromagnetic field. So, literally, as Charles zips around, he spends some of his time as an electron-field wiggle and some of his time as an electromagnetic-field wiggle. Sometimes he's an electron, and sometimes he's a photon — a bit of the electromagnetic (EM) field!

It gets worse. Way worse. Charles-turned-EM-wiggle can become other wiggles, like muon wiggles. For every fundamental particle in the universe, there's a corresponding field, and they all talk to one another and wiggle back and forth constantly.

The summation of all the wiggles and sub-wiggles and sub-sub-wiggles add up to what we call "an electron traveling from one spot to another." It all becomes really nasty mathematically very quickly, but folks like physicist Richard Feynman came up with handy tricks to get some science work done.
Now, after tons of backstory, we can get to the main question. The fields wiggle to and fro (and sometimes fro and to). If the wiggles persist and travel, we call them "particles." If they die off quickly, we call them "virtual particles." But fundamentally, they're both wiggles of fields.

When Charles encounters Charlene, they're not like two little bullets ready to slam into each other. Instead, they're complicated sets of wiggles in all sorts of fields, phasing in and out from one type of field to another.

When they do get close enough to interact, it's … messy. Very messy. Wiggles and counter-wiggles, a frenzied mishmash of intermingling. The machinery of quantum field theory — after many tedious calculations — does indeed provide the correct answer (Charles and Charlene bounce off each other), but the details are headache-inducing.

So, the shorthand — "they exchange virtual particles" — rolls off the tongue quite easily, a little slip of jargon to package up a very complicated process. But, unfortunately, it's not very accurate.
(FULL STORY)

Are tiny BLACK HOLES hitting Earth once every 1,000 years? Experts claim primordial phenomenon could explain dark matter
[8/19/2016]
Earlier this year, experts predicted that dark matter may be made of black holes formed during the first second of our universe's existence.
Known as primordial black holes, they could be hitting out own planet every 1,000 years, the professor behind the theory has now revealed.
The Nasa study claimed this interpretation aligns with our knowledge of cosmic infrared and X-ray background glows and may explain the unexpectedly high masses of merging black holes.
(FULL STORY)

‘Largest structure in the universe’ undermines fundamental cosmic principles
[8/16/2016]
Just in time for the hype surrounding No Man’s Sky, the game that takes cosmic scale to the extreme, a team of astronomers say they’ve discovered what might be the largest structure in the observable universe. The tremendous feature consists of nine gamma-ray bursts (GRB), forming a ring that is streaking across some 5 billion light years through space, according to a paper published in Monthly Notices of the Royal Astronomical Society.

The ring’s diameter stretches more than 70 times that of the full moon as seen from Earth. And, as the GRBs each appear to be about 7 billion light years away, the probability that these features are positioned in this way by chance is just one in 20,000, according to lead author Professor Lajos Balazs from the Kinkily Observatory in Budapest.

Amazingly, the team of astronomers discovered the cosmic ring by accident. “Originally, we studied the space distribution of gamma ray bursts,” Balazs told Digital Trends. “GRBs are the most energetic transients in the universe and the only observed objects sampling the observable universe as a whole. In general, we were interested to conclude whether the universe is homogeneous and isotropic on large scale.
“We were totally surprised,” he added, “because we did not expect to find it.”
However, there are reasons to step back and reconsider the discovery — it seems to undermine our established understanding of how the universe developed.

According to the cosmological principle, the structure of the universe is uniform at its largest scale and its largest structures are theoretically limited to 1.2 billion light years across. This new discovery pushes that limit nearly five-fold.

Balazs and his team used telescopes in space and observatories on Earth to identify the structure. They will now investigate whether the cosmological principle and other processes of galaxy formation can account for the ring structure. If not, theories about the formation of the cosmos may need to be rewritten.

“If we are right,” Balazs commented in a press release, “this structure contradicts the current models of the universe. It was a huge surprise to find something this big – and we still don’t quite understand how it came to exist at all.”
(FULL STORY)

"Kitchen Smoke" in nebula offer clues to the building blocks of life
[8/17/2016]
Using data collected by NASA's Stratospheric Observatory for Infrared Astronomy (SOFIA) and other observatories, an international team of researchers has studied how a particular type of organic molecules, the raw materials for life - could develop in space. This information could help scientists better understand how life could have developed on Earth.

Bavo Croiset of Leiden University in the Netherlands and his collaborators focused on a type of molecule called polycyclic aromatic hydrocarbons (PAHs), which are flat molecules consisting of carbon atoms arranged in a honeycomb pattern, surrounded by hydrogen. PAHs make up about 10 percent of the carbon in the universe, and are found on the Earth where they are released upon the burning of organic material such as meat, sugarcane, wood etc.
Croiset's team determined that when PAHs in the nebula NGC 7023, also known as the Iris Nebula, are hit by ultraviolet radiation from the nebula's central star, they evolve into larger, more complex molecules. Scientists hypothesize that the growth of complex organic molecules like PAHs is one of the steps leading to the emergence of life.

Some existing models predicted that the radiation from a newborn, nearby massive star would tend to break down large organic molecules into smaller ones, rather than build them up. To test these models, researchers wanted to estimate the size of the molecules at various locations relative to the central star.

Croiset's team used SOFIA to observe Nebula NGC 7023 with two instruments, the FLITECAM near-infrared camera and the FORCAST mid-infrared camera. SOFIA's instruments are sensitive to two wavelengths that are produced by these particular molecules, which can be used to estimate their size.

The team analyzed the SOFIA images in combination with data previously obtained by the Spitzer infrared space observatory, the Hubble Space Telescope and the Canada-France-Hawaii Telescope on the Big Island of Hawaii.

The analysis indicates that the size of the PAH molecules in this nebula vary by location in a clear pattern. The average size of the molecules in the nebula's central cavity, surrounding the illuminating star, is larger than on the surface of the cloud at the outer edge of the cavity.

In a paper published in Astronomy and Astrophysics, The team concluded that this molecular size variation is due both to some of the smallest molecules being destroyed by the harsh ultraviolet radiation field of the star, and to medium-sized molecules being irradiated so they combine into larger molecules. Researchers were surprised to find that the radiation resulted in net growth, rather than destruction.

"The success of these observations depended on both SOFIA's ability to observe wavelengths inaccessible from the ground, and the large size of its telescope, which provided a more detailed map than would have been possible with smaller telescopes," said Olivier Berne at CNRS, the National Center for Scientific Research in Toulouse, France, one of the published paper's co-authors.
(FULL STORY)

Brian Krill: Evolution of the 21st-Century Scientist
[8/12/2016]
Throughout the past year, I've struggled with how best to define myself as a scientist. At times, I even questioned whether it's appropriate to refer to myself as a scientist.

The thing is, even though I have a PhD in experimental psychology and multiple scientific publications to my name, I no longer work in a traditional scientific setting. That is to say, I don't teach or carry out my own research at a college or university, like many scientists. Indeed, according to a recent survey conducted by the National Science Foundation (NSF), roughly 45 percent of PhD recipients in science work at four-year educational institutions.
I freely chose to leave academia about a year and a half ago because, at the time, doing so was the best thing for my family. Nonetheless, the transition was difficult, especially after so much time devoted to preparation for what I thought would be my lifelong career—four years of undergraduate education followed by five years of graduate training and another two years of postdoctoral training.

After 18 months, I'm at last fully adjusted to life on the outside. But to get to this point, I’ve had to challenge my own heavily ingrained assumptions about what it means to be a scientist. For years, I had clung to the belief, likely held by so many others, that being a scientist necessarily means being an academic and a scholar. This view is wrong—more so now than ever—because the economic landscape for scientists is changing. As such, it might well be time for the entire scientific community to rethink what it means to be a scientist.
(FULL STORY)

Simulated black hole experiment backs Hawking prediction
[8/16/2016]
Prof Jeff Steinhauer simulated a black hole in a super-cooled state of matter called a Bose-Einstein condensate. In the journal Nature Physics, he describes having observed the equivalent of a phenomenon called Hawking radiation - predicted to be released by black holes. Prof Hawking first argued for its existence in 1974. "Classical" physics dictates that the gravity of a black holes is so strong that nothing, not even light, can escape. So Hawking's idea relies on quantum mechanics - the realm of physics which takes hold at very small scales. These quantum effects allow black holes to radiate particles in a process which, over vast stretches of time, would ultimately cause the black hole to evaporate.

But the amount of radiation emitted is small, so the phenomenon has never actually been observed in an astrophysical black hole. Prof Steinhauer, from the Technion - Israel Institute of Technology in Haifa, uncovered evidence that particles were spontaneously escaping his replica black hole. Furthermore, these were "entangled" (or linked) with partner particles being pulled into the hole - a key signature of Hawking radiation. The Bose-Einstein condensate used in the experiment is created when matter, in this case a cloud of rubidium atoms inside a tube, is cooled to near the temperature known as absolute zero, -273C.
In this environment, sound travels at just half a millimetre per second. By speeding up the atoms partway along the tube, to faster than that speed, Prof Steinhauer created a sort of "event horizon" for sound waves. It was packets of sound waves, called "phonons", that played the part of entangled particles on the fringe of a black hole.

The findings do not help answer one of the trickiest puzzles about black hole physics: the Information Paradox. One of the implications of Hawking's theory is that physical information - for example, about properties of a sub-atomic particle - is destroyed when black holes emit Hawking radiation.
But this violates one of the rules of quantum theory. Toby Wiseman, a theoretical physicist at Imperial College London, told BBC News: "Analogues are very interesting from an experimental and technological point of view. But I don't think we're ever going to learn anything about actual black holes [from these simulations]. What it is doing is confirming the ideas of Hawking, but in this analogue setting."
Dr Wiseman, who was not involved with the research, compared idea of the Bose-Einstein condensate simulation to water in a bathtub. "It relies on the fact that there's a precise mathematical analogue between the physics of particles near black holes and ripples in flowing fluids... It's an elegant idea that goes back some way. "If you pull the plug in a bath, you create a flow down the plug, and the ripples on the the water get dragged down the plughole. The flow gets quicker as it gets toward the plughole and if you have a system where the flow is going faster than the speed of the ripples, when those ripples flow past some point near the plughole, they can never come back out." Dr Wiseman said this point was equivalent to the event horizon - the point of no return for matter being drawn in by the gravity of a black hole.
(FULL STORY)

Deuteron joins proton as smaller than expected
[8/11/2016]
According to the international Committee on Data for Science and Technology (CODATA), the charge radius of the proton is 0.8768(69) fm. Few researchers would give that number much thought if not for measurements in 2010 and 2013 that yielded a radius 4% smaller than and 7.2 standard deviations distant from the CODATA value. Randolf Pohl of the Max Planck Institute of Quantum Optics in Garching, Germany, and colleagues obtained the curiously low radius after analyzing the energy-level shifts of muons orbiting hydrogen nuclei. With a mass 207 times that of the electron, a muon has a tighter orbital that more closely overlaps the nuclear charge distribution, which makes the negatively charged particle a useful tool for probing nuclear dimensions. The discrepancy between the results of muon-based and other experimental investigations has come to be known as the proton radius puzzle.

Now Pohl and his colleagues have used the same technique to measure the radius of the deuteron, a nucleus of one proton and one neutron. The researchers shot a beam of muons at a target of D2 gas. Lasers excited some of the atoms whose electrons were replaced by muons and probed the muons’ energy-level transitions. By combining the measurements with theory, the researchers came up with a deuteron charge radius of 2.12562(78) fm. That’s 7.5 σ smaller than the CODATA value (see graph below; the new result is in red). In addition, both the proton and deuteron sizes are in tension with the values obtained by applying the same technique to atoms with electrons rather than muons.
(FULL STORY)

Scientists Identify 20 Alien Worlds Most Likely to Be Like Earth
[8/10/2016]
Astronomers are narrowing the field in their search for a "second Earth."

An international team of researchers has identified the 20 most Earth-like worlds among the more than 4,000 exoplanet candidates that NASA's Kepler space telescope has detected to date, scientists report in a new study.

All 20 potential "second Earths" lie within the habitable zones of their sun-like stars — meaning they should be able to harbor liquid water on their surfaces — and are likely rocky, the researchers said.

Identifying these Earth-like planets is important in the hunt for alien life, said study lead author Stephen Kane, an associate professor of physics and astronomy at San Francisco State University (SFSU).

"[It] means we can focus in on the planets in this paper and perform follow-up studies to learn more about them, including if they are indeed habitable," Kane said in a statement.

Kane and his team sorted through the 216 habitable-zone Kepler planets and candidates found so far. (A "candidate" is a world that has yet to be confirmed by follow-up observations or analysis. Kepler has found about 4,700 candidates to date, more than 2,300 of which have been confirmed; about 90 percent of all candidates should eventually turn out to be the real deal, mission team members have said.)

Second-Earth candidates had to be safely within the habitable zone. If a planet is too close to the inner edge, it could experience a runaway greenhouse effect like the one that occurred on Venus. And if it's too close to the outer edge, the planet could end up being a frigid world like Mars, the researchers said.
(FULL STORY)

‘Alien Megastructure’ Star Mystery Deepens After Fresh Kepler Data Confirms Erratic Dimming
[8/8/2016]
The case of Tabby’s star keeps getting curiouser and curiouser. The star — formally known by its somewhat clunky name KIC 8462852 — has been a source of endless intrigue for astronomers since September, when its bizarre behavior triggered speculations over the existence of an alien civilization around it.

The WTF (Where’s the Flux) star is now back in the news, and it's more mysterious than ever.

Before we delve into the latest findings, released online through the preprint server arXiv, here is a quick lowdown of the story so far:

Last fall, a team of scientists led by Tabetha Boyajian from Yale University, who lends the object its informal name, the “Tabby’s star,” reported that the star was not behaving as it should. Based on observations conducted using NASA's Kepler Space Telescope between 2009 and 2013, the team witnessed two unusual incidents, in 2011 and 2013, when the star's light dimmed in dramatic, never-before-seen ways.

This dimming indicated that something had passed in front of the star — located between the constellations, Cygnus and Lyra. At the time, a swarm of comets was proposed as the most likely explanation.

However, this is not when Tabby’s star captured the public’s imagination. That happened a month later, in October, when Jason Wright, an astronomer from Penn State University, put forth the idea that the swarm of objects around the star is “something you would expect an alien civilization to build.”

In other words, he suggested that the swarm may be an “alien megastructure,” or a giant Dyson sphere, built by a technologically advanced species to harness the star’s energy.

Unfortunately, two subsequent independent searches, especially tailored to detect alien radio signals and laser pulses, drew a blank, and, earlier this year, a study based on analysis of photographic plates of the sky dating back to the late 19th century argued that even the comet swarm idea, which was the best of the remaining proposals, cannot explain the star’s erratic dimming — although the study’s findings were widely disputed.

In the new paper — which is yet to be peer-reviewed — Caltech astronomer Ben Montet and Joshua Simon of the Carnegie Institute detail their analysis of photometric data of the star gathered by the Kepler space telescope.

Their findings — the star was definitely dimming at a rate that defies explanation over the four years Kepler monitored it. For instance, in the first 1,000 days of Kepler’s observations, the star’s luminosity dipped by roughly 3.4 percent per year, before dropping dramatically by 2.5 percent in a span of just 200 days — something that suggests that the long-term dimming hypothesis may very well be true.

“We note these results are apparent in data from each individual detector, not just the combined light curve, suggesting that the decline in flux is an astrophysical effect rather than an instrumental one,” the researchers wrote in the paper. “We offer no definitive explanation that could explain the observed light curve in this work. The effect could be stellar in nature, although there are no known mechanisms that would cause a main-sequence F star to dim in brightness by 2.5 percent over a few months. The effect could also be caused by a passing dust cloud in orbit around KIC 8462852.”

Although the study seems to rule out all the possible explanations that have been put forward so far — a swarm of comets, planetary fragments, or a distorted star — it does not mean the dimming is caused by an alien megastructure. It just means we still don’t have an explanation for what is causing the star’s light pattern to dip erratically.

“The new paper states, and I agree, that we don’t have any really good models for this sort of behavior,” Wright told Gizmodo. “That’s exciting!”

Those looking for a satisfactory explanation would now pin their hopes on a team of researchers led by Boyajian, which recently successfully crowdfunded a campaign to secure observation time at the Las Cumbres Observatory Global Telescope Network — a privately run network of telescopes set up around the globe to ensure continuous monitoring of an object.

For now, though, “the most mysterious star in our galaxy” continues to live up to its fame.
(FULL STORY)

Scientists develop new form of light
[8/7/2016]
Physicists have described a new form of light produced by binding photons to single electrons. According to their study, published Friday in Nature Communications, the mashed-up particles could be used in new photonic circuits and allow the study of quantum phenomena on the visible scale.

“The results of this research will have a huge impact on the way we conceive light,” said lead author Vincenzo Giannini in a statement. Photons are the basic particle component of light. When these particles come in contact with a material, they interact with numerous electrons on the material’s surface. Dr. Giannini, who lectures at Imperial College London’s physics department, sought to study the interaction of photons with a "recently discovered" class of materials called topological insulators.
Giannini’s team of physicists and material scientists developed digital models that would predict the interaction. Their models, which were based on a single nanoparticle made of a topological insulator, showed that light could interact with just one surface electron.

By coupling the two particles, researchers could combine certain properties of both. Light normally travels in a straight line, but when bound to a single electron, it could follow the electron’s path along a material surface. And, while electrons usually stop when they encounter a poor conductor, the addition of photons would allow the coupled particle to continue moving.
(FULL STORY)

New particle hopes fade as LHC data 'bump' disappears
[8/5/2016]
Results from the Large Hadron Collider show that a "bump" in the machine's data, previously rumoured to represent a new particle, has gone away.
The discovery of new particles, which could trigger a paradigm shift in physics, may still be years away. All the latest LHC results are being discussed at a conference in Chicago.
David Charlton of Birmingham University, leader of the Atlas experiment at the LHC, told BBC News that everyone working on the project was disappointed.
"There was a lot of excitement when we started to collect data. But in the [latest results] we see no sign of a bump, there's nothing. "It is a pity because it would have been a really fantastic thing if there had been a new particle."
Speaking to journalists in Chicago at the International Conference on High Energy Physics (ICHEP), Prof Charlton said it was a remarkable coincidence - but purely a coincidence - that two separate LHC detectors, Atlas and CMS, picked up matching "bumps". "It just seems to be a statistical fluke, that the two experiments saw something at the same mass. "Coincidences are always strange when they happen - but we've been looking very hard at our data to make sure we fully understand them, and we don't see anything in the new sample."
(FULL STORY)

Moon Express cleared for lunar landing
[8/3/2016]
Moon Express has become the first private firm to win US approval for an unmanned mission to the moon.
The two-week mission was given the go-ahead by the Federal Aviation Administration's Office of Commercial Space Transportation.
The plan is to send a suitcase-sized lander to the moon in late 2017.
The lander, which is not yet completed, will be carried on a rocket made by Rocket Lab, a start-up firm which has not launched any commercial missions.
Science experiments and some commercial cargo will be carried on the one-way trip to the lunar surface.
Moon Express also plans to beam pictures back to the Earth.
What if you could mine the moon?
"The Moon Express 2017 mission approval is a landmark decision by the US government and a pathfinder for private sector commercial missions beyond Earth's orbit," said Moon Express co-founder Bob Richards.
His partner, Naveen Jain, says the company is keen to explore the possibilities of mining on the moon.
"In the immediate future we envision bringing precious resources, metals and moon rocks back to Earth," he said.
(FULL STORY)

First Reprogrammable Quantum Computer Created
[8/3/2016]
Scientists have created the first programmable and reprogrammable quantum computer, according to a new study.

The technology could usher in a much-anticipated era of quantum computing, which researchers say could help scientists run complex simulations and produce rapid solutions to tricky calculations.
(FULL STORY)

Tiny 'Atomic Memory' Device Could Store All Books Ever Written
[7/18/2016]
A new "atomic memory" device that encodes data atom by atom can store hundreds of times more data than current hard disks can, a new study finds.

"You would need just the area of a postage stamp to write out all books ever written," said study senior author Sander Otte, a physicist at the Delft University of Technology's Kavli Institute of Nanoscience in the Netherlands.

In fact, the researchers estimated that if they created a cube 100 microns wide — about the same diameter as the average human hair — made of sheets of atomic memory separated from one another by 5 nanometers, or billionths of a meter, the cube could easily store the contents of the entire U.S. Library of Congress.
(FULL STORY)

Error fix for long-lived qubits brings quantum computers nearer
[7/20/2016]
Useful quantum computers are one step closer, thanks to the latest demonstration of a technique designed to stop them making mistakes.

Quantum computers store information as quantum bits, or qubits. Unlike binary bits, which store a 0 or a 1, qubits can hold a mixture of both states at the same time, boosting their computing potential for certain types of problems. But qubits are fragile – their quantum nature means they can’t hold data for long before errors creep in.

The code is designed to provide wiggle-room, making it possible to recover from errors. A similar concept is used to handle errors in binary bits on hard drives and DVDs, but things are more difficult in the quantum realm. The rules of quantum mechanics mean you can’t directly read the state of a qubit without destroying it – it’s like opening the box to take a look at Schrödinger’s cat. That means we need more sophisticated codes for quantum computers than for DVDs.
(FULL STORY)

2 Newfound Alien Planets May Be Capable of Supporting Life
[7/18/2016]
NASA's Kepler space telescope has spotted four possibly rocky alien planets orbiting the same star, and two of these newfound worlds might be capable of supporting life.

The four exoplanets circle a red dwarf — a star smaller and dimmer than the sun — called K2-72, which lies 181 light-years from Earth in the Aquarius constellation. All four worlds are between 20 percent and 50 percent wider than Earth, making them good candidates to be rocky, discovery team members said.
(FULL STORY)

Scientists create vast 3-D map of universe, validate Einstein's theories
[7/15/2016]
Hundreds of scientists worked together to map out one-quarter of the sky – and they weren't just plotting landmarks for a road trip to Alpha Centauri.

Through painstaking, complex measurements that reached into the earliest chapters of the universe, they charted a 3-D model of 650 billion cubic light years of space that included 1.2 million galaxies. It was created with an important goal – to measure dark energy.

When scientists recently discovered that the expansion of the universe was accelerating, dark energy was suggested as the “anti-gravity” force responsible for it, perhaps even as the “cosmological constant” that Einstein envisioned. Another theory argued that gravity itself was breaking down – not particularly encouraging in a universe thought to be ruled by its laws.

Results from this massive project, however, have affirmed the laws of gravity and general relativity on a universal level, providing evidence that dark energy is indeed responsible for the accelerated expansion of the universe. The research indicated that this energy is consistent with Albert Einstein’s suggestion of Lambda, a cosmological constant that is a repellent force countering attraction between matter.
(FULL STORY)

New Science Experiment Will Tell Us If The Universe Is a Hologram
[7/10/2016]
Do we live in a two-dimensional hologram? Scientists may discover that we live in a Matrix-like illusion during an experiment from the U.S. Department of Energy’s Fermi National Accelerator Laboratory that will look for “holographic noise” and will collect data over the next years.

Although our world appears three-dimensional to us, this could be entirely illusive. Scientists are theorizing that our universe may operate much like a TV screen; just like a TV screen has pixels which create seamless images, spacetime could be contained in two-dimensional packets that make reality appear three-dimensional to the human eye.

Like a hologram, a three-dimensional image would be coded onto a two-dimensional medium. If this theory is correct, these “pixels” of spacetime would be ten trillion trillion times smaller than an atom.

“We want to find out whether space-time is a quantum system just like matter is,” said Craig Hogan, director of Fermilab’s Center for Particle Astrophysics. “If we see something, it will completely change ideas about space we’ve used for thousands of years.”
If these packets of spacetime exist, then they are expected to be governed by the Heisenberg uncertainty principle. According to this principle, it is impossible to simultaneously ascertain the exact speed and exact location of a subatomic particle. If spacetime consists of two-dimensional fragments, then all of space would potentially be governed by this principle. If this were the case, then space, like matter, would have perpetual quantum vibrations regardless of its energy level.
(FULL STORY)

Three's Company: Newly Discovered Planet Orbits a Trio of Stars
[7/7/2016]
Three's Company: Newly Discovered Planet Orbits a Trio of Stars
A new planet, HD 131399Ab, in a triple star system was recently discovered about 340 light-years from Earth in the constellation of Centaurus.
Credit: ESO/L. Calçada
A newly discovered planet has been spotted orbiting three stars at once, in a highly exotic celestial arrangement.

"The planet is orbiting star A — the lonely star in this scenario," Kevin Wagner, a first-year doctoral student at the University of Arizona, told Space.com. The planet and star A are then orbited by a pair of stars that the scientists call "star B" and "star C. (Check below to see a video of the system's orbital dance).

The strange new world, HD 131399Ab, lies 340 light-years from Earth, in the constellation Centaurus. For about half of its orbit through the system, all three stars are visible in the sky.
(FULL STORY)

A Fifth Force: Fact or Fiction?
[7/5/2016]
Science and the internet have an uneasy relationship: Science tends to move forward through a careful and tedious evaluation of data and theory, and the process can take years to complete. In contrast, the internet community generally has the attention span of Dory, the absent-minded fish of "Finding Nemo"(and now "Finding Dory") — a meme here, a celebrity picture there — oh, look … a funny cat video.

Thus people who are interested in serious science should be extremely cautious when they read an online story that purports to be a paradigm-shifting scientific discovery. A recent example is one suggesting that a new force of nature might have been discovered. If true, that would mean that we have to rewrite the textbooks.

So what has been claimed?

In an article submitted on April 7, 2015, to the arXiv repository of physics papers, a group of Hungarian researchers reported on a study in which they focused an intense beam of protons (particles found in the center of atoms) on thin lithium targets. The collisions created excited nuclei of beryllium-8, which decayed into ordinary beryllium-8 and pairs of electron-positron particles. (The positron is the antimatter equivalent of the electron.)

They claimed that their data could not be explained by known physical phenomena in the Standard Model, the reigning model governing particle physics. But, they purported, they could explain the data if a new particle existed with a mass of approximately 17 million electron volts, which is 32.7 times heavier than an electron and just shy of 2 percent the mass of a proton. The particles that emerge at this energy range, which is relatively low by modern standards, have been well studied. And so it would be very surprising if a new particle were discovered in this energy regime.

However, the measurement survived peer review and was published on Jan. 26, 2016, in the journal Physical Review Letters, which is one of the most prestigious physics journals in the world. In this publication, the researchers, and this research, cleared an impressive hurdle. [What's That? Your Physics Questions Answered]

Their measurement received little attention until a group of theoretical physicists from the University of California, Irvine (UCI), turned their attention to it. As theorists commonly do with a controversial physics measurement, the team compared it with the body of work that has been assembled over the last century or so, to see if the new data are consistent or inconsistent with the existing body of knowledge. In this case, they looked at about a dozen published studies.

What they found is that though the measurement didn't conflict with any past studies, it seemed to be something never before observed — and something that couldn't be explained by the Standard Model.
(FULL STORY)

Neutrinos hint at why antimatter didn’t blow up the universe
[7/4/2016]
It could all have been so different. When matter first formed in the universe, our current theories suggest that it should have been accompanied by an equal amount of antimatter – a conclusion we know must be wrong, because we wouldn’t be here if it were true. Now the latest results from a pair of experiments designed to study the behaviour of neutrinos – particles that barely interact with the rest of the universe – could mean we’re starting to understand why.
(FULL STORY)

Relativistic codes reveal a clumpy universe
[6/28/2016]
A new set of codes that, for the first time, are able to apply Einstein's complete general theory of relativity to simulate how our universe evolved, have been independently developed by two international teams of physicists. They pave the way for cosmologists to confirm whether our interpretations of observations of large-scale structure and cosmic expansion are telling us the true story.
The impetus to develop codes designed to apply general relativity to cosmology stems from the limitations of traditional numerical simulations of the universe. Currently, such models invoke Newtonian gravity and assume a homogenous universe when describing cosmic expansion, for reasons of simplicity and computing power. On the largest scales the universe is homogenous and isotropic, meaning that matter is distributed evenly in all directions; but on smaller scales the universe is clearly inhomogeneous, with matter clumped into chains of galaxies and filaments of dark matter assembled around vast voids.
(FULL STORY)

Quantum computer makes first high-energy physics simulation
[6/22/2016]
Physicists have performed the first full simulation of a high-energy physics experiment — the creation of pairs of particles and their antiparticles — on a quantum computer1. If the team can scale it up, the technique promises access to calculations that would be too complex for an ordinary computer to deal with.

To understand exactly what their theories predict, physicists routinely do computer simulations. They then compare the outcomes of the simulations with actual experimental data to test their theories.

In some situations, however, the calculations are too hard to allow predictions from first principles. This is particularly true for phenomena that involve the strong nuclear force, which governs how quarks bind together into protons and neutrons and how these particles form atomic nuclei, says Christine Muschik, a theoretical physicist at the University of Innsbruck in Austria and a member of the simulation team.

Many researchers hope that future quantum computers will help to solve this problem. These machines, which are still in the earliest stages of development, exploit the physics of objects that can be in multiple states at once, encoding information in ‘qubits’, rather than in the on/off state of classical bits. A computer made of a handful of qubits can perform many calculations simultaneously, and can complete certain tasks exponentially faster than an ordinary computer.

Coaxing qubits
Esteban Martinez, an experimental physicist at the University of Innsbruck, and his colleagues completed a proof of concept for a simulation of a high-energy physics experiment in which energy is converted into matter, creating an electron and its antiparticle, a positron.

The team used a tried-and-tested type of quantum computer in which an electromagnetic field traps four ions in a row, each one encoding a qubit, in a vacuum. They manipulated the ions’ spins — their magnetic orientations — using laser beams. This coaxed the ions to perform logic operations, the basic steps in any computer calculation.

After sequences of about 100 steps, each lasting a few milliseconds, the team looked at the state of the ions using a digital camera. Each of the four ions represented a location, two for particles and two for antiparticles, and the orientation of the ion revealed whether or not a particle or an antiparticle had been created at that location.

The team’s quantum calculations confirmed the predictions of a simplified version of quantum electrodynamics, the established theory of the electromagnetic force. “The stronger the field, the faster we can create particles and antiparticles,” Martinez says. He and his collaborators describe their results on 22 June in Nature1.

Four qubits constitute a rudimentary quantum computer; the fabled applications of future quantum computers, such as for breaking down huge numbers into prime factors, will require hundreds of qubits and complex error-correction codes. But for physical simulations, which can tolerate small margins of error, 30 to 40 qubits could already be useful, Martinez says.

John Chiaverini, a physicist who works on quantum computing at the Massachusetts Institute of Technology in Cambridge, says that the experiment might be difficult to scale up without significant modifications. The linear arrangement of ions in the trap, he says, is “particularly limiting for attacking problems of a reasonable scale”. Muschik says that her team is already making plans to use two-dimensional configurations of ions.

Are we there yet?
“We are not yet there where we can answer questions we can’t answer with classical computers,” Martinez says, “but this is a first step in that direction.” Quantum computers are not strictly necessary for understanding the electromagnetic force. However, the researchers hope to scale up their techniques so that they can simulate the strong nuclear force. This may take years, Muschik says, and will require not only breakthroughs in hardware, but also the development of new quantum algorithms.

These scaled-up quantum computers could help in understanding what happens during the high-speed collision of two atomic nuclei, for instance. Faced with such a problem, classical computer simulations just fall apart, says Andreas Kronfeld, a theoretical physicist who works on simulations of the strong nuclear force at the Fermi National Accelerator Laboratory (Fermilab) near Chicago, Illinois.

Another example, he says, is understanding neutron stars. Researchers think that these compact celestial objects consist of densely packed neutrons, but they’re not sure. They also don’t know the state of matter in which those neutrons would exist.
(FULL STORY)

E.T. Phones Earth? 1,500 Years Until Contact, Experts Estimate
[6/20/2016]
"Communicating with anybody is an incredibly slow, long-duration endeavor," said Evan Solomonides at a press conference June 14 at the American Astronomical Society's summer meeting in San Diego, California. Solomonides is an undergraduate student at Cornell University in New York, where he worked with Cornell radio astronomer Yervant Terzian to explore the mystery of the Fermi paradox: If life is abundant in the universe, the argument goes, it should have contacted Earth, yet there's no definitive sign of such an interaction.
(FULL STORY)

We could be on the brink of a shockingly big discovery in physics
[6/16/2016]
It's December 15, 2015, and an auditorium in Geneva is packed with physicists. The air is filled with tension and excitement because everybody knows that something important is about to be announced. The CERN Large Hadron Collider (LHC) has recently restarted operations at the highest energies ever achieved in a laboratory experiment, and the first new results from two enormous, complex detectors known as ATLAS and CMS are being presented. This announcement has been organized hastily because both detectors have picked up something completely unexpected. Rumors have been circulating for days about what it might be, but nobody knows for sure what is really going on, and the speculations are wild.
(FULL STORY)

Scientists have detected gravitational waves for the second time
[6/15/2016]
Scientists with the LIGO collaboration claim they have once again detected gravitational waves — the ripples in space-time produced by objects moving throughout the Universe. It’s the second time these researchers have picked up gravitational wave signals, after becoming the first team in history to do so earlier this year.
(FULL STORY)

No Escape From Black Holes? Stephen Hawking Points to a Possible Exit
[6/6/2016]
“A black hole has no hair.”

That mysterious, koan-like statement by the theorist and legendary phrasemaker John Archibald Wheeler of Princeton has stood for half a century as one of the brute pillars of modern physics.

It describes the ability of nature, according to classical gravitational equations, to obliterate most of the attributes and properties of anything that falls into a black hole, playing havoc with science’s ability to predict the future and tearing at our understanding of how the universe works.

Now it seems that statement might be wrong.

Recently Stephen Hawking, who has spent his entire career battling a form of Lou Gehrig’s disease, wheeled across the stage in Harvard’s hoary, wood-paneled Sanders Theater to do battle with the black hole. It is one of the most fearsome demons ever conjured by science, and one partly of his own making: a cosmic pit so deep and dense and endless that it was long thought that nothing — not even light, not even a thought — could ever escape.

But Dr. Hawking was there to tell us not to be so afraid.

In a paper to be published this week in Physical Review Letters, Dr. Hawking and his colleagues Andrew Strominger of Harvard and Malcolm Perry of Cambridge University in England say they have found a clue pointing the way out of black holes.

“They are not the eternal prisons they were once thought,” Dr. Hawking said in his famous robot voice, now processed through a synthesizer. “If you feel you are trapped in a black hole, don’t give up. There is a way out.”

Black holes are the most ominous prediction of Einstein’s general theory of relativity: Too much matter or energy concentrated in one place would cause space to give way, swallowing everything inside like a magician’s cloak.

An eternal prison was the only metaphor scientists had for these monsters until 40 years ago, when Dr. Hawking turned black holes upside down — or perhaps inside out. His equations showed that black holes would not last forever. Over time, they would “leak” and then explode in a fountain of radiation and particles. Ever since, the burning question in physics has been: When the black hole finally goes, does it give up the secrets of everything that fell in?

Dr. Hawking’s calculation was, and remains, hailed as a breakthrough in understanding the connection between gravity and quantum mechanics, between the fabric of space and the subatomic particles that live inside it — the large and the small in the universe.

But there was a hitch. By Dr. Hawking’s estimation, the radiation coming out of the black hole as it fell apart would be random. As a result, most of the “information” about what had fallen in — all of the attributes and properties of the things sucked in, whether elephants or donkeys, Volkswagens or Cadillacs — would be erased.

In a riposte to Einstein’s famous remark that God does not play dice, Dr. Hawking said in 1976, “God not only plays dice with the universe, but sometimes throws them where they can’t be seen.”

But his calculation violated a tenet of modern physics: that it is always possible in theory to reverse time, run the proverbial film backward and reconstruct what happened in, say, the collision of two cars or the collapse of a dead star into a black hole.

The universe, like a kind of supercomputer, is supposed to be able to keep track of whether one car was a green pickup truck and the other was a red Porsche, or whether one was made of matter and the other antimatter. These things may be destroyed, but their “information” — their essential physical attributes — should live forever.
In fact, the information seemed to be lost in the black hole, according to Dr. Hawking, as if part of the universe’s memory chip had been erased. According to this theorem, only information about the mass, charge and angular momentum of what went in would survive.

Nothing about whether it was antimatter or matter, male or female, sweet or sour.

A war of words and ideas ensued. The information paradox, as it is known, was no abstruse debate, as Dr. Hawking pointed out from the stage of the Sanders Theater in April. Rather, it challenged foundational beliefs about what reality is and how it works.

If the rules break down in black holes, they may be lost in other places as well, he warned. If foundational information disappears into a gaping maw, the notion of a “past” itself may be in jeopardy — we couldn’t even be sure of our own histories. Our memories could be illusions.

“It’s the past that tells us who we are. Without it we lose our identity,” he said.

Fortunately for historians, Dr. Hawking conceded defeat in the black hole information debate 10 years ago, admitting that advances in string theory, the so-called theory of everything, had left no room in the universe for information loss.

At least in principle, then, he agreed, information is always preserved — even in the smoke and ashes when you, say, burn a book. With the right calculations, you should be able reconstruct the patterns of ink, the text.

Dr. Hawking paid off a bet with John Preskill, a Caltech physicist, with a baseball encyclopedia, from which information can be easily retrieved.
But neither Dr. Hawking nor anybody else was able to come up with a convincing explanation for how that happens and how all this “information” escapes from the deadly erasing clutches of a black hole.

Indeed, a group of physicists four years ago tried to figure it out and suggested controversially that there might be a firewall of energy just inside a black hole that stops anything from getting out or even into a black hole.

The new results do not address that issue. But they do undermine the famous notion that black holes have “no hair” — that they are shorn of the essential properties of the things they have consumed.

About four years ago, Dr. Strominger started noodling around with theoretical studies about gravity dating to the early 1960s. Interpreted in a modern light, the papers — published in 1962 by Hermann Bondi, M. G. J. van der Burg, A. W. K. Metzner and Rainer Sachs, and in 1965 by Steven Weinberg, later a recipient of the Nobel Prize — suggested that gravity was not as ruthless as Dr. Wheeler had said.

Looked at from the right vantage point, black holes might not be not be bald at all.

The right vantage point is not from a great distance in space — the normal assumption in theoretical calculations — but from a far distance in time, the far future, technically known as “null infinity.”

“Null infinity is where light rays go if they are not trapped in a black hole,” Dr. Strominger tried to explain over coffee in Harvard Square recently.From this point of view, you can think of light rays on the surface of a black hole as a bundle of straws all pointing outward, trying to fly away at the speed of, of course, light. Because of the black hole’s immense gravity, they are stuck.

But the individual straws can slide inward or outward along their futile tracks, slightly advancing or falling back, under the influence of incoming material. When a particle falls into a black hole, it slides the straws of light back and forth, a process called a supertranslation.

That leaves a telltale pattern on the horizon, the invisible boundary that is the point of no return of a black hole — a halo of “soft hair,” as Dr. Strominger and his colleagues put it. That pattern, like the pixels on your iPhone or the wavy grooves in a vinyl record, contains information about what has passed through the horizon and disappeared.

“One often hears that black holes have no hair,” Dr. Strominger and a postdoctoral researcher, Alexander Zhiboedov, wrote in a 2014 paper. Not true: “Black holes have a lush infinite head of supertranslation hair.”

Enter Dr. Hawking.

For years, he and Dr. Strominger and a few others had gotten together to work in seclusion at a Texas ranch owned by the oilman and fracking pioneer George P. Mitchell. Because Dr. Hawking was discouraged from flying, in April 2014 the retreat was in Hereford, Britain. It was there that Dr. Hawking first heard about soft hair — and was very excited. He, Dr. Strominger and Dr. Perry began working together.
In Stockholm that fall, he made a splash when he announced that a resolution to the information paradox was at hand — somewhat to the surprise of Dr. Strominger and Dr. Perry, who has been trying to maintain an understated stance.

Although information gets hopelessly scrambled, Dr. Hawking declared, it “can be recovered in principle, but it is lost for all practical purposes.”

In the paper, they are at pains to admit that knocking the pins out from under the no-hair theorem is a far cry from solving the information paradox. But it is progress.

Their work suggests that science has been missing something fundamental about how black holes evaporate, Dr. Strominger said. And now they can sharpen their questions. “I hope we have the tiger by the tail,” he said.

Whether or not soft hair is enough to resolve the information paradox, nobody really knows. Reaction from other physicists has been reserved.
(FULL STORY)

Surprise! The Universe Is Expanding Faster Than Scientists Thought
[6/3/2016]
The universe is expanding 5 to 9 percent faster than astronomers had thought, a new study suggests.

"This surprising finding may be an important clue to understanding those mysterious parts of the universe that make up 95 percent of everything and don't emit light, such as dark energy, dark matter and dark radiation," study leader Adam Riess, an astrophysicist at the Space Telescope Science Institute and Johns Hopkins University in Baltimore, said in a statement.

Riess — who shared the 2011 Nobel Prize in physics for the discovery that the universe's expansion is accelerating — and his colleagues used NASA's Hubble Space Telescope to study 2,400 Cepheid stars and 300 Type Ia supernovas.
(FULL STORY)

Building Blocks of Life Found in Comet's Atmosphere
[5/27/2016]
For the first time, scientists have directly detected a crucial amino acid and a rich selection of organic molecules in the dusty atmosphere of a comet, further bolstering the hypothesis that these icy objects delivered some of life's ingredients to Earth.

The amino acid glycine, along with some of its precursor organic molecules and the essential element phosphorus, were spotted in the cloud of gas and dust surrounding Comet 67P/Churyumov-Gerasimenko by the Rosetta spacecraft, which has been orbiting the comet since 2014. While glycine had previously been extracted from cometary dust samples that were brought to Earth by NASA's Stardust mission, this is the first time that the compound has been detected in space, naturally vaporized.

The discovery of those building blocks around a comet supports the idea that comets could have played an essential role in the development of life on early Earth, researchers said.
(FULL STORY)

Quantum cats here and there
[5/27/2016]
The story of Schrödinger's cat being hidden away in a box and being both dead and alive is often invoked to illustrate the how peculiar the quantum world can be. On a twist of the dead/alive behavior, Wang et al. now show that the cat can be in two separate locations at the same time. Constructing their cat from coherent microwave photons, they show that the state of the “electromagnetic cat” can be shared by two separated cavities. Going beyond common-sense absurdities of the classical world, the ability to share quantum states in different locations could be a powerful resource for quantum information processing.
(FULL STORY)

Planet 1,200 Light-years Away Is A Good Prospect For Habitability
[5/26/2016]
A distant planet known as Kepler-62f could be habitable, a team of astronomers reports.
The planet, which is about 1,200 light-years from Earth in the direction of the constellation Lyra, is approximately 40 percent larger than Earth. At that size, Kepler-62f is within the range of planets that are likely to be rocky and possibly could have oceans, said Aomawa Shields, the study's lead author and a National Science Foundation astronomy and astrophysics postdoctoral fellow in UCLA's department of physics and astronomy.

NASA's Kepler mission discovered the planetary system that includes Kepler-62f in 2013, and it identified Kepler-62f as the outermost of five planets orbiting a star that is smaller and cooler than the sun. But the mission didn't produce information about Kepler-62f's composition or atmosphere or the shape of its orbit.

Shields collaborated on the study with astronomers Rory Barnes, Eric Agol, Benjamin Charnay, Cecilia Bitz and Victoria Meadows, all of the University of Washington, where Shields earned her doctorate. To determine whether the planet could sustain life, the team came up with possible scenarios about what its atmosphere might be like and what the shape of its orbit might be.

"We found there are multiple atmospheric compositions that allow it to be warm enough to have surface liquid water," said Shields, a University of California President's Postdoctoral Program Fellow. "This makes it a strong candidate for a habitable planet."
(FULL STORY)

Has a Hungarian Physics Lab Found a Fifth Force of Nature?
[5/25/2016]
A laboratory experiment in Hungary has spotted an anomaly in radioactive decay that could be the signature of a previously unknown fifth fundamental force of nature, physicists say—if the finding holds up.
Attila Krasznahorkay at the Hungarian Academy of Sciences’s Institute for Nuclear Research in Debrecen, Hungary, and his colleagues reported their surprising result in 2015 on the arXiv preprint server, and this January in the journal Physical Review Letters. But the report – which posited the existence of a new, light boson only 34 times heavier than the electron—was largely overlooked.

Then, on April 25, a group of US theoretical physicists brought the finding to wider attention by publishing its own analysis of the result on arXiv. The theorists showed that the data didn’t conflict with any previous experiments—and concluded that it could be evidence for a fifth fundamental force. “We brought it out from relative obscurity,” says Jonathan Feng, at the University of California, Irvine, the lead author of the arXiv report.
Four days later, two of Feng's colleagues discussed the finding at a workshop at the SLAC National Accelerator Laboratory in Menlo Park, California. Researchers there were sceptical but excited about the idea, says Bogdan Wojtsekhowski, a physicist at the Thomas Jefferson National Accelerator Facility in Newport News, Virginia. “Many participants in the workshop are thinking about different ways to check it,” he says. Groups in Europe and the United States say that they should be able to confirm or rebut the Hungarian experimental results within about a year.
(FULL STORY)

Silicon quantum computers take shape in Australia
[5/24/2016]
Silicon is at the heart of the multibillion-dollar computing industry. Now, efforts to harness the element to build a quantum processor are taking off, thanks to elegant designs from an Australian collaboration.
In July, the Centre for Quantum Computation and Communication Technology, which is based at the University of New South Wales (UNSW) in Sydney, will receive the first instalment of a Aus$46-million (US$33-million) investment. The money comes from government and industry sources whose goal is to create a practical quantum computer.

At an innovation forum in London on 6 May, hosted by Nature and start-up accelerator Entrepreneur First, two physicists from a group at the UNSW pitched a plan to reach that goal. Their audience was a panel of entrepreneurs and scientists, who critiqued ideas for commercializing a range of quantum technologies, including sensors, computer security and a quantum internet as well as quantum computers.

So far, the UNSW team has demonstrated a system with quantum bits, or qubits, only in a single atom. Useful computations will require linking qubits in multiple atoms. But the team’s silicon qubits hold their quantum state nearly a million times longer than do systems made from superconducting circuits, a leading alternative, UNSW physicist Guilherme Tosi told participants at the event. This helps the silicon qubits to perform operations with one-sixth of the errors of superconducting circuits.

If the team can pull off this low error rate in a larger system, it would be “quite amazing”, said Hartmut Neven, director of engineering at Google and a member of the panel. But he cautioned that in terms of performance, the system is far behind others. The team is aiming for ten qubits in five years, but both Google and IBM are already approaching this with superconducting systems. And in five years, Google plans to have ramped up to hundreds of qubits.
(FULL STORY)

New Support for Alternative Quantum View
[5/16/2016]
An experiment claims to have invalidated a decades-old criticism against pilot-wave theory, an alternative formulation of quantum mechanics that avoids the most baffling features of the subatomic universe.

Of the many counterintuitive features of quantum mechanics, perhaps the most challenging to our notions of common sense is that particles do not have locations until they are observed. This is exactly what the standard view of quantum mechanics, often called the Copenhagen interpretation, asks us to believe. Instead of the clear-cut positions and movements of Newtonian physics, we have a cloud of probabilities described by a mathematical structure known as a wave function. The wave function, meanwhile, evolves over time, its evolution governed by precise rules codified in something called the Schrödinger equation. The mathematics are clear enough; the actual whereabouts of particles, less so. Until a particle is observed, an act that causes the wave function to “collapse,” we can say nothing about its location. Albert Einstein, among others, objected to this idea. As his biographer Abraham Pais wrote: “We often discussed his notions on objective reality. I recall that during one walk Einstein suddenly stopped, turned to me and asked whether I really believed that the moon exists only when I look at it.”

But there’s another view — one that’s been around for almost a century — in which particles really do have precise positions at all times. This alternative view, known as pilot-wave theory or Bohmian mechanics, never became as popular as the Copenhagen view, in part because Bohmian mechanics implies that the world must be strange in other ways. In particular, a 1992 study claimed to crystalize certain bizarre consequences of Bohmian mechanics and in doing so deal it a fatal conceptual blow. The authors of that paper concluded that a particle following the laws of Bohmian mechanics would end up taking a trajectory that was so unphysical — even by the warped standards of quantum theory — that they described it as “surreal.”

Nearly a quarter-century later, a group of scientists has carried out an experiment in a Toronto laboratory that aims to test this idea. And if their results, first reported earlier this year, hold up to scrutiny, the Bohmian view of quantum mechanics — less fuzzy but in some ways more strange than the traditional view — may be poised for a comeback.
(FULL STORY)

Dark matter does not include certain axion-like particles
[5/17/2016]
Scientists believe 80 percent of the universe is made up of dark matter. What exactly constitutes dark matter? Scientists still aren't sure. A new study, published this week in the journal Physical Review Letters, grows the list of particles not found in dark matter. Astronomers have previously hypothesized that axion-like particles, or ALPs, might make up dark matter. Given their diminutive size -- registering at a billionth the mass of a single electron -- it was a logical guess.

But when researchers at Stockholm University used NASA's gamma-ray telescope on the Fermi satellite to look for ALPs in the Perseus galaxy cluster, they came up empty-handed. ALPs can be briefly transformed into light-emitting matter when they travel through intense electromagnetic fields. Likewise, light particles like gamma radiation can briefly transform into ALPs. No such transformations, however, were detected near the center of the Perseus cluster.

While the research didn't offer any revelations on the makeup of dark matter, scientists believe they can now exclude certain types of ALPs in the ongoing search for the elusive matter.

"The ALPs we have been able to exclude could explain a certain amount of dark matter," Manuel Meyer, a physicist at Stockholm University, said in a news release. "What is particularly interesting is that with our analysis we are reaching a sensitivity that we thought could only be obtained with dedicated future experiments on Earth."

Scientists discover new form of light
[5/17/2016]
Researchers in Ireland have discovered a new form of light. Their discovery is expected to reshape scientists' understanding of light's basic nature.

Angular momentum describes the rotation of a light beam around its axis. Until now, researchers believed the angular momentum was always a multiple of Planck's constant -- a constant ratio that describes the relationship between photon energy and frequency, and also sets the scale for quantum mechanics.

The newly discovered form of light, however, features photons with an angular momentum of just half the value of Planck's constant. The difference sounds small, but researchers say the significance of the discovery is great.

"For a beam of light, although traveling in a straight line it can also be rotating around its own axis," John Donegan, a professor at Trinity College Dublin's School of Physics, explained in a news release. "So when light from the mirror hits your eye in the morning, every photon twists your eye a little, one way or another."

"Our discovery will have real impacts for the study of light waves in areas such as secure optical communications," Donegan added.

Researchers made their discovery after passing light through special crystals to create a light beam with a hollow, screw-like structure. Using quantum mechanics, the physicists theorized that the beam's twisting photons were being slowed to a half-integer of Planck's constant.

The team of researchers then designed a device to measure the beam's angular momentum as it passed through the crystal. As they had predicted, they registered a shift in the flow of photons caused by quantum effects.

The researchers described their discovery in a paper published this week in the journal Science Advances.

"What I think is so exciting about this result is that even this fundamental property of light, that physicists have always thought was fixed, can be changed," concluded Paul Eastham, assistant professor of physics at Trinity.
(FULL STORY)

How light is detected affects the atom that emits it
[5/13/2016]
Flick a switch on a dark winter day and your office is flooded with bright light, one of many everyday miracles to which we are all usually oblivious. A physicist would probably describe what is happening in terms of the particle nature of light. An atom or molecule in the fluorescent tube that is in an excited state spontaneously decays to a lower energy state, releasing a particle called a photon.

When the photon enters your eye, something similar happens but in reverse. The photon is absorbed by a molecule in the retina and its energy kicks that molecule into an excited state. Light is both a particle and a wave, and this duality is fundamental to the physics that rule the Lilliputian world of atoms and molecules. Yet it would seem that in this case the wave nature of light can be safely ignored.

Kater Murch, assistant professor of physics in Arts and Sciences at Washington University in St. Louis, might give you an argument about that. His lab is one of the first in the world to look at spontaneous emission with an instrument sensitive to the wave rather than the particle nature of light, work described in the May 20th issue of Nature Communications.

His experimental instrument consists of an artificial atom (actually a superconducting circuit with two states, or energy levels) and an interferometer, in which the electromagnetic wave of the emitted light interferes with a reference wave of the same frequency.

This manner of detection turns everything upside down, he said. All that a photon detector can tell you about spontaneous emission is whether an atom is in its excited state or its ground state. But the interferometer catches the atom diffusing through a quantum "state space" made up of all the possible combinations, or superpositions, of its two energy states.

This is actually trickier than it sounds because the scientists are tracking a very faint signal (the electromagnetic field associated with one photon), and most of what they see in the interference pattern is quantum noise. But the noise carries complementary information about the state of the artificial atom that allows them to chart its evolution.

When viewed in this way, the artificial atom can move from a lower energy state to a higher energy one even as its follows the inevitable downward trajectory to the ground state. "You'd never see that if you were detecting photons," Murch said.

So different detectors see spontaneous emission very differently. "By looking at the wave nature of light, we are able see this lovely diffusive evolution between the states," Murch said.

But it gets stranger. The fact that an atom's average excitation can increase even when it decays is a sign that how we look at light might give us some control over the atoms that emitted the light, Murch said.

This might sound like a reversal of cause and effect, with the effect pushing on the cause. It is possible only because of one of the weirdest of all the quantum effects: When an atom emits light, quantum physics requires the light and the atom to become connected, or entangled, so that measuring a property of one instantly reveals the value of that property for the other, no matter how far away it is.

Or put another way, every measurement of an entangled object perturbs its entangled partner. It is this quantum back-action, Murch said, that could potentially allow a light detector to control the light emitter.

"Quantum control has been a dream for many years," Murch said. "One day, we may use it to enhance fluorescence imaging by detecting the light in a way that creates superpositions in the emitters. "That's very long term, but that's the idea," he said.
(FULL STORY)

AI learns and recreates Nobel-winning physics experiment
[5/16/2016]
Australian physicists, perhaps searching for a way to shorten the work week, have created an AI that can run and even improve a complex physics experiment with little oversight. The research could eventually allow human scientists to focus on high-level problems and research design, leaving the nuts and bolts to a robotic lab assistant.

The experiment the AI performed was the creation of a Bose-Einstein condensate, a hyper-cold gas, the process for which won three physicists the Nobel Prize in 2001. It involves using directed radiation to slow a group of atoms nearly to a standstill, producing all manner of interesting effects.
The Australian National University team cooled a bit of gas down to 1 microkelvin — that’s a thousandth of a degree above absolute zero — then handed over control to the AI. It then had to figure out how to apply its lasers and control other parameters to best cool the atoms down to a few hundred nanokelvin, and over dozens of repetitions, it found more and more efficient ways to do so.
“It did things a person wouldn’t guess, such as changing one laser’s power up and down, and compensating with another,” said ANU’s Paul Wigley, co-lead researcher, in a news release. “I didn’t expect the machine could learn to do the experiment itself, from scratch, in under an hour. It may be able to come up with complicated ways humans haven’t thought of to get experiments colder and make measurements more precise.”
Bose-Einstein condensates have strange and wonderful properties, and their extreme sensitivity to fluctuations in energy make them useful for other experiments and measurements. But that same sensitivity makes the process of creating and maintaining them difficult. The AI monitors many parameters at once and can adjust the process quickly and in ways that humans might not understand, but which are nevertheless effective.

The result: condensates can be created faster, under more conditions, and in greater quantities. Not to mention the AI doesn’t eat, sleep, or take vacations.

“It’s cheaper than taking a physicist everywhere with you,” said the other co-lead researcher, Michael Hush, of the University of New South Wales. “You could make a working device to measure gravity that you could take in the back of a car, and the artificial intelligence would recalibrate and fix itself no matter what.”

This AI is extremely specific in its design, of course, and can’t be applied as-is to other problems; for more flexible automation, physicists will still have to rely on the general-purpose research units called “graduate students.”
(FULL STORY)

Scientists Talk Privately About Creating a Synthetic Human Genome
[5/13/2016]
Scientists are now contemplating the fabrication of a human genome, meaning they would use chemicals to manufacture all the DNA contained in human chromosomes.

The prospect is spurring both intrigue and concern in the life sciences community because it might be possible, such as through cloning, to use a synthetic genome to create human beings without biological parents.

While the project is still in the idea phase, and also involves efforts to improve DNA synthesis in general, it was discussed at a closed-door meeting on Tuesday at Harvard Medical School in Boston. The nearly 150 attendees were told not to contact the news media or to post on Twitter during the meeting.

Organizers said the project could have a big scientific payoff and would be a follow-up to the original Human Genome Project, which was aimed at reading the sequence of the three billion chemical letters in the DNA blueprint of human life. The new project, by contrast, would involve not reading, but rather writing the human genome — synthesizing all three billion units from chemicals.

But such an attempt would raise numerous ethical issues. Could scientists create humans with certain kinds of traits, perhaps people born and bred to be soldiers? Or might it be possible to make copies of specific people?

“Would it be O.K., for example, to sequence and then synthesize Einstein’s genome?” Drew Endy, a bioengineer at Stanford, and Laurie Zoloth, a bioethicist at Northwestern University, wrote in an essay criticizing the proposed project. “If so how many Einstein genomes should be made and installed in cells, and who would get to make them?”

Dr. Endy, though invited, said he deliberately did not attend the meeting at Harvard because it was not being opened to enough people and was not giving enough thought to the ethical implications of the work.

Continue reading the main story

RELATED COVERAGE

Scientists Seek Moratorium on Edits to Human Genome That Could Be Inherited DEC. 3, 2015

George Church, a professor of genetics at Harvard Medical School and an organizer of the proposed project, said there had been a misunderstanding. The project was not aimed at creating people, just cells, and would not be restricted to human genomes, he said. Rather it would aim to improve the ability to synthesize DNA in general, which could be applied to various animals, plants and microbes.

“They’re painting a picture which I don’t think represents the project,” Dr. Church said in an interview.

He said the meeting was closed to the news media, and people were asked not to tweet because the project organizers, in an attempt to be transparent, had submitted a paper to a scientific journal. They were therefore not supposed to discuss the idea publicly before publication. He and other organizers said ethical aspects have been amply discussed since the beginning.

The project was initially called HGP2: The Human Genome Synthesis Project, with HGP referring to the Human Genome Project. An invitation to the meeting at Harvard said that the primary goal “would be to synthesize a complete human genome in a cell line within a period of 10 years.”
(FULL STORY)

Boiling Water May Be Cause of Martian Streaks
[5/2/2016]
The results of Earth-bound lab experiments appear to back up the theory that dark lines on Martian slopes are created by water — though in an otherworldly manner, scientists said Monday.A team from France, Britain and the United States constructed models and simulated Mars conditions to follow up on a 2015 study which proffered “the strongest evidence yet” for liquid water — a prerequisite for life — on the Red Planet. That finding had left many scientists scratching their heads as the low pressure of Mars’ atmosphere means that water does not survive long in liquid form. It either boils or freezes.
(FULL STORY)

Three Newly Discovered Planets Are the Best Bets for Life Outside the Solar System
[5/2/2016]
An international team of astronomers has discovered three Earth-like exoplanets orbiting an ultra-cool dwarf star—the smallest and dimmest stars in the Galaxy—now known as TRAPPIST-1. The discovery, made with the TRAPPIST telescope at ESO's La Silla Observatory, is significant not only because the three planets have similar properties to Earth, suggesting they could harbor life, but also because they are relatively close (just 40 light years away) and they are the first planets ever discovered orbiting such a dim star. A research paper detailing the teams findings was published today in the journal Nature.

"What is super exciting is that for the first time, we have extrasolar worlds similar in size and temperature to Earth—planets that could thus, in theory, harbor liquid water and host life on at least a part of their surfaces—for which the atmospheric composition can be studied in detail with current technology," lead researcher Michaël Gillon of the University of Liège in Belgium said in an email to Popular Mechanics.
(FULL STORY)

The real reasons nothing can go faster than the speed of light
[5/2/2016]
We are told that nothing can travel faster than light. This is how we know it is true
(FULL STORY)

Physicists Abuzz About Possible New Particle as CERN Revs Up
[5/2/2016]
Scientists around the globe are revved up with excitement as the world's biggest atom smasher — best known for revealing the Higgs boson four years ago — starts whirring again to churn out data that may confirm cautious hints of an entirely new particle.

Such a discovery would all but upend the most basic understanding of physics, experts say.

The European Center for Nuclear Research, or CERN by its French-language acronym, has in recent months given more oomph to the machinery in a 27-kilometer (17-mile) underground circuit along the French-Swiss border known as the Large Hadron Collider.

In a surprise development in December, two separate LHC detectors each turned up faint signs that could indicate a new particle, and since then theorizing has been rife.
(FULL STORY)

Are we the only intelligent life in cosmos? Probably not, say astronomers
[5/1/2016]
Alien life: A new paper shows that the discoveries of exoplanets, plus a revised Drake's equation, produces a new, empirically valid probability of whether any other advanced civilizations have ever existed. Astronomers revised the half-century old Drake equation, which attempts to calculate the probability of the existence of advanced alien civilizations, to determine whether any such civilizations have existed at any point in the history of the universe.

They found that the chances that a human civilization evolved on Earth and nowhere else in the universe are less than about one in 10 billion trillion.
(FULL STORY)

Could 'black hole' in a lab finally help Stephen Hawking win a Nobel Prize?
[4/26/2016]
ne of Stephen Hawking's most brilliant and disturbing theories may have been confirmed by a scientist who created a sound “black hole” in his laboratory, potentially paving the way for a Nobel Prize.

Research by Professor Hawking, a cosmologist at Cambridge University, disputes the notion that black holes are a gravitational sinkhole, pulling in matter and never allowing anything to escape, even light. His model, developed in the 1970s, instead suggested that black holes could actually emit tiny particles, allowing energy to escape. If true, it would mean some black holes could simply evaporate completely with profound implications for our understanding of the universe.But such is the weakness of the emitted particle combined with the remoteness of even the nearest of black holes, his mathematical discovery has yet to be verified by observation.

Instead Jeff Steinhauer, professor of physics at the Technion university in Haifa, created something analagous to a “black hole” for sound in his laboratory.

In a paper published on the physics website arXiv, and reported by The Times, he described how he cooled helium to close to absolute zero before manipulating it in such a way that sound could not cross it, like a black hole's event horizon. He said he found evidence that phonons – the sound equivalent of light's photons - were leaking out, rather as Prof Hawking had predicted for black holes.The results have yet to be replicated elsewhere and scientists say they will want to check the effect is not caused by another factor.

If confirmed, it would strengthen Prof Hawking's case for science's greatest prize.

Although his theory has a lot of support, Nobel Prizes for Physics are not awarded without experimental proof.

Earlier this year, Prof Hawking used the BBC's Reith Lecture to make the case that his work was close to being proven, both in the laboratory and from echoes of the very earliest moments of our universe.“I am resigned to the fact that I won’t see proof of Hawking radiation directly.

“There are solid state analogues of black holes and other effects, that the Nobel committee might accept as proof,” he said. “But there’s another kind of Hawking radiation, coming from the cosmological event horizon of the early inflationary universe. “I am now studying whether one might detect Hawking radiation in primordial gravitational waves . . . so I might get a Nobel prize after all.”
(FULL STORY)

Gravitational lens reveals hiding dwarf dark galaxy
[4/14/2016]
Originally, scientists were simply trying to capture an image of the gravitational lens SDP.81 using the Atacama Large Millimeter Array. Their efforts were part of a 2014 survey aimed at testing ALMA's new, high-resolution capabilities. More than a year later, however, the image revealed a surprise -- a dwarf dark galaxy hiding in the halo of a larger galaxy, positioned some 4 billion light-years from Earth.A gravitational lens, or gravitational lensing, is a phenomenon whereby the gravity of a closer galaxy bends the light of a more distant galaxy, creating a magnifying lens-like effect. The phenomenon is often used to study galaxies that would otherwise be too far away to see.

Astronomers initially assumed SDP.81 revealed the light of two galaxies -- that of a more distant galaxy, 12 billion light-years away, and that of the a closer galaxy, 4 billion light-years away.

But new analysis of the image by researchers at Stanford University has revealed evidence of a dwarf dark galaxy.

"We can find these invisible objects in the same way that you can see rain droplets on a window. You know they are there because they distort the image of the background objects," astronomer Yashar Hezaveh explained in a news release.

The gravitational influence of dark matter distorted the light bending through the gravitational lens.

Hezaveh and his colleagues recruited the power of several supercomputers to scan the radio telescope data for anomalies within the halo of SDP.81. They succeeded in identifying a unique clump of distortion, less than one-thousandth the mass of the Milky Way. The work may pave the way for the discovery of more collections of dark matter and also solve a discrepancy that's long plagued cosmologists and astronomers.
(FULL STORY)

Leonardo Da Vinci's Living Relatives Found
[4/12/2016]
Leonardo da Vinci lives on, according to two Italian researchers who have tracked down the living relatives of the Renaissance genius.

It was believed that no traces were left of the painter, engineer, mathematician, philosopher and naturalist. The remains of Leonardo, who died in 1519 in Amboise, France, were dispersed in the 16th century during religious wars. But according to historian Agnese Sabato and art historian Alessandro Vezzosi, director of the Museo Ideale in the Tuscan town of Vinci, where the artist was born in 1452, Da Vinci's family did not go extinct.
(FULL STORY)

Hawking made the prediction yesterday (April 12) during the Breakthrough Starshot announcement in New York City. At the news conference, Hawking, along with Russian billionaire investor Yuri Milner and a group of scientists, detailed a new project that aims to send a multitude of tiny, wafer-size spaceships into space to the neighboring star system Alpha Centauri.

If these tiny spaceships travel at 20 percent the speed of light, they'll be able to reach Alpha Centauri in just 20 years, Milner said. Once there, the spacecraft will be able to do a 1-hour flyby of Alpha Centauri and collect data that's impossible to gather from Earth, such as taking close-up photos of the star system, probing space dust molecules and measuring magnetic fields, said Avi Loeb, chairman of the Breakthrough Starshot Advisory Committee and a professor of science at Harvard University.
(FULL STORY)

Measurement of Universe's expansion rate creates cosmological puzzle
[4/11/2016]
The most precise measurement ever made of the current rate of expansion of the Universe has produced a value that appears incompatible with measurements of radiation left over from the Big Bang1. If the findings are confirmed by independent techniques, the laws of cosmology might have to be rewritten.This might even mean that dark energy — the unknown force that is thought to be responsible for the observed acceleration of the expansion of the Universe — has increased in strength since the dawn of time.

“I think that there is something in the standard cosmological model that we don't understand,” says astrophysicist Adam Riess, a physicist at Johns Hopkins University in Baltimore, Maryland, who co-discovered dark energy in 1998 and led the latest study. Kevork Abazajian, a cosmologist at the University of California, Irvine, who was not involved in the study, says that the results have the potential of “becoming transformational in cosmology”.
(FULL STORY)

Stephen Hawking Helps Launch Project 'Starshot' for Interstellar Space Exploration
[4/12/2016]
The famed cosmologist, along with a group of scientists and billionaire investor Yuri Milner, unveiled an ambitious new $100 million project today (April 12) called Breakthrough Starshot, which aims to build the prototype for a tiny, light-propelled robotic spacecraft that could visit the nearby star Alpha Centauri after a journey of just 20 years.

"The limit that confronts us now is the great void between us and the stars, but now we can transcend it," Hawking said today during a news conference here at One World Observatory.
(FULL STORY)

'Bizarre' Group of Distant Black Holes are Mysteriously Aligned
[4/12/2016]
A highly sensitive radio telescope has seen something peculiar in the depths of our cosmos: A group of supermassive black holes are mysteriously aligned, as if captured in a synchronized dance.These black holes, which occupy the centers of galaxies in a region of space called ELAIS-N1, appear to have no relation to one another, separated by millions of light-years. But after studying the radio waves generated by the twin jets blasting from the black holes’ poles, astronomers using data from the Giant Metrewave Radio Telescope (GMRT) in India realized that all the jets were pointed in the same direction, like arrows on compasses all pointing “north.”This is the first time a group of supermassive black holes in galactic cores have been seen to share this bizarre relationship and, at first glance, the occurrence should be impossible. What we are witnessing is a cluster of galaxies, that all have central supermassive black holes that have their axes of rotation pointed in the same direction.

“Since these black holes don’t know about each other, or have any way of exchanging information or influencing each other directly over such vast scales, this spin alignment must have occurred during the formation of the galaxies in the early universe,” said Andrew Russ Taylor, director of the Inter-University Institute for Data Intensive Astronomy in Cape Town, South Africa. Taylor is lead author of the study published in the journal Monthly Notices of the Royal Astronomical Society.
(FULL STORY)

Isaac Newton: handwritten recipe reveals fascination with alchemy
[4/9/2016]
A 17th-century recipe written by Isaac Newton is now going online, revealing more about the physicist’s relationship with the ancient science of alchemy.

Calling for ingredients such as "one part Fiery Dragon" and "at least seven Eagles of mercury," the handwritten recipe describes how to make "sophick mercury," seen at the time as an essential element in creating the "philosopher’s stone," a fabled substance with the power to turn base metals, like lead, into gold.

The manuscript, which is written in Latin and English, was acquired in February by Philadelphia-based nonprofit the Chemical Heritage Foundation, National Geographic reports. The foundation is now working to upload digital images and transcriptions of the text to an online database.
(FULL STORY)

If there is a planet beyond Neptune, what is it like?
[4/11/2016]
Scientists may not have been able to spot the proposed ninth planet in our solar system, or even confirm that it exists, but that hasn't stopped them from imagining how it looks. Astrophysicists from the University of Bern recently showed off a new model of the possible evolution of Planet Nine, a planet hypothesized to explain the movement bodies at our solar system's edge.
Published in the Journal Astronomy and Astrophysics, the model shows the possible size, temperature, and brightness of the mysterious planet.
(FULL STORY)

The "R" in RNA can easily be made in space, and that has implications for life beyond Earth
[4/11/2016]
New research suggests that the sugar ribose -- the "R" in RNA -- is probably found in comets and asteroids that zip through the solar system and may be more abundant throughout the universe than was previously thought.

The finding has implications not just for the study of the origins of life on Earth, but also for understanding how much life there might be beyond our planet.

Scientists already knew that several of the molecules necessary for life including amino acids, nucleobases and others can be made from the interaction of cometary ices and space radiation. But ribose, which makes up the backbone of the RNA molecule, had been elusive -- until now.

The new work, published Thursday in Science, fills in another piece of the puzzle, said Andrew Mattioda, an astrochemist at NASA Ames Research Center, who was not involved with the study.
(FULL STORY)

Surprise! Gigantic Black Hole Found in Cosmic Backwater
[4/6/2016]
One of the biggest black holes ever found sits in a cosmic backwater, like a towering skyscraper in a small town.

Astronomers have spotted a supermassive black hole containing 17 billion times the mass of the sun — only slightly smaller than the heftiest known black hole, which weighs in at a maximum of 21 billion solar masses — at the center of the galaxy NGC 1600.

That's a surprise, because NGC 1600, which lies 200 million light-years from Earth in the constellation Eridanus, belongs to an average-size galaxy group, and the monster black holes discovered to date tend to be found in dense clusters of galaxies. So researchers may have to rethink their ideas about where gigantic black holes reside, and how many of them might populate the universe, study team members said.
(FULL STORY)

New Bizarre State of Matter Seems to Split Fundamental Particles
[4/6/2016]
A bizarre new state of matter has been discovered — one in which electrons that usually are indivisible seem to break apart.

The new state of matter, which had been predicted but never spotted in real life before, forms when the electrons in an exotic material enter into a type of "quantum dance," in which the spins of the electrons interact in a particular way, said Arnab Banerjee, a physicist at Oak Ridge National Laboratory in Tennessee. The findings could pave the way for better quantum computers, Banerjee said.
(FULL STORY)