August 12, 2009

This image based on a computer simulation of a type 1a supernova shows the turbulent and asymmetric flame of the runaway thermonuclear burning that consumes the white dwarf star. Image by F. Ropke.

(PhysOrg.com) -- The stellar explosions known as type 1a supernovae have long been used as "standard candles," their uniform brightness giving astronomers a way to measure cosmic distances and the expansion of the universe. But a new study published this week in Nature reveals sources of variability in type 1a supernovae that will have to be taken into account if astronomers are to use them for more precise measurements in the future.

The discovery of dark energy, a mysterious force that is accelerating the expansion of the universe, was based on observations of type 1a supernovae. But in order to probe the nature of dark energy and determine if it is constant or variable over time, scientists will have to measure cosmic distances with much greater precision than they have in the past.

"As we begin the next generation of cosmology experiments, we will want to use type 1a supernovae as very sensitive measures of distance," said lead author Daniel Kasen, a Hubble postdoctoral fellow at the University of California, Santa Cruz. "We know they are not all the same brightness, and we have ways of correcting for that, but we need to know if there are systematic differences that would bias the distance measurements. So this study explored what causes those differences in brightness."

Kasen and his coauthors--Fritz Röpke of the Max Planck Institute for Astrophysics in Garching, Germany, and Stan Woosley, professor of astronomy and astrophysics at UC Santa Cruz--used supercomputers to run dozens of simulations of type 1a supernovae. The results indicate that much of the diversity observed in these supernovae is due to the chaotic nature of the processes involved and the resulting asymmetry of the explosions.

For the most part, this variability would not produce systematic errors in measurement studies as long as researchers use large numbers of observations and apply the standard corrections, Kasen said. The study did find a small but potentially worrisome effect that could result from systematic differences in the chemical compositions of stars at different times in the history of the universe. But researchers can use the computer models to further characterize this effect and develop corrections for it.

"Since we are beginning to understand how type 1a supernovae work from first principles, these models can be used to refine our distance estimates and make measurements of the expansion rate of the universe more precise," Woosley said.

A type 1a supernova occurs when a white dwarf star acquires additional mass by siphoning matter away from a companion star. When it reaches a critical mass--1.4 times the mass of the Sun, packed into an object the size of the Earth--the heat and pressure in the center of the star spark a runaway nuclear fusion reaction, and the white dwarf explodes. Since the initial conditions are about the same in all cases, these supernovae tend to have the same luminosity, and their "light curves" (how the luminosity changes over time) are predictable.

Some are intrinsically brighter than others, but these flare and fade more slowly, and this correlation between the brightness and the width of the light curve allows astronomers to apply a correction to standardize their observations. So astronomers can measure the light curve of a type 1a supernova, calculate its intrinsic brightness, and then determine how far away it is, since the apparent brightness diminishes with distance (just as a candle appears dimmer at a distance than it does up close).

The computer models used to simulate these supernovae in the new study are based on current theoretical understanding of how and where the ignition process begins inside the white dwarf and where it makes the transition from slow-burning combustion to explosive detonation.

"Since ignition does not occur in the dead center, and since detonation occurs first at some point near the surface of the exploding white dwarf, the resulting explosions are not spherically symmetric," Woosley explained. "This could only be studied properly using multi-dimensional calculations."

Most previous studies have used one-dimensional models in which the simulated explosion is spherically symmetric. Multi-dimensional simulations require much more computing power, so Kasen's group ran most of their simulations on the powerful Jaguar supercomputer at Oak Ridge National Laboratory, and also used supercomputers at the National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory. The results of two-dimensional models are reported in the Nature paper, and three-dimensional studies are currently under way.

In this image from a computer simulation, debris from a type 1a supernova explosion shows the asymmetric substructures that develop from the turbulent flame that consumes the white dwarf star. Colors represent different elements synthesized in the explosion (e.g., red=nickel-56). Image by D. Kasen et al.

The simulations showed that the asymmetry of the explosions is a key factor determining the brightness of type 1a supernovae. "The reason these supernovae are not all the same brightness is closely tied to this breaking of spherical symmetry," Kasen said.

The dominant source of variability is the synthesis of new elements during the explosions, which is sensitive to differences in the geometry of the first sparks that ignite a thermonuclear runaway in the simmering core of the white dwarf. Nickel-56 is especially important, because the radioactive decay of this unstable isotope creates the afterglow that astronomers are able to observe for months or even years after the explosion.

"The decay of nickel-56 is what powers the light curve. The explosion is over in a matter of seconds, so what we see is the result of how the nickel heats the debris and how the debris radiates light," Kasen said.

Kasen developed the computer code to simulate this radiative transfer process, using output from the simulated explosions to produce visualizations that can be compared directly to astronomical observations of supernovae.

The good news is that the variability seen in the computer models agrees with observations of type 1a supernovae. "Most importantly, the width and peak luminosity of the light curve are correlated in a way that agrees with what observers have found. So the models are consistent with the observations on which the discovery of dark energy was based," Woosley said.

Another source of variability is that these asymmetric explosions look different when viewed at different angles. This can account for differences in brightness of as much as 20 percent, Kasen said, but the effect is random and creates scatter in the measurements that can be statistically reduced by observing large numbers of supernovae.

The potential for systematic bias comes primarily from variation in the initial chemical composition of the white dwarf star. Heavier elements are synthesized during supernova explosions, and debris from those explosions is incorporated into new stars. As a result, stars formed recently are likely to contain more heavy elements (higher "metallicity," in astronomers' terminology) than stars formed in the distant past.

"That's the kind of thing we expect to evolve over time, so if you look at distant stars corresponding to much earlier times in the history of the universe, they would tend to have lower metallicity," Kasen said. "When we calculated the effect of this in our models, we found that the resulting errors in distance measurements would be on the order of 2 percent or less."

Further studies using computer simulations will enable researchers to characterize the effects of such variations in more detail and limit their impact on future dark-energy experiments, which might require a level of precision that would make errors of 2 percent unacceptable.

Related Stories

Recommended for you

What if I told you that recent experiments have revealed a revolutionary new method of propulsion that threatens to overthrow the laws of physics as we know them? That its inventor claims it could allow us to travel to the ...

The coalescence of two black holes—a very violent and exotic event—is one of the most sought-after observations of modern astronomy. But, as these mergers emit no light of any kind, finding such elusive events has been ...

The recent discovery of an Earth twin has boosted chances there is intelligent life on other planets. But while Pope Francis's telescope scans the starlit skies, the Vatican is sceptical of ever meeting Mr. Spock.

A dying star's final moments are captured in this image from the NASA/ESA Hubble Space Telescope. The death throes of this star may only last mere moments on a cosmological timescale, but this star's demise is still quite ...

Looks like the science is less settled on Type 1A supernovae than we'd been hoping. This is now the 2nd article raising skepticism over this technique (at least from what I've seen).http://www.physor...815.html

There are a lot of skeptics when it comes to 1a's as distance devices. Some of those arguments range from the strange to the reasonable. For now I'd say it's an acceptable guide, like Newtonian physics was to relativity.

The long held belief that SNIa explosions are standard candles date back to the 1970s, and the controversy is still examined today. A recent post and blog comments may prove useful to those seeking additional information concerning this finding. Check out Universe Today and the following posts for more info. Access the article at: http://www.univer...-energy/ . This finding may represent a major revision of our understanding of the cosmos. But as Nereid points out in one of the posts, newer methods for detection go way beyond using SNIa supernova as standard candles. Gravitational lensing,SZE effects, BAO measurements and the anisotropic CMB background all point to more accurate data on the distance to these remote galaxies independent of SNIa vagaries.

Interesting study. But, can someone please help me understand how the universe's expansion is accelerating?

I have always been told that this was because we can measure the Doppler-shift and that the data shows that the light is almost always shifted to the red (indicating the object is moving away from us) and also that the farther the shift, the greater the speed. Fine, I understand that. And I also understand that we can calculate approximate distance based on the magnitude of the intensity of the light that reaches us here on Earth (inverse-square law). Thus, it would appear that the expansion of the universe is happening at an accelerating rate.

However, I believe there is a third variable which is not accounted for correctly. If we say object A is X1 light-years away and is receding at Y1 miles per hour and object B is X2 light-years away moving away at Y2 miles per hour, and X1 > X2 and Y1 > Y2, then it makes perfect sense to say that the universe is expanding at an accelerating rate (at least for this data set). Yet, if you take into consideration the fact that the light from A took X1 minus X2 light-years LONGER to reach us the light from B (the light from A is older than the light from B), it would be more accurate to say that the universe is moving FASTER as it is OLDER. Therefore, if speed is DECREASES with time (speed is increasing as we look farther back in time), then the universal expansion is slowing down, not speeding up.

I know I am probably wrong but can someone please point out my logical miscalculation?

Wow, this is a cool simulation. Amazing how these white dwarfs are basically turning into stellar hydrogen bombs. Wonder how much of the material is actually participating in the fusion. The shock wave of the initial blast obviously triggers an incredible wave of secondary fusion. With all that radiation afterglow they are natures ultimate dirty bomb, neat.

Interesting study. But, can someone please help me understand how the universe's expansion is accelerating?

Well, let's assume you are watching a SN1a that is 10 billion light years away (A) and one at 1 billion light years distance (B). Obviously supernova A will have a much stronger red shift as it is speeding away from us.

Let's then assume the universe is expanding at a decreasing rate or constant rate (as it was believed only 10 years ago). According to this thesis the two supernovas should have a certain difference in redshift (the actual numbers are not important).

However, it turns out that the red shift of supernova B (the near one) is actually bigger than expected compared to the red shift of supernova A. In other words supernova B is moving too fast away from us.

During those 9 billion years something has happened and the current belief is that it is actually the expansion of the universe that has changed, supernova B is actually moving away 'too fast' because the universe is expanding faster now than it did during the explosion of supernova A.

It's really very simple to understand once you look at it like that. Just remember that near objects have a redshift that is 'too big', meaning they are moving away 'too fast' (relative to what we see in distant objects).

Although the current theory of accelerating expansion fits our observations best, wecould be completely incorrect as we still can't see the entire system.

If we assume we're correct in the origin of the Universe and the past and current rates of acceleration, we can estimate the size of the Universe. Even with that estimation to the best of our knowledge we're still unable to see the edges of our reality, if there is indeed an edge. This makes it very difficult to give any definite assurance to a theory of what's actually going on, regardless of the fit with our observations.

Although the current theory of accelerating expansion fits our observations best, wecould be completely incorrect as we still can't see the entire system.

We could be but within the limits observation it looks like acceleration is a close approximation to reality. EXCEPT for that little bit at the end of the article.

"That's the kind of thing we expect to evolve over time, so if you look at distant stars corresponding to much earlier times in the history of the universe, they would tend to have lower metallicity," Kasen said. "When we calculated the effect of this in our models, we found that the resulting errors in distance measurements would be on the order of 2 percent or less."

2 percent is enough to make acceleration look a lot less like a good fit to reality. And the general variability means that a lot of 1a super novas have to measured a long ways out as well as in close. Enough to get the precision error well below one percent so that a one or two percent difference between near and far can be above the error rate.

Is the ladder out to Cepheid variables even below one percent?

After reading the site that yyz linked to, and other things as well, the key idea is to get more than one method of measurement. If they agree with each other reasonably well that increases confidence.

What I find odd in this is not the theories that clearly not considered solid by anyone involved its the vehement hatred of them by so many. Most of those posting vitriol haven't read up on the stuff.

Physics and Astronomy is not about understanding everything. We are making assumptions about the world we can see and using those assumptions to make predictions and explain the known universe.

If you want to 'see the entire system' before making any assumptions I'm afraid you will have to wait a long time.

Giving assurance to a theory is about testing it to the best of our ability. A theory can never be proven right, it can only be proven wrong. That's why we call them theories and not truths.

I'm not saying that. As you stated, we make "assumptions based on our observations, but an assumption is not a fact or proof.

We can reasonably say that from observations according to our framework, the Universe is between 13.5 and 14 billion years old. However, in 400 years we discover the big bang is wrong and we see evidence for a multiple bang universe, or to go with old white hole theory, that there are engines of creation spewing our Universe out constantly creating more energy and matter. That would completely change our knowledge as we've gained more insight into the system. Perhaps we learn the exact rules of gravitation and find the Universe is 100 billion years old due to refinement in our observational power.

The 21st Century Dark-Realm Rush By Science
is a shuddering shameful futile waste

The 21st century Dark-Realm Rush of science in search of Dark Energy-Matter is a shuddering and shameful waste of menpower and of many other resources on a commonsensical futile chase of a grand 100 year old virtual hallucination.

Again and again : dark energy-matter scientists keep chasing their selfmade gibbering tail.

The 21st century Dark-Realm Rush of science in search of Dark Energy-Matter is a shuddering and shameful waste of manpower and of many other resources on a commonsensical futile chase of a grand 100 year old virtual hallucination.

Dov Henis is right.

Radioactive nickel-56 that causes the supernova afterglow is a doubly "magic" nucleus. It consists of a closed shell of neutrons and a closed shell of protons:

28 Protons 28 Neutrons => Doubly "magic" Ni-56.

Ni-56 decays to Fe-56, the most abundant atom in the Earth, in ordinary meteorites, and in the interior of the Sun.

That means that the supernova explosion is driving nuclear reactions to thermodynamic equilibrium.

Scientists that ignore information recorded in rest mass data for the 3,000 different types of atoms that comprise the entire visible universe:

Ni-56 decays to Fe-56, the most abundant atom in the Earth, in ordinary meteorites, and in the interior of the Sun [The Origin, Composition, and Energy Source for the Sun, 32nd Lunar & Planetary Science Conference, Houston, TX, 12-16 March 2001 http://arxiv.org/...411255v1

Its happens in many but not all threads. Not on this thread, as of yet. Where people go rather a bit beyond mere skepticism.

That's true. Thanks for the clarification.

And the weird part is that there is more of that for Dark Matter than Dark Energy which is the more speculative of the two. Or at least I remember it that way.

I agree. But this is not so wierd if we acknowledge that Dark Matter is by far more popular than Dark Energy and thus attracts far more of those people for whom "scientific reasoning" is terra incognita.

That was a wonderful explanation! At least, it made good sense to me. I only wish that when people in the news talked about this phenomenon they would use the same level of detail in addressing the curious lay-person as you did. Thank you for taking the time!

Slotin

Aug 19, 2009

This comment has been removed by a moderator.

Please sign in to add a comment.
Registration is free, and takes less than a minute.
Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.