The Copernican principle holds that humans are not privileged observers of the Universe. Copernicus stated that the Earth is not at the center of the solar system or at any particularly special position in the heavens. Modern cosmology has extended this idea to reason that the earth does not occupy any unique position in the Universe. Modern philosophy of science pushes the principle even further to conclude that every observer (even if they be they little green men) should reason as if they were the most standard observer. However, despite all these humble and rational thoughts it is still tempting to explain certain aspects of modern cosmology that seem finely tuned as consequences of observer selection effects. Namely I am speaking of dark energy or the accelerated cosmological expansion which supposedly could be explained if we occupy a privileged position near the center of a large, nonlinear, and nearly spherical void in mass density. The idea that the region of the cosmos around us could be a void is colloquially known in astronomy as the Hubble bubble. Technically a Hubble bubble is defined as a region of space wherein there is an observed departure of the local value of the Hubble constant from its cosmologically averaged value.

Lets speculate a little further on what it would be like to live in a Hubble bubble. In the standard cosmological model of the Universe the structures we see today like galaxies and clusters of galaxies (and similarly the structures we don't see like the massive dark matter halos the visible matter is embedded in) formed from tiny primordial quantum fluctuations in the early universe. The fluctuations were random variations in density such that locations which were over-dense formed galaxies and those which were under-dense formed voids. It is possible, in fact statistically quite acceptable that there are voids of various sizes in the Universe. These voids would become increasingly under-dense as the Universe evolved and equivalently over-dense regions of the Universe became increasingly over-dense. Inside the void matter would expand outward due to the gravitational pull of matter in surrounding dense regions and thus an observer at the center of the void would see an accelerated expansion of matter outward. Now it is also possible that our entire observable Universe is a Hubble bubble, but that really flies in the face in all of cosmology. It is unfounded, absurd, and really the whole idea of a Hubble bubble may explain dark energy, but is hardly a very good explanation.

The Hubble Bubble is wildly speculative and precision cosmology has almost completely defeated it as a credible explanation. First, as the framework of cosmology has been successful resting on the Copernican principle it seems odd to throw it out now. It is odd and largely misguided. First, the probability of producing a void of necessary magnitude; to mimic aspects of dark energy is extremely small in the standard structure formation models. Second, the probability of an observer being at the center (the only location where the expansion effect would be noticed) is extremely low. Finally, the void would need to be close to spherical to match the observed spatial smoothness (or isotropy) of the universe. These qualitative arguments and many more quantitative arguments from precision cosmology data are laid forth in a recent paper by A. Moss, J. Zibin, and D. Scoot titled Precision Cosmology Defeats Void Models for Acceleration. The abstract follows:

The suggestion that we occupy a privileged position near the center of a large, nonlinear, and nearly spherical void has recently attracted much attention as an alternative to dark energy. Putting aside the philosophical problems with this scenario, we perform the most complete and up-to-date comparison with cosmological data. We use supernovae and the full cosmic microwave background spectrum as the basis of our analysis. We also include constraints from radial baryonic acoustic oscillations, the local Hubble rate, age, big bang nucleosynthesis, the Compton y-distortion, and for the first time include the local amplitude of matter fluctuations, σ8. These all paint a consistent picture in which voids are in severe tension with the data. In particular, void models predict a very low local Hubble rate, suffer from an "old age problem", and predict much less local structure than is observed.

The paper makes several quantitative arguments against the plausibility any kind of void model for cosmic acceleration by drawing together an impressive amount of cosmological data and technical expertise, however, they don't ever mention the term Hubble Bubble. A 2007 paper by Conley et al. takes the Hubble Bubble paradigm head on: Is There Evidence for a Hubble Bubble? The Nature of Type Ia Supernova Colors and Dust in External Galaxies. In Conley et al. they explore how dust effects the colors of type Ia supernovae because they reason if the dust can be modeled as a purely local Milky Way effect then the supernovae data would actually favor the Hubble Bubble. Of course, despite difficulties the analysis, they find that in their parametrization there is evidence for more than the simply effect of local Milky Way dust implying doom for the Hubble Bubble. So the Hubble Bubble has been burst.

I enjoy very much the intersection of classical music and physics (for example see my posts on the phenomena of lightning and on Quantifying Goethe) so if I was in the UK I would definitely be checking out the ongoing performance lecture about the legacy of Albert Einstein the scientist, the man, and the musician. Music was an important part of Einstein's life and his passion for music is what has inspired me to continue to learn the Viola after a fifteen year hiatus. The show is called Einstein's Universe and it is put on by particle physicist Brian Foster and British musician Jack Liebeck. The lecture tour will be a fusion of science communication and classical music. You can read a bit more about the show over at Physics World here and see a video about it below.

You can catch a Foster and Liebeck perform an arrangement of the Mozart violin sonata in C Major k.296 here.

Cosmology not only probes the absolute mass scale of the neutrino but is a completely independent method to test against. In any case, it is imperative to include an accurate prescription for the neutrino in cosmology, as any failure to do so can bias the other cosmological parameters. A cosmological constraint on the sum of the neutrino masses is primarily a constraint on the relic big bang neutrino density Ων. One can relate this density to the sum of the mass eigenstates ∑mν as given by Ων= ∑mν/(93.14 h2 eV). The direct effects of the neutrinos depend on whether they are relativistic or nonrelativistic and the scale under consideration. Neutrinos have a large thermal velocity as a result of their low mass and subsequently erase their own perturbations on scales smaller than the free streaming length. This subsequently contributes to a suppression of the statistical clustering of galaxies over small scales and can be observed in a galaxy survey. The abundance of neutrinos in the Universe can also have a direct effect on the primary CMB anisotropies if nonrelativistic before the time of decoupling (i.e., when sufficiently massive). However, one of the most clear effects at this epoch is a displacement in the time of matter-radiation equality. All these cosmological effects can be used to impose bounds on the neutrino mass. Previous studies have capitalized on these signatures and have started to place sub eV constraints on the absolute mass scale . We utilize the new Sloan Digital Sky Survey MegaZ luminous red galaxy (LRG) DR7 galaxy clustering data to provide the first photometric galaxy clustering constraint on the neutrino and, combining with the CMB, examine the complementarity of these early- and late-time probes. With an almost comprehensive combination of probes this renders one of the tightest constraints on the neutrinos in cosmology and therefore physics.

Cosmological observations provide independent constraints on the neutrino mass scale provided that a few assumptions (a flat universe with Gaussian and adiabatic primordial fluctuations and a constant spectral index for example) can be made. Compared to the prospects of current-to-next generation particle neutrino experiments (like KATRIN) it may be that astronomical surveys of the cosmic microwave background anisotropies or optical surveys of the large scale structure of the Universe will place the tightest constraints on neutrino masses for some time. Continue reading from the excerpt above written by Shaun Thomas, Filipe Abdalla, and Ofer Lahav on their invited viewpoint article in Physical Review Letters (freely available):Upper Bound of 0.28 eV on Neutrino Masses from the Largest Photometric Redshift Survey.

A little over a week ago in Lindau, Germany Theordor Hanch hinted at new measurements of the size of the proton which may impact the fundamental theory of quantum electrodynamics. Hansch's lecture was an overview of the history of lasers progressing from our realization of the wave/particle duality nature of light to new research published in Nature on the size of the proton. The new research relies on the fact that the energy levels allowed within an atom depend upon the quantum mechanical interaction of the proton and the electron (or in the case of this recent experiment the exotic muon particle). Each atom has its own energy levels and corresponding spectral lines like a fingerprint. Understanding the spectra produced by atoms was historically very important, to stress this Hansch called the simple hydrogen spectrum the 'Rosseta stone of atoms.' Tiny discrepancies in the expected spectra of the atom in experiment compared to theory have led to major advances in fundamental knowledge. The breakthrough that allowed for exploration of these discrepancies in the behavior of atoms occurred exactly 50 years ago with the development of the laser.

The Hansch lecture on the heartbeat of light is available to watch on the Lindau conference website here. It is at least worth watching what he explains at minute 18 on the nonlinear self organization of light pulses in pulsed lasers in analogy to mechanical pendulums. He shows a video of ten mechanical pendulums in a row with staggered frequencies ranging from 30 to 39 cycles per minute. Each pendulum corresponds to one of the frequencies present in a laser cavity and at the first moment all the lasers are in phase such that constructive interference occurs corresponding to a laser pulse or a large transfer of energy. Quickly the pendulums get out of phase and although they look chaotic there are smaller emerging and disappearing patterns. If you wait long enough the pendulums briefly line up in phase again and this is when the laser when emit the next big flash of light. This demonstration is lovely because it underlies all of physics, if you are a physicist you probably can immediately visualize what I am describing, if you are not a physicist you may have to see the video to visualize what I am talking about, but everyone will appreciate the beauty of the simple demonstration which was effective enough to illicit a round of applause from the audience.

Historically precision measurement of the hydrogen energy levels were difficult because Doppler broadening is large for the particularly light weight hydrogen atom. Hansch explained that with lasers you can pick out hydrogen that is standing still or at most moving sideways using saturation spectroscopy. The development of lasers allowed physicists for the first time to see single fine structure components in atoms particularity the Lamb shift discerning the 2S state where the electron comes close to the proton and the 2P state where it stays away. The saturation spectroscopy technique allows the Lamb splitting of energy levels to be seen plainly. The Lamb shift depends on the size of the proton, but to probe the proton size more finely tricks are needed.

The notion of size for a particle like the proton that resides in the realm quantum mechanics is tricky to define, but there are two classic ways of measuring its radius: scattering of electrons from a hydrogen atom or by looking at the exact energy levels of a hydrogen atom. The size of the proton has been based mainly on the precision spectroscopy of atomic hydrogen and calculations from bound-state quantum electrodynamics. It is known that a hydrogenic atom with a smaller Bohr radius would enhance the effects related to the finite size of the proton, that is to say a proton interacting with a bound oppositely charged massive particle would demonstrate effects in Lamb shift due to a contribution from the proton's size. A collaborative team of scientists lead by Randalf Pohl have spectroscopically measured the Lamb shift of muonic hydrogen and found the charge radius of the proton is 4% smaller than the previously accepted value.

Muonic hydrogen is like regular hydrogen but the electron has been swapped for a muon. The experiment called for muonic hydrogen in which a muon travels around the proton with a radius 200 times smaller than that of hydrogen constructed from and electron. A muon is an elementary particle similar to the electron, is has the the same negative charge and spin, but it is about 200 times more massive than an electron. The muon 'orbits' so close to the proton in fact that it actually spends some portion of its orbit within the radius of the proton. Muons decay quickly and creating muonic hydrogen is a task that could only be undertaken at the Paul Scherrer institute in Switzerland which is the sole location in the world where a muon beam of sufficient intensity could be generated.

The Lamb shift is the result of angular momentum conservation within the atom. The 2S state of hydrogen has zero angular momentum and the 2P state has an angular momentum of one (don't ask about units). As mentioned earlier the result is that in the 2S state the electron comes close to the proton and in the 2P state it stays away.

The experiment worked as follows. The researchers created muonic hydrogen at the Paul Scherrer Institute with equipment constructed especially for the experiment (it took ten years to build). Once the muonic hydrogen is created the researchers shine in a tunable infrared laser with a frequency corresponding to the splitting between the 2S and 2P states. The laser will excite some of the muons from the 2S into the 2P state (when the muonic hydrogen is created a tiny fraction of of it is naturally produced in the 2S state), but the muons will quickly decay to the ground state emitting a powerful x-ray in the process. The researchers measured the amount of x-rays emitted at each specific frequency they had their infrared laser tuned to and the exact frequency which generated the most x-ray flux is the Lamb shift measurement they made.

The new measurement is discrepant with previous results, but the team has done such a careful job of measurement, the first results indicating a discrepancy were discovered six years ago but the results were held, that theoreticians are questioning the accuracy of fundamental constants like the Rydberg constant and basic theories of quantum electrodynamics. There is some more discussion and interpretation of the experiment over at Uncertain Principles on how the proton is even smaller than we thought. I think that it is a very cool discovery. Hansch commented on discovery during his lecture with an anecdote about Arthuer Schawlow (who received the 1981 Nobel for his work with laser spectroscopy). Schawlow would ask students in the hallways at Stanford, 'What have you discovered?' and Hansch says the message to students is, 'I am not here to learn something old, I am here to discover something new.'

The first image of the microwave sky was released today by the Planck collaboration. The image is the result of a year of observations from the Planck satellite. How far we have come since the first image of the cosmic microwave background by COBE! The most prominent aspect of the image is the bright band across sky caused by diffuse gas and dust emission from our own Milky Way. Also visible are local clouds of gas, nearby galaxies such as Andromeda, and more distant galaxies which host supermassive black holes in their center. The more subtle variations which will be visible when the foregrounds are removed are tiny temperature fluctuations which carry information about the cosmic microwave background and primordial density fluctuations seeded by the Big Bang. However, scientists are waiting to dive into detailed analysis of the multi-frequency data ranging from 30 GHz to 857 GHz until all of the foregrounds and telescope systematics can be understood. Ultimately the Planck data will give us the most precise constraints humans have ever had on the parameters of our cosmos.

Planck is a major step forward in cosmic microwave background (CMB) observations because it measures polarization of microwave photons. The polarization of photons may carry information about the universe from inflation or when the CMB was generated 400,000 years after the Big Bang. Generally when an electromagnetic wave or photon is incident upon a free electron the scattered photon is polarized perpendicularly to the incident direction. Different regions of the CMB may have a net linear polarization generated when radiation from perpendicular directions in the sky has different intensities. Different directions in the sky have different intensities dependent upon perturbations; there are three kinds of perturbations 1) scalar perturbations due to density fluctuations, 2) vector perturbations due to vorticity (like cosmic strings or defects, although these are not likely to be detected), 3) and tensor perturbations due to gravity waves. The Planck mission will be the first CMB space satellite to measure the as of yet unseen gravity wave or "B-mode" poarlization which will reveal the physics of primordial gravity waves when the Universe was in existence for just 10-36 seconds.