A galaxy cluster, is a cluster of galaxies… Galaxies are swarms of stars held together in a common gravitational potential, in an analogous way galaxy clusters are swarms of galaxies held together in a larger gravitational potential structure.

“Richness” is a bit of astronomical jargon that refers to the number of bright galaxies in a cluster. A cluster is “rich” if it has many massive galaxies and not rich if there are no massive galaxies. In fact, in a way that sounds quite PC, a galaxy cluster is never referred to as “poor”, but some galaxies have “very low richness”. This is a term that was first defined in the 1960s

[the richness of a cluster] is defined to be the number of member galaxies brighter than absolute magnitude Mi ≥ −19.35, which is chosen to match the limiting magnitude at the furthest cluster redshift that we probe

The clusters were detected using a 3D matched filter method. This allowed for a very large number of clusters to be found. 18,056 cluster candidates were found in total, which allowed for the statistics of this population of clusters to be measured.

The total significance of the shear measurement behind the clusters amounts to 54σ. Which corresponds to a (frequentist) probability of 1-4.645×10-636 or a

chance that we have detected a weak lensing signal behind these clusters and groups!

The main result in the paper was the measurement that the mass of clusters increases with the richness with a relation of M200 = M0(N200/20)^β. This may be expected, that clusters that are more massive have more bright galaxies; after all a cluster is defined as a collection of galaxies. We found a normalization M0 ∼ (2.7+0.5) × 10^13 Solar Masses, and a logarithmic slope of β ∼ 1.4 ± 0.1.

Curiously no redshift dependence of the normalization was found. This suggests that there is a mechanism that regulates the number of bright galaxies in clusters that is not affected by the evolution of cluster properties over time. We do not know why this relationship should not change over time, or why it has the values it does, but we hope to find out soon.

Cosmic shear is the effect where galaxy images, that are (relatively!) near-by – a mere few billion light years – are distorted by gravitational lensing caused by the local matter in the Universe. We can measure this and use the data to learn about how the distribution of matter evolved over that time.

The Cosmic Microwave Background (CMB) is the ubiquitous glow of microwaves, that comes from every part of the sky, and that was emitted nearly 14 billion years ago. Analysis of the CMB allows us to learn about the early Universe, but also the nearby Universe because the local matter also gravitationally lenses the microwave photons.

In a recent paper we have shown how to combine Cosmic Shear information and CMB together in a single all-encompassing statistic. Because we see the Universe in three dimensions (2 on the sky and one in distance or look-back-time) this new statistic needed to work in three dimensions too.

What we found was the when the galaxy and microwave data are combined properly the resulting statistic is more powerful than the sum of the two previous statistics, because there is the extra information that comes from the “cross-correlation” between them. In particular we found that the extra information helps in measuring systematic effects in the cosmic shear data.

What is a “cross correlation”?

A correlation is a determined relationship between two things. The definition (that the Apple dictionary on my computer gives) is

noun

a mutual relationship or connection between two or more things

In recent cosmological literature we use this term somewhat colloquially to refer to relationship between data points in a single data set. For example one could correlate the position of galaxies separated by a particular separation – to determine if they were clustered together – or one could correlate the temperature of microwave emission from different parts of the sky (both of these have been done with much success).

The word “cross” in “cross correlation” refers to taking correlations of quantities observed from different data sets. The addition of the word “cross” seems somewhat superfluous in fact. If we have experiment A and B one can correlate the data points from A, or correlate data points from B, or correlate data points between A and B.

In the new paper we instead used a more descriptive nomenclature that refers to inter and intra datum aspects of the analysis. Intra-datum means using statistics within a single data set and inter-datum means calculating statistics between them; for example the plotting a histogram of points within a data set, compared to plotting the points from two data sets on one graph.

When should one attempted to find inter-datum correlations between any data points? In this regard there seems to be two modes of investigation that one could take, following a Popper-inspired categorisation one can define the following modes:

Deductive mode. In this approach one has a clearly defined scientific hypothesis, for example the measurement of some parameter (or model) predicted to have a given value(s). Then one can find a statistic that maximises the signal-to-noise (or expected evidence ratio) for that parameter or model. That statistic may or may not include inter-datum aspects.

Inductivemode. Alternatively one may simply wish to correlate everything with everything, with no regard to a hypothesis or model. In this approach the motivation would just be to explore the space of possibilities; trying to find something new. If a positive correlation is found then this may, or may not, indicate an underlying physical process or causal relation between the quantities.

The danger of an inductive approach, of course, is one can find correlations for which the underlying physical process is much more complicated than that taken at face value. To illustrate this point one can look on Google Correlate and find some interesting correlations, for example:

In a recent paper, led by collaborator Dr Liping Fu the CFHTLenS survey was used to measure the “3-point correlation function” from weak lensing data. This is one of the first times this has been measured, and certainly one of the clearest detections.

A “2-point” statistic, in cosmology jargon, is one that uses two data points in some way. Usually an average over many pairs of objects (galaxies or stars) are used to extract information. In this case what is being measured is called the “two-point [weak lensing] correlation function” and what it measures is the excess probability that any pair of galaxies (separated by a particular angular distance) are aligned. This is slightly different to a similar statistic used in galaxy cluster analysis. The two-point correlation function is related to the Fourier transform of matter power spectrum and can be used to measure cosmological parameters, which is why we are interested in it. In a sense the two-point correlation function is like a scale-dependent measure of the variance of the gravitational lensing in the data: the mean orientation of galaxies is assumed to be zero (when averaged over a large enough number) because there is no preferred direction in the Universe, but the variance is non-zero.

The measurement of the 2-point statistic is represented above, “sticks” (of various [angular] lengths) are virtually analysed on the data and the for each stick-length the ellipticity (or “ovalness”) of the galaxies along the direction of the sticks is measured. If the two galaxies are aligned then the multiplication of these ellipticities (e * e) will be positive, but if not then sometimes it will be positive and sometimes negative.

Ellipticity can be expressed as two numbers e1 and e2 that lie on a Cartesian graph where positive and negative values represent different alignments. When multiplied together aligned ellipticities therefore produce a positive number and anti-aligned produce a negative number. When averaged over all galaxies a purely random field with no preferred alignment (equal positive and negative) the multiplication averages to zero. If there is alignment the average is positive.

Galaxies will align is there is some common material that is causing the gravitational lensing to be coherent. So when averaged over many galaxies the multiplication of the ellipticities <e*e> (the angular brackets represent taking an average) for a particular stick length tells us whether there is lensing material with a scale the same as the sticks length: a positive result means there is alignment on average, a zero result means there is no alignment on average, a negative result would mean there is anti-alignment on average.

Figure from Kitching et al. (Annals of Applied Statistics 2011, Vol. 5, No. 3, 2231-2263). Gravitational lensing from the large scale structure in the universe makes preferred aligned occur around clusters of dark matter of around voids, what we call “E-mode”. Anti-alignment is not normally caused by gravitational lensing, what we call “B-mode”.

In this new paper we not only measured the two-point correlation function but also the 3-point correlation function! This is an extension of the idea to now measure the excess probability that any 3 galaxies have preferred alignment. Now instead of a single angle and pairs of galaxies the measurement uses triangle configurations of galaxies and results in a measurement that depends on two angles.

This is a much more demanding computational task, because there are many more possible ways that triangle can be drawn than a stick (for every given length of stick the other two sides of the triangle can take many different lengths). The amplitude of the 3-point correlation function tells us if there is any coherent structure on multiple-scales, and in particular allows us to test whether the simple description of large-scale structure using only the 2-point correlation function – and the matter power spectrum – is sufficient or not.

This is one of the first measurements of this statistic and paves the way for extracting much more information from lensing data sets than could be done using 2-point statistics alone.

The full article can be found at this link : http://arxiv.org/abs/1404.5469

Today a paper of mine that I have been working on for the last few years was posted to the arXiv.

The subject of the paper is a method called “3D cosmic shear”. It’s a particularly important subject because there are several new telescopes, and surveys that have been designed, or are being built, that intend to use such methods to measure the properties of dark energy. What we found in this paper is that some of the assumptions that have been made about how dark matter and ordinary matter interact in the cosmic web could be incorrect, which means the interpretation of the data from the new experiments will be a bit more complicated than cosmologists previously thought…

What is 3D cosmic shear?

Let’s start with the 3D part of the name. The Universe is filled with galaxies, which are sprinkled throughout the cosmic web of dark matter (one may say sprinkled like raisins in the infinite hot-cross-bun that is the Universe, but let’s not push the analogy too far).

Plot credit: S. Columbi and Y. Mellier. Matter is distributed as a cosmic web (orange) and galaxies are sprinkled within this web (blue dots). Light from the galaxies (yellow lines) is distorted by the gravitational field of the web and causes a small change in the ellipticity of the galaxy images.

We live in one of those galaxies, the Milky Way, so as we observe the Universe what we see are galaxies scattered across the sky (2-dimensions, north-south and east-west, or “right ascension” and declination). We also see galaxies in a 3rd dimension of redshift. This 3rd dimension is a little more complicated to understand: because the speed of light is finite the galaxies we see that are a long way away in the 3rd dimension are not only distant in a space but also distant in time. The 3D distribution of galaxies we see is consists of galaxies that are scattered across space and time.

But what else happens to light as it propagates towards us? Well, all of the matter from which the cosmic web is made, like all matter everywhere, has a gravitational field, and gravity is a force that pulls things together. You are being pulled by Earths gravity to the ground, the Earth (and you) are being pulled by the Suns gravity, the Sun (and you, and the Earth) are being pulled by all of the other stars in the Milky Way… and so on and so forth. Light doesn’t escape the universal attraction of gravity, and it is also pulled towards objects that have mass. The effect of gravity on a light ray is known as gravitational lensing, what happens (when the gravitational field is weak) is that an additional ellipticity (or “oval-ness”, technically this is the third flattening or third eccentricity of the galaxies observed shape) is imprinted on the image of a distant galaxy. We call this extra ellipticity “shear”.

Plot credit: T. Tyson. The light from distant galaxies is gravitationally lensed by the cosmic web of dark matter, inducing a small change in the ellipticity of the images of galaxies called cosmic shear.

The additional ellipticity caused by the cosmic web is known cosmic shear.

3D cosmic shear combines both information from the distances and spatial distribution of galaxies and the gravitational lensing effect of cosmic shear into a single analysis, and in the paper this was applied to a large survey for the first time. Because 3D cosmic shear maps the cosmic web and how it grows with time we can potentially learn about how the Universe evolved, and what its ultimate fate will be.

What did we find?

The paper used the method of 3D cosmic shear to analyse the largest/deepest optical wavelength survey of the sky to date CFHTLenS.

What we found is that matter in the Universe is clumped together less tightly than it should be if dark matter alone was responsible for the structure of the cosmic web.

Previously it has been assumed that the cosmic web of dark matter is not affected by the sprinkling of galaxies on top of it. However what we found is that there is evidence that the cosmic web of dark matter may actually be affected by the galaxies after all, in particular by the supermassive black holes that reside in the centres of the most massive galaxies, so-called Active Galactic Nuclei (AGN).

AGN may be impacting the distribution of dark matter in the Universe. Hubble Space Telescope image of a 5000-light-year-long (1.5-kiloparsec-long) jet being ejected from the active nucleus of the active galaxy M87, a radio galaxy. The blue synchrotron radiation of the jet contrasts with the yellow starlight from the host galaxy.

Alternatively things may be even weirder, another possibility is that what we are seeing the effect of massive neutrinos, or even a combination of effects. But a new mystery, and challenge for cosmologists, now exists. The normal matter and the dark matter seem to have an effect on each other: we may live in a more complicated Universe than we previously imagined. To resolve this mystery more data is now required, and more work to make the data analysis using 3D cosmic shear even more sensitive to the structure of the cosmic web within which we reside.