A lumpy model for a lumpy universe

Computer models are used for all sorts of applications, from designing cars to predicting the weather. They are even used to simulate the whole universe, to work out how it was born, and what might happen next. Until now, the universe has been modelled using numerical simulations, which are quick and simple, but limited by the assumptions they make and their use of Newtonian gravity instead of Einstein’s general relativity.

A typical numerical simulation assumes that the universe is isotropic and homogeneous, meaning that all of its matter is distributed evenly throughout. This is true on a large scale, but on smaller scales the matter is gathered into clusters of galaxies and dark matter, and the rest of space is empty. This means that expansion of the universe occurs at different rates in different places – spots dense with matter will be pulled closer together by their gravity and expand slower, and empty spaces will expand unhindered (28% faster than the average rate of expansion!).

Two sets of code have now been written independently using this more realistic mass distribution and all of Einstein’s theory of general relativity to simulate how the universe evolved. One of the programs was written by Eloisa Bentivegna and Marco Bruni from the University of Catania in Italy and the Institute of Cosmology and Gravitation at the University of Portsmouth respectively. It is already available to the public, and uses a collection of free software called the ‘Einstein toolkit”. There is a backbone of software called ‘Cactus’, and modules called ‘thorns’ are added to perform specific tasks, like solving Einstein’s equations or calculating gravitational waves. As modules are added, new applications are created, and by writing a module preparing the initial conditions for matter which is inhomogeneous on a small scale but homogeneous on larger scales the code could be used to make predictions about the whole universe. The second code was written by James Mertens and Glenn Starkman from Case Western Reserve University in Ohio and John T Giblin from Kenyon College. They will be releasing their code soon, and say that it performs better than the ‘Cactus’ code.

The code can confirm whether or not our interpretation of the structure of the universe and cosmic expansion are true. They are likely to be used equally, so that they can independently verify each other’s solutions when making important predictions. Numerical simulations won’t become obsolete for a while either – they are less accurate, but take a lot less computing power, meaning that they can get a lot more detail.

So how much does this change about our perception of the universe? Many of our current assumptions might not hold in this ‘lumpy’ model of the universe. For example, cosmic distances are measured using ‘standard candles’, which use the propagation of light from supernovae, since we know their luminosities. Light may propagate differently though different parts of space, so the distances we think we know may all be wrong. Next, the calculated rate of expansion of the universe, described by the Hubble parameter, is also calculated using a homogenous universe, and we now know that this value depends on the density of the region in space. Finally, the code rules out the ‘back reaction’ phenomenon, which was thought to explain the effects of dark energy. Time to rethink everything we think we know…

Hi! Welcome to ReadAroundYourSubject - the perfect place to have a browse around some of the most recent and exciting science news in my blog, and find pages making the leap from A Level to University level Physics and Chemistry easier.