Simulating the Universe on Supercomputers

The evolution of cosmic structure

The evolution of structure in the Universe is one of the hottest topics in Cosmology and Astrophysics. In the last years the so-called $\Lambda$-CDM-model could be established also with great help of very large computer simulations. This model describes a Universe that consists mainly of dark components: 96% are made of dark energy and dark matter. Ordinary matter made up of baryons give only 4% to the total content of the Universe. The talk will present recent results with the main focus on computational methods and challenges in that field. A state-of-the-art computer code for running these calculations will be presented in detail.

The talk will describe recent progress in the field of cosmic structure formation and will mainly focus on computational problems and methods carrying out such large simulations on the fastest Supercomputers available today. At the end of the talk I will also briefly discuss a new method we developed to access the dark matter structure in the Milky way to a scale that was just impossible some month ago with current Supercomputers.

To describe the evolution of the Universe from the Big Bang to what we see today is a quite hard task. It took many years until we reached an understanding and model that fits the observations we get by telescopes and satellites. Great steps in justifying this model were possible because of computer simulations. These simulations calculate the evolution of the universe from a short time after the Big Bang to present time.
Comparing the results of these simulations with observations was a major proof for the correctness of the $\Lambda$-CDM model. The simulation itself is highly complicated because the main force driving the evolution of the universe is gravitation in an expanding space. And this force acts over extremely large distances. This makes computations very expensive and large computers and efficient algorithms are needed to handle this problem. The biggest simulation ever done in that field is the Millennium Run, carried out by our institute.
It took one month of computation on a 512-CPU-Cluster at the Max Planck Society Computing Center. Even with this computer power it is only possible to simulate the dark part of the universe. So the simulation only includes dark matter and dark energy. There are no galaxies, stars, planets or even smaller objects. The dynamical range that would be needed to simulate all together is just so large, that it is totally impossible to run this on any computer today or in the future. Nevertheless to get real galaxies in the Millennium simulation they are added in a post processing by so called Galaxay-Formation-Algorithms. These take the dark matter distribution given by the simulation and populate it with galaxies based on some astrophysical models. This way one can create a universe with shining galaxies a so called Mock Universe.
From the dark matter distribution itself one can also learn a lot, make statistics and compare them to observational data. It is quite impressive that the structures
forming in the computer simulation look exactly like the structure galaxies and clusters of galaxies in the real universe.
Recently a researcher group simulated a Milky Way like dark matter structure with an extremely high
resolution to get more insights into the Dark Matter around us. This so called Via Lactea Simulations
used NASA's fastest project Colombia Supercomputer for about one month to finish the calculation.

The talk will mainly focus on numerical techniques how to run such a simulation. As an example the
state-of-the-art code Gadget (Springel, 2005) will be presented. This code was used to calculate
the Millennium Simulation. It is at the moment the leading code for cosmological simulations.

At the end of the talk a new method will be presented that increases the resolution of all simulations done
so far by an enormous factor.