Simulating Everything

New work lead by a Carnegie-Mellon professor, reports on results of the first …

My time in graduate school was spent developing algorithms for, and
running simulations of, at most a few hundred or thousand atoms or
molecules. During a 'big' calculation, I might have simulated something
a few hundred nanometers across. All the while I was working on
simulating the very small, others were working on simulating the very
big—the very very big—in fact, they were trying to simulate
everything—the Universe itself. Even though the two
systems—zeolites in my case, the Universe in the other—are
vastly different, they are both the same type of computer problem,
namely the N-body problem. In simulations of this type of problem, you set up a
large number of particles with a given initial condition, define how they
will interact with one another and with any background, and allow them
to evolve over time in a dynamic fashion. While this may seem
a simple setup, it can be extremely taxing for a computer, especially
when you want to simulate on the order of one billion or more particles. It's such a problem for current computers that there is work being done in the hope of creating custom computers that specialize in solving gravitational N-body problems.

Using state of the art computers—2000 processors on a Cray
XT3—and algorithms, Tiziana Di Matteo, associate
professor of physics at Carnegie-Mellon University, along with fellow
researchers at the Harvard-Smithsonian Center for Astrophysics in
Cambridge, MA and the Max Planck Institute for Astrophysics
in Germany simulated the formation of the universe. They did this based on data from initial
conditions derived from the first year results from the Wilkinson
Microwave Anisotropy Probe. While they are not the first to do this
feat—the Millennium
Simulation modeled the formation of the universe using over 1010
particles in 2005—they are the first to account for a key
ingredient in the formation of the universe: black holes. According to
paper co-author Volker Springel of the Max Plank Institute, who was
also the lead author on the Millennium simulation work, "Including black
holes in computer simulations is critical. The galaxies
we see today look the way they do because of black hole physics."

The results of the simulation, named the BHCosmo
run, are impressive, as they contain length scales that range form a single
galaxy to the astronomically large filaments of gasses and galaxies that run
throughout our universe. The image shows a snapshot of the universe at
a 'relatively' recent time, with a red-shift of 1.0; the circles are
black holes, and the colored sections illustrate the gas distribution
throughout the universe. A series of such images showing the evolution
of the gas filaments and the formation of the black holes—along
with some interesting movies—can be found at the BHCosmo
homepage.

In addition to generating statistical data about the relation
of the black hole formation rate to the star formation rate, and a host of
other cosmological factors, the level of detail present allows the
researchers to follow the formation and evolution of a single black
hole. As stated in the conclusions of the paper, "Our cosmological
approach to black hole formation enables us to follow the fuel for
black hole activity from its ultimate source, the early intergalactic
medium, and so link the large scale structure and environments of black
holes directly with their growth." This unprecedented amount
of information will allow for many new lines of research to be carried
out, and hopefully provide answers to some of the outstanding questions
in cosmology today. The work is submitted for publication in an
upcoming edition of The Astrophysical Journal,
a preprint of the article can be read for free from arXiv.

Matt Ford / Matt is a contributing writer at Ars Technica, focusing on physics, astronomy, chemistry, mathematics, and engineering. When he's not writing, he works on realtime models of large-scale engineering systems.