Wednesday, January 19, 2011

The Humongous Computing Task of the LHC

We should not forget that besides the scientific achievement of the LHC, the task of handling the humongous amount of data coming it at such a rapid rate in itself pushes the computing knowledge and ability to a new level. By itself, it forces new innovation in data handling and computing techniques that many commercial companies simply can't do. Yet, this is something we need to know if our technological needs are to continue to expand.

This news article (link may be open for free only for a limited time) examines the astounding effort in piping out data gathered from the ATLAS detector, and how many groups around the world are working together to handle such volume and analysis.

Even after rejecting 199,999 of every 200,000 collisions, the detector churns out 19 gigabytes of data in the first minute. In total, ATLAS and the three other main detectors at the LHC produced 13 petabytes (13 × 1015 bytes) of data in 2010, which would fill a stack of CDs around 14 kilometres high. That rate outstrips any other scientific effort going on today, even in data-rich fields such as genomics and climate science (see Nature 455, 16–21; 2008). And the analyses are more complex too. Particle physicists must study millions of collisions at once to find the signals buried in them — information on dark matter, extra dimensions and new particles that could plug holes in current models of the Universe. Their primary quarry is the Higgs boson, a particle thought to have a central role in determining the mass of all other known particles.