The European Organisation for Nuclear Research (CERN) on Friday unveiled a massive computing grid aimed at analysing millions of gigabytes of data set to be generated by the world’s largest atom-smasher.
The data manager of the Large Hadron Collider (LHC), the Worldwide LHC Computing Grid, combines the IT power of over 140 computer centres, and would analyse each year over a thousand times more data than is currently available in printed material over the world.

The atom-smasher LHC, a scientific research instrument spanning 27 kilometres (17 miles) of tunnels buried 100 metres underground in the suburbs of Geneva, is expected to generate tonnes of data through subatomic collisions.

Given the amount of data to be analysed and managed, the Computing Grid is “an absolute necessity,” said Jos Engelen, chief scientific officer of the LHC project.

(ARTICLE CONTINUES BELOW)

The grid would deal with over 15 million gigabytes of data every year, and is “the result of a ‘silent revolution’ in large scale computing over the last five years,” added Engelen.

In fact, the amount of information to be handled each year is “equivalent of more than a thousand times that of the amount of information contained in all the books printed on the planet,” said Gaelle Shifrin, spokeswoman of the nuclear physics centre and institute of Lyon, also one of the 11 main centres collaborating on the Grid project.

A d v e r t i s e m e n t

Grid computing is a system connecting computers over a wide geographic area. Just as the World Wide Web allows users around the world access to information, grid computing allows access to computing resources such as data storage and processing power.

While the grid is already active, the LHC itself is down until next year following a technical fault.