High-energy physics has a case of the Higgs

Three talks at Physics@FOM show just how excited the high energy physics …

One of the things that was abundantly clear in the high energy physics sessions at Physics@FOM is that everyone is very excited. The LHC is ready to roll later this winter, the Tevatron is putting out data like... well, a machine, and there is just so much stuff waiting around the corner. It almost feels as if you're among kids that are about to be given the keys to a candy store.

I only attended three talks in this session: an update on the ATLAS detector at the LHC, preliminary results from the Tevatron's search for the Higgs boson, and a talk introducing a new way to go from a theoretician's idea to experimental data. Individually, these talks are fairly unimportant, but together they are pretty exciting.

First, ATLAS, one of the very large detectors at the LHC, has come through testing with flying colors. It is the size of a small building, but must be positioned with micrometer accuracy, which is now done. The LHC operators used cosmic rays and the low energy collisions from the LHC start-up late last year to calibrate ATLAS. All three layers—two particle tracking layers and one absorbing layer for energy accounting—are functioning as expected. The presenter showed various detection tests and compared them to instrument simulations: ATLAS works as predicted.

This summer the LHC can start taking data in ernest, but we shouldn't expect anything spectacular until the year after that at the earliest. However, this highlights one of the areas that is still being worked on. The ATLAS detector is expected to generate a huge amount of raw data, and the researchers are still not sure if their data pre-selection is going to be good enough to weed the data down enough to make it useful in a timely manner.

Fermi races for the Higgs

Of course, while the LHC is getting warmed up, the Tevatron is pumping out data. One of the scientists associated with the D0 detector presented preliminary results in their search for the Higgs boson. The actual conclusion is that we haven't found it yet. However, the good folk at the Tevatron are pinning their hopes on a small energy range where the data is consistently—but not statistically significantly—to one side of the average.

Basically, before the data crawls out of the noise, there is often a run of data that hints at a real signal, but statistically still resides within the noise. This is what the D0 collaboration thinks it has found. If they are right, more time and more data should increase the signal and Higg's will emerge, blinking, into the light of day.

Unfortunately, as the D0 people are aware, runs like this also occur naturally. In any case, by the time the LHC has gone through re-finding the entire list of particles we know about and is ready to start searching for the Higg's boson in earnest, the Tevatron may have already found it.

Software brings theoreticians closer to results

Possibly the most fun and most informative talk was an intriguingly named presentation called "From brains to detectors." Using a very imaginative 24-hour analogy, Jack Bauer is a theoretical physicist with an idea. His mission is to test that idea against data in 24 hours. The presenter then used this format to compare the traditional approach with some new software developments.

Traditionally, Jack spends an awful lot of time writing his own custom code, and perhaps after 2 years of coding, he can test his idea against data. The first step is that Jack uses the standard model to extract what is called the Lagrangian, which is basically the energy landscape of an interaction. From this, Jack can generate a list of particles and masses. The Lagrangian, which is the actual physics, is a few hours work, while the particle list takes a few days to generate. But the particles on the list are typically not the ones directly detected by instruments.

In his next move, Jack simulates the behavior of his list of particles. He uses this simulation to track what each particle will decay to and with what probability. From this, he can figure out what sorts of particle combinations would indicate he's on the right track. Unfortunately, this still isn't good enough. Instruments don't behave as ideal detectors and the details of the beams influence the rate at which certain particles are produced.

Jack can't get around this problem without the aid of an experimental physicist—if he really is an theoretician, he hasn't been on speaking terms with experimental physicists for years. After hunting down an experimental physicist, they put together a simulation that includes Jack's model and his experimentalist friend's detectors. This model allows them to figure out how to filter the data appropriately to look for the set of events that are important to Jack. Now, finally, he has a prediction of what the experimental data should look like and can compare that with actual experimental data.

Normally this process takes at least two years, because everyone writes their own code. However, now there is a tool chain that takes the Lagrangian—this is the physically important bit—and generates the data for every step in the process just described. Even better, apart from the first step, which generates the initial list of particles, every step in the chain has multiple tools that should generate the same results. This allows researchers to compare predictions and help detect bugs in each tool.

They are open source, so should an important bug be found—an important bug is one that changes the physics—it can be fixed very quickly. What's more, many of these tools can be operated via a Web interface, so they are really easy to use.

But, the most fantastic part of this is that, using these tools, it only takes Jack 24 hours—did you expect any other time?—to go from the Lagrangian to being able to compare a theoretical prediction to data. A fun presentation that advertised a series of really important developments that will help physicists cope with the huge amount of data expected to come out of the LHC.

Chris Lee / Chris writes for Ars Technica's science section. A physicist by day and science writer by night, he specializes in quantum physics and optics. He lives and works in Eindhoven, the Netherlands.