Tag Archives: LHCb

This summer there was a blog post from Sabine Hossenfelder claiming that “The LHC `nightmare scenario’ has come true” — implying that the Large Hadron Collider [LHC] has found nothing but a Standard Model Higgs particle (the simplest possible type), and will find nothing more of great importance. With all due respect for the considerable intelligence and technical ability of the author of that post, I could not disagree more; not only are we not in a nightmare, it isn’t even night-time yet, and hardly time for sleep or even daydreaming. There’s a tremendous amount of work to do, and there may be many hidden discoveries yet to be made, lurking in existing LHC data. Or elsewhere.

I can defend this claim (and have done so as recently as this month; here are my slides). But there’s evidence from another quarter that it is far too early for such pessimism. It has appeared in a new paper (a preprint, so not yet peer-reviewed) by an experimentalist named Arno Heister, who is evaluating 20-year old data from the experiment known as ALEPH.

In the early 1990s the Large Electron-Positron (LEP) collider at CERN, in the same tunnel that now houses the LHC, produced nearly 4 million Z particles at the center of ALEPH; the Z’s decayed immediately into other particles, and ALEPH was used to observe those decays. Of course the data was studied in great detail, and you might think there couldn’t possibly be anything still left to find in that data, after over 20 years. But a hidden gem wouldn’t surprise those of us who have worked in this subject for a long time — especially those of us who have worked on hidden valleys. (Hidden Valleys are theories with a set of new forces and low-mass particles, which, because they aren’t affected by the known forces excepting gravity, interact very weakly with the known particles. They are also often called “dark sectors” if they have something to do with dark matter.)

For some reason most experimenters in particle physics don’t tend to look for things just because they can; they stick to signals that theorists have already predicted. Since hidden valleys only hit the market in a 2006 paper I wrote with then-student Kathryn Zurek, long after the experimenters at ALEPH had moved on to other experiments, nobody went back to look in ALEPH or other LEP data for hidden valley phenomena (with one exception.) I didn’t expect anyone to ever do so; it’s a lot of work to dig up and recommission old computer files.

This wouldn’t have been a problem if the big LHC experiments (ATLAS, CMS and LHCb) had looked extensively for the sorts of particles expected in hidden valleys. ATLAS and CMS especially have many advantages; for instance, the LHC has made over a hundred times more Z particles than LEP ever did. But despite specific proposals for what to look for (and a decade of pleading), only a few limited searches have been carried out, mostly for very long-lived particles, for particles with mass of a few GeV/c² or less, and for particles produced in unexpected Higgs decays. And that means that, yes, hidden physics could certainly still be found in old ALEPH data, and in other old experiments. Kudos to Dr. Heister for taking a look. Continue reading →

Greetings from Geneva, and CERN, the laboratory that hosts the Large Hadron Collider [LHC], where the Higgs particle was found by the physicists at the ATLAS and CMS experiments. Between jet lag, preparing a talk for Wednesday, and talking to many experimental and theoretical particle physicists from morning til night, it will be a pretty exhausting week.

The initial purpose of this trip is to participate in a conference held by the LHCb experiment, entitled “Implications of LHCb measurements and future prospects.” Its goal is to bring theoretical particle physicists and LHCb experimenters together, to exchange information about what has been and what can be measured at LHCb.

On this website I’ve mostly written about ATLAS and CMS, partly because LHCb’s measurements are often quite subtle to explain, and partly because the Higgs particle search, the highlight of the early stage of the LHC, was really ATLAS’s and CMS’s task. But this week’s activities gives me a nice opportunity to put the focus on this very interesting experiment, which is quite different from ATLAS and CMS both in its design and in its goals, and to explain its important role.

ATLAS and CMS were built as general purpose detectors, whose first goal was to find the Higgs particle and whose second was to find (potentially rare) signs of any other high-energy processes that are not predicted by the Standard Model, the equations we use to describe all the known particles and forces of nature. Crudely speaking, ATLAS and CMS are ideal for looking for new phenomena in the 100 to 5000 GeV energy range (though we won’t reach the upper end of the range until 2015 and beyond.)

LHCb, by contrast, was built to study in great detail the bottom and charm quarks, and the hadrons(particles made from quarks, anti-quarks and gluons) that contain them. These quarks and their antiquarks are produced in enormous abundance at the LHC. They and the hadrons that contain them have masses in the 1.5 to 10 GeV/c² range… not much heavier than protons, and much lower than what ATLAS and CMS are geared to study. And this is why LHCb has been making crucial high-precision tests of the Standard Model using bottom- and charm-containing hadrons. (Crucial, but not, despite repeated claims by the LHCb press office, capable of ruling out supersymmetry, which no single measurement can possibly do.)

Although this is the rough division of labor among these experiments, it’s too simplistic to describe the experiments this way. ATLAS and CMS can do quite a lot of physics at the low mass range, and in some measurements can compete well with LHCb. Less well-known is that LHCb may be able to do a small but critical set of measurements involving higher energies than is their usual target.

LHCb is very different from ATLAS and CMS in many ways, and the most obvious is its shape. ATLAS and CMS look like giant barrels centered on the location of the proton-proton collisions, and are designed to measure as many particles as possible that are produced in the collision of two protons. LHCb’s shape is more like a wedge, with one end surrounding the collision point.

Left: Cut-away drawing of CMS, which is shaped like a barrel with proton-proton collisions occurring at its center. ATLAS’s shape is similar. Right: Cut-away drawing of LHCb, which is shaped something like a wedge, with collisions occurring at one end.

This shape only allows it to measure those particle that go in the “forward” direction — close to the direction of one of the proton beams. (“Backward” would be near the other beam; the distinction between forward and backward is arbitrary, because the two proton beams have the same properties. “Central” would be far from either beam.) Unlike ATLAS and CMS, LHCb is not used to reconstruct the whole collision; many of the particles produced in the collision go into backward or central regions which LHCb can’t observe. This has some disadvantages, and in particular put LHCb out of the running for the Higgs discovery. But a significant fraction of the bottom and charm quarks produced in proton-proton collisions go “forward” or “backward”, so a forward-looking design is fine if it’s bottom and charm quarks you’re interested in. And such a design is a lot cheaper, too. It also means that LHCb is well positioned to make some other measurements where the forward direction is important. I’ll give you one or two examples later in the week.

To make their measurements of bottom and charm quarks, LHCb makes use of the fact that these quarks decay after about a trillionth of a second (a picosecond) [or longer if, as is commonly the case, there is significant time dilation due to Einstein’s relativity effects on very fast particles]. This is long enough for them to travel a measurable distance — typically a millimeter or more. LHCb is designed to make the measurements of charged particles with terrific precision, allowing them to infer a slight difference between the proton-proton collision point, from which most low-energy charged particles will emerge, and the location where some other charged particles may have been produced in the decay of a bottom hadron or some other particle that travels a millimeter or more before decaying. The ability to do precision “tracking” of the charged particles makes LHCb sensitive to the presence of any as-yet unknown particles that might be produced and then decay after traveling a small or moderate distance. More on that later in the week.

A computer reconstruction of the tracks in a proton-proton collision, as measured by LHCb. Most tracks start at the proton-proton collision point at left, but the two tracks drawn in purple emerge from a different point about 15 millimeters away, the apparent location of the decay of a hadron, whose inferred trajectory is the blue line, and whose mass (measured from the purple tracks) indicates that it contained a bottom quark.

One other thing to know about LHCb; in order to make their precise measurements possible, and to deal with the fact that they don’t observe a whole collision, they can’t afford to have too many collisions going on at once. ATLAS and CMS have been coping with ten to twenty simultaneous proton-proton collisions; this is part of what is known as “pile-up”. But near LHCb the LHC beams are adjusted so that the number of collisions at LHCb is often limited to just one or two or three simultaneous collisions. This has the downside that the amount of data LHCb collected in 2011 was about 1/5 of what ATLAS and CMS each collected, while for 2012 the number was more like 1/10. But LHCb can do a number of things to make up for this lower rate; in particular their trigger system is more forgiving than that of ATLAS or CMS, so there are certain things they can measure using data of a sort that ATLAS and CMS have no choice but to throw away.

Over the weekend, someone said to me, breathlessly, that they’d read that “Results from the Large Hadron Collider [LHC] have blown string theory out of the water.”

Good Heavens! I replied. Who fed you that line of rubbish?!

Well, I’m not sure how this silliness got started, but it’s completely wrong. Just in case some of you or your friends have heard the same thing, let me explain why it’s wrong.

First, a distinction — one that is rarely made, especially by the more rabid bloggers, both those who are string lovers and those that are string haters. [Both types mystify me.] String theory has several applications, and you need to keep them straight. Let me mention two.

Application number 1: this is the one you’ve heard about. String theory is a candidate(and only a candidate) for a “theory of everything” — a silly term, if you ask me, for what it really means is “a theory of all of nature’s particles, forces and space-time”. It’s not a theory of genetics or a theory of cooking or a theory of how to write a good blog post. But it’s still a pretty cool thing. This is the theory (i.e. a set of consistent equations and methods that describes relativistic quantum strings) that’s supposed to explain quantum gravity and all of particle physics, and if it succeeded, that would be fantastic.

Application number 2: String theory can serve as a tool. You can use its mathematics, and/or the physical insights that you can gain by thinking about and calculating how strings behave, to solve or partially solve problems in other subjects. (Here’s an example.) These subjects include quantum field theory and advanced mathematics, and if you work in these areas, you may really not care much about application number 1. Even if application number 1 were ruled out by data, we’d still continue to use string theory as a tool.Consider this: if you grew up learning that a hammer was a religious idol to be worshipped, and later you decided you didn’t believe that anymore, would you throw out all your hammers? No. They’re still useful even if you don’t worship them.

BUT: today we are talking about Application Number 1: string theory as a candidate theory of all particles, etc. Continue reading →

In my last post, I promised you some comments on a couple of other news stories you may have seen. Promise kept! see below.

But before I go there, I should mention (after questions from readers) an important distinction. Wednesday’s post was about the simple process by which a Bs meson (a hadron containing a bottom quark and a down[typo] strange anti-quark, or vice versa, along with the usual crowd of gluons and quark/antiquark pairs) decays to a muon and an anti-muon. The data currently shows nothing out of the ordinary there. This is not to be confused with another story, loosely related but with crucially different details. There are some apparent discrepancies (as much as 3.7 standard deviations, but only 2.8 after accounting for the look-elsewhere effect) cropping up in details of the intricate process by which a Bd meson (a hadron containing a bottom quark and a down antiquark, or vice versa, plus the usual crowd) decays to a muon, an anti-muon, and a spin-one Kaon (a hadron containing a strange quark and a down anti-quark, or vice versa, plus the usual crowd). The measurements made by the LHCb experiment at the Large Hadron Collider disagree, in some but not all features, with the (technically difficult) predictions made using the Standard Model (the equations used to describe the known particles and forces.)

Don’t confuse these two processes! (Top) The process B_s –> muon + anti-muon, covered in Wednesday’s post, agrees with Standard Model predictions. (Bottom) The process B_d –> muon + anti-muon + K* is claimed to deviate by nearly 3 standard deviations from the Standard Model, but (as far as I am aware) the prediction and associated claim has not yet been verified by multiple groups of people, nor has the measurement been repeated.

A few theorists have even gone so far as to claim this discrepancy is clearly a new phenomenon — the end of the Standard Model’s hegemony — and have gotten some press people to write (very poorly and inaccurately) about their claim. Well, aside from the fact that every year we see several 3 standard deviation discrepancies turn out to be nothing, let’s remember to be cautious when a few scientists try to convince journalists before they’ve convinced their colleagues… (remember this example that went nowhere? …) And in this case we have them serving as judge and jury as well as press office: these same theorists did the calculation which disagrees with the data. So maybe the Standard Model is wrong, or maybe their calculation is wrong. In any case, you certainly musn’t believe the news article as currently written, because it has so many misleading statements and overstatements as to be completely beyond repair. [For one thing, it’s a case study in how to misuse the word “prove”.] I’ll try to get you the real story, but I have to study the data and the various Standard Model predictions more carefully first before I can do that with complete confidence.

Ok, back to the promised comments: on twists and turns for neutrinos and for muons… Continue reading →

The American Physical Society’s Division of Particles and Fields is pursuing a long-term planning exercise for the high-energy physics community. Its goal is to develop the community’s long-term physics aspirations. Its narrative will communicate the opportunities for discovery in high-energy physics to the broader scientific community and to the government.

They are doing so in perhaps the worst of times, when political attacks on science are growing, government cuts to science research are severe, budgets to fund the research programs of particle physicists like me have been chopped by jaw-dropping amounts (think 25% or worse, from last year’s budget to this year’s — you can thank the sequester).. and all this at a moment when the data from the Large Hadron Collider and other experiments are not yet able to point us in an obvious direction for our future research program. Intelligent particle physicists disagree on what to do next, there’s no easy way to come to consensus, and in any case Congress is likely to ignore anything we suggest. But at least I hear Minneapolis is lovely in July and August! This is the first Snowmass workshop that I have missed in a very long time, especially embarrassing since my Ph.D. thesis advisor is one of the conveners. What can I say? I wish my colleagues well…!

Meanwhile, I’d like to comment briefly on a few particle physics stories that you’ve perhaps seen in the press over recent days. I’ll cover one of them today — a measurement of a rare process which has now been officially “discovered”, though evidence for it was quite strong already last fall — and address a couple of others later in the week. After that I’ll tell you about a couple of other stories that haven’t made the popular press… Continue reading →

Today I’m attending the first day of a short workshop of particle theorists and experimentalists at the Princeton Center for Theoretical Science, a sort of “Where are we now and where are we going?” meeting. It’s entitled “Higgs Physics After Discovery”, but discussion will surely range more widely.

What, indeed, are the big questions facing particle physics in the short-term, meaning the next few months? Well, here are a few key ones:

A Higgs particle of some type has been discovered by the ATLAS and CMS experiments at the Large Hadron Collider [LHC] (with some contributions from the Tevatron experiments DZero and CDF); is it the simplest possible type of Higgs particle (the “Standard Model Higgs“) or is it more complex? What data analysis can be done on the LHC’s data from 2011-2012 to shed more light on this question?

More generally, from the LHC’s huge data set from 2011-2012 — specifically, from the data analysis that has been done so far — what precisely have we learned? (It’s increasingly important to go beyond the rougher estimates that were appropriate last year when the data was still pouring in.) What types of new phenomena have been excluded, and to what extent?

What other types of data analysis should be done on the 2011-2012 data, in order to look for other new phenomena that could still be lurking there? (There’s still a lot to be done on this question!) And what types of work should theoretical particle physicists do to help the experimentalists address this issue?

Several experiments from the Tevatron and the LHC, notably the LHCb experiment, have learned that newly measured decays of certain mesons (hadrons with equal numbers of quarks and anti-quarks) that contain heavy quarks are roughly consistent with the Standard Model (the equations we use to describe the known elementary particles and forces, and a simplest type of Higgs field and Higgs particle.) How do these findings constrain the possibility of other new phenomena?

Looking ahead to 2015, when the LHC will begin running again at a higher energy per proton-proton collision, what preparations need to be made? Especially, what needs to be done to refine the triggering systems at ATLAS, CMS and LHCb, so that the maximum information can be extracted from the new data, and no important information is unnecessarily discarded?

Which, if any, of the multiple (but mostly mutually inconsistent) experimental hints of dark matter should be taken seriously? Which possibilities do the various dark matter experiments, and the LHC’s data, actually exclude or favor?

That might be it for the very near term. There are lots of other questions in the medium- to long-term, among which is the big question of what types of experiments should be done over the next 10 – 20 years. One challenge is that the LHC’s data hasn’t yet given us a clear target other than the Higgs particle itself. An obvious possible experiment to do is to study the Higgs in more detail, using an electron/anti-electron collider — historically this has been a successful strategy that has been used on almost every new apparently-elementary particle. But there are a lot of other possibilities, including raising the LHC’s collisions to even higher energy than we’ll see in 2015, using more powerful magnets currently under development.

If there are other near-term questions I’ve forgotten about, I’m sure I’ll be reminded at the workshop, and I’ll add them in.

By almost all measures, the Higgs Symposium at the University of Edinburgh, as part of the new Higgs Centre for Theoretical Physics, was a great success. The only negative was that Professor Peter Higgs himself had a bad cold this week, and had to cancel his talk, as well as missing the majority of the talks by others. Obviously all of us in attendance were very disappointed not to hear directly from him, and we wish him a speedy recovery.

Other than this big hole in the schedule, the talks given at the symposium seemed to me to form a coherent summary of where we are right now in our understanding of the Higgs field and particle. They were full of interesting material, and wonderfully complementary to one another. This motivates me to try to provide, for non-experts, some future articles on what the conference attendees had to say. But to write such articles well takes time. So for now, here’s the quick version summarizing the last few talks, along the lines of the summaries I wrote (here and here) of the earlier talks. The slides from all the talks are posted here.

A lot of people do put a lot of stock in prophecy, including prophecies of the end of the world that nobody ever made (such as the one not made for today by the Mayans, through their calendar) and others that people made but were wrong (such as those made by Harold Camping last year and by many throughout history who preceded him.) If anyone were any good at prophecy they’d be able to use their special knowledge to become billionaires, so maybe we should be watching Bill Gates and Michael Bloomberg and the Koch brothers and people like that. I haven’t heard any rumors of them building bunkers or spaceships yet. Of course at the end of the year they may get a small tax hike, but that wouldn’t be the end of the world.

The Large Hadron Collider [LHC], meanwhile, has triumphantly reached the end of its first run of proton-proton collisions. Goal #1 of the LHC was to allow physicists at the ATLAS and CMS experiments to discover the Higgs particle, or particles, or whatever took their place in nature; and it would appear that, in a smashing success, they have co-discovered one. But no Higgs particles, or anything like them, will be produced again until 2015. Although the LHC will run for a short while in early 2013, it will do so in a different mode, smashing not protons but the nuclei of lead atoms together, in order to study the properties of extremely hot and dense matter, under conditions the universe hasn’t seen since the earliest stages of the Big Bang that launched the current era of our universe. Then it will be closed down for repairs and upgrades. So until 2015, any additional information we’re going to learn about the Higgs particle, or any other unknown particle that might have been produced at the LHC, is going to be obtained by analyzing the data that has been collected in 2011 and 2012. The total amount of data is huge; what was collected in 2012 was about 4.5 times as much as in 2011, and it was taken at 8 TeV of energy per proton-proton collision rather than 7 TeV as in 2011. I can assure you there will be many new things learned from analyzing that data throughout 2013 and 2014.

Of course a lot of people prophesied confidently that we’d discover supersymmetry, or something else dramatic, very early on at the LHC. Boy, were they wrong! Those of us who were cautioning against such optimistic statements are not sure whether to laugh or cry, because of course it would have been great to have such a discovery early in the LHC program. But there was ample reason to believe (despite what other bloggers sometimes say) that even if supersymmetry exists and is accessible to the LHC experiments, discovering it could take a lot longer than just two years! For instance, see this paper written in 2006 pointing out that the search strategies being planned for seeking supersymmetry might fail in the presence of a few extra lightweight particles not predicted in the minimal variants of supersymmetry. As far as I can tell at present, this very big loophole has only partly been closed by the LHC studies done up to now. The same loophole applies for other speculative ideas, including certain variants of LHC-accessible extra dimensions. I am hopeful that these loopholes can be closed in 2013 and 2014, with additional analysis on the current data, but until they are, you should be very cautious believing those who claim that reasonable variants of LHC-accessible supersymmetry (meaning “natural variants of supersymmetry that resolve the hierarchy problem”) are ruled out by the LHC experiments. It’s just not true. Not yet. The only classes of theories that have been almost thoroughly ruled out by LHC data are those predict on general grounds that there should be no observable Higgs particle at all (e.g. classic technicolor).

Now, the prophecy I’d like to make, but cannot — because I do not have any special insight into the answer — is on the question of whether the LHC will make great new discoveries in the future, or whether the LHC has already made its last discovery: a Higgs particle of Standard Model type. Even if the latter is the case, we will need years of data from the LHC in order to distinguish these two possibilities; there’s no way for us to guess. It’s clear that Nature’s holding secrets from us. We know the Standard Model (the equations we use to describe all the known particles and forces) is not a complete theory of nature, because it doesn’t explain things like dark matter (hey, were dark matter particles perhaps discovered in 2012?), and it doesn’t tell us why, for example, there are six types of quarks, or why the heaviest quark has a mass that is more than 10,000 times larger than the mass of the lightest quarks, etc. What we don’t know is whether the answers to those secrets are accessible to the LHC; does it have enough energy per collision, and enough collisions, for the job? The only way to find out is to run the LHC, and to dig thoroughly through its data for any sign of anything amiss with the predictions of the Standard Model. This is very hard work, and it will take the rest of the decade (but not until the end of the world.)

In the meantime, please do not fret about the quiet in the tunnel outside Geneva, Switzerland. The LHC will be back, bigger and better (well, at least with more energy per collision) in 2015. And while we wait during the two year shutdown, the experimentalists at ATLAS, CMS, and LHCb will be hard at work, producing many new results from the 2011 and 2012 proton collision data! Even the experiments CDF and DZero from the terminated Tevatron are still writing new papers. In short, fear not: not only isn’t the December solstice of 2012 the end of the world, it doesn’t even signal a temporary stop to the news about the Higgs particle!

First Time Visitor?

This site addresses various aspects of science, with a current focus on particle physics. I aim to serve the public, including those with no background knowledge of physics. If you're not yourself an expert, you might want to click on "New? Start Here" or "About" to get started. If you'd like to watch my hour-long public lecture about the Higgs particle, try ``Movie Clips''.

A Higgs particle is produced in a proton-proton collision at center, and decays to two photons (particles of light, indicated by green towers) in an LHC detector. Tracks emerging from center are from remnants of the two protons.