BaBar experiment also means trouble for the main alternative, Supersymmetry.

The Standard Model (SM) of particles and interactions provides a successful description of most of the matter we know of. However, physicists have known for many years that it is not complete: the SM predicted massless neutrinos, and has no place for dark matter. A new result from the BaBar experiment at the Stanford Linear Accelerator Center (SLAC) could possibly provide another problem for the SM—and would place severe constraints on a popular alternative theory, supersymmetry (SUSY).

As described in a new paper in Physical Review Letters, the BaBar collaboration measured a decay process of the bottom (b) quark, the second-heaviest such particle. This decay process produces leptons, the class of particles including electrons, neutrinos, muons (a common product of cosmic rays), and taus. The latest BaBar results indicate more taus were produced than the SM predicted. However, the results were also inconsistent with the predictions of the simplest form of SUSY. While the uncertainties on these results are still large, they are similar to earlier data from the Belle Collaboration in Japan.

Although SLAC's main accelerator is no longer used for collisions, it can inject electrons and positrons into two storage rings that cross paths at the BaBar detector. The energies of these collisions are tuned to produce B mesons, which contain a bottom quark.

The BaBar experiment was designed for b quark physics. (The name even refers to b quarks and their antiquarks, written as b and pronounced "B-bar.") The current results focused on a particular decay of a b quark into a D or D* meson, a charged lepton, and a neutrino.

Muons and taus (also known as the tauon or tau lepton) are leptons with the same electric charge as electrons, but are much more massive: the muon is about 200 times the mass of an electron, while the tauon has about 3500 times the electron's mass. (Neutrinos, on the other hand, are neutral leptons with very tiny masses.) While muons and electrons are common decay products in particle experiments, taus are produced only rarely, and they quickly decay into their lighter lepton cousins.

Even though taus are rare, when they are produced in the decay of particles containing b quarks, they provide sensitive tests of their decay, and thus can be used to test theories on the frontiers of the SM and its alternatives.

According to current theories, ordinary matter (as opposed to dark matter) fall into two categories: leptons, which are fundamental, and hadrons, which are made of quarks. Hadrons made of two quarks (actually a quark and an antiquark) are the mesons, while those with three quarks are the baryons. Protons and neutrons are baryons, and their quarks are the stable "up" (u) and "down" (d) varieties.

There are three generations of both leptons and quarks. The lightest generation includes the electron along with the u and d quarks; the heaviest generation contains the tau, the b quark, and the top (t) quark. In between lies the generation with the muon, the strange quark, and the charm quark. Particle physics is trippy.

The SM predicts the frequency at which charged leptons produced in this decay process will be taus: roughly 20 percent for the decays that produce a D meson, and 23 percent for the D* mode. In contrast, the researchers at BaBar found 31 percent taus for the D and 25 percent for the D*—a significant excess. One possible way to explain this difference is to introduce an additional Higgs boson beyond the one predicted by the SM (which we've only just found). A common version of SUSY introduces four Higgs particles, one of which could produce the extra tauons seen in an experiment like this. However, these particular BaBar data did not agree with the predictions of the SUSY model, either.

The BaBar results were in the range of three standard deviations, or "3 σ," which in particle physics means they are significant, but not yet definite. Thus, it's too soon to sign the death certificate for the popular version of SUSY. (The SM, of course, is already known to be incomplete, so its overthrow by these data isn't anything radical.) We'll have to watch future experiments carefully—not least since superstring theory, which currently lies well beyond the realm of experimental tests, depends upon SUSY.

Once more, with feeling: SUSY is a symmetry (a super one!), not a theory, at least not in the sense that the standard model is a theory. It is a component of a wide range of theories. BaBar certainly didn't exclude SUSY as a symmetry in nature -- what particular theory did they constrain? The MSSM? The NMMSM? Something else?

Seems like Higg's particles are popping up everywhere. Since it seems as if the only way to "see" a Higgs (or any other high energy particle) is to look at its decay products, what's preventing an observation of an actual decay of a Higgs?

In general, seems as if all of the investigations are more of a forensic nature, ie after the fact. The data seems to be second or third generations of decay after the initial particle whatever it is. What's preventing actually observing the primary event itself? Even the machine at CERN is a forensic/historical tool.

I know little about high energy physics, so... If we determine there are more heavy particles than we thought, would this go towards accounting for all the "missing mass" in the universe?

No. What's happening is that they're looking at processes that can result in any one of several outcomes happening and that one of them (tau production) is happening significantly more often than expected.

Due to look elsewhere effects these finds amount to data fishing and you need 5 sigma to make an observation, I take it. 3 sigma effects come and go, and I'm frankly not sure that they merit an article. But the general area does, so it's still gud for more than the lolz.

By this time supersymmetry models won't be complete either. They don't predict neutrino masses I think, and in any case I hear they need finetuning to better than 1 %. You need more particle sectors or eternal inflation to complete this.

Conversely, eternal inflation needs a supersymmetry sector to have string theory to complete a basis for the landscape. That is why eternally inflated lolcats seem to want this. Or is it "HEPcats" (High Energy Physics)?

Quote:

There are three generations of both leptons and quarks. ... Particle physics is trippy.

Possibly no more. A recent supernova model showed that a usually ignored low rate reflection of neutrinos has a huge effect on element production due to neutrino flavor change of reflected neutrinos affecting the neutrino shell.

So to get the observed nucleosynthesis rates we need these flavor generations. It may test anthropic theory of eternal inflation but the jury is out on that. Other types of elementary particle physics may have a harder go at predicting generations for all I know.

So the model of our Universe is about to bust? Well, I have to say it was a good run, despite the flaws this one had. I still wish I could bend physics and take the shape of other humans for a peek in dressing rooms of the opposite sex and such entertaining things, but hey, we can't be winners in all respects. Humanity achieved quite a few things, ranging from manipulating sound waves for musical masterpieces to drawing paintings for exciting arrangements of photons to reach our minds.

I'm going to pack my stuff now and I thank you all for the ride! See you soon in another universe!

If we determine there are more heavy particles than we thought, would this go towards accounting for all the "missing mass" in the universe?

No, this would make the problem more acute.

It is not missing mass anymore, it has been observed as dark matter for over a decade now I believe. The properties of dark matter are definitely different from baryons. Only neutrinos act as weakly with EM, and they are mostly too energetic (hot) to predict the observed cold dark matter.

Seems like Higg's particles are popping up everywhere. Since it seems as if the only way to "see" a Higgs (or any other high energy particle) is to look at its decay products, what's preventing an observation of an actual decay of a Higgs?

In general, seems as if all of the investigations are more of a forensic nature, ie after the fact. The data seems to be second or third generations of decay after the initial particle whatever it is. What's preventing actually observing the primary event itself? Even the machine at CERN is a forensic/historical tool.

No question is stupid unless witlessly put. =D

The reason is basically the same reason why we never observe systems directly but by probes, we can't possible *be* the system and even if we could we wouldn't make direct observations of our parts.

We can observe a system's interactions after the fact by looking at probes (often visible light in our daily life). By honing experiments and theory, i.e. making good constraints, we can often make unambiguous observations and so know what is going on.

By relativity all our observations are "forensic", "historical", "indirect", yadda yadda, taking place outside and after the local event volume by analyzing the probes. They are still the same observations, so no problem there, the problem is in trying to make a qualitative distinction when there is no ground for it. "Forensic" is a legal technical distinction and "history" is a science technical distinction, so they are good for what they denote there. "Direct" has no good, testable definition what I know of, it is unfounded handwaving.

The problems with high energy particle observations are too numerous to describe and I don't know them well. But it starts out simple enough with high energy particles being shortlived and often ejected in high energy jets. Those are directed at small angles along the beam lines and so traveling a long distance until hitting the detectors. Combined with the short lifetime it is a killer for observations in the early stages of the jets.

It would be nice to include a graph about the Standard Model (such as the one on Wikipedia: http://en.wikipedia.org/wiki/File:Stand ... ticles.svg) to make it easier to get an overview of the topic at hand. Personally I find that it quickly becomes a jumble of similar sounding terms otherwise. (The insert topic on SM was nice though.)

A suggestion for a future topic could be to collect a bunch of Youtube videos explaining some of these things. (Eg I've found quite a few interesting ones at minutephysics: http://www.youtube.com/user/minutephysics.)

Heck, I'd like to see an article that contained an errata for everything I learned as a kid. I'm thinking there is quite a lot of knowledge to throw out. :-)

I have been remarking since proton sigma-r experiments in the 90's that the Standard Model is the next Kuhnian paradigm shift. There are so many physicists so beholden to it, that they would rather bolt on another pile of variables than put in any effort at replacing it. What's it got now? 15 variables?

It's not elegant enough to be true. Either we need new math that makes it pretty or we need something else.

It's not elegant enough to be true. Either we need new math that makes it pretty or we need something else.

There are problems with the standard model; this is not one of them. The standard model may be overturned, it will certainly be revised. However, 'elegance' is an invented thing, not a property of the universe. I am glad to be alive at a time when the experiments are outpacing the theory quickly enough for that bit of religious thinking to be shown up so clearly.

I know little about high energy physics, so... If we determine there are more heavy particles than we thought, would this go towards accounting for all the "missing mass" in the universe?

Definite maybe.

In order to be a candidate for DM, the particle also needs to meet a bunch of other properties too. So it's not like "any old massive particle" would do. The key is that they have to be weakly interacting, otherwise they would clump up more than we can see in the universe. This eliminates lots and lots of candidates.

However, 'elegance' is an invented thing, not a property of the universe

e(i x pi) + 1 = 0

That is a good invention, yes. I'm glad to be alive when discoveries outpace those who synthesize explanations for them. Otherwise I might fall into the old fallacy of confusing human models for phenomena with the phenomena themselves.

I have been remarking since proton sigma-r experiments in the 90's that the Standard Model is the next Kuhnian paradigm shift. There are so many physicists so beholden to it, that they would rather bolt on another pile of variables than put in any effort at replacing it. What's it got now? 15 variables?

We've known SM is incomplete and thus ripe for a "paradigm shift" since it was created but the fact that it has so many variables is not one of those reasons. I'm not sure what your point is?

Quote:

It's not elegant enough to be true. Either we need new math that makes it pretty or we need something else.

Or, the universe is just messy. That answer may be less aesthetically pleasing but it doesn't mean it's not true.

I have been remarking since proton sigma-r experiments in the 90's that the Standard Model is the next Kuhnian paradigm shift. There are so many physicists so beholden to it, that they would rather bolt on another pile of variables than put in any effort at replacing it. What's it got now? 15 variables?

It's not elegant enough to be true. Either we need new math that makes it pretty or we need something else.

[sarcasm]And this shoots down my "faith" in science a little more - makes it look a little more like "religion" - Wait. Wha....

The problems with high energy particle observations are too numerous to describe and I don't know them well. But it starts out simple enough with high energy particles being shortlived and often ejected in high energy jets. Those are directed at small angles along the beam lines and so traveling a long distance until hitting the detectors. Combined with the short lifetime it is a killer for observations in the early stages of the jets.

Here's a possibility, then, for addressing the 'detectors far down range' issue - build a collider with the beamline intersection happening at an angle, rather than head-on, so that the collision products have inherent momentum away from the beamline. The higher the angle (up to 90°), the closer the detectors can go, but also the more expensive the system will be to build (I expect). The one ring (to rule them all!) that we have at CERN is decades old, and has just been re-equipped once or twice to get the current generation of experiments. Maybe the next collider can be a green-field design with separate rings that intersect at a couple points.

LHCb will certainly be weighing in at some point, but as of yet has not (to my knowledge). These kinds of precision measurements are not the LHC's bread and butter. The LHC is a massive sword, where BaBar and the like are scalpels. They won't be discovering any new particles, but they can make more careful measurements of those already discovered and check the Standard Model's numerical predictions.

Jugalator wrote:

So the model of our Universe is about to bust?

It has been known since the Standard Model was written down that it was at the very least unsatisfying. It has been known with certainty since the 90s that is seriously flawed, but that it is so spectacularly successful at passing every numerical test that it must be the low energy limit of the true theory (analogously, it is the Newtonian mechanics of the as-yet undiscovered special relativity). It has no dark matter, gets the cosmological constant hilariously wrong, does not include gravity, and gets some subtle points about neutrinos wrong (and some of its predictions about the properties neutrinos are yet to be overturned, but could be soon, or they might be verified). Physicists look forward to these kinds of discrepancies because they hint at where the unknown error(s) lie(s). So even if this result holds up, the SM is not about to bust, but to be tweaked and improved, perhaps even radically tweaked. Newtonian mechanics didn't implode when SR was discovered, it was simply recognized as a useful approximation.

I agree with what I believe is Stephen Hawking's POV. I'd dearly love for the Standard Model to fail. The Standard Model leaves us stuck on our little planet, with travel outside our solar system virtually impossible. We need a physics with loopholes big enough to allow us to roam the galaxy at will.

I'm not sure what that physics would be like, whether it would involve instantly created wormholes to almost anywhere or what. But there's got to be a way to get out there.

So the model of our Universe is about to bust? Well, I have to say it was a good run, despite the flaws this one had.

If this does turn out to be a problem for the Standard Model, what would most likely happen is a modification to the standard model, sort of like the way Einstein invalidated Newtonian physics, but only when you're approaching the speed of light.

I agree with what I believe is Stephen Hawking's POV. I'd dearly love for the Standard Model to fail. The Standard Model leaves us stuck on our little planet, with travel outside our solar system virtually impossible. We need a physics with loopholes big enough to allow us to roam the galaxy at will.

I'm not sure what that physics would be like, whether it would involve instantly created wormholes to almost anywhere or what. But there's got to be a way to get out there.

Relativity is the main roadblock for ftl travel, not the Standard Model. The Standard Model is sorta relevant in that it describes possible sources of energy, but it's not like a new model of particle physics will allow a ship to accelerate past c.

Basically, relativity says that going c or faster is impossible and that going close to c is energetically very expensive, and the Standard Model says that as a practical matter it's difficult to generate large amounts of energy with engine-quality devices. It follows that accelerating to close to c is difficult. Perhaps a new model would allow for a revolutionary means of energy production, but it's not going to get rid of the fundamental limits on what you can do with that energy. There's simply no getting around the fact that no one is ever going to travel a light year in less than 1 year (as measured in the rest frame of points A and B). Of course, if relativity falls, so does the Standard Model, but I wouldn't hold my breath.

So the model of our Universe is about to bust? Well, I have to say it was a good run, despite the flaws this one had.

If this does turn out to be a problem for the Standard Model, what would most likely happen is a modification to the standard model, sort of like the way Einstein invalidated Newtonian physics, but only when you're approaching the speed of light.

This hurts my brain. It's like a special case of a special case of a special case...

I know little about high energy physics, so... If we determine there are more heavy particles than we thought, would this go towards accounting for all the "missing mass" in the universe?

No. What's happening is that they're looking at processes that can result in any one of several outcomes happening and that one of them (tau production) is happening significantly more often than expected.

Yep. Though another series of runs (of the same or similar experiment) could produce significantly less tau production, thus reducing the significance.

As an example, say that you have a six sided die, which is believed to be "statisticly perfect", that is no side will come up more often than the others.

If you roll it only 6 times, the actual likelihood of getting one each of 1, 2, 3, 4, 5, 6 are fairly low; just over 1.5%.

Instead, you might get results of 1, 2, 4, 4, 4, 6. This would look, based off the limited number of rolls, to be significant that the die isn't perfect (i.e., the model has issues), but in reality not enough tests have been done.

If you instead did 6 million tests, the percentages would average out. Certainly, there might be limited stretches where a lot of a certain number appeared, but the overall values would tend to settle out.

In this case, they've got a 3-sigma result. In physics, 3-sigma is "interesting; we should take another look, but remember it often washes away with further testing".

Particle physicists consider 5-sigma to be the point where they really believe something has happened.

I have been remarking since proton sigma-r experiments in the 90's that the Standard Model is the next Kuhnian paradigm shift. There are so many physicists so beholden to it, that they would rather bolt on another pile of variables than put in any effort at replacing it. What's it got now? 15 variables?

It's not elegant enough to be true. Either we need new math that makes it pretty or we need something else.

[sarcasm]And this shoots down my "faith" in science a little more - makes it look a little more like "religion" - Wait. Wha....

Scientific Model = has holes filled in by "variables"

Religion = made up shit to explain stuff we can't explain

Scientific Model = Religion

Hmm....

[/sarcasm]

Extremely simplistic and inaccurate, besides.

Modern science is a subset of philosophy: empiricism. Science cannot prove things to be right, wrong, real, or unreal. It's about creating models that have explanatory and predictive power. Whether those models represent "reality" is itself not falsifiable and therefore not science.

While primitive religions create explanations of things we CAN'T EXPLAIN, less simplistic religions provide meaning and explanations for things that are, in principle, UNEXPLAINABLE--or, more accurately unknowable in an empirical sense. That is, not every truth is empirically determinable. The dismissive "made up shit" doesn't deserve a detailed response.

Bottom line: science, in the purest sense, is not religion and religion is not what you think it is.

Superstring theory is not entirely beyond the reach of observation. It predicts a large negative cosmological constant and astronomers are pretty certain that the value is small positive. That's one failure for string theory already on the books.

Superstring theory is not entirely beyond the reach of observation. It predicts a large negative cosmological constant and astronomers are pretty certain that the value is small positive. That's one failure for string theory already on the books.

String theory doesn't make predictions unless you make some typically unfounded assumptions, and I'm guessing the result you're quoting is something similar. By that I mean, you can say for example "in the regime where strings are weakly coupled, it follows that..." but there is no reason to say strings aren't strongly coupled, and so on and so forth, so that you can find a string theory model consistent with (almost?) any universe. It's the granddaddy of all fine tuning problems.

There are true believers who claim to have extracted predictions, and I don't doubt that whoever came up with this result hyped it as being a generic prediction of string models, but not once in reality has anyone gotten something out of string theory that could lead to the falsification of all possible string models. That said, I'd be happy to be shown wrong, as that would make string theory slightly less disappointing.

sort of like the way Einstein invalidated Newtonian physics, but only when you're approaching the speed of light.

Nothing could be further from the truth.

Under the Newtonian model, gravity is a force that is created by mass and acts at a distance on other massive objects. The force falls off as a radiative term. Under the field versions of the same, it is a field that is created my mass, and other mass interacts with that field.

Under GR "gravity" is the result of the local stress tensor caused by the 4 dimensional shape of spacetime. The entire universe interacts with this on both the large and small scale though the Einstein equation, in a non-linear fashion (ie, gravity causes gravity, sort of)

The two images of how the universe works are utterly unlike each other. The fact that they both predict the same results for the low-energy limit here on the Earth doesn't change this.

jandrese wrote:

If this does turn out to be a problem for the Standard Model, what would most likely happen is a modification to the standard model.

This is almost certainly not true.

The basic model of the SM and its Lagrangian have certain in-built limits, many of which are close to the edge now. Adding Neutrino mass was one thing, but other particles is another entirely. This means it is not easy to modify the SM to produce a WIMP, for instance. So in effect, the SM does not predict 95% of the universe, and likely can't.

Superstring theory is not entirely beyond the reach of observation. It predicts a large negative cosmological constant and astronomers are pretty certain that the value is small positive. That's one failure for string theory already on the books.

Its not so much that visible universe is expanding at an accelerating rate, its just that the rest of the universe sucks.

Having walked all around one of these detectors (including under) (J-Lab), I can honestly say they are some of the most amazing constructions I can imagine. Immense things (the image on this article looks photoshopped, but it's not (or if it is, there are those out there that it wouldn't be).

I've gotten to see one of the detectors open (imagine an orange, with the core aligned with the beam axis), I've gotten to see the miles of cables, the Cherenkov detectors, the incredibly immense lead beam stop suspended above the floor.

The problems with high energy particle observations are too numerous to describe and I don't know them well. But it starts out simple enough with high energy particles being shortlived and often ejected in high energy jets. Those are directed at small angles along the beam lines and so traveling a long distance until hitting the detectors. Combined with the short lifetime it is a killer for observations in the early stages of the jets.

Here's a possibility, then, for addressing the 'detectors far down range' issue - build a collider with the beamline intersection happening at an angle, rather than head-on, so that the collision products have inherent momentum away from the beamline.

Good idea - perhaps. You get cosine losses of momentum so squared cosine losses for energy. But it would start to do what you say.

You know even the Standard Model is not a theory. It's just a heuristic that conjoins all known observations together through some exquisite mathematics. SUSY is just an extension. New observations that conflict are good as they will help bring on a real and more complete true particle theory. It's not surprising that new observations would conflict with those predicted by a heuristic. Simple fact.

Modern science is a subset of philosophy: empiricism. Science cannot prove things to be right, wrong, real, or unreal. It's about creating models that have explanatory and predictive power. Whether those models represent "reality" is itself not falsifiable and therefore not science.

While primitive religions create explanations of things we CAN'T EXPLAIN, less simplistic religions provide meaning and explanations for things that are, in principle, UNEXPLAINABLE--or, more accurately unknowable in an empirical sense. That is, not every truth is empirically determinable. The dismissive "made up shit" doesn't deserve a detailed response.

Bottom line: science, in the purest sense, is not religion and religion is not what you think it is.

Creationists shouldn't comment on science, it is hilarious to see.

For accidents of history, science became commingled with philosophy and religion. However, first religion then philosophy split ways with science when it became evident that science can adjudicate on facts while they can't. The basic reason is because testing can show something is erroneous.

And for some not well understood reason the process of competing alternatives converge on unambiguous facts despite us having finite resources in a finite universe. As theoretical physicist Sean Carroll notes, "The Laws Underlying The Physics of Everyday Life Are Completely Understood". [ http://blogs.discovermagazine.com/cosmi ... ariance%29 ]

Thus facts are right, and when fully tested the simplistic model of relative "truths" append in an absolute and robust way. Moreover, realism is build into every mechanics from classical to quantum, in the form of testing "constrained reaction to constrained action". ("Hit a rock and it hits back.") Classically that is action - reaction, quantum mechanics describe it testably as observation - observables.

Theories, and testable models are a useful subset of them, are testable and hence amenable to become observed facts. As they are a correlated set of hypotheses, they are better than isolated observations or ad hoc hypotheses - they are "superfacts".

Take evolution. It is famously an observed fact of a process, an observed fact of a robust and valid theory, and a set of observed pathways. That is one large set of facts!

All religions are simplistic - they impute "meaning" of an agency where we know there is none. It is the proverbial "made up shit".