Discovering the Quantum Universe

HEPAP today released a new publication designed to convey to the general public excitement about prospects for particle physics in the coming years. It’s entitled Discovering the Quantum Universe, and it has a companion web-site. Both the web-site and the document itself are beautiful and impressive productions. The web-site also contains the earlier 2004 HEPAP report Quantum Universe, which was mentioned in one of the earliest posts of this blog. The newer document is based on an earlier version from last summer, and one of its main goals is to make the case for a linear collider. In some sense this is promotional material for the conclusions recently reached by the EPP2010 panel. Also part of the promotional activity today is a briefing for members of Congress that will include a talk by my colleague Brian Greene.

While I hope that this all has the intended effect of getting the public, the Congress and the Administration excited about particle physics and willing to support it at the level necessary to fund a new generation of machines and experiments, as you might guess I have my doubts about the wisdom of some of the material included in this report. Unlike the EPP2010 report, which oversold string theory a bit, this report oversells it a lot, with language like:

… preliminary studies have looked at the ability of linear collider experiments to detect the telltale harmonies of strings. Here linear collider precision is essential, since the string effects appear as small differences in the extrapolated values of the superpartner parameters. A combined analysis of simulated LHC and ILC data shows it may be possible to match the fundamental parameters of the underlying string vibrations.

The inclusion of this kind of language seems to me to be misleading and irresponsible. Ten years from now when we have real LHC data, know that the ILC can’t tell us anything about string theory, and are asking the US government to put up large sums to finish the ILC, we’ll have to hope that the relevant decision makers didn’t get convinced by this report that the ILC is a machine designed to get information about string theory.

42 Responses to Discovering the Quantum Universe

… preliminary studies have looked at the ability of linear collider experiments to detect particle interactions that probably have nothing to do with the telltale harmonies of strings. Here linear collider precision is essential, in spite of the fact that string effects somebody probably once guessed may appear as small differences in the extrapolated values of the superpartner parameters, have nothing to do with it. A combined analysis of simulated LHC and ILC data shows it may be possible to advance understanding of the underlying physics even though there are probably no underlying string vibrations with which we may match the fundamental parameters.

Most successful particle accelerators + detectors achieved their fame for reasons having nothing to do with their originally intended mission. The AGS at BNL produced 3 NP-winning expts, none of which were foreseen — J/psi, CP violation, 2 neutrino species.
Sam Ting’s expt was proposed to discover vector mesons to test Vector Meson Dominance, not to discover the charmed quark. While the Bevatron was built to produce the antiproton (which it did, leading to NP), it really became famous for the resonances (“Bump hunting”) discovered with the Alvarez bubble chamber. Gargamelle discovered weak neutral currents, which were not even mentioned in the proposal (if I recall correctly from The Second Creation.) The proton decay underground water tanks have turned out to be better suited for neutrino astronomy.

For that matter Columbus proposed to sail to China, discovered America (Erik the Viking might disagree), claimed to have landed in India, and after all that people still call him a hero.

PEP, PETRA, TRISTAN were all built to find the top quark (which IS part of the SM), and all failed. This did admittedly lead to subsequent funding difficulties. If indeed LHC/ILC produce new physics beyond SM, the string theory stuff will be forgotten/ignored. The real hope is that the new machines will find *something* never mind why they were proposed.

But I guess Peter is afraid that too much hype may create a backlash. In all those cited cases they were looking with much less hype for something and they found something else which was at least as interesting. Now they have to find something which has to go beyond all this messianic hype, a pretty difficult task.

The real backlash problem is that so many machines have been built and have found no new physics beyond SM. If yet another generation of machines fails to produce physics beyond SM, then it will get even harder to justify further machines. At a minimum the next generation machines really MUST find the Higgs. Although the Higgs is part of SM, it will still be a major finding and will serve as a boost to obtain funding for future machines. If LHC/ILC do not find the Higgs, it will get really hard to get more money. (Unfortunately nobody can guarantee where/how the Higgs will appear, just as with the top quark.) String theory or any other theoretical motivation (“hype”) is not the critical issue.

One different aspect of this situation is that the ILC will not really be exploring a completely new energy range, since the LHC actually will have higher energy. By the time the ILC construction decision is ready to be made, I suspect the LHC data will have given strong indications that there is no supersymmetry/higher dimensions/string theory at energies accessible to the ILC. That’s why using these ideas to sell the ILC now may be a mistake.

Hopefully the LHC will find some new and unexpected physics that explains electroweak symmetry breaking and the ILC will have enough energy to study it. If so, there will be a very good case to be made to build the ILC, but much of this report will have to be explained away.

Historically, e+e- colliders have not had the energy to go beyond the mass reach of hadron machines. As powerful as LEP was, the W and Z were first produced in hadron machines. LEP was designed for precision studies of particles of known mass. Even SPEAR, when it produced the psi, the J had been produced at the AGS. If, as one hopes, the LHC produces the Higgs, the ILC will promptly become a Higgs factory. Its design parameters will be adjusted (if necessary) to optimize Higgs production. It will simply be unnecessary to explain away the string theory or any other prior motivation/hype. It is in the nature of scientific discovery that this is so.

I hope you’re right about not needing to explain away the hype. But I personally wouldn’t want to be the physicist testifying before Congress in 2012 trying to answer questions from congressmen about why the US should spend billions on something that’s not even going to help discover those strings and extra dimensions.

If the LHC produces the Higgs, the physicists will testify “the Higgs has been found, the ILC will be a unique precision tool to study it, anything else will be a bonus”. It was not so different for LEP, the W and Z had been found, anything else would be a bonus (except that LEP was proposed AFTER the W and Z were found). Unfortunately the design and construction timescales are now so long that ILC planning must start now, before LHC data is in. If LHC finds nothing, ILC will have to be sold as a pure search and discovery machine. This is difficult, and not the fault of ST. PEP, PETRA and TRISTAN were all built to find a particle which really does exist, but the SM gave no clue as to its mass. Physics can be and is unkind.

Really, the difficulty is that in the good old days (those wonderful times before the internet, cell phones and blogs … one hankers for the simple life) each new generation of machines produced new physics discoveries beyond the theories of the day. That has not happened for ~30 years. One simply does not know the mass scale of the new physics. And that is not the fault of ST. No theory — including QED — has given the mass scale of the next generation of particles. The expts must find something, then one can say “these particles are part of a family, search for the next one (like the Omega-)”.

On page 13 of the pdf, “Discovering the Quantum Universe” says:
“… According to our present understanding, the Higgs particle itself should have a mass a trillion times beyond the Terascale. …”.
I thought that the Standard Model would be happy with a Higgs in the 110 to 200 GeV range, and that was such a generally accepted view that Fermilab thought that it might have seen the Higgs at around 115 GeV. To what “present understanding” are they referring ?

On page 18 they list the following potential LHC discoveries:
A Higgs particle
Superpartner particles
Evidence for extra dimensions
Missing energy from a weakly interacting heavy particle
Heavy charged particles that appear to be stable
A Z-prime particle, representing a previously unknown force of nature
Superpartner particles matching the predictions of supergravity

If the Standard Model continues to hold up, as it has so far,
so that there are
no superstring-type superpartners,
no superstring-type extra dimensions,
no WIMP,
no heavy charged stable particle,
no Z-prime, and
no supergravity-type superpartners
then
the LHC might see nothing much beyond a 115-200 GeV Higgs,
plus better luminosity of the sort of signals already seen at Fermilab.

Does the USA high energy physics community have a plan for such a contingency ?

If not, is it such a contingency really so remote that there is no need to plan for it ?

Does the USA high energy physics community have a plan for such a contingency ?

Yes. We all give up and go home. Not just the USA, pretty much everyone. People have nightmares about the LHC only seeing a Higgs. Maybe you can justify the ILC to probe the electroweak symmetry breaking sector, but if everything fits a single Higgs doublet and nothing else, I’m not sure I see the point.

Aaron Bergman said that “… if the LHC only see[s] a Higgs … We all give up and go home …”.

That sounds sad to me, when such things as the values of the Standard Model parameters are not explained by conventional models (unless you accept anthropic landscape models as being both conventional and explanatory).

If it is a contingency that is sufficiently likely to cause nightmares, wouldn’t it be nice to have a contingency plan other than “give up and go home”? Has much effort been put in trying to develop such a plan, or is that contingency being ignored as being incompatible with a united USA lobby for ILC ?

Has much effort been put in trying to develop such a plan, or is that contingency being ignored as being incompatible with a united USA lobby for ILC ?

You don’t seem to understand. At that point, there’s pretty much nothing left to do. You can measure the standard model parameters to a ridiculously high precision, I suppose, and hope that you can get the theory down well enough to look for small discrepancies, but that’s not particularly sexy. You’re just going to have a damned hard time convincing anyone to spend ten or so billion dollars on an accelerator after the previous one bored us all to tears.

Maybe someone will have some clever idea for accelerating subatomic particles cheaply, but until then, I’d guess the contingency plan for a lot of people would be to start contemplating the Black-Scholes equation.

IIRC, you once guaranteed that there must be new physics at the LHC, because of the Landau pole. So if LHC sees nothing new, except perhaps for a standard Higgs, there is something seriously flawed about our understanding of QFT?

Aaron Bergman said, replying to me about the contingency in which the LHC only sees a 110 to 200 GeV Higgs, plus events similar to those already seen at Fermilab:
“… You don’t seem to understand. At that point, there’s pretty much nothing left to do. …”.

I can see that might be the case for superstring theoretical physics, which would be invalidated by lack of conventional supersymmetry,
but
is it really futile to search for a non-superstring theory that might explain the parameters of the Standard Model etc,
in which case further detailed study of Fermilab/LHC type data might be useful in distinguishing among such types of models, and in examining their consequences ?

Has high energy theoretical physics become such a monoculture of superstrings that the community feels that it is preferable to “give up and go home” rather than work on non-superstring models ?

From another point of view, if LHC sees only non-supersymmetric Higgs plus similar-to-Fermilab events, thus invalidating superstring theory,
would the USA superstring community (with 90% of USA theoretical hep funding and jobs) prefer to
take their ball and go home (giving up that funding and jobs to non-hep things)
rather than
see non-superstring theoretical hep have a chance to succeed where they failed ?

It is not only a matter of strings. Selling theoretical speculations in this way, a few years before LHC, seems either useless or suicidal.

More in general, this and other similar presentations look like a show for a big bureaucratical enterprise (“we must discover the Higgs / extra dimensions / …”). In my opinion emphasizing the points made by sunderpeeche (#2) would be not only more honest, but also more attractive.
But I am only a physicist, not an expert of outreach.

There’s a Landau pole problem for non-asymptotically free couplings. It’s there for the U(1) gauge coupling, but only occurs at exponentially high energies (not at the TeV scale). All it says is that at short enough distances the effective coupling will get large and perturbation theory will break down.

The Higgs quartic coupling also has this problem, and it is this coupling that determines the mass of the physical Higgs particle. If the Higgs particle is low enough mass, the quartic coupling at the TeV scale is small, and again its blowing up only is a problem of principle at short distances. If you try and push the Higgs mass up in the standard model, you are pushing up this quartic coupling. At some point it gets large and perturbation theory breaks down.

The reason you get a guarantee at the Tev-scale isn’t really the Landau pole issue: it’s just that either you’ll see a Higgs particle (small enough quartic coupling), or whatever is causing the Higgs phenomenon is something inherently non-perturbative and completely new physics.

Has high energy theoretical physics become such a monoculture of superstrings that the community feels that it is preferable to “give up and go home” rather than work on non-superstring models ?

Must you see everything through these silly string wars? This has nothing to do with that. Theorists can go on speculating about whatever they want. It won’t matter. If you’ve been reading what I’ve been saying, this is about experiment.

The reason you have to see something is that otherwise WW scattering (I think) becomes nonunitary. If you believe in unitarity, then something better fix that. A simple doublet Higgs solves that problem, however, and it’s perfectly consistent for that to be all there is until the Planck scale.

Aaron Bergman said that “… if the LHC only see[s] a Higgs … We all give up and go home …”.

I may agree that in these circumstances there will be no compelling reason to build even bigger accelerators. However, I do not agree that theorists should give up. There are quite a few fundamental problems that are not addressed by current theories at all. Some of them were mentioned in Chris Oakley’s posts:

1. What about divergences in QFT? Will we ever get them from “under the rug” and honestly examine? Or the current consensus is that divergences are ultimately related to the (currently unknown) Planck-scale physics, and we can do nothing about them?

2. There seems to be no satisfactory resolution of the bound state problem in QFT. We learned how to calculate the S-matrix to high precision. So, we can get the energies of bound states as poles of the S-matrix (e.g., the Lamb shifts). However, there is no well-defined Hamiltonian in QFT, so the radiative corrections to the wave functions of bound states cannot be calculated within QFT.

3. Without well-defined Hamiltonian, I don’t see how one can address the time evolution of wave functions in QFT.

The experimental support for problems 2. and 3. above does not require smashing of particles with TeV energies. It would rather need precise time-resolved observations of dynamics of low-energy systems, such as excited states of atoms. This may be not as sexy as building multi-billion dollar monsters, but the insight gained from these low-energy experiments could be just as fundamental.

Talking about searching for the unexpected, have a look at the bets amde at
Richard Arnowitt festschrift almost 10 years ago.http://faculty.physics.tamu.edu/allen/fest/bets.html
Unfortunately , still none of the above have been discovered. Would people here want to take stabs at the same bet?

Is this wise? The only major linear collider ever built, the Stanford Linear Collider (SLC) , experienced extensive startup problems lasting for many years. For example, the initial scheduled run in the summer of 1988 yielded not a single Z boson. This was followed by several years of very disappointing performance, numerous technical problems, and considerable acrimony. The accelerator and asociated experiments (Mark II and SLD) were largely eclipsed by LEP. As far as I know, no one has built a similar machine since. Does the knowledge and expertise really exist to build an even more ambitious linear collider?

“As far as I know, no one has built a similar machine since.” The SLC is the only linear collider ever built. For the pedants, it was not a true linear collider because it used only one linac (for both e+ and e-) and routed them through 2 arcs to a detector (Mk II, later SLD). It experienced all of the teething problems of an untested technology (“path breaking” … you choose the description). Its luminosity was far below expectations. It should have produced Z’s before LEP and it did not. I don’t know about the acrimony.

The ILC will be a true linear collider (2 linacs). I have no doubt it will also experience teething problems. The proponents admit there are significant R+D issues to solve. I have heard several people say it is unwise to put all the eggs in such a basket.

The SSC got cancelled. A muon collider is also untested technology, even more far future than ILC. Have you any better ideas?

Futuristic techniques could increase the acceleration gradient by up to 3 orders of magnitude. This would allow to shorten a linear collider by 3 orders of magnitude. It would have true practical applications, giving cheap sources of synchrotron radiation.

Unfortunately this kind of activity does not receive the attention and manpower that it deserves, proposing theoretical speculations is more rewarding, and with some propaganda we can maybe get the 5 or 10 Giga$ needed for a ILC.

I just thought I would add a few comments to this. Before it was called ILC it was known as NLC (Next Linear Collider) and there has been a tremendous amount of work that has been done on trying to realize such a big project. Experimentally there was the NLCTA (Next Linear Collider Test Accelerator) that ran at SLAC as well as much research into higher frequency klystrons. It was basically a normal conducting X-band (11.424 GHz) accelerator. Going to higher frequencies and higher energies is a tremendous effort because of the impediments of reaching higher gradients due to effects like RF breakdown. When I was a grad student there I actually worked on the effect of pulsed heating that occurs on the interior copper surfaces from the eddy currents due to high magnetic fields. The heating happens so quickly the copper has no time to expand which causes stress on the surface. This repeated cyclic stress can be enough to cause microcracking over some period of time. This degrades the Q of the cavity and also causes thermal runaway (not to mention leading to RF breakdown). This all means if you are not careful and actually build an accelerator that can deliver the gradients you want, you might destoy it over not so long of a time. This leads into material science and surface science. So there are just an incredible amount of things to look at to realize something like the ILC. Superconducting accelerators do not have the pulsed heating issue but they do suffer from wakefields due to the high Q’s of the cavities. Trying to realize an ILC machine has not been done for the lack of trying, but it will still require great efforts moving forward.

Something better? We could return to the idea of the VLHC (>100 TeV @ 10^34 luminosity). The Fermilab design study of a few years ago shows that we do know how to build such a machine technically. Managerially the jury is out; but the LHC construction trials are valuable and relevant experience.

I mentioned in an earlier post that after SLC there was TLC (1 TeV x 1 TeV), this was too demanding, it became TLC (1 TeV c-o-m), eventually NLC (achieve whatever is feasible ~ 250 GeV x 250 GeV), evolved into international project = ILC. Very demanding technically.

VLHC? The technology + knowhow already exists to build SSC, VLHC and other conventional storage rings (a super LEP if desired). But the size and cost is very large. SSC did not survive the politics. (ILC may not either.)

The 27 April 2006 issue of Nature said some interesting things about the ILC.

1 – The lead editorial (unsigned) (page 1089) had a favorable subhead, saying “There is broad backing for a US bid to build the International Linear Collider”.

2 – That same editorial also said “… Germany, Japan, and CERN itself may also bid to host the project … the insistence of CERN … that decisions about the siting of the ILC be delayed until an accelerator technology … is ready, strikes some in the community as unnecessary and self-serving …”.

3 – A News article by Geoff Brumfiel (pages 1094-1095) said “… Russia, Japan, and China have all expressed interest in hosting it …”.

4 – That same News article also said “… Mike Lubell, head of public affairs for the American Physical Society in Washington DC, cautions that the collider – estimated to cost at least $6 billion – still faces an uphill battle …”.

5 – The above-mentioned editorial also said “… political backing for the deal is likely to come from friends of Fermilab … Fermilab … is looking to its supporters, such as House Speaker Dennis Hastert (Republican, Illinois) …”.

B – Rushing to get commitments before technology is demonstrated (maybe this is related to remarks by G. William Foster in his 19 February 2006 resignation letter “… it has been impossible to elicit any honest public discussion of a technically defensible schedule for the ILC from the leadership of HEP. This is not an academic exercise or an empty political game, since unrealistic schedules for the ILC are being used to destroy prospects for any feasible mid-scale near term program in U.S. HEP. … our steamship seems to have a committee of captains who believe that the ductility of carbon steel rivets at freezing temperatures is just a political problem …”).

C – The $6 billion dollar figure for ILC (when the DOE estimate is $12 billion) may be an effort, a la SSC, to lock in the project with an unrealistic low-ball cost estimate, which is bound to unravel sooner or later.

D – If the Democrats win the House in 2006, then Illinois Republican Dennis Hastert will not be Speaker of the House. The current Democrat Minority Leader is Nancy Pelosi of California. Recall that the straw that broke the SSC’s back was the defeat of Republican Texas-SSC supporter Bush I by Democrat Clinton, who, although he had paid lip service to the SSC, refused to go to the mat to fight for it, and was quite OK to let it die.
If Pelosi succeeds Hastert, and follows the Clinton pattern, she will tell all the ILC lobbyists/physicists how she is strongly in favor of the ILC at Fermilab, but when it comes time to cast the votes, she won’t crack the Speaker’s disciplinary whip, and will watch the vote kill the ILC, and say something like “Gee, I tried, but times are tough and too many people wanted to use that money for what they see as more pressing things” – (her web page emphasizes Energy, Medicare, Gas Prices, etc).

The ILC is far from a done deal. The technology to build it, the siting (will the US pull out if ILC not in USA?), a timeline and cost estimate — all are unknown. Not to mention a change of administration and/or party control of Congress. It isn’t always Democrats who kill Republican-sponsored programs. (BTW do the Reps really support ILC?) The Apollo project was initiated by Kennedy, killed off by Nixon.

sunderpeeche is correct.
It is just accidental/coincidental that Texas’s Bush I who supported SSC and Illinois’s Hastert who supports ILC are both Republicans,
and
that they were (or may be) succeeded in power by Democrats Clinton (no big allegiance to Texas) and Pelosi (no big allegiance to Illinois).

sunderpeeche also asks “BTW do the Reps really support ILC?”.
No, not in general.
It is just an accident of geography that Hastert of Illinois is Republican Speaker of the House, and as Republican Speaker he has a lot of power to whip all House Republicans (now a majority) into line.
If a non-Illinois Republican were Speaker, the Republican Majority would probably not be a coherent block in favor of the ILC.

PS – If Dirksen had not been an influential Senator years ago, Fermilab might not be in Illinois. Dirksen’s party was much less relevant than his influence used on behalf of his constituents in Illinois.

PPS – As to my personal political party bias, in the last 4 USA elections, I voted Perot, Perot, Libertarian, Libertarian. I don’t have much of any preference between the Republican and Democrat parties.

An ILC operating up to sqrt(s) = 500 GeV should be able to “… effectively perform top quark measurements in an almost background free environment …” and study the “… The top-Higgs Yukawa coupling …”, as stated in a Fermilab document at http://www.fnal.gov/pub/news05/TTTPosterSession/TTT-ILC.pdf.

Since it is very likely that LHC will see Higgs at 115 – 200 GeV, it is also very likely that ILC could be very useful to study the Higgs – T-quark system.

I am puzzled why the USA HEP community fails to emphasize such a very likely useful symbiotic relation of ILC and LHC,
but instead
emphasizes speculative things like superstring-relatd supersymmetry, which might well be ruled out by LHC before the ILC is completed.

Why take a Russian Roulette chance of getting ILC killed by LHC seeing only Higgs at 115 – 200 GeV (plus stuff similar to that already seen at Fermilab),
when
a case can be made that the ILC and LHC are both needed to fully understand how the T-quark and Higgs work within the Standard Model ?

For example, in “Discovering the Quantum Universe” the only mention I recall of studying quark interaction with Higgs is on page 14 (page 18 of the pdf) with respect to the possibility that LHC might find a Z-prime.

Why not play up T-quark interactions with Higgs ?

Not only is it almost certain to be seen at LHC, and therefore real,
but also
there are interesting possibilities that could be publicized.
For one example, look at the T-quark condensate work of Yamawaki et al ( hep-ph/0508065 and hep-ph/0311165 etc ). In addition to being interesting, it might be nice for the USA HEP people to put some emphasis on non-USA work such as that at Nagoya.

Indeed it would make sense to emphasize/advertise the Higgs and T-quark sectors. The T has already been found, ~ ” tentative evidence for a Higgs at 114 GeV was found at CERN, a more comprehensive search can be made at ILC, including precision studies of its properties” ….. certainly in this regard the extra dimensions stuff is hype.

That sounds sad to me, when such things as the values of the Standard Model parameters are not explained by conventional models

This happened already from the instant we accepted the view of QFTs as a chain of effective theories because then we bought also the asumption that any unnatural mass (in the sense of t’Hooft naturality) shall come via renormalization group from the GUT scale, and then we will never be able to predict his mass with experimental certainess because we will not have enough loops to run the renormalization group downwards with high precision. Of course this is not the whole history, because even with effective theories we can exploit some formulae coming from fixed points; but in general it is true one has surrendered to the string-gut-planck scale.

Let us not get carried away with silliness, just because of the overabundance of (presumed to be) vacuous theories. What did Max Planck predict when he fitted the blackbody spectrum in 1900? To Planck the energy partition into finite cells was a book-keeping device which he could not avoid. It was Einstein who took the existence of light quanta as particles seriously, in 1905. When Dirac first published his equation, his paper had no predictions of new phenomena. It gave a “natural” explanation of spin 1/2 (and a value g=2), but it was already known (but not explained) that an electron has s=1/2 and g=2. Dirac had no immediate explanation for the negative energy states. The realization that the union of relativity and QM requires the existence of antimatter came later (as did the realization that the Dirac eq must be formulated as a second-quantized field operator equation).

Pauli did indeed hew to the principles 1 and 2 above, and dismissed just about any new idea as rubbish. He initially opposed the notion that an electron could have an intrinsic angular momentum, he later rejected Yang-Mills theories (because a nonzero bare mass for the gauge bosons destroys renormalizability). He opposed many other good ideas (including the work of Stuckelberg). Not really a good role model to follow.

Wolfgang Pauli’s letter of Dec 4, 1930 to a meeting of beta radiation specialists in Tubingen:

‘Dear Radioactive Ladies and Gentlemen, I have hit upon a desperate remedy regarding … the continous beta-spectrum … I admit that my way out may seem rather improbable a priori … Nevertheless, if you don’t play you can’t win … Therefore, Dear Radioactives, test and judge.’

Note that the events related to Spin happened very fast during the years 1921-1929 or even a narrow interval (1923-25?). From the empirical work of Lande, on J^2 terms, Pauli was in the belief that angular momentum had an intrisic bi-valuedness of mathematical origin, and then he rejected the need of the notion of a new physical angular momentum in the electron; initially he was looking for the answer in total angular momentum itself. He suggested that it was possible that angular momentum had a minimum interval to be integrated upon (so \int j^{-2} dj produces 1/j – 1/(j-1) or even that any derivative of functions f(j) was forced to be discrete (so d/dj 1/j gives the term 1/j – 1/j-1). This was in 1923. Then Born-Jordan mechanics showed that quantisation of angular momentum was in multiples of h/2, while Schroedinger mechanics showed that it is in multiples of h. Dirac comes to the rescue giving a wave equation (thus Schroedinger like) that provides the h/2. And from both ways it is clear that the j(j+1) terms in angular momentum are coming from the indeterminacy principle but not from the bivaluation; thus Pauli accepts the electron 1/2 as a separate idea: the bivaluation of the electron (that at the end comes because SO(3) is covered by SU(2)) separated from the fact that J^2 has eigenvalue j (j+1).

I am much edified. My previous post was however in response to a preceding utterly foolish comment (now deleted). The fact remains that a new idea frequently offers no clue as to the consequences it will lead to. And one cannot demand that it do so.