Pages

Friday, December 27, 2013

The finetuned cube

The kids are almost three years now and I spend a lot of time picking up wooden building blocks. That’s good for your health in many ways, for example by the following brain gymnastics.

When I scan the floor under the couch for that missing cube, I don’t expect to find it balancing on a corner - would you? And in the strange event that you found it delicately balanced on a corner, would you not expect to also find something, or somebody, that explains this?

When physicists scanned the LHC data for that particle, that particle you’re not supposed to call the god-particle, they knew it would be balancing on a corner. The Higgs is too light, much too light, that much we knew already. And so, before the LHC most physicists expected that once they’d be able to see the Higgs, they’d also catch a glimpse of whatever it was that explained this delicate balance. But they didn’t.

It goes under the name ‘naturalness,’ the belief that a finely tuned balance requires additional explanation. “Naturally” is the physicist’s way of saying “of course”. Supersymmetry, neatified to Susy, was supposed to be the explanation for finetuning, but Susy has not shown up, and neither has anything else. The cube stands balanced on the corner, seemingly all by itself.

Of course those who built their career on Susy cross-sections are not happy. They are now about to discard naturalness, for this would mean Susy could hide everywhere or nowhere, as long as it’s not within reach of the LHC. And beyond the LHC there’s 16 orders of magnitude space for more papers. Peter Woit tells this tale of changing minds on his blog. The denial of pre-LHC arguments is so bold it deserves a book (hint, hint), but that’s a people-story and not mine to tell. Let me thus leave aside the psychological morass and the mud-throwing, and just look at the issue at hand: Naturalness, or its absence respectively.

I don’t believe in naturalness, the idea that finetuned parameter values require additional explanation. I recognize that it can be a useful guiding principle, and that apparent finetuning deserves a search for its cause, but it’s a suggestion rather than a requirement.

I don’t believe in naturalness because the definition of finetuning itself is unnatural in its focus on numerical parameters. The reason physicists focus on numbers is that numbers are easy to quantify - they are already quantified. The cosmological constant is 120 orders of magnitude too large, which is bad with countably many zeros. But the theories that we use are finetuned to describe our universe in many other ways. It’s just that physicists tend to forget how weird mathematics can be.

We work with manifolds of integer dimension that allow for a metric and a causal structure, we work with smooth and differentiable functions, we work with bounded Hamiltonians and hermitian operators and our fibre bundles are principal bundles. There is absolutely no reason why this has to be, other than that evidence shows it describes nature. That’s the difference between math and physics: In physics you take that part of math that is useful to explain what you observe.
Differentiable functions, to pick my favorite example because it can be quantified, have measure zero in the space of all functions. That’s infinite finetuning. It’s just that nobody ever talks about it. Be wary whenever you meet the phrase “of course” in a scientific publication – infinity might hide behind it.

This finetuning of mathematical requirements appears in form of axioms of the theory – it’s a finetuning in theory space, and a selection is made based on evidence: differentiable manifolds with Lorentzian metric and hermitian operators work. But selecting the value of numerical parameters based on observational evidence is no different from selecting any other axiom. The existence of ‘multiverses’ in various areas of physics is similarly a consequence of the need to select axioms. Mathematical consistency is simply insufficient as a requirement to describe nature. Whenever you push your theory too far and ties to observation loosen too much, you get a multiverse.

My disbelief in naturalness used to be a fringe opinion and it’s gotten me funny looks on more than one occasion. But the world refused to be as particle physicists expected, naturalness rapidly loses popularity, and now it’s my turn to practice funny looks. The cube, it’s balancing on a tip and nobody knows why. In desparation they throw up their hands and say “anthropic principle”. Then they continue to produce scatter plots. But it’s a logical fallacy called ‘false dichotomy’, the claim that if it’s not natural it must be anthropic.

That I don’t believe in naturalness as a requirement doesn’t mean I think it a useless principle. If you have finetuned parameters, it will generally be fruitful to figure out the mechanism of finetuning. This mechanism will inevitably constitute another incidence of finetuning in one way or the other, either in parameter space or in theory space. But along the line you can learn something, while falling back on the anthropic principle doesn’t teach us anything. (In fact, we alreadyknow itdoesn’t work.) So if you encounter finetuning, it’s a good idea to look for a mechanism. But don’t expect that mechanism to work without finetuning itself - because it won’t.

If that was too many words, watch this video:

It’s a cube that balances on a tip. If your resolution scale is the size of the cube, all you will find is that it’s mysteriously finetuned. The explanation for that finetuned balance you can only find if you look into the details, on scales much below the size of the cube. If you do, you’ll find an elaborate mechanism that keeps the cube balanced. So now you have an explanation for the balance. But that mechanism is finetuned itself, and you’ll wonder then just why that mechanism was there in the first place. That’s the finetuning in theory space.

Now in the example with the above video we know where the mechanism originated. Metaphors all have their shortcomings, so please don’t mistake me for advocating intelligent design. Let me just say that the origin of the mechanism was a complex multi-scale phenomenon that you’d not be able to extract in an effective field theory approach. In a similar way, it seems plausible to me that the unexplained values of parameters in the standard model can’t be derived from any UV completion by way of an effective field theory, at least not without finetuning. The often used example is that hundreds of years ago it was believed that the orbits of planets have to be explained by some fundamental principles (regular polygons stacked inside each other, etc). Today nobody would assign these numbers fundamental relevance.

Of course I didn’t find a cube balancing on a tip under the couch. I didn’t find the cube until I stepped on it the next morning. I did however quite literally find a missing puzzle piece – and that’s as much as a theoretical physicist can ask for.

The film on the cube was very clear and suggestive of practical uses too.

This idea of motion viewed this way as "natural " in your take as in an axiomatic sense. I suggested such a simple idea of motion long ago in an internet relay chat room and got raised eyebrows. A few days latter a young physicist come back to apollogize saying he was mad at his teacher, had made A's, and said I was right. I replied I did not know what was taught in the schools.

My intuition stills sees something wrong with this picture. What would be its analog in four and greater space? (For that matter what is wrong with Bode's Law or Kepler's model in this range of searching for refinements and parameters? )

What would four and one center such gyroscope tell us at the points of a simplex?

I keep expecting that cube to rise within a certain range based on differences of centripetal force or spin with additional outside energy as if balanced only on one point. Not the two involved here. A spectrum that ignores all but the six in compactification models shows orthogonality outside the six D space so perhaps a photon spectrum.

I know this may show me naieve and it is OK provided it is allowed informally here and does not detract from your time and work.

In binary terms:1. A change motion of no coordinates describes rest up to a given dimension2.But a change in all coordinates results in a linear (diagonal) motion.

Great post! :-) I especially like the example of differential functions, and the cube video is fun!

Just a couple of small comments:

"The cosmological constant is 120 orders of magnitude too large"

I am yet to see any formulation of the "cosmological constant problem" that stands up to serious scrutiny. Those 120 orders of magnitude are just folklore, calculated from some outrageous conjectures.

"hundreds of years ago it was believed that the orbits of planets have to be explained by some fundamental principles"

Are you referring to the Titius-Bode law? All planets bar Neptune follow it to within a few percent error (even if you include Pluto into planets), which is remarkable. Granted, it can be an accident of initial conditions for the solar system creation, but the fit between theory and data is just soooooo good, that it is tantalizing to try and find a more serious account for this type of fine-tuning.

Finally, you didn't mention the too-small entropy at the Big Bang, the arrow of time, and related stuff. That can also be regarded as a form of some freak fine-tuning. :-)

I would like to see whomever made this cube work out the dodecahedron model and its physics balanced on a point of singularity stability expressed as 2D step motion.The 10 fold Newton symmetry as a group shadow of higher spaces (or alternative ideas of the nature of substrates) applies to fine structure constants and powers (thus 60 or 120) with the permutations of 5 things.Two such dodecs could teach each other to mirror the path data or align many of them to do the same path.Bernoulli's spirals and all that. But not just Lie groups but Klein's The Icosahedron and Equations of the Fifth Degree. Defects intelligibly mapped into a plane (or branes) of some sort.

This Christmas the grandson had toys strewn everywhere and I found an isolated cube of six colors which his Dad said was a puzzle to form a 3x3x3 Rubik's cube in surface appearence. But it also had parts not just one cube and all those single ones were only one of 30 combinations. The problem to solve here is such a cube with all the combos (some superimposed on higher space). That and seeking unity of the physics that combined adjacent discrete and analog views. Sorry if this was not addressed to me.

"I am yet to see any formulation of the "cosmological constant problem" that stands up to serious scrutiny. Those 120 orders of magnitude are just folklore, calculated from some outrageous conjectures."

Giotis,on the contrary. If infinity is potential infinity signs just simplify equations and double or half infinity makes perfect sense.Things may sound complicated as a Baroque instrument plucked by a murder of crow quills or simple like a hammered piano soft in the overtones in our fine tuned microtones.To each other we may seem out of tune. Now, what are we to do with dead strings? Some dreams are eaqually bizaare while a few who hear can experience perfect pitch.

"So if you encounter finetuning, it’s a good idea to look for a mechanism"

Eight cylindrical gold-plated test masses define a cube, one vertical face vs. the opposite. The right side would be four solid single crystals of space group P3(1)21 alpha-quartz; the left side space group P3(2)21 alpha-quartz. This opposes 6.68×10^22 pairs of opposite shoes (pairs of enantiomorphic unit cells) in a geometric Eötvös experiment. Observe whether spacetime is more than 5×10^(-14) difference/average chiral anisotropic toward fermion quarks (not boson photons). Open image in a new tab to see the (meter-long) 20 micron suspending filament.

@Phillip Helbig,I read the paper by Carlo Rovelli. I thought it was interesting that not once in his paper did he cite the principal problem in calculating the cosmological constant. That is, it must be true by pure logic that at time of the genesis of particles, most notably nucleons, the vacuum energy density has to approximate the energy density of those particles.

Today that energy density that we call lambda is many many orders of magnitude less than that. That change in measurement of energy density is incontrovertible. Why do physicists avoid spelling this out as the two main parameters of the problem and a function that allows for transformation of that energy density in space and time between those two points.

It's not that hard a problem. The move to make lambda equal zero was always as idiotic as making it 10^120 greater than we measure it now. You can't just add extra dimensions, or negative energy, or any other ad hoc invention. You have to accept that there is something in particle physics relating to kinetic energy that changes the energy of particles as they were spun out from the hot dense past. That energy is now an added self energy that all massive particles possess now that they didn't before they were accelerated outward.

My hunch, and it's a pretty good one, is that its exists as increased angular momentum of those particles as they were flung outward. I think Bee's former female cohort at Perimeter is on the right track when she thinks gluons include photons. I agree and think they possess much more energy now than at their genesis. She's a smart cookie. Look up assymptotically safe gravity on the Pirsa website.

The reason that physics seems to be sinking into a morass of multiverses, anthropic arguments, and endless philosophical discourse is that it is built on quicksand. The current oncology--everything is a field--is false. A field is something that assigns a value to every point of spacetime. You have to have a spacetime before you can have a field. What's spacetime? Nobody knows. There is no generally accepted model. As soon as there is a good model for spacetime, all of the current problems will be solved easily.

Based on recent physics the following possibilities need to be examined:

1. The explanation for why the cube stands balanced on its corner is that it is perfectly supported by dark matter on all sides. It is like immersing a cube in a heap of sand, it will hold any position, the sand being the equivalent of dark matter.

2. Another explanation is extra-dimensions. A 3D cube balanced on a corner seems extraordinary ONLY if you restrict to 3D space or 4D spacetime. With extra dimensions things like this can easily happen. Close your eyes, move on from what the world seems to be, keep adding dimensions one by one until it all explains itself.

4. Yet another explanation is the multiverse, you have cubes resting on corners getting support from other cubes. It happens through gravity, they are effectively all leaning on each other, corner to corner (without touching). A universe that exists independent of other universes is nonsense ... ever hear the saying: "No man is an island"...it extends to universes too!

Of course in the marketplace of nay sayers and wild but entertaining metaphors we can debate what is outside the standard theory box as well as the box itself. Or we can see a little beyond current theory and observation from experiments without radically undermining what progress in science has been achieved.

Sabine and Uncle Al in these comments seem to me such a higher stance of practical views. If we catch up to that level of distinctions these theorists have sensibly seen the answers will seem so simple even while formulas like defining the fine structure constant holds such truths stareing us for a. long time now right in the face.

If no one, subjective or not, is an island how can there be an equivalence idea and gravity a force or rubber sheet geometry?

This work of theory is very hard work. Looking in a flat puzzle for a piece could be more than a theorist can be asked to try for a unified theory that has to deal with singularities as if defects. A small fragment of an egg shell may describe the shape of the whole or the whole in some other geometry of excluded perpendiculars or parallels can find a defect thus shape of a missing piece.

I make a distinction like this for "natural" (my term) and representative dimensions. But this topic concerns our unities and measures as physics. (to which as out host is away I hope I have not added to the chaos.)

Is the electron perfectly spherical or for that matter commercial belts in star systems? Do we gain insight merely making nuclei or planet orbits spirals explained by ellyspes say for bare charges?

If we insist the ground for our pet models as complete to the exclusion of a dynamic mirror of the alternatives we are defective in the total picture so do not even suspect we do not understand.

Our gods can load the dice as well as compute the universe as a crap game.

There can be, in the simplicity of duration or minimum distance, divided so halved or doubled by two natures constants, c or h as unity, an apparent hierarchy wherein outside the zero or 120 scales we find in these exclueded stances 1024 and some null singularity and so on for 12D embedded in 16D up to 9 natural dimensions.

What I tried to express in my blogpost is this. If you encounter a situation in which something is stable that you expect to be unstable (balancing on a tip), look for a mechanism but don't expect and don't require that to be possible without finetuning, because we already know it will not be possible anyway.

Forget about the Higgs and instabilities for a moment and recall the idea of ADD large (flat) extra dimensions. That doesn't solve the Hierarchy problem, but it reformulates it, and that made it a very useful approach. Similarly, the idea behind the split-fermion scenario was "Hierarchies without Symmetries". That doesn't do away with the finetuning (you're left to wonder why the potential looks as it looks), but now you have a mechanism and something to test and something to work with. If you instead are faced with an 'unnatural' finetuning and just say 'anthropic principle' you don't learn anything. Best,

I was referring to Kepler's Platonic solids. You're right in that the 120 orders of magnitude are typically based on an estimate that just puts the Planck scale there as cutoff, but there do exist rigorous calculations. Will try to dig out the reference at some point. Best,

Regarding the link, I didn't read it too carefully, I just thought it would be better than linking to a paper that nobody would look at anyway. If you have a better reference, I'll be happy to add a link. Best,

In General Relativity, spacetime is a differentiable, orientable manifold of dimension four with metric of Lorentzian signature. I agree that we do not know fundamentally what spacetime is, which is why I work on what I work on, but your statement that the statement that "everything is a field--is false" is false. It is a very good theory for everything that we have observed so far, it's just that it fails us in certain extreme limits. Science isn't about truth, it's about usefulness. Best,

Re Unknown's comment, see Einstein's 1929 presentation on field theory at http://www.rain.org/~karpeles/einsteindis.html . See this bit where he's talking about the electromagnetic field and the gravitational field:

"The two types of field are causally linked in this theory, but still not fused to an identity. It can, however, scarcely be imagined that empty space has conditions or states of two essentially different kinds, and it is natural to suspect that this only appears to be so because the structure of the physical continuum is not completely described by the Riemannian metric."

According to Einstein, a field is a state of space. Not spacetime. Space.

Why do you ask me a question if you then forbid me to answer it? General Relativity is widely believed to fail if the curvature comes into the Planckian regime, R^2 \sim m_p^4, pls distribute indices suitably. In that regime, quantum corrections become important, but since we don't have a theory of quantum gravity, etc etc. Best,

Physics can be testably 5×10^(-14) wrong toward mass at the starting line. Newton assumed 3×10^8 m/s was infinity, 6.626×10^(-34) (kg-m^2)/s was zero, plus Big G. GR and QFT do not assume two at time, yet all three together are irreconcilable. Something is wrong where physics need not look, since it cannot be there by (defective) postulate.

Phillip, The spirit of Carlos paper (I just now read) is OK but I do not see it resolves any confusions it feels it has found in a neutral balance of concepts of unity as antinomies(Kant), let alone read the thoughts of Einstein on this matter.

If 120 was a greatest wrong prediction then my 1024 is greater. (If anyone cares why I did not say 4096 and knows you van get back to me on that)

As far as Pirsa (I have not read) there are compliments more general than the models in the high level of generalities discussed here.

Such a simple device like the magic of a gyroscope balanced on our fingertip. One would believe back to the future hooverboards are not far away.

But what of Piet Heine's SuperEgg on any point balanced? That determined by absolute x y values squared =1. Why does it role down an inclined plane. Does an egg really stand vertical at equinox and other myths?

Symmetry in a cube of unlimited density or symmetry unfolded is after all how our brain works as unity.

Ashish,The non-small-scale as a paradox (Tarski) may apply. For it seems you can take a sphere volume, divide it into parts, reassemble them then get a bigger sphere. From one universe might we not make many and that the question here you seem to demand as an answer?Moreover, at the not-so-large-scale we can divide it into five parts but do not know how on this scale to fit them back together.This seems to say that at that scale abstract mathematical space is fundamentally discrete in measure like it stands on a point in sand yet seems to have an average limiting of grain size to which we need the right appature at the now of a light coneThat it does not clog the falling sands of time. So we make better clocks to deal with navigating issues of cyclic latitude and longitude. But even the Polynesians could map the oceans and follow the stars.

/* Whenever you push your theory too far and ties to observation loosen too much, you get a multiverse. */

Nope, what you'll just get is the disagreement of observation with theory. It's nothing strange, we already have met this many times in the past. The disagreement of epicycle model with observations doesn't imply, new universe was there - just the epicycle model was wrong/incomplete.

So why the failure of some other theory should be an exception? The multiverse is just a hyped politically correct denomination of disagreement of theory with experiments, which helps the contemporary theorists to pretend infallibility before layman public.

I can assure you, many people (including mine) already know, why it's so. For example Burkhard Heim derived the mass of most particles before forty years with precision, which is still unachievable with contemporary methods by many orders of magnitude. The contemporary physicists just pretend, that the better solution doesn't exist, as it helps them in neverending asking of new jobs and salaries from tax payers, as R.Wilson recognized and named pregnantly before many years.

datapacrat.com/False/TESLA/X/PHYSICS/PHYSIC~1.HTM

Why you cannot understand, that the contemporary scientists are culture of parasites of human society? Even parasites can be frenetically active, but this doesn't make them less harmful to their host, which is feeding them. This video may you help with it.

youtube.com/watch?v=eNU3MLqyzPk

Thomas Bearden dedicated his life the actual research of quantum gravity phenomena - not just your illusion about it.

ZEPHIR,you have a most interesting blogbut if you think galaxies are getting smaller and taking up less space and that the only reality to judge which physicists are confused, you are going to have to find them a hotter place.

On what can we ground physics if it is to be conflicts by those in confusion to resolve the role of science in this world, for other core stances and ideal points there or not really in the distance? To make it a better world or to bask in our universal higher sense of its mystery?

Opposition criticism may or may not be a useful tool for inquirey. Costly big or for the lonely work of individuals in the search for enlightenment and useful objective fact. Just look back at the articles archived and predictions made before the "discovery " of the Higgs in the blogs as our exploration has evolved along with the feeling our conversation has come closer to entropy with less of something meaningful to say.

I could give it a try (but not to Sabine because I would not be surprised if she replied with better theory to some commenters confused rhetorical question-answers.)

So, Uncle Al. If a snub cube has exact mirror image structurally could it not be really one semiregular solid hard for us to see as one? What sort of magic is this that tells us there is more than meets our mind's eye about the Higgs as such a theoretical and constructible geometric structure? Your long ignored experiment essentially applies beyond energy differences observed in large organic choral molecules to the very foundations of space-spacetime.

But if we made a snub cube and set it "fine tuned" on a point, would it be confused as to which way to spin or to move?

Where in my question (still unanswered by you) was there a request that you "make up" a wrong answer. You have twice replied without answering my below question.

Question stands: Please specify non-small-scale failures of GR.

See, I am very interested in theories vs observation (call me old-fashioned!). Why won't you explicitly say that GR has a perfect record (outside the Planckian regime). Just answer and say that GR matches with observation 100% on non-small-scales? Does it?

If we were constructing our model with Lego blocks, then using a bit of copper wire to make an awkward connection would seem “unnatural.” Perhaps the open question is whether using a different sort of Lego would eliminate the need for wire.

Our universe seemingly arises from the accretion and interplay of distinctions. So, is it most naturally to be treated as a collage or is it still somehow of a single piece entirely?

I remember a movie in which the hip young physics teacher quotes Einstein to his class, something to the effect that energy is never still. I have not been able to locate the statement, but perhaps this is sufficient reference.

Following along those lines, potential energy is one of the two principle elements of the Lagrangian and the search for the grand equation has largely been a search for some adequate and encompassing Lagrangian (loose quotes on this). So, it is not as if a discussion of the nature of potential energy is peripheral.

While the notion of potential has clearly been formalized, it has always seemed a bit of artifice to me, something like a cube balanced on its corner needing further explanation. Is there some active principle that keeps restless energy poised in so many manifestations?

@Don FosterI know it wasn't addressed to me but you made very good points. It seems we might (can?) still implement the Legrangian on a cosmic scale. I like to think of Lambda as the potential energy of the cosmos. Rather than thinking of energy as always restless it might be better to think of it as always changing form.

At the beginning the potential, Lambda, was big but not infinite. There is a tendency in physics to equate that restless, fluctuating quality of energy to infinite potential. I think that's wrong because if it was true we would not be living in a low lambda potential at the present time with an equivalent energy of about 3.5 degrees kelvin. Better to stick to the Buddhist idea of energy as always changing form. And the overarching picture of the changing form is from potential to kinetic energy as the universe ages.

What you said about a cube balancing on its corner in a high potential energy environment was really interesting to me. If kinetic energy was related to spin, (aka angular momentum) then it seems appropriate that the first particle to make its appearance in a high potential energy environment is a spin zero particle.

It may be like Peter Woit envisions - there is no heirarchy problem. That is, there is no matter that exists past the energy of the Higgs and nothing that would require it. In other words once the Higgs manifested itself as mass and became part of massive particles then the kinetic part of the cosmos came into existance because the Higgs allow particles to exhibit kinetic energy as they were accelerated outward. And, of course the potential energy, Lambda, was vastly depleted by exerting that acceleration.

I agree that, at this point, if there is no SUSY, we are at a stage when we see the cube balanced but do not see gyroscopic parts inside. But once we see gyroscopic parts, I am not sure, why you call it a fine tuning. After all, every phenomenon in science requires some specific mechanism. So once we find the mechanism, something else if not SUSY, then our problem would be solved. We would not worry about fine tuning or at least not discuss fine tuning. Have I misunderstood your point?

I realize that no one will listen at this point, but the fine tuning controversy, the hierarchy problems and the vacuum energy density crisis are all swept away forever in a natural manner and with a natural resolution if one dares to question how the Planck mass, length and time are calculated.

http://arxiv.org/abs/0901.3381

The answer is simple and obvious, but totally unacceptable to those who treat the conventional Planck scale as sacrosanct.

Robert,You have posted many times how important prediction is for science in a model.If in dimensional analysis the dimensions are seen as one unity and that as say cubes divided by cubes then that part of your equations is unity.These are the sort of equations with anwsers long staring us in the face.Your arxiv PDF shows me how difficult seeing the obvious or beyond it is in this matter.This leaves your Planck mass a unity but not necessarily independent of the rest of the equation, the same for these concepts here debated as non issues or not as to if real or vanished.

The same for numerical parts of the equations. 2 or 1/2 in particular so important at least classically for our distinction (virality) of potential and kinetic energy.So in the elements in three space of say 120 possible in four space could we not predict element 48 (Ag) is where fusion is balanced with fission at a point of stability. That is, if this already was not the case?

Yes, you missed my point. Please re-read the paragraph that starts with "We work with manifolds of integer dimension..." and the following paragraph to understand what I mean with finetuning in theory space. Best,

Hi Bee,I see your point. But I still disagree with your statement “that apparent fine tuning deserves a search for its cause, but it’s a suggestion rather than a requirement.” I think explanations should be requirements and this is how science progresses. To give a possibly absurd example from classical physics: air planes take off at a certain speed and there is a maximum speed at which a train should go around a curve. These examples look funny now because the physics was known long before technological devices were made. In absence of physical theories of lift and centripetal force someone would have called it fine tuning!! But once the physics was known, nobody called it fine tuning!

Kashyap,I think your comment brings us back to older links of commenters in a lively discussion of just how far mathematics can cover the physics. Beyond consistency perhaps we should be fine tuning the mathematics.This fine question of theory did not stop the builders of rail roads using arc tan to design safe curves for tracks.Imagine this cube as a tethered Helium balloon in a passenger car on such a curve. In the forces considered instead of being vertical why does it lean the unexpected way? Fine tuning or not we can think outside the box (or cattle) cars if we want. I hope I have not missed her point.

The referees always first say the Planck scale is defined for all time and cannot be defined in a different manner - end of story.

When I counter with: "but what if G has a different value in the microcosm and the conventional definition of the Planck scale "illegally" mixes an atomic scale constant (h-bar), a completely scale invariant constant (c) and a stellar scale constant (Newtonian G).

The referee then says: "You would have to show empirical evidence for the putative very large value of G for the microcosm."

Then I say, "no one can measure this constant WITHIN a microcosm system, and so the value of G is an empirically-based AXIOM(see an entire website and 50 published papers for the empirical evidence)."

I am just asking people to consider a different assumption about G and seeing where it leads. And it leads to a resolution of the VED crisis, among many other things.

At this point it is game over for the paper because the referee can refuse to entertain any idea that conflicts with ones most cherished assumptions, like absolute G.

We are fortunate that there was once more willingness to consider alternative sets of assumptions (think special relativity, for example) to see where they would lead, regardless of the status of the author.

Would Helbig like to discuss the scientific aspects of the paper, without repeating the hackneyed red herrings of recent memory?

Ashish,I had a detailed reply if you mean the book by that Fermi lab author on the LHC.What do you have in mind that could be more advanced than the discussions here?In the chat science forum he gave us some of his time.The moderators deleted my comment where I bested him on two arguements. But being an open minded scientist he went from saying the Casmir force had nothing to do with the cosmological constant to urging young physicists when asking what area to pursue to research this very area for their careers.The second arguement got me banned. So what is it in us that instead of working out the science objectively and the philosophy involved that we waste time and effort not playing nicely? That superconnected supercomputer at room temperature inside our skulls just might be worthy of a little more study, yes?

@L.Edgar Otto: First, why Helium balloon tilts in the opposite way is well understood. Simply put since it is lighter than air and is surrounded by air it behaves like a negative mass object. If you do not like negative mass, you can work out the details of buoyancy, weight and centrifugal force to get the same result. As for fine tuning Math, I do not know. Today we have to use the Math we know today and which has worked in so many different cases until today. If someone like Bee comes up with new explanation with new Math then, of course we would have to accept it.

Kashyap,that sounds to me an informative sensible reply and perhaps a deeper question if and where we should take things like negative mass or distance seriously.The Helium balloon was an explored mystery not that long ago (here Ashish is right my wisdom is out of date) but there are so many such mysteries still not resolved as I Google my past reading of science in my brainEinstein would have been 50 years behind in electromagnetism if not for Ingles. Einstein was 150 years ahead of his time. I cannot say the same for our era.Now how do you explain this :-) . Imagine a bare naked singularity hanging on an infinite pendulum projected out by a negative index of refraction from inside a cube such as this that has a surface we cannot tell is a mirror ball or a wrecking ball?The searching for defects paintshopped over for continuity on the large scale is a powerful new method of enquiry. I have some reasonably useful math of an assembly code nature. But must I accept it a complete discription of physics? What happens if such a Helium balloon appeared in the phase shift universe when there were only two elements in all the stars? And what about the next shift?

So much has been said about the "fine tuning" of the cosmological constant, but anyway ...It is well known, that the naive calculations of QFT lead to a way too high vacuum energy density. "The universe would not even reach to the moon" as Pauli expressed it.But the striking thing is that if one takes the square root of this number one pretty much gets it right. Does that look like fine tuning or an accident ? To me, NO. Therefore, instead of endless philosophical discussions, I think it would be more helpful to understand what that square root means. Interestingly, the relevance of taking the square root of the "naive" cosmological constant was already realized by Sorkin long time ago:http://arxiv.org/abs/gr-qc/9706002In my opinion the most pressing problem of modern (fundamental) physics is to understand what dark energy is. As long as we do not know this, I guess we are "kinda stuck".

MarkusMFrom a philosophy overview what you say makes clear sense. The universe as represented in a plane (brane) or its xy boundary factors is the issue of what is a fractal of holographic interpretation of the data. As I was saying.

Edgar,"fractal of holographic interpretation of the data".Well, CDT for instance predicts that spacetime is a fractal at small scales having a spectral dimension of 2. If I naively assign harmonic oscillators to such a spacetime, instead of the 10^244 of QFT in 4 dimensions I get 10^122 of them all in all and the right C.C.(and the holographic principle for free).Pure numerology, I know, but the point is that there seems to be room for explaining the C.C. - no fine tuning needed here.Best

L. Edgar Otto,Physicists are a highly subjective and emotional group and, yes, as you mentioned, physicists are particularly adept at deleting opposing views. But as Einstein hinted when using the term "violent opposition," physicists can become pretty violent when confronted with reality.I was at one Dark Matter conference and asked a gentle question: "Suppose, hypothetically only, that there is no dark matter. If this were true -- hypothetically only -- where does physics go then?"All hell broke loose. Many shouted: "How dare he? This is Blasphemy." They surrounded me and shattered my laptop against the wall. A bunch of the back were screaming to each other: "What is the number for 911?" The physicist in cowboy hat at the back said "Everyone down, and started shooting." Luckily I followed his instructions too, but the black board became Swiss cheese. I escaped through the window but not before the persons shouting "Get his funding" took my wallet away.

this is a delightful post. Yes, we particle physicists did expect to see hints of what we imagined would explain the fine-tuning of the Higgs mass. Some people believed in naturalness more than others, but no one argued that the LHC should not be built because they did not believe in naturalness.

Your examples are excellent. The platonic solids basis for the planets occupied bright people for many years - yet we know today that it was a complete fallacy. Our notions of electroweak symmetry breaking probably has a better basis in fact, but they are still just notions. As you have pointed out in other posts, human beings like to find patterns and will investigate any anomaly in order to get a clue as to what is going on. Sometimes the anomaly really does bring insight (for example, the near equality of the proton and neutron masses), and sometimes it does not (for example, the closeness of the mass different of the D* and D mesons to the pion mass). We can’t know until we investigate.

We expect the value of the Higgs mass (as the difference between two large numbers) to bring insight. I believe you are pointing out that it might turn out to be more like the D*-D mass difference than the p-n mass difference, i.e., there is no deep symmetry here but only a feature of a richer panorama of particles.

To risk another analogy: the solid angle subtended by the moon is almost exactly the same as the solid angle subtended by the sun. One could imagine a contemplative type of person thinking hard about why that should be the case, some time before the modern era. We know, however, that this is just a coincidence that does not serve to enlighten us about our solar system, exoplanets or about bigger questions of planetary science.

Again, I like this post. It helps me think in a slightly different way about where particle physics might go, if indeed the LHC more or less disproves theories meant to resolve the hierarchy problem.

BTW, I see that I chose two examples in particle physics (p-n and D*-D-pi mass differences) that are striking only in a quantitative way; I am looking at small differences of large numbers and so I am focusing on numbers as physicists almost always do.

The “cubli” metaphor is especially apt. A cube balanced on a corner is a striking example of fine-tuning, yes. If we manage to look inside the cubli, we will learn the reason why it does so. But the explanation will not involve the symmetries of the cube or of gravity, at least not in any direct manner. Rather, a carefully crafted mechanism enforces a stable equilibrium where an unstable one is expected. Upon understanding that mechanism completely, we see there is a kind of fine-tuning in the way the parts and pieces are put together and the way that they perform their functions (it is not just a box of random mechanical junk), but that “fine-tuned-ness” has nothing to do with the striking behavior of the cubli; rather, it reflects the craftsmanship and engineering skills of the people who made it.

Returning to the smallness of the Higgs mass, we might one day understand that it arises as a feature of a theory, similar to the way that D*-D mass difference arises from non-perturbative QCD (as explained by lattice calculations) with only and indirect connection with Lambda_QCD, alpha_S, SU(3)c and the charm quark mass.

BTW, there is also a gentle pun with the cubli example: what better place to find a fine-tuned mechanism than in the land of the makers of exquisite watches and clocks, with finely-tuned gears, springs and counterweights…

You distinguish between naturalness or fine tuning in the values of parameters versus naturalness in the structure of theories, and claim that people are too focused on the former and tend to forget about the latter. But I would argue that this is only "natural". The reason is that we have some hope of understanding the consequences of adjusting the values of parameters. (An example is Lambda - it's simple to use linear cosmological perturbation theory to see what would happen if it had a different value. Of course the effects of all parameters wouldn't be so easy to determine.) But the effects of changes to the mathematical structures are, I would expect, generally much harder to decide.

The problem is this: to *know* that the restriction to, eg, differentiable functions is unnatural or involves fine tuning, we must first demonstrate that viable alternative theories exist based on non-differentiable functions. Then the fact that the differentiable functions actually present in our world are measure zero in the larger space would be relevant. But no one has demonstrated this. Indeed, I would not be surprized if no consistent, viable theory could be constructed using non-differentiable functions.

(Above, "viable" presumably must mean capable of supporting observers able to ask about naturalness. Of course, deciding whether such observers radically different from us might exist only adds to the difficulty I described above.)

You wrote "There is absolutely no reason why this has to be" about such theoretical restrictions. But the lack of a consistent, viable theory would be a reason. Of course we don't know (yet) that we lack such theories. But we don't know that they exist, either. I'd expect that it's much easier to obtain a consistent, viable theory by varying a parameter from our actual values than by performing theory-structural changes. So your argument that we already have examples of unnaturalness doesn't seem to hold water.

PS - for a non-particle physicist, what's plotted vs what in these "scatter plots", and how does this relate to your point about anthropics?

Jim etc al,Conceptually we may ask if the matrix or wave versions of QM are ultimately different as well various models of relativity. In this sense the anthrophic idea itself is not relavant or needs to be itself fine tuned.So we ask with the best tool we have if some physical or abstract space is differentiable or an integration gives us an answer as stability.We can discuss haunting coincidences for what we can construct even if we do not see the same patterns apply to smaller scales.For a simple version of this sort of primitive geometric numerology consider the Deltahedra symmetries.Defects appear in the list of them in the sense of what is flat verses spherical tops as well in the simple constructable range of integers what is even or odd.Between the tetrahedron and the icosahedron (4 to 20 triangular faces) 6 triangles would be flat or nearly so if our model a little less Euclidean. So how is it there is no 18 faced deltahedron? I find it remarkable how these objects taken together and we add up the parts the numbers work out an self fine tune. Eddington's monomarks are really "polymarks" of hyperdimensional volumes.But this does not mean I am a Platonist or that proofs of undecidability is the last word. We can find echos of how we apply chiaral models as central to physics herein too. Nature doubles her surprising generational hierarchies intelligibly. Sabine 's "Scientific Phenomenology " has proven most helpful to bring a lot of models together. But while standard forces make mote sense what gravity is still eludes us. Happy New Year!

It seems this post followed a previous one and that, you were on a train of thought here? I get a sense of the balance regard the cube, in fact maybe even gadgets in cube, for three "d" printing? :)

In 1998, observations of type Ia supernovae also suggested that the expansion of the universe has been accelerating[3][4] since around redshift of z~0.5Accelerating universe

The question of the flatness problem and an accelerating universe? What is natural to this expansion of space? So through corroboration the naturalness of the universe produces opinion on what the universe is doing as asks for phenomenological approaches.

Dear Bee,While the Standard Model is almost certainly an effective theory only, with new Planck-scale physics, if not new physics at a lower energy, is the mathematical significance of it being being a perturbatively renormalizable theory understated?

What I mean is that if the divergences in the perturbation expansion miraculously cancel, is it totally implausible that the quadratically-divergent-with-cut-off-mass terms also miraculously balance out?