Pages

Sunday, January 13, 2019

Good Problems in the Foundations of Physics

Look at the history of physics, and you will find that breakthroughs come in two different types. Either observations run into conflict with predictions and a new theory must be developed. Or physicists solve a theoretical problem, resulting in new predictions which are then confirmed by experiment. In both cases, problems that give rise to breakthroughs are inconsistencies: Either theory does not agree with data (experiment-led), or the theories have internal disagreements that require resolution (theory-led).

That’s an oversimplification, of course, and leaves aside the myriad twists and turns and personal tragedies that make scientific history interesting. But it captures the essence.

Unfortunately, in the past decades it has become fashionable among physicists to present the theory-led breakthroughs as a success of beautiful mathematics.

Now, it is certainly correct that in some cases the theorists making such breakthroughs were inspired by math they considered beautiful. This is well-documented, eg, for both Dirac and Einstein. However, as I lay out in my book, arguments from beauty have not always been successful. They worked in cases when the underlying problem was one of consistency. They failed in other cases. As the philosopher Radin Dardashti put it aptly, scientists sometimes work on the right problem for the wrong reason.

That breakthrough problems were those which harbored an inconsistency is true even for the often-told story of the prediction of the charm quark. The charm quark, so they will tell you, was a prediction based on naturalness, which is an argument from beauty. However, we also know that the theories which particle physicists used at the time were not renormalizable and therefore would break down at some energy. Once electro-weak unification removes this problem, the requirement of gauge-anomaly cancellation will tell you that a fourth quark is necessary. But this isn’t a prediction based on beauty. It’s a prediction based on consistency.

This, I must emphasize, is not what historically happened. Weinberg’s theory of the electro-weak unification came after the prediction of the charm quark. But in hindsight we can see that the reason this prediction worked was that it was indeed a problem of consistency. Physicists worked on the right problem, if for the wrong reasons.

What can we learn from this?

Well, one thing we learn is that if you rely on beauty you may get lucky. Sometimes it works. Feyerabend, I think, had it basically right when he argued “anything goes.” Or, as the late German chancellor Kohl put it, “What matters is what comes out in the end.”

But we also see that if you happen to insist on the wrong ideal of beauty, you will not make it into history books. Worse, since our conception of what counts as a beautiful theory is based on what worked in the past, it may actually get in the way if a breakthrough requires new notions of beauty.

The more useful lesson to learn, therefore, is that the big theory-led breakthroughs could have been based on sound mathematical arguments, even if in practice they came about by trial and error.

The “anything goes” approach is fine if you can test a large number of hypotheses and then continue with the ones that work. But in the foundations of physics we can no longer afford “anything goes”. Experiments are now so expensive and take such a long time to build that we have to be very careful when deciding which theories to test. And if we take a clue from history, then the most promising route to progress is to focus on problems that are either inconsistencies with data or internal inconsistencies of the theories.

At least that’s my conclusion.

It is far from my intention to tell anyone what to do. Indeed, if there is any message I tried to get across in my book it’s that I wish physicists would think more for themselves and listen less to their colleagues.

Having said this, I have gotten a lot of emails from students asking me for advice, and I recall how difficult it was for me as a student to make sense of the recent research trends. For this reason I append below my assessment of some of the currently most popular problems in the foundations of physics. Not because I want you to listen to me, but because I hope that the argument I offered will help you come to your own conclusion.

Dark Matter
Is an inconsistency between theory and experiment and therefore a good problem. (The issue with dark matter isn’t whether it’s a good problem or not, but to decide under when to consider the problem solved.)

Dark Energy
There are different aspects of this problem, some of which are good problems others not. The question why the cosmological constant is small compared to (powers of) the Planck mass is not a good problem because there is nothing wrong with just choosing it to be a certain constant. The question why the cosmological constant is presently comparable to the density of dark matter is likewise a bad problem because it isn’t associated with any inconsistency. On the other hand, the absence of observable fluctuations around the vacuum energy (what Afshordi calls the “cosmological non-constant problem”) and the question why the zero-point energy gravitates in atoms but not in the vacuum (details here) are good problems.

The Hierarchy Problem
The hierarchy problem is the big difference between the strength of gravity and the other forces in the standard model. There is nothing contradictory about this, hence not a good problem.

Grand Unification
A lot of physicists would rather have one unified force in the standard model rather than three different ones. There is, however, nothing wrong with the three different forces. I am undecided as to whether the almost-prediction of the Weinberg-angle from breaking a large symmetry group does or does not require an explanation.

Quantum Gravity
Quantum gravity removes an inconsistency and hence a solution to a good problem. However, I must add that there may be other ways to resolve the problem besides quantizing gravity.

Black Hole Information Loss
A good problem in principle. Unfortunately, there are many different ways to fix the problem and no way to experimentally distinguish between them. So while it’s a good problem, I don’t consider it a promising research direction.

Particle Masses
It would be nice to have a way to derive the masses of the particles in the standard model from a theory with fewer parameters, but there is nothing wrong with these masses just being what they are. Thus, not a good problem.

Quantum Field Theory
There are various problems with quantum field theories where we lack a good understanding of how the theory works and that require a solution. The UV Landau pole in the standard model is one of them. It must be resolved somehow, but just exactly how is not clear. We also do not have a good understanding of the non-perturbative formulation of the theory and the infrared behavior turns out to be not as well understood as we thought only years ago (see eg here).

The Measurement Problem
The measurement problem in quantum mechanics is typically thought of as a problem of interpretation and then left to philosophers to discuss. I think that’s a mistake; it is an actual inconsistency. The inconsistency comes from the need to postulate the behavior of macroscopic objects when that behavior should instead follow from the theory of the constituents. The measurement postulate, hence, is inconsistent with reductionism.

The Flatness Problem
Is an argument from finetuning and not well-defined without a probability distribution. There is nothing wrong with the (initial value of) the curvature density just being what it is. Thus, not a good problem.

The Monopole Problem
That’s the question why we haven’t seen magnetic monopoles. It is quite plausibly solved by them not existing. Also not a good problem.

Baryon Asymmetry and The Horizon Problem
These are both finetuning problems that rely on the choice of an initial condition, which is considered to be likely. However, there is no way to quantify how likely the initial condition is, so the problem is not well-defined.

There are further always a variety of anomalies where data disagrees with theory. Those can linger at low significance for a long time and it’s difficult to decide how seriously to take them. For those I can only give you the general advice that you listen to experimentalists (preferably some who are not working on the experiment in question) before you listen to theorists. Experimentalists often have an intuition for how seriously to take a result. That intuition, however, usually doesn’t make it into publications because it’s impossible to quantify. Measures of statistical significance don’t always tell the full story.

I thought Feyerabend's "anything goes" was more referring to the means used to establish a scientific theory, as opposed to Popper's falsification paradigm, and not so much to the fact that you can try any random scientific theory.

Do you consider the large correlations between distant regions of the microwave background a good problem? Inflation may present a solution, but as far as I know, there has been a lot of criticism directed toward the inflation hypothesis in the last couple of years. So might it be a good research direction to find alternative explanations for the correlations, that do not rely on inflation?

It seems to me that measuring the gravity field of light would be a great problem in the foundations of physics. The field of a "Pencil of light" was calculated nearly 90 years ago by Tolman, and laser technology seems to approach the point at which such a measurement becomes realistic. If the results turned out as unexpectedly as the Michelson-Morley experiment, they could overthrow the quantum gravity paradigm.

I have the impression that the significance of such an experiment is dismissed rather casually. Do you have a similar impression, and if so, what do you think of that?

I'd be surprised if such a measurement becomes possible any time soon. In any case, it would certainly be interesting but not sure what you want to learn about quantum gravity from measuring a classical field.

I would disagree about the measurement problem. The need to postulate the behavior of an IDEAL macroscopic object - that is, a measurement device - cannot possibly lead to any inconsistency in this sense. The question why REAL devices behave (almost) exactly the same is a legitimate problem, but it has nothing to do with foundations, I believe.

Ah, but quantum gravity relies on an assumption that no one ever bothers to spell out, probably because it seems too obvious: that gravity and quantum theory share a common domain of validity. Do we know this for sure?

If any quantum object were shown to be a gravity source, then that would settle the matter, but I am unaware of any such case. Measuring the gravity field of electons, protons or even buckminsterfullerenes might be hopeless, but perhaps not a sufficiently high-energy density laser beam.

If this measurement could be undertaken and the predicted result was attained, then this would be ironclad evidence that gravity must be quantized. On the other hand, a highly unexpected null result would mean that we discovered a limit to the domain of applicability of general relativity, namely where quantum theory begins.

I do not regard the gravitational bending of light, Einstein rings etc. as ironclad evidence for the obvious assumption because they can also be accounted for by QFT on curved spacetime. You need the quantum object to be a gravity source to test it.

"Atomic MassesIt would be nice to have a way to derive the masses of the atoms from a standard model with fewer parameters, but there is nothing wrong with these masses just being what they are. Thus, not a good problem."

Young researchers choosing their field should be warned that rankings into good and bad problems are of the same incomplete usefulness as, say, a ranking of good and bad stocks for financial investing. Supply and demand, unforeseen developments have as much influence on the eventual success as analytic projections. So, diversify!

You are misunderstanding my intent. When I say "not a good problem" I do not mean "I don't want to see an answer", I mean "not a promising route to progress." I have explained very clearly what my reasoning is. Sure you can get lucky trying to find a way to calculate the masses in the standard model.

Yes, thanks for pointing out, I forgot about this. Not a good problem. There are consistency bounds on the upper number of particles, but for all I know 4 or 5 generations would be okay theoretically, it's just in conflict with experiment.

you wrote "Feyerabend, I think, had it basically right when he argued “anything goes.”"

for many years, i took the catchphrase 'anything goes' (or as Mao Zedong put it "let a thousand flowers bloom") to mean that in developing a theory, there is no one methodology or criterion that should be used (e.g. conceptual simplicity, mathematical simplicity, conceptual beauty mathematical beauty have all been used by various theoreticians). i.e. that Feyerabend was supporting 'methodological anarchism'.

But the philosopher of science, James Owen Weatherall, pointed out to me that Feyerabend was not referring only to methodology, but also to epistemology.

This 'epistemological anarchism' view is in accordance with Boltzmann's advocacy of 'theoretical pluralism (which follows from his view of theory as a representation of nature (see http://cds.cern.ch/record/1013890/files/0701308.pdf)).

So, are you in favor of methodological anarchism or epistemological anarchism, or both?

I attended a lecture by John Baez last year, after which he was asked which fields of research might be considered by a grad student. He answered "Not fundamental physics. Maybe biological physics or environmental physics".

Applying machine learning to the dark matter problem sounds super exciting. Now I know more about machine learning than about dark matter. Could you maybe give a general idea of what you have in mind? Machine learning is not a single technique but more of a collection of tools.. are you thinking of detecting patterns in a large body of astronomical observations, or rather searching through a space of possible theories/solutions? Or something else?

I am a little surprised by the argument that the strength of gravity is not a good problem. Leaving aside whether we can solve it, I would have thought that understanding why gravity has the strength it has would lead to a better understanding of gravity itself, and I don't see that as a bad thing. Similarly with the masses of particles. It is not just a case of reducing the number of parameters - we might get a better understanding and correct the standard model. After all, Einstein's relativity made, at first, a minor change to Newtonian mechanics in most domains we can measure, but it is profoundly different.

A few comments on questions on the strengths of various fields and Hierarchy. The hierarchy problem is serious if one thinks interactions are unified into single symmetry for a gauge or gauge-like field. Of course gravitation sticks out prominently. The reason gravitation is so weak is that the masses of elementary particles are so small. The Planck mass is m_p = sqrt{ħc/G} and this can be compared to the mass of some elementary particle M so that

(M/m_p)^2 = GM^2/ħc,

which looks suspiciously similar to the fine structure constant α_e = e^2/(4πεħc) ~ 1/137 and we can call the above ratio α_g. My temptation is to put the mass of the Higgs boson in for the mass M = 125GeV and the Planck mass is 1.22×10^{19} GeV and we then get α_g = 1.02×10^{-36}.

I chose the Higgs particle mass because I think the Higgs field is a scalar field connected to gravity. Also the masses of all fundamental particles we know, not complex bound systems like hadrons with an induced mass gap, is given by the Higgs field. The masses of other particles is related to this with various Yukawa coupling terms. The Higgs field, along with inflaton and related scalar fields (dilatons axions etc) are potentially singlet entanglement states of a gauge-like field that in a triplet entanglement is equivalent to a graviton. Bern and Dixon have proposed something similar.

We then have this huge difference in the scale of α_e and α_g, in fact 34 orders of magnitude. Two words I like together are “what if.” I read a paper on E10, http://arxiv.org/abs/hep-th/9411188v1, and I might ponder what if; what would be the spectrum of Higgs scalar fields in this setting. Is the Higgs field we know just the lightest of a whole spectrum of these. This spectrum might then be an RG flow ~ E^2 for gravitation. The Virasoro ladder extension on E8 would potentially mean all the elementary particles are just the lowest mass in such a sequence. As Mark Twain put it, “In falling down the elevator shaft, once you hit bottom you go no further.” As a result the masses of particles we observe are very small. On the other hand for the α_g ~ E^2/E_p^2 near the top there is a scalar field mass near the Planck mass the α_g ~ 1 at that scale. For the gauge fields their running parameter with this demonstration equation g^{-2} ~ g0^{-1} - 1/4πg0^2 ln(Λ^2/E^2) where Λ is the cut-off then this reaches unity at the cut off Λ, which we may take to be a Planck or string scale. I am ignoring various technical details with supersymmetry of course.

The alternative is to say that gravitation is really not at all unified with quantum fields. Instead maybe they have some equivalency. General relativity may be categorically equivalent to quantum mechanics, particularly if spacetime is an emergent property of large N entanglements. So it is then possible we have been barking up the wrong tree on that. Quantum systems tend to increase entanglements, and the growth of black holes, and to some measure the equivalence principle, is a manifestation of how entanglement entropy increases. Under these conditions the hierarchy problem is seen in a different light.

This list really is just a spin-off from my book, an application of the conclusions I draw there. But yes, sometimes I think I may have been off better in philosophy than in physics. Then again, I prefer equations over words, so that may not have worked either.

I am not in favor of anarchism of either which way. Anarchism is unstable. People will develop rules and they will form groups and they will develop hierarchies whether you like that or not. If you ignore human nature, you'll end up envisioning an utopia that will never come to pass.

Having said this, I think it's a mistake to consider anything as a final answer and you have to remain open to amending and improving it. That includes epistemology.

‘Maxwell discussed … in terms of a model in which the vacuum was like an elastic … what counts are the equations themselves and not the model used to get them. We may only question whether the equations are true or false … If we take away the model he used to build it, Maxwell’s beautiful edifice stands…’ – Richard P. Feynman, Feynman Lectures on Physics, v3, c18, p2.

Beauty is described here then, as, the mathematics that survives? That is what is fundamental and why Sabine is at a advantage understanding what is involved as more then a philosophical adventure.:-)Lee Smolin never liked the Platonic view either.:-)

"Where a dictionary proceeds in a circular manner, defining a word by reference to another, the basic concepts of mathematics are infinitely closer to an indecomposable element", a kind of elementary particle" of thought with a minimal amount of ambiguity in their definition. Alain Connes"

This "indecomposible element" is the beauty described. Advancements in relativity can be made from that element?

you're referring to political anarchism and of course it's unstable (but it is not necessarily utopian). if you mean its structure is non-permanent. the world we live in is in fact, already in a state of political anarchy since it consists of number of states (or countries), each having its own political system, who get along, more or less, without there being a central authority creating end enforcing the rules of conduct between them. you have an anarchy in any system other than a one-world government. The ONLY difference between this and a pure anarchist system is that in a pure anarchist system, the basic unit is the individual rather than the country. (within the U.S., there the issue of the supremacy of the state over the federal government (i.e. states rights over. federal rights). in the anarchist position that there are no federal rights and no state rights, only individual rights. in the U.S., we have all three and there are ongoing conflicts between all of them under the supervision of the Supreme Court, at least so far although the Court's over-riding authority may be challenged in the very near future).

note: Einstein was a methodological anarchist. he referred to himself as a 'methodological opportunist' who chose whatever theoretical tools were useful. some scientists study a specific subject using whatever theoretical tools 'work' and some scientists select a tool and then look for subjects to apply that tool to (this is also referred to as the 'law of the instrument' - when a child is given a hammer, he/she discovers that everything they encounter, requires pounding).

btw - here's a thought - if one takes Boltzmann's theoretical pluralism seriously (and i do since i encounter it in my own research) then perhaps, the failure of both string theory and LQG indicates that they share a common feature in their representations of nature and that common feature is incorrect (it is not an essential feature of reality).

In 1839 a young Frenchman, Alexandre Edmond Becquerel, experimented with electricity in his father’s lab. He was passionate about phenomena of magnetism, electricity and optics, which scientists had only started to understand. He noticed a strange occurrence: an electrolytic cell generated more energy when it was exposed to sunlight. He called it the photovoltaic effect.

Forty years had to pass for another two scientists, William Grylls Adams and Richard Evans Day, to discover the photovoltaic effect in a solid substance. Then, in 1905, Albert Einstein explained the fundamental physics of it, which ultimately led to the quantum revolution in physics. Yet even eight years later, great physicists such as Max Planck considered this explanation foolish. With an apparent lack of practical applications, all these breakthroughs had not been taken forward until a US company, Bell Labs, made the world’s first useful solar cell in the 1950s. The rest is history.

Science is a passion driven by curiosity. It is not a job, it is an obsession driven by the need to know.

Alfred Lothar Wegener was a German polar researcher, geophysicist and meteorologist.

During his lifetime he was primarily known for his achievements in meteorology and as a pioneer of polar research, but today he is most remembered as the originator of the theory of continental drift by hypothesizing in 1912 that the continents are slowly drifting around the Earth. His hypothesis was controversial and not widely accepted until the 1950s.

Ludwig Boltzmann faced massive ridicule for his work on thermodynamics, eventually committing suicide in 1906. His work was largely carried on and extended by Paul Ehrenfest, who faced similar ridicule, committing suicide in 1933. Their work laid the foundation for modern statistical mechanics.

In 1927, Georges Lemaître put together data about the redshift and distance measurements of galaxies to infer the expanding Universe, writing to Einstein about his findings. Einstein responded, "Your calculations are correct, but your physics are abominable." Yet Lemaître was correct, with his conclusions predating Hubble's identical ones by two years.

Fritz Zwicky, who first inferred the existence of dark matter in the 1930s, had his results dismissed based on the absurdity that such a significant fraction of the Universe could be hitherto undetected. The work of Vera Rubin and Kent Ford in the 1970s led to dark matter being seriously considered, but the work of Zwicky could have given us a 40 year head-start on the puzzle.

The really major advances in science are not recognized for decades after they have been made by those obsessed with knowing and discovery. These discoveries are just to hard to accept by the current paradigm of the day. As it has been, it remains so today. There are great advances in science that have been made but are currently being ridiculed. It just takes decades for these advances to pass the test of time and to come of age.

Or, as the late German chancellor Kohl put it, “What matters is what comes out in the end.”

This misses the humour of Kohl's "entscheidend ist, was hinten herauskommt". You have a good translation for the normal meaning of the phrase. But "hinten" is the backside, often associated with defecation. :-|

I'll comment on only those points where I disagree (which is a subset of those points about which I know enough to comment), in other words no comment means "I agree" or "I don't know enough to comment". I've commented on most of these in other threads, so I won't repeat myself too much.

"Dark EnergyThe question why the cosmological constant is presently comparable to the density of dark matter is likewise a bad problem because it isn’t associated with any inconsistency."

This is indeed not a problem, but not for the reason you think.

"Particle MassesIt would be nice to have a way to derive the masses of the particles in the standard model from a theory with fewer parameters, but there is nothing wrong with these masses just being what they are. Thus, not a good problem."

It is unclear whether this is a problem. It might be related to fine-tuning in a specific sense of the term ("the universe is fine-tuned for life").

"The Flatness ProblemIs an argument from finetuning and not well-defined without a probability distribution. There is nothing wrong with the (initial value of) the curvature density just being what it is. Thus, not a good problem."

This is indeed not a problem, but not for the reason you think. See the paper by Marc Holman (now published in Foundations of Physics.

"Baryon Asymmetry and The Horizon ProblemThese are both finetuning problems that rely on the choice of an initial condition, which is considered to be likely. However, there is no way to quantify how likely the initial condition is, so the problem is not well-defined."

I think that the first "likely" should be "unlikely". The flatness problem is also in this category, though it is "worse" in the sense that the horizon problem can be solved by the assumption of a Robertson-Walker metric, while the flatness problem claims that fine-tuning exists even given the Robertson-Walker metric.

It could be that it is not well defined, but nevertheless one can say that it is a problem. I think that this is a fundamental difference between our views. You seem to say that if we can't quantify the problem it is not a problem, while I think that some things are obviously problems even if they are not completely quantified. Suppose you wake up and see a large pink elephant in your bedroom. Would you say that it is not a problem because you don't know the probability distribution?

IMO the black hole information loss is not a problem yet. It's an argument from ignorance. We don't have a complete theory of the interior of a black hole. We assume information is lost. But we don't know that because we don't have a complete theory. We only assume it is a problem.

Just because a group (or collection of groups) does not have "a central authority creating and enforcing the rules of conduct between them" doesn't mean it's anarchic. There are many ways to generate hierarchies and rules without central authorities. My point was that this virtually always happens because anarchism is hugely ineffective. People work better together if they have shared goals and procedures.

This isn't any different in science as in society in general. Sociologists like to refer to a scientific community as a "community of practice". It means, roughly, the rules aren't written down and they aren't enforced, but they are passed on as tacit knowledge. While everybody is free to do whatever they want, there is usually a large coherence in the group. This is normally a good thing, but it can sometimes get in the way of progress.

In any case, as I already said above, things can change and do change and should change.

As I said a few times before you can't just disagree on a conclusion without pointing out what you think is wrong with the argument. It's like saying you agree that addition is commutative and 3+1=4, but don't want to agree that 1+3=4.

I actually meant likely when I wrote likely. The sentence says that if you take an initial condition that you assume is likely you will get a problem. You say that it would take an unlikely condition to not get a problem, which is the same thing but doesn't make sense in the context of the sentence.

"It could be that it is not well defined, but nevertheless one can say that it is a problem. I think that...

You can say whatever you like. I have explained very clearly why I think spending time on ill-defined problems is not a promising avenue.

Do not keep saying to yourself, if you can possible avoid it, "But how can it be like that?" because you will get 'down the drain', into a blind alley from which nobody has escaped. Nobody knows how it can be like that.

so while he seems to have agreed that it is a problem, he didn't think it was one that you should give students to work on. I would agree that this is still the case.

Let me also note that Feynman did work on it. This was his motivation for looking at negative probabilities. I saw him give a talk on it when I was an undergrad, around 1980, when he explained that the measurement problem was his motivation for this work. He wrote what he did in paper which appeared in 1987, and which doesn't mention the measurement problem, presumably because he concluded that negative probabilities don't solve the measurement problem.

Second, I think the "solution" to the measurement problem will come through the theory of decoherence, which has made great advances over the last 30 years, but where there is still room for work. The theory of decoherence tries to show how quantum behavior at the microscopic level looks like classical behavior at the macroscopic level. And once we understand that, while people will keep on talking about the measurement problem, it will effectively be a non-problem.

I think you missed my point. I agree that anarchists reject central authority, I am saying that this is necessary but not sufficient. In any case, quite possibly we use the word in different meanings. For what I am concerned, the way Wikipedia phrases it captures it well: "Anarchy is a society, entity, group of people, or a single person that rejects hierarchy."

(Though, upon further inspection, I think a single person can't "reject hierarchy" any more than you can reject being taller than yourself - doesn't make a terrible lot of sense.)

In any case, I still fail to see what this has to do with my blogpost and would appreciate if you could get back to the topic or drop this discussion.

@Sabine,by accident I could see Roger Scruton’s movie “Why beauty matters?” short time ago, so I use it as an inspiration. He basically says that when we remove Beauty we will be left with Ugliness only and “she” will show us no direction out of chaos…

Actually I do agree with almost all your feelings about the current situation in physics except your fight with a beauty. To me -- what is wrong -- is a fake beauty (=ugliness). Items on your list are truly important but your rationale, full of statements like “There is nothing contradictory about this” (I don’t care), “There is nothing wrong with …” (I can live with it) indicates defeatist attitude and underestimation of available empirical data. In my view the final goal of any theory shall be not to leave any question unanswered and not to leave any empirical fact unexplained. Therefore all unanswered questions and all unexplained empirical facts are challenges worth to tackle.

Let me shortly comment and give different view to your list:

Dark MatterDM searches are experimentalist version of what you call “Lost in math”, (“Lost in experiment”), there are no limits on what, where and how one can try to find it, however the typical experiments are approaching to neutrino background levels empty handed… => ignored empirical factAnd at the same time the viable alternative (MOND based point of view) is considered a heresy => ignored empirical fact

Dark Energy (cosmological constant), The Flatness Problem, Baryon Asymmetry and The Horizon Problem=> not well enough understood empirical facts, resulting in theoretical inconsistences, to be handled with maximum care because asLev Landau once put it: “Cosmologists are often in error but never in doubt.”

SH wrote Clearly you still haven't read my book because you make the exact mistake I point out there. What you say is correct, but you fail to explain what's problematic about it.

Well my point is the if one is thinking according to there existing some grand gauge symmetry then the hierarchy problem is of relevance. At this point I am somewhat agnostic about this. The E8xE8 --> SU(3)xE6xE8 looks interesting in ways, but E6 does not recover the standard model well. I think there may be another way to think about this, but I will not go into that. The biggest problem I see with this is connecting with the standard model.

I started to read a borrowed copy of your book back in November. I intend to purchase it and finish it. but I just incurred some major expenses. So my budget is a bit tight. I so far agree with you that beauty can't be a criterion for the veracity of a theory. I also think that if we work through this we may be all slapping our foreheads saying, "Of course it is so obvious, simple and elegant." It may not be us who do this, but our grandchildren.

You say "The really major advances in science are not recognized for decades after they have been made by those obsessed with knowing and discovery. These discoveries are just to hard to accept by the current paradigm of the day. As it has been, it remains so today. There are great advances in science that have been made but are currently being ridiculed. It just takes decades for these advances to pass the test of time and to come of age."

However there are many examples of advances in science which have had an 'immediate' impact. Newton, Marie Curie, Roentgen, Rutherford, Einstein, Fleming, Michelson-Morley and many others. It's a very long list. In many cases these discoveries have not just shaken the 'current paradigm' they have either given it a real kicking. Indeed in some cases the current paradigm was consigned to the history of science - an interesting curio but no longer relevant.

I also take issue with your last couple of sentences. What is a 'great advance(s) in science'? Sometimes it's obvious (Einstein) sometimes not (Wegner). But it is always the scientific community which ultimately decides what a 'great advance' is. Reaching that decision takes time but not always decades.

I read Sabine's blog and list as being along the lines of "advice to a young physicist" and her classification of good and bad problems as her take on what topics could be a useful starting point for the new researcher to attack and possibly kickstart and / or progress a career.

I would imagine there's agreement that problems which relate to fine tuning need to be addressed but that they are also the problems a young researcher should initially avoid. This seems to me to make sense. If there is an obvious inconsistency between experiment and theory then there is a clear starting point for the research. Resolving just why a fine tuning constant has a particular value is a much greater problem and addressing it requires a level of experience, knowledge and intuition which can only be achieved over time.

Peter Shor: Decoherence can get you part-way there, with the density matrix diagonalizing in open systems, as the off-diagonal terms get radiated away, and you probably end up with Zurek-style einselection. But I don't see how you can get from there to a single macroscopic state being selected without having to introduce something completely novel. If one subscribes to the many-worlds idea, then one has to contend with general relativity, since many worlds in a single spacetime would mean some sort of unique combined metric, resulting in gravitational interaction between worlds. And one cannot simply state that each branch of the wave function exists in its own spacetime without explaining how exactly these spacetimes split from a single one. Thus any significant progress in the measurement problem cannot be avoided without pulling gravity into it, no pun intended.

Maxwell (and Faraday and Hertz etc) studied electric and magnetic fields. These fields were already known at the time. The point is that the electric and magnetic interactions are quite easy to observe and their discovery was definitely not a success of theory-development. (In contrast to, say, the prediction of anti-particles by Dirac.)

"Second, I think the "solution" to the measurement problem will come through the theory of decoherence, which has made great advances over the last 30 years, but where there is still room for work. The theory of decoherence tries to show how quantum behavior at the microscopic level looks like classical behavior at the macroscopic level. And once we understand that, while people will keep on talking about the measurement problem, it will effectively be a non-problem."

Decoherence can't solve the measurement problem because in that case the time-evolution still remains linear. Any process that solves the measurement problem must by necessity be non-linear because you have to "collapse" a large space with all possible coherent superpostions into a few eigentstates (of whatever it is that the detector measures). Decoherence can (and pretty much already does) tell you what selects the eigenstates. It does not (and cannot) tell you how you end up in an eigenstate (at best you end up in a mixture).

I agree that part of the problem is that we don't really understand well how to model a detector bottom-up, but this is only part of the story.

It is of course an assumption a) that the universe has been through several (or an infinite) number of iterations and b) that few (or none) of the previous iterations had life. But perhaps that was the case, and with each iteration the cards are mixed anew. This is a multiverse explanation for fine-tuning. There are other possibilities, of course.

Note that being fine-tuned for life does not mean that life must arise.

The Horizon problem is a real problem. While the flatness problem might be dismissed by anthropic arguments, there is no reason why the CMB should have such a perfect black-body spectrum. Whenever you encounter such a perfect spectrum you infer that the body emitting it had existed prior to your observation for a sufficiently long time, and the early universe should not be an exception. Ignoring this point reminds me of a religious friend of mine who, when confronted with the evidence that the universe is more than five thousands years old, replies: "It was created old! Like Adam and Eve were". The measurement problem is circumvented altogether here: arXiv:1804.00509

I agree that the horizon problem is a real problem (which doesn't mean that there is no solution to it). I don't see how anthropic arguments can solve the flatness problem, though, at least not in its strong form (i.e., not the original formulation that Omega is between, say, 0.001 and 100, which was the puzzle at the time, but rather why the universe is flat to high precision, i.e. Omega+lambda=1 to better than a percent).

You seem to be misunderstanding the horizon problem, though. The puzzle is not why there is a black-body spectrum, but rather why the temperature is the same in regions which, at least according to classical cosmology, were never causally connected.

Thank you for this list and your efforts to make this problems (and non-problems) understood.

Why is the measurement problem still widely viewed as an open question? What is wrong with Everett's analysis? As far as I can understand, he never introduced any new postulate, he just used standard quantum mechanics to show that wave function collapse is not needed to explain measurements. The fact that many people do not like the consequences of his analysis does not detract from its logical and mathematical consistency. To me this seems like another example of someone's subjective judgement of "beauty" and "likelihood" against good science.

Hi Sabine, In your comment "You are misunderstanding my intent. When I say "not a good problem" I do not mean "I don't want to see an answer", I mean "not a promising route to progress." I have explained very clearly what my reasoning is. Sure you can get lucky trying to find a way to calculate the masses in the standard model."

I feel like getting lucky is a legitimate way to find new physics. It is kind of like being at the right place at the right time. One has to make their luck. I would like to figure out how the neutron gets its mass, what is the mechanism. From there I think the proton neutron mass ratio and all the other ratios can be found.

As Phillip says, the horizon problem is not about the blackbody spectrum being so good, but about it having the same temperature wherever we look. It's not a problem because if you roll back in time long enough you will eventually get into a regime where we do not know what equations to even use and statement about what initial conditions are likely or unlikely to come out of this are entirely unsubstantiated.

As has been noted many times before, many worlds doesn't so much solve the measurement problem as obfuscate it. The price you have to pay is that you end up with an interpretation of probability that is so murky as to be meaningless. You may find this helpful.

I see the uniformity (in temperature) of the CMB and its near perfect BB spectrum as two facets of the same problem. Had classical cosmology allowed the hot plasma allegedly generating the CMB to thermalize for a sufficiently long time, it would have become both uniform in temperature, and locally with a BB spectrum. The fact that classical cosmology does not allow for such thermalization is the problem. A solution to the problem (which is distinct from my religious friend's solution to the age-of-the-universe problem) would be some consistent, well justified modification of classical cosmology at early times which allows for such thermalization to take place (e.g. https://arxiv.org/pdf/1201.5281.pdf). It has nothing to do with initial conditions in the same sense that explaining the BB spectrum of a hot potato does not require mentioning initial conditions.

A great deal of the time when I see physicists discuss quantum mechanics in public venues, I can’t help think those people drank the Kool-Aid. I don’t mean their admiration for the usefulness, accuracy, and predictability of the theory (that is well established). I am referring to their interpretations and enigmatic descriptions of how unreal reality is. I don’t see much difference in their belief to faith in god. They tie the empirical with loosely correlated affiliations and reasoning then strongly magnify it’s importance due to a plethora of emotional reasons such as, group think, established preexisting beliefs, self-interest, self-importance, etc.

When one’s first reaction to a well-reasoned challenge is to figure and argue how the challenge is wrong rather than contemplate why the challenge may be right, I see the practice of undeniable faith rather than good science. I think this is how we still believe in god, and got stuck on Newtonian mechanics for 300 years. Humans need to better understand how much our thinking is unknowingly influenced by our nature, and how incredibly difficult it is to objectively see in one’s self or those we agree with.

I don't understand how "multi-universe" and "cyclic universe" and can be taken seriously by physicists, to me they sound like Obi Wan waving his hand and saying "These are not the droids you are looking for."

Or, "Yes, it is strange, but you don't have to worry about it, we have a solution you can't test and we can't prove."

I understand Sabine's argument that there is not necessarily any explanation for the values of the constants (I read the book!) but that is different than saying a reason exists that we can't prove.

What am I missing? Why do physicists talk seriously about things you cannot hope to observe, like the values of constants in a parallel universe?

I have not posted your earlier comment because your insistence that dark matter has been disproved is wrong and I don't currently have the time to get into this. I have now posted your most recent comment but will not post further comments on the matter. Please discuss this elsewhere. Thanks,

Sabine, the article you link to as explaining why Many Worlds "doesn't so much solve the measurement problem as obfuscate it," is itself a caldron of confusion. As one comment put it, "here I see a person who still thinks about selfhood as essentialistically as a benedictine monk in the Middle Ages." What are we to make of Ball's insistence that "one of the most serious difficulties with the MWI is what it does to the notion of self"? Seriously?

The whole piece reeks of dualism; whether or not the author is consciously a dualist, he seems unable at some level to really accept that minds are physical systems, and nothing more. For example, Ball writes,

"The 'I' at each moment of time, [Lev Vaidman] says, is defined by a complete classical description of the state of his body and brain. But such an 'I' could never be conscious of its existence.

"Consciousness relies on experience, and experience is not an instantaneous property."

Beyond the question of why Ball thinks that he understands what consciousness is and how it works -- an extravagant boast when we barely understand how human brains work at all -- is the question of just *where* he thinks those experiences reside, if not in the *instantaneous* physical state of a brain?

I suspect that dualism, whether acknowledged or not, lies at the heart of the debate over MWI. Those who accept at a gut level that minds are just physical systems can be OK with MWI and the idea of a universal wave function that never collapses. Those who harbor dualistic notions reject MWI vociferously.

Thank you (I think). I thought maybe I had broken a rule. The context of the point was that you believe further research into Dark Matter to be fruitful. I do not agree if it involves a myriad of new species of undiscovered phenomena that are not necessarily beautiful mathematically (eg. SIDM, Superfluid, WIMP's, MACHO's as examples postulated with no detected particle, and no independent way to confirm outside opaque (computer assisted) models)

It would be dishonest with regards to the point if I did not specify my reasoning behind why I believe Dark Matter research would be fruitless in the very context you explain it in. I am in no way pushing my own alternative explanation of phenomena, which I know is something you disallow in these comments.

I am not sure what I have done wrong, and if I have not broken any rule, publish the comment and let others spend the time to rake me over the coals for being "wrong". It's possible everyone else will ignore it also.

Well, sorry that you didn't like the piece, I thought it got across the point well. The reference to self in my eyes is a rhethorical device to get across that in many worlds it's difficult to figure out whose or what's probabilities you should be calculating in the first place.

I never said anything like that. The point I made above is a very simple one: We have a disagreement between theory and data and that makes it a good problem. What are good ways to try and *solve* the problem is another matter entirely. I have been very clear in my book that I don't think inventing new particles en masse is a good procedure, so don't know why you bring this up. In any case, stating that "dark matter has been disproved" is so wrong I don't have the patience to even deal with it. Best,

"It is also an assumption that the cosmos we observe constitutes a "universe".

I'm not sure if you're making the same point that I sometimes want to include as a caveat to my usage of the term "the Universe", but...

Whenever I refer to the Universe, I am referring to something which, I am convinced, is far greater than the portion we can observe... perhaps infinite.

It has always been the case that the limits of the observable have fallen by the wayside; why should it be any different now? Yes, the particle horizon may be a hard limit to observation, but that in no way mandates a boundary otherwise.

I'm tempted to post an extrapolation I've come up with based on a visualization I've found which details the motion of galaxies in the Local Supercluster, tying it in with a recent comment by Dr. Castaldo. I'll refrain only because I want to first make an effort to confirm that my idea is not duplicative (Dr. Bee's book is the first stop in that quest... see below). I'll link the visualization regardless, because it is so interesting; use the 3D-tool at the bottom of the page.

http://www.ifa.hawaii.edu/info/press-releases/galaxy_orbits/

I received... and started reading... Lost In Math in hardcover today. For me (a frustrated intergalactic traveler... and this was no pipe dream), this is right up there with Abell's Exploration of the Universe which, if I recall correctly, I first read in the second grade (1967/68). I still have, right behind me, a Second Edition (1969) copy of Abell's textbook and, should I live to the age of 110, Dr. Bee's book will still be right next to it! It's the crash-course I always needed : )

Thank you for engaging, and I know you have explained your position on *new* particles. It is the already in dozens of peer reviewed journal *old* particles where I think you may be encouraging continuing "beating a dead horse".

I have been very clear in my book that I don't think inventing new particles en masse is a good procedure

I did list some of the particles I think are in the "fruitless" category, but I am not sure of your position on "particle dark matter" in general. I think fruitless. You? Other commenters? Is superfluid Dark Matter in the same boat?

"Whenever I refer to the Universe, I am referring to something which, I am convinced, is far greater than the portion we can observe... perhaps infinite.

It has always been the case that the limits of the observable have fallen by the wayside; why should it be any different now? Yes, the particle horizon may be a hard limit to observation, but that in no way mandates a boundary otherwise."

What I meant, by drawing a distinction between the cosmos that we observe and the "Universe" model which is the de facto framework for all of our current cosmological discussions, is inclusive of your comments (excluding the infinite reference).

However, while the Universe model can be amended, as you do, to address any boundary limitation issue, other problems are more intractable. The assumption underlying the FLRW metric is that a general metric can can be ascribed to the cosmos and then the field equations of GR can be piggy-backed on to what is an essentially universal reference frame to derive a scale factor.

The resultant model is a study in incoherence (the inexplicable original condition), contradiction (the alleged 13.8 billion year age of the Universe, a statement of universal simultaneity that is antithetical to relativity theory), and absurdity (95% of the Universe is composed of some invisible matter-energy, the only salient characteristics of which are that they are necessary to make the model agree with observations).

Arguing, some years ago, in an online forum that cosmologists seemed unwilling to reconsider the Universe assumption underlying FLRW, someone responded that it had been reconsidered and found wanting. When asked for citations to support that claim he provided exactly one- to a paper from 1917. Such is the state of modern cosmology.

"It's not a problem because if you roll back in time long enough you will eventually get into a regime where we do not know what equations to even use and statement about what initial conditions are likely or unlikely to come out of this are entirely unsubstantiated."

This is, of course, true. However, it also demonstrates a difference between Sabine one one side and me (and, I would argue, many if not most others) on the other side. Sabine tends to say that one we reach a region where we are treading on uncertain ground, there is nothing to worry about, because we don't know anything. I would say that some problems can be identified even if we don't know all the details. In the case of the horizon problem, it is a problem because in the early universe areas have the same temperature which were not in causal contact. Unless we are willing to give up relativity, this cannot happen, so it is a problem if the universe is described by a Robertson-Walker model extrapolated back into the past at all times. We don't know what the solution is. Inflation is a possible solution to this problem (whether inflation introduces more problems than it solves is another question). However, just because we are in a region where we don't know what is going on does not mean we shouldn't worry about it. That would be like someone who does numerical simulations of galaxy formation replying to critics with "we haven't included full baryonic physics, so there is nothing to worry about" or with MOND enthusiasts replying to criticism with "we don't have a relativistic theory, so there is nothing to worry about".

Sabine said: "What I am saying is that if you do not have a well-defined problem you are unlikely to make progress."

So here is a well defined problem for you. Formulate the laws of physics in a way which is both consistent with what we believe is true, and allows for extrapolation of the universe backwards in time so that thermalization of the hot plasma generating the CMB becomes possible (thermalization being a generic process, mandating only the existence of an arrow-of-time).

If you try to solve this problem you will soon find out that most of the problems you listed in your post are just different facets of a single big problem in the foundations of physics. In fact, treating these different facets as distinct problems is perhaps the most severe problem in the foundations of physics.

Physics, as I see it, has taken a `wrong turn' many decades ago, and before it is back on track no combination of money plus size and number of brains will end the current phase of stagnation. "Crank" I hear - that's ok. I've been called worst :)

"Sabine tends to say that one we reach a region where we are treading on uncertain ground, there is nothing to worry about, because we don't know anything."

No, this is not what I am saying. What I am saying is that if you do not have a well-defined problem you are unlikely to make progress.

That makes more sense. However, with reference to the horizon problem, you wrote above, and I quote: "It's not a problem".

"As Phillip says, the horizon problem is not about the blackbody spectrum being so good, but about it having the same temperature wherever we look. It's not a problem because if you roll back in time long enough you will eventually get into a regime where we do not know what equations to even use and statement about what initial conditions are likely or unlikely to come out of this are entirely unsubstantiated." (emphasis added)

If you do not have a well-defined problem you have no problem because there is nothing to solve. You don't know what you are even talking about. Look, go and formulate the problem in equations and you'll see that you cannot derive any disagreement, neither internally nor with data. You can worry about this, if you want to. I am simply telling you that that's not a promising route to progress.

For all I can tell you major argument is "Phillip Helbig thinks it's a big problem, therefore it's a problem." Excuse me for not finding this convincing.

If you do not have a well-defined problem you have no problem because there is nothing to solve.

OK, that makes your position clear.

I am simply telling you that that's not a promising route to progress.

Of course. Things which people have been pondering over for decades are probably not easy to solve.

For all I can tell you major argument is "Phillip Helbig thinks it's a big problem, therefore it's a problem." Excuse me for not finding this convincing.

No. Forget me. Most cosmologists think that it is a real problem. Perhaps not well defined, but nevertheless real. (Again, they disagree with you.) Feel free to dismiss most cosmologists. :-|

Of course, most people, even most cosmologists, can be wrong. (Don't anyone quote Landau!) I and a few other people have written papers pointing out that the flatness problem is not a problem, so we don't necessarily believe things just because others do. But we showed that it is not a problem, which is different from saying that it is not well defined. Someone should do something similar for the horizon problem; that would be interesting.

I think you are putting too much emphasis on the non causal connection between different regions in the universe. A causal connection is just a necessary condition. To achieve the near perfect uniformity of the CMB's temperature and its near perfect BB spectrum, the universe must have existed for a very long time prior to the big-bang. Of course, one can argue (with Sabine) that the BB spectrum could have been created by a process other than thermalization, depending on the (yet unknown) physics in the early universe. Yet, the BB spectrum is not a ubiquitous distribution like the normal, Poison etc.. It appears in nature in one circumstance only: After a long thermalization process. Extrapolating the universe backwards in time is therefore the rational way to proceed (Inflation, in this regard, is probably the worst way to do so).

As for your earlier comment regarding the flatness problem not being solved by anthropic arguments - I confess that I didn't delve into the details (because I'm not fond of anthropic arguments and because my proposal solves also the flatness problem). Do you disagree with the statement that, had the current ratio between the critical density and the observed density not been nearly 1, then life as we know it could not have been formed? (...since this ratio in the early universe would have been gigantic).

"As has been noted many times before, many worlds doesn't so much solve the measurement problem as obfuscate it. The price you have to pay is that you end up with an interpretation of probability that is so murky as to be meaningless."

I looked at the quanta piece that you pointed out, and like Kevin van Horn I find it extremely confused in its arguments about the role of consciousness. I am not sure, though, that Kevin is right about the role of dualism (after all, if one accepts a dualist picture and mind somehow exists separately from the brain, one could contemplate quantum superpositions of minds...).

Also, I note that Phillip Ball agrees with my previous comment that decoherence supports MWI (even if he ultimately rejects MWI for very strange reasons).

As to the interpretation of probabilitites, it has been shown by proponents of MWI that the Born rule as we know it holds in Everett's multiverse except in branches of vanishingly small amplitude. It seems all right to neglect a vanishingly small portion of the universal wave function, so I would rate that as a 99.999% satisfactory explanation of the appearance of probabilities!

I think you are putting too much emphasis on the non causal connection between different regions in the universe. A causal connection is just a necessary condition.

I guess you mean "but not sufficient".

"To achieve the near perfect uniformity of the CMB's temperature and its near perfect BB spectrum, the universe must have existed for a very long time prior to the big-bang."

Why are you the first person to claim this?

Others have pointed out the possibility that the universe might have existed long before the big bang. Tegmark says that one should think of the big bang as when something (inflation) stopped, not when something started. But these ideas are not based on a perceived need for thermalization.

"Of course, one can argue (with Sabine) that the BB spectrum could have been created by a process other than thermalization, depending on the (yet unknown) physics in the early universe."

I hope I've made it clear that I differ from Sabine here; I think we should worry about some things even if we don't understand them.

"Yet, the BB spectrum is not a ubiquitous distribution like the normal, Poison etc.. It appears in nature in one circumstance only: After a long thermalization process."

I think that most people will agree that thermalization is needed in any sensible scenario, but why does it have to be long?

"As for your earlier comment regarding the flatness problem not being solved by anthropic arguments - I confess that I didn't delve into the details (because I'm not fond of anthropic arguments and because my proposal solves also the flatness problem). Do you disagree with the statement that, had the current ratio between the critical density and the observed density not been nearly 1, then life as we know it could not have been formed? (...since this ratio in the early universe would have been gigantic)."

Yes, it would have been gigantic, but this is not our universe. In other words, the Friedmann equation is called the Friedmann equation because it is an equation. Hence, one cannot change just one term ("suppose the density at the time of the electroweak phase transition were just one part in a million higher"); one has to change at least two. In doing so, what happens is yes, that's true, but the total mass of such a universe is two kilograms or whatever.

I don't see any anthropic connection to fundamental cosmological parameters other than that the universe be old enough, that early expansion was slow enough to allow galaxy formation, and so on, but these are pretty weak constraints.

There is a huge literature on the flatness problem, much of it confused. I recommend reading Marc Holman's "review" on this topic (now published in Foundations of Physics and the references therein (including those to my work).

Thank you for your reply and the link to the article by Phillip Ball, which I had read before and I read again on your suggestion.

His arguments about probability in Everettian quantum mechanics seem fallacious to me. As far as I can understand, the probabilities are the same as in the standard theory and experiment, the difference being that the wavefunction collapse postulate is eliminated.

Wavefunction collapse does not conserve energy, and is unnecessary to explain experiments. How is that an acceptable scientific postulate?

Interpretation can perhaps be left open to personal taste, but not the postulates of the theory.

MACHOs are not, solely, “theoretical particles”. They exist, and have been observed, in various micro-lensing - of the gravitational kind - programs, such as OGLE. Per current observations, they include faint stars, exoplanets, even failed stars and perhaps rogue planets. Rogue small planets, and other solid bodies as small as small asteroids, have yet to be observed in this way. However, there cannot be many, much smaller objects, because meteors from beyond the solar system are rare (some micrometeorites like this have been observed). I don’t recall, off hand, what the status is of results for black holes, of a wide range of masses, from this technique.

The “problem” with the MACHOs observed so far is that they can comprise, at most, a small fraction of the observed/inferred CDM within the smallish part of the MW that has been probed by this technique so far.

I may have expressed this unclearly. The probabilities that you calculate using the standard procedure are the same, of course, but that's "shut-up and calculate". The problem with many worlds is to figure out the probabilities of whom or what it is that you are even calculating.

Thanks again. I accept that everybody will have it's own interpretation of what the theory means. I accept that you see problems of interpretation that I don't, probably because you have thought and studied more about these issues than I have, but shouldn't we all at least agree on a common set of postulates?

"The problem with many worlds is to figure out the probabilities of whom or what it is that you are even calculating."

It seems to me that there is a clear frequentist interpretation of probabilitites if you look at the *past* results of measurements performed by an observer.If you look at future results, the situation is different because the resultsare not uniquely defined. The difference between the past and the future is that branches that have decohered will not recohere, so the multiverse looks like a tree to a very good approximation.

Sabine: you can think of a physicist that performs the same experiment repeatedly, with say two possible results each time (say "spin up" or "spin down" or whatever it is that the experimenters measure).Then you can look at the sequence of past results obtained by the experimenter.Note that this is a well defined sequence of 0's and 1's (but as I pointed out earlier the future results are not well defined).Then you can ask whether the frequencies of 0/1 obtained agrees with the Born rule.Does that make sense to you ?

Sabine does not have a problem with you (or me for that matter) but what I am saying about the falsification status of Dark Matter.

Yes, I do realise that MACHOs cover a lot of different possibilities including real observable matter, and presumably a lot that is real that is outside of our ability to directly detect. It is within this subset that are outside of our ability to detect that there is the potential for a poor investment for our science dollar with a similar argument to Sabine's regarding a new Circular collider. This matches Sabine's view in total, and her support for DM research is not based on continued searches for even MACHO's (or WIMPS etc.) for they have no horizon (ie. when do we give up looking when we don't find anything?)

Sabines positive view on Dark Matter research is based on the match with the accepted science of the Big Bang theory and other cosmological evidence that goes along with that. I am sure that all current researchers outside of the particle view of Dark Matter would concur with Sabine's view also. A pause on high energy experiments trying to find particle Dark Matter, but lots of Cosmological observations, space probes to map and quantify the Cosmological distribution.

Sabine has no need to "debunk" my ideas on Dark Matter. There are thousands of papers that have passed peer review that show continued evidence for Dark Matter that do that for her. No particle yet, so we should stop trying that avenue, but everything else should be fruitful.

Even Stacy McGaugh, hardly a fan of Dark Matter sees how neatly an 80% or so level of dark matter over total matter elegantly fits the established narrative of the Big Bang.

I agreed with all of this until I saw Sabine's video regarding Dark Matter.

I implore everybody to watch the video, buy the book, read the peer reviewed papers, but make up your own mind about the falsification status of Dark Matter.

I think we are talking past each other. Calling an "observer" a "physicist" doesn't answer my question. You have a wavefunction that contains infinitely many universes. How many of those are contained in your "physicist"? How do you decide?

Sabine, I think you're asking how to chop up the universal wavefunction in a bunch of separate universes. That is a valid question which can hopefully be addressed by decoherence and / or MWI theorists. I don't know how much has been done already, and to what extent this is still work in progress. You'd have to ask the experts for that.

Nonetheless, I maintain that the main issue regarding probabilities in MWI is how to derive the Born rule, and that it is useful to think about that in the idealized setting that I described in my previous commments. In traditional QM the Born rule talks about the possible results of measurements. Defining "measurement" is possibly delicate (and I think unaddressed in traditional QM) but it is only fair to think in the same terms in MWI.

So, to reiterate: at day 1 a physicist performs a measurement, and after the measurement we have 2 branches, one for each result. At day 2 each physicist in the 2 branches performs another measurement and that yields 4 branches. And so forth until we get 2^n branches after experiment n. Now one can ask what is the sum of squared amplitudes of the branches with some given deviation from the Born rule.

If you think that the terms "physicist" and "measurement" in the above thought experiment are not defined precisely enough, my answer is: give me your definition of a measurement and I'll give you mine!

"You have a wavefunction that contains infinitely many universes. How many of those are contained in your "physicist"? How do you decide?"

Infinitely many, but for all practical purposes just one.

Sorry, I fail to see why this is not obvious. Perhaps I am only thinking superficially about the matter.

Two quantum states of a macroscopic object (like an observer) only have a significant overlap when they only differ in a few subatomic details, so for practical purposes you can in many cases treat an infinite ensemble of objects/observers as a single object/observer. Those states that are macroscopically different to the one we are experiencing can be safely ignored, as our overlap with those quantum states is negligible.

It is often said that all this discussion is metaphysical and of no consequence. But Eugene Koonin (a biologist with 165429 citations) has argued that the complexity of biological systems is such that their evolution can only be scientifically understood as the result of multiple attempts in multiple parallel realities and the fact that our observations are thus inevitably subject to statistical postselection bias (AKA weak anthropic principle). Koonin seems to be more influenced by his russian compatriots working on eternal inflation than by Everett, but his arguments could be used in either case. https://doi.org/10.1186/1745-6150-2-15

"The “problem” with the MACHOs observed so far is that they can comprise, at most, a small fraction of the observed/inferred CDM within the smallish part of the MW that has been probed by this technique so far."

@Sabine: Here is a fun idea. What is the *actual* distribution of dimensionless numbers in the existing physics that we know works? I imagine we can generate all kinds of dimensionless ratios so we at least have a testable sample.

We could find at least the median, and see how close it is to 1.0.

We could also test the mode (maybe half-sample mode, another robust statistic) to see if is close to 1.0. At least you'd have a talking point about a real distribution that can be analyzed!

At the risk of failing comment moderation, a couple of people have repeated the phrase observed/inferred CDM .

Now for those of you that have seen Sabine's video, the amount of Dark Matter that is "observed/inferred" in Galaxies at least, is the amount of mass required in an assumed Halo that balances Einstein's (and Newton's) equations of motion when the velocity (from doppler/redshift) of the star is taken into account.

That places the inference/observation at the same level as that of the planet Vulcan. A planet could have been a hypothesis that saved Newtonian Dynamics. It could have been a MACHO.

Now, the planet Vulcan would have only comprised a tiny fraction of a percentage of the mass of the solar system. Without speed of light related relations requiring a rethink in Relativity, there is no way that Newtonian Dynamics should have been overturned for such a tiny anomaly.

Therefore, in the same way with Dark Matter - in the absence of rock solid Galactic Rotation Relations, we should accept Dark Matter as a yet to be discovered tangible entity lest we doubt Einstein's equations outside of our station. Or get ready to re-write our grant proposals with respect to Dark Matter.

Because the parameter in question is not a mass. It's parameterized as a fluid which has the properties of matter. We do not know if it is made of particles which have a mass. Indeed the word "dark mass" would be much more misleading.

I think that there is a misleading concerning with energy and mass. Signals at speed c changing gravitational field can be modelled as mass distribution change when emitted or absorbed - they have no mass or gravitational INTERaction meanings "at flight".

"You have a wavefunction that contains infinitely many universes. How many of those are contained in your "physicist"? How do you decide?"

As a partial answer to Sabine's question: it may or may not be possible depending on the wavefunction. In the idealized setting described in my previous comment (a physicist performing n consecutive measurement) Sabine will surely agree (?) that there is a clear way of chopping up the wavefunction into 2^n classical-looking universes.

But if we imagine a multiverse that is in a completely random quantum state there will be no way to cut it up nicely into classical-looking universes. What this means is that our actual multiverse has very special properties - it is very different from a giant quantum mess. Or at least that we live in a corner of the multiverse with very special properties(after all, there could be some giant quantum mess existing in superposition with our own classical-looking uiverse, but we have no way to find out).

"Particle MassesIt would be nice to have a way to derive the masses of the particles in the standard model from a theory with fewer parameters, but there is nothing wrong with these masses just being what they are. Thus, not a good problem."

isn't this explained in terms of the yukawa couplings of fermions with the higgs field?

but then that leads to the question of why yukawa couplings of say a top quark is so much larger than a up quark, or an electron vs a tau

Sabine, I should think the zero point energy (vacuum energy, cosmological constant) problem is worth investigation. Experiments can be done with it (Casimir effect, Lamb-Retherford), and the measured value and theoretical value are wildly different. Perhaps other experiments can be devised to just mess with it, and see if it varies under different conditions.

Perhaps theory can be modified to account for it, without resorting to super-symmetry, in ways that can be tested. Any kind of quantization of space seems likely to influence the computations, perhaps there is a quantization that comports better with experimental measurement.