This site is the blogging component for my main site Crank Astronomy (formerly "Dealing with Creationism in Astronomy"). It will provide a more interactive component for discussion of the main site content. I will also use this blog to comment on work in progress for the main site, news events, and other pseudoscience-related issues.

Sunday, July 27, 2014

Or How I Stopped Worrying and Learned to Love Infinities…

It's a common whine of pseudo-scientists, that mainstream science has lost its way, introducing near-mystical entities like dark matter & dark energy, infinities, etc. to explain what many might want to view as 'common-sense' phenomena.

Of course, they conveniently forget that such 'common-sense' interpretations of the natural world ruled the human race for thousands of years of recorded history. Our modern world of space flight, communications satellites, microelectronics, etc. came about only over the past 400 years or so as we abandoned a view that Nature operates based on our notions and supernaturalism, and developed techniques which allowed mathematics to make predictions that can be tested by experiments which paved the way to precision engineering on large scales down to the atomic level.

While pseudo-scientists like to complain about modern theories with features they don't like, they conveniently forget (or most likely they never actually knew) that many older theories (which they are not complaining about) had annoying infinities and other 'common-sense'-defying characteristics that challenged and perplexed researchers of the day. Some of these issues are still unresolved, but otherwise largely ignored because they occur in regimes not yet accessible by experiment. But even with these problems, the theory is still very useful in areas we can reach with experiment.

Older theories made many 'nonsense' predictions, a number of which were even verified by experiments. Electromagnetism is loaded with them.

Poisson's Spot or Arago's Spot (Wikipedia): A small bright spot in the center of a shadow, a consequence of the wave theory of light. Poisson predicted the bright spot would exist in the center of a shadow using the wave theory of light and claimed that such a nonsense result was evidence against the wave theory. Then Arago set up and experiment and found it. This is actually a simple experiment for a reasonably equipped optics lab - I have even demonstrated it myself.

Maxwell's equations predicted atoms composed of negative charges occupying a large space around a central positive charge were unstable would collapse in less than a millionth of a second because the negative charge in orbit around the positive center would radiate away its energy. We know that atoms do not collapse.

Maxwell's electromagnetism predicted that heat from a fireplace would emit a dangerous amount (essentially an infinite amount) of high-energy radiation, yet that is not observed (Wikipedia: Ultraviolet Catastrophe).

Electromagnetism has a similar problem to Newtonian gravity at r=0. If we compute the energy of an electron of a finite radius, we get a diverging result as the radius approaches zero. We can choose a finite radius where the internal energy matches the electron mass, and obtain the Classical Electron Radius of 2.82e-15 meters (Wikipedia: Classical Electron Radius). But when we slam electrons together in accelerators, we find the electrons penetrate significantly closer than this. Today, we still only have an 'upper limit' of 1e-22 meters which means that is the LARGEST the electron could be, and it places no constraints on how much smaller it could be (Wikipedia: Electron-Fundamental Properties). The Standard Model assumes the electrons, neutrinos, and quarks are all structureless point-particles. Does the electron have structure? The honest answer is scientists don't know, but are trying to find out.

Standard Model of particle physics treats all particles as point-masses and charges. While some theories proposed beyond the Standard Model hypothesize a
finite size for the electron, none of those theories have yet been
verified. But for a few anomalies (neutrino mass, electron magnetic moment, etc.) this works quite well, but it is clearly not complete (Wikipedia: Point Particle, Standard Model)

With such bizarre, nonsense, illogical predictions, why aren't many cranks advocating total abandonment of Maxwell's equations?

Even circuit analysis was not immune from infinities. LC-resonance circuits have an infinity at their resonance frequency (Wikipedia: Electrical Resonance, LC Circuit), which is only resolved with the recognition that real circuits have some resistance which eliminates the infinity (Wikipedia: RLC Circuit). Here's an interesting question which maybe readers can find the answer - has anyone built an LC circuit using superconductors which ostensibly have zero resistance? If so, what happens at the resonance frequency?

Fluid dynamical equations, such as the Navier-Stokes equations (Wikipedia), treats liquids as a continuum at all scales. In the math, fluids are not ultimately made of atoms and molecules. As a result, mathematics can generate singularities (Backreaction: Singularities in your Kitchen).

Then there's all the counter-intuitive phenomena in quantum-mechanics, such as tunneling (Wikipedia) which are fundamental in the operation of the integrated circuits in the computer on which you are reading this. Quantum mechanics also generates it's own set of infinities. So far, they are dealt with by a number of techniques that can at best be called 'hacks' (Wikipedia: Renormalization) that allow use to obtain experimentally testable numbers, but they still indicate the theory is not complete.

Why don't we discard these theories since they generate such nonsensical results?

New Knowledge from Counter-Intuitive Behavior of Physical Theories

Almost invariably, the odd solutions to physical theories often generate useful insights. Sometimes these odd solutions are just ignorable, often the researcher has just made an error or oversight in applying the theory. That's why physical science is more of a social endeavor, as researchers provide checks on each others work.

Einstein-Podolsky-Rosen (EPR) argument (wikipedia). It took many years before we could actually test it and when we did, we found the bizarre 'spooky action-at-a-distance' which Einstein said was proof of the failure of quantum mechanics turned out to be exactly how the universe worked! Even so, some researchers continue to explore if this strange behavior has a more 'logical' origin.

Negative root prediction of positron. Negative roots in Dirac equation (Wikipedia) suggested existence of a positive counterpart of the electron, initially misidentified with the proton, which was identified a few years later (Wikipedia: positron).

One reason we keep these models, in spite of their problems, is that they still work better than their competitors in generating numerical values of measurements which we can compare to experiments. This ability also makes them easier to use as the basis for new technologies. All these 'problem theories' are the same ones that have made all modern technologies possible. You reject them at your peril.

The other reason is these theories work quite well everywhere else, while exhibiting strange behavior under very limited conditions.

Cranks want to replace these theories with bizarre things that generate little or no testable predictions. When we investigate the alternatives presented by the cranks, we encounter:

no math to provide predictive capability; or

presenting same math as the mainstream theory, but surrounding it with different terminology; or

replacing a single theory that makes many different predictions with numerous ad hoc explanations. The big problem with these alternatives is they can only arise as the original theory makes successful predictions after which the pseudo-scientists adopt the results as yet another ad hoc explanation.

So faced with these infinities and non-sensical behavior, we are faced with two options:

Admit the answer is unknown, then actually doing the REAL work needed to solve it. Continue to use the theory where it has good predictive power and work to test it closer to the limits where the strange behavior occurs.

Pseudo-scientist 'solution' is to claim problem doesn't exist or is made up; ignore the theory which has it, often without replacement. Of course, such individuals will miss out of the discoveries and inventions possible with even a incomplete theory.

Cranks routinely miss the fact that all models are an approximation to reality (see Crank Science: Worse than Wrong) but an approximation that produces useful results is still valuable, and better than models that produce objective predictions beyond, say "it looks like an electric filament!". As I've noted here many times, crank theories produce no useful results yet are repeatedly claimed as successful - which is what makes it a crank theory.

What Infinities Tell Us

Science popularizers often overuse some of the odder predictions of various theories such as speculation about the meanings of infinities the theory might have. This is in part to emphasize that there are still discoveries to be made - that science is not complete.

But anyone claiming to do serious SCIENTIFIC critiques would rightfully have their judgement questioned if they base their arguments on the statements made by popularizers. It's legitimate to argue about best analogies, over sensationalizing, or even outright inaccuracies for statements made by popularizers who are aiming to encourage excitement and enthusiasm about the science. Once one gets into describing what happens when we encounter infinities from a scientific theory, popularizers should probably have a red warning banner that reads "SPECULATION" or "HYPOTHESIS".

In reality, history suggests that infinities are signposts:

"Beyond Here May Be NEW PHYSICS".

It's an area on the frontier, seeking solutions, and therefore the most exciting. Sometimes it's just telling us that the approximations break down, atomic scale processes become important in a fluid, resistance becomes important in a circuit, etc. Sometimes there really are new physics. The Pioneer anomaly is an observation that was real and could have been a hint of new physics, but detailed examinations of laboratory phenomena, such as radiation recoil, revealed that over the 40+ years of the mission, even these tiny effects could accumulate sufficiently to be detectable (Wikipedia: Pioneer Anomaly).

So what's up with black holes?

Many cranks try to make a big deal about general relativity and black holes, wanting to rant about singularities and such. These singularities in General Relavity are the modern version of these historical scientific problems, and as such get a lot of attention in popularizations.

Is there a singularity at 'center' of black hole?

Many physicists suspect (myself included) suspect you can never actually cross the event horizon, so what lies at the center while perhaps interesting, is largely irrelevant.

However, some solutions suggest you can actually cross the horizon and reach the center, but it would be a one-way trip.

Note that even the recent 'Cosmos' series didn't make any claims about passing THROUGH the event horizon of a black hole.

Was there anything prior to the 'singularity' at the beginning of the Big Bang?

Either way, trying to find a conclusive answer to the question can provide new insights, but in the process generates a horde of speculations. Most of this interaction takes place in the literature and places like the Cornell Preprint server. All the PROFESSIONAL literature I've read on this controversial topic is pretty civil. Some of it receives responses, but things are usually quiet unless some new experimental result suggests a connection, or a university press office desperately needs something to publish.

Not so with the cranks, who take being dismissed, corrected, or ignored as personal attacks and try to use this as a way to get attention, but that's another post...

Many of the cranks who scream about the nonsensical theories seem utterly clueless about the history of the topic, ignoring how many of those 'problem theories' are not only still in use today, but in many cases driving technologies that they use.

But the cranks will find some excuse to ignore these inconvenient facts...

"But those infinities are not as bad as this one in relativity..."

Baloney.

Many of these historical problems were sources of serious controversy of their day, just as the modern infinities. Some, such as the infinite self-energy electromagnetic mass issue are still only solved by methods that aren't much different than hacks. Such workarounds suggest that our theory is not truly complete.

But does that mean the theory is useless?

No, it is quite usable in the broader realms, but the problems provide guides for future experiments which may outline the requirements of a more complete theory.

We were getting into the second hour of the interview, and it's beginning to show for me. I was starting to tire so I had a lot of 'you knows' and tended to diverge from topic. However, I'm generally pleased with the result.

Sunday, July 13, 2014

This post covers a set of general issues about falsification of scientific models and is also meant to be a follow-up to the claims by Bruno Suric in this comment.

I've written on the lame claim about the original Eddington observations of gravitational deflection (see Relativity Denial: The 1919 Solar Eclipse). Supporters of this claim seem oblivious to the fact that this was far from the only time in history these observations could be done. The observations have been repeated, and improved upon, in the 90+ years since. The Hipparcos data has measured the deflection to an accuracy of 0.3%, far more than enough to exclude the Newtonian model of light deflection. The fact that there were some possible conflicts in the observations suggests future tests must improve the experimental controls to remove the ambiguity.

The history of science is filled with examples where Mr. Suric's interpretation of falsification of a scientific model would lead one down an erroneous path.

Consider another example. In late-1700s It was found that the orbit of Uranus did not match that expected from Newtonian gravitation when including the perturbations due to the other known planets. Did that falsify Newtonian gravitation?

No.

While the possibility that Newtonian gravitation did not apply in the outer reaches of the solar system was certainly a possibility, it was not the only possible solution of the problem.

Because one of the other assumptions in the calculation was the motion was the other (known) planets in the solar system. What if THAT assumption were incorrect?

This happens quite often in science as many complex models are built with many underlying assumptions. When the model undergoes rigorous testing, those assumptions are also being tested.

Falsifying the Standard Solar Model With the Solar Neutrino Problem?

Few modern models are the result of only one assumption. They are usually the result of a combination of many assumptions, all tested to some finite level of precision. You might make the assumption that a component of the model is still valid beyond the tested range of precision, and that is often not an unreasonable assumption. Nature is surprisingly consistent in many of these cases, but Nature is under no obligation to agree with extreme extensions of empirically-tested natural 'laws'.

Mr. Suric claims that the solar neutrino problem should have falsified the entire model of the Sun. But this demonstrates a lack of understanding of how complex scientific models are made and tested.

What we call the Standard Solar Model has evolved over time as our understanding of the underlying fundamental physics and computational techniques improved (Wikipedia). The first solar models were computed with pencil-and-paper math and slide rules, integrating the structure equations with large distance steps which had the chance of computation as well as transcription errors.

The ability of computers to do many of the computations repeatedly and tirelessly with smaller distance steps and more accuracy enabled more accurate physics to be included. Today, some researchers write their own stellar structure codes or run one of the publicly available versions (see references below) that can run on modern personal computers. I've run some of these models on my laptop.

In the 1960s, at the time the first solar neutrino flux estimates were made, there were a number of other assumptions that went into the standard solar model (see Helium Content and Neutrino Fluxes in Solar Models). These assumptions were ALSO being tested.

ALL of these assumptions, and more, were being tested in the Standard Solar Model. After the first solar neutrino measurements revealed a neutrino count significantly less than expected (about 1/3 of the expected value), the other assumptions in the Standard Solar Model were re-examined (see Search for Neutrinos from the Sun).

An examination of all the alternatives eventually narrowed the problem down to the possibility that the neutrino oscillated between the different 'flavors' requiring the neutrino to have a mass, albeit very small. We still cannot measure the mass of the neutrino directly, but there are experiments that can put limits on the mass.

In my early undergraduate days, in the late 1970s, this idea as a solution to the SN problem was under discussion. i worked with a professor who did weak interaction theory (which often involves neutrinos), who, at the time, regarded the neutrino essentially massless. This is a fundamental assumption in the Standard Model of Particle Physics.

What a difference twenty+ years of technology improvements makes...

Eventually, neutrino detectors were built with sufficient sensitivity that they could detect the other 'flavors' of neutrinos, the mu and tau neutrinos. With these, we successfully detected these extra neutrinos from the Sun. Experiments were also conducted using nuclear reactors with known neutrino productions rates, sending neutrinos through the Earth, to be received by a remote detector. This experiment provided an earth-based test of neutrino oscillations where we had a controlled source (T2K Experiment). Neutrinos with mass are not part of the Standard Model of Particle Physics (Wikipedia, Neutrinos in the Standard Model).

So the Nobel prize awarded for this work (NobelPrize: Bahcall). This is ridiculed by Mr. Suric, but have we seen a prediction of the neutrino flux, as well as the stellar structure, from the model HE advocates. If so, where? How well does his model hold up under examination with the newer experimental techniques like helioseismology (Wikipedia)?

With these new experiments, a number of cranks have had to resort to handwaving attempts to discredit the experiments (see Tim Thompson's Rebuttal to Donald Scott on the topic). Yet legitimate researchers continue to use the results of these experiments to build even more refined experiments that not only verify but improve on the precision and extend the results.

That is what REAL science does, and results of previous experiments are subject to repeated testing and retesting.

Meanwhile pseudoscience is still making excuses. After all, where is THEIR computation of the predicted neutrino flux of the Sun computed from first principles in THEIR model? We've seen NO publication of THEIR predictions with how they were obtained, yet they continue to scream their model is more successful!

As I note above, the problem with the Standard Solar Model was solved requiring a revision in the Standard Model of particle physics. But does that mean all physicists must stop using the Standard Model for nuclear and particle physics calculations?

No!

Because for many other areas in computing reaction rates, etc. the Standard Model works just fine. It only becomes an issue near the limits of applicability of the model, which we now know is when the neutrino mass may be a factor.

Galileo said that the acceleration of gravity was constant.

When Newton suggested that it varied with distance as 1/r^2, did we stop using a constant for the acceleration of gravity?

No!

Because even Newton's theory demonstrated that gravity deviated from a constant at the surface of the Earth by an amount so small as to be irrelevant for most practical applications near the surface of the Earth.

Similarly, we still use the Newtonian model of gravitation for planning rocket launches and travel through most of the solar system, because even Einstein's theory gives the same predictions as the Newtonian theory to a precision smaller than the errors produced by aerodynamic and other engineering uncertainties.

Will the Standard Model of Particle Physics eventually be replaced?

Almost certainly!

But you can bet that whatever theory replaces it will generate predictions close, or identical to the current Standard Model in the areas where the Standard Model works well today.

Ad Absurdum

"The theory of the round Earth cannot explain the existence of mountains. Therefore the model must be discarded. This is evidence that the Earth is flat."

Most people would regard this statement as ridiculous (except perhaps the late Charles Johnson and his supporters).

Yet this is exactly the pattern of 'logic' invoked by many pseudo-scientists.

The Electric Universe Theory is very intriguing to me for the simple fact that it’s elegant, easy to grasp and can explain many anomalies that occur in Nature. In fact, I consider it an equal to Einstein’s Gravitic theories that is the mainstream thought today. Do I think it’s THE theory? No, but I think […]

Unfortunately for Dad2059, Nature is under no obligation to conform to human prejudices and colloquial notions of 'logic', 'elegance' or 'simplicity'. But time after time, mathematics has demonstrated its power not only to explain Nature, but allow us to apply those same discovered rules in building technologies (see The Cosmos In Your Pocket: Expanded & Revised).

Now to Mr. Reeve's claims...

Re: “What you’re really saying is EU ‘theorists’ STILL have no model that provides numerical values we can compare against measurements from spacecraft.”

Reeve: Did I say that?

This sounds like an attempt to stall. Actually, Mr. Reeve dances all around this fact, because it is a 'third rail' of Electric Universe (EU) claims (Wikipedia: The Third Rail). It is the glaring fact, that Electric Universe advocates fear to bring to attention, to admit even to themselves.

Re: “EU ‘theories’ are useless for doing anything real.”

Reeve: Have you considered that most people do not yet understand what the idea actually IS? How will quantifying a web of concepts which few people understand to this day add clarity?

Really, where do Electric Universe advocates get off claiming their 'theory' is a better description of reality, when, according to Mr. Reeve, they can't even define it in a coherent, objective, TESTABLE fashion? Electromagnetism is pretty definite on these things - define a current and charge distribution and the electric and magnetic fields have definite values from Maxwell's equations.

So what IS so difficult about this to EU supporters?

Re: “Any claims by EU advocates that they have a theory that *works* better than the standard models are, to put it kindly, FALSE.”

Reeve: Not every debate hinges strictly on mathematics.

But scientific debates do.

Scientific debates hinge on the ability of the mathematical rules to successfully explain the operation of Nature by generating numbers that can be compared to actual measurements. THAT is the criteria of a successful theory, particularly in the physical sciences.

If Marklund convection is a better 'explanation', as he claims, then there should be a good match between the predictions of the mathematical model AND the observations. THAT is the DEFINITION of a 'better explanation' in the standards of science.

Let's examine the issues Mr. Reeve claim raises, but which he evaded, with his 'better explanation':

A Marklund current requires a large scale electric field along the length of the current. Electric fields require charge separation. Since opposite charges always attract and cosmic plasmas are generally neutral or quasi-neutral, you need energy to separate the charges to create the electric field. Where did this energy come from and how did it separate the charges along the length of the current in such an organized way?

What's the strength of the electric field? Is it strong enough to create Stark splitting (wikipedia) of the ionic spectral lines in the plasma? If so, then we should be able to detect this splitting with spectroscopes on Earth (visible, IR or radio wavelengths) and determine the electric field. The values measured should be in reasonable agreement (within an order of magnitude or better) with the prediction.

We have pretty good methods for measuring cosmic magnetic fields, even with limitations imposed by 'line-of-sight'. Given reasonable assumptions about the size scales of these structures, one should be able estimate the magnetic field strength along the structure and from that, determine what field would be measured from Earth-based instruments by the Zeeman effect. Note this effect is used all the time to determine the magnetic fields of the Sun (wikipedia) and other stars.

With a model of the magnetic field, and electric field, around the filament, it is simple to solve for the convective velocity V = crossproduct(E, B)/B^2 and determine the spectral profile of the ions with the Doppler effect. Note that because Doppler, Zeeman, and Stark effects have different dependencies on wavelength, they can be separated to determine velocities, magnetic fields, and electric fields.

Using the velocity and ionization states of various elements, compute chemical distributions. While Marklund, Peratt and Alfven DESCRIBE this process, I have yet to see actual predictions of chemical distributions for a given plasma, field, and temperature configuration. This will also impact the spectra of the object.

You've got charged particles moving in a magnetic field. What's the synchrotron flux predicted for this configuration? This should be detectable by radio telescopes and should exhibit some correlation structures visible at other wavelengths.

Until you can show a direct match between between the mathematically-determined predictions of the model, AND ACTUAL OBSERVATIONS, these claims are, at BEST, a hypothesis.
By that criteria alone, the Marklund convection idea is in a weaker position than Dark Matter (wikipedia) since we can at least add the hypothesized particles to simulations and obtain better agreement with observations, which can be used to constrain the various searches for what Dark Matter actually IS. See recent cosmological simulations (The Millennium Simulation Project& Illustris (Phys.org)) and compare to skymaps from SDSS. In addition to just comparing the 'look' of the models to the data, researchers also compare other parameters such as the size and mass in the galaxy distribution of the models to the data. Are they a perfect match? No. But then, they still have better agreement than anything presented by Electric Universe supporters.

Reeve: "This is not a commentary on the legitimacy of the idea itself; it’s an attempt to get people to skip over the process of actually learning the idea, in order to interfere with the process of building awareness sufficient to match the quantitative appeal of the conventional theories."

So again, Mr. Reeve wants to evade the fact that EU theories are useless for doing anything in the real world, like estimating radiation fluxes in space needed to protect astronauts and satellites in space.

So how do we protect our astronauts and other space assets with 'awareness' of Electric Universe claims? I mean beyond the obvious benefits obtained by recognizing that Electric Universe claims give no useful information on the radiation environment and are therefore ignorable? I assume that is NOT what Mr. Reeve means.

As for Peratt's supercomputer access, did Peratt actually have a grant authorizing his time on the resource for that use? I've had access to a few supercomputers for various parts of my job, but I can't just run any project I desire on them. Loads of other people compete for time on those machines. Besides, today you can build supercomputers with off-the-shelf hardware quite cheaply (Wikipedia: Beowulf Cluster) and the TRISTAN plasma code which Peratt used is publicly available (TRISTAN). With modern hardware, a smaller machine would still probably be faster than the system Peratt had available in the 1980s (see Electric Universe: Real Plasma Physicists BUILD Mathematical Models. Note the comments: Whatever happened to that simulation on modern hardware that Siggy_G was going to do?)

Yet EU supporters, instead of rolling up their sleeves and actually doing the work, choose a position, apparently encouraged by Mr. Reeve, of regarding their 'wishful thinking' as having the same value as actual scientific evidence.

We did not go to the Moon, or send spacecraft to distant planets, by 'wishful thinking.' We did it by doing the math, which verified the physics, which guided the engineering. Wishful thinking might have provided inspiration, but, as Thomas Edison has noted (Wikiquote), it takes much more perspiration to actually get the job done.

More and more, Electric Universe 'theorists' look like posers or wannabes, who want recognition for work they have not actually done.

Search This Blog

About Me

I obtained my doctorate in physics and astronomy in 1994. I currently work in scientific data visualization for the media and public outreach. For more information on how I became involved in the creationism issue, visit my main page