Many physicists, Prof. Stephen Hawking being the most recognizable, interpret Gödel’s incompleteness theorems as forbidding even the possibility a theory of everything (read Prof. Hawkins thoughts here).

Prof. Hawking writes:

“What is the relation between Godel’s theorem and whether we can formulate the theory of the universe in terms of a finite number of principles? One connection is obvious. According to the positivist philosophy of science, a physical theory is a mathematical model. So if there are mathematical results that can not be proved, there are physical problems that can not be predicted.”

But mathematics and physics are not comparable. There are no restrictions on the formulation of a mathematical proposition so it is always possible to formulate a proposition that cannot be derived from a given axiom set or shown that it is inconsistent with it. If proposition cannot be derived from the axiom set, then the axiom set is considered incomplete. Additionally, if the proposition is inconsistent with the axiom set, then any axioms that are added that would make it possible to derive the proposition would be internally inconsistent. Therefore, all axiom sets are either incomplete, inconsistent or both.

Things are very different in physics. We can safely assume that the Universe is composed of a finite number of types of fundamental objects which interact through a finite number of fundamental forces and that together they define a finite number of fundamental laws; all of which may be represented by an axiom. From the set of all such axioms, the axiom set of the universe, can be derived the behaviour of any physical system at all scales. It is not possible to derive a prediction from this axiom set that would be inconsistent with it or that would not be part of universe at some point of its evolution. The Universe’s axiom set being complete (all that is can be derived from its axiom set) and consistent (nothing that is derived from its axiom set is inconsistent with it), then the theory derived on the Universe’s axiom set must be the theory of everything. The theory of everything is therefore possible.

But though a theory of everything is possible, how can we find it and how can we know that it is when we do?

A physical prediction can be tested without knowing the Universe’s axiom set. We don’t have to reduce a prediction to the axioms of axiom set of Universe to prove it. We don’t need to know the axiom set. All we have to do is devise observations or experiments that test it. If the prediction is not consistent with observations, and if the prediction is correctly derived from a physics theory, then we know that the axiom set of the theory is incomplete and/or inconsistent. If all predictions at all scales are observationally confirmed, then the theory is complete and consistent and we have a theory of everything.

Quantum mechanics and general relativity both make predictions that are consistent with reality at a microscopic and macroscopic scales respectively, but when applied to all scales both theories make predictions that are inconsistent with observations or experiments. This proves that both theories are incomplete and/or inconsistent or that their axioms do not correspond to fundamental aspects of reality. It also proves that their unification is impossible since their axiom sets are mutually exclusive. If we are to make progress towards the theory of everything (I use “the” because there could only be one), then we will have to work from axiom sets different than those of general relativity and quantum mechanics. A good place to start may be minimal axiom sets necessary to describe dynamic systems.

Two Ways to do Science

From an axiomatic standpoint, there are two only two ways to do theoretical physics. The first aims to extend, expand and deepen an existing theory; which is what the overwhelming majority of theorists do. This approach assumes that the theory is fundamentally correct, that is, its axioms are thought to correspond to fundamental aspects of reality.

The second way of doing theoretical physics is to create a new axiom set and derive a theory from it. Distinct axiom sets will lead to distinct theories which, even if they are mutually exclusive may still describe and explain phenomena in ways that are consistent with observations. There can be a multiplicity of such “correct” theories if the axioms are made to correspond to observed aspects of physical reality that are not fundamental but emerging. For instance, theories have been built where one axiom states that the fundamental component of matter is the atom. Such theories, though it may describe very well some phenomena at the molecular scale will fail in explaining a number of phenomenon at smaller scales. In the strict sense, premises based on emergent aspect of reality are not axioms in the physical sense. They can better be understood as theorems. And as mathematical theorems in mathematics can explain the behavior of mathematical objects belonging to a certain class but cannot be generalized to others, physical theorems can explain the behavior of class of objects belonging to a certain scale but these explanations cannot be extended to others scales or even to objects or other classes of objects in the same scale.

But axiom sets are not inherently wrong or right. By definition, since axioms are the starting point, they cannot be reduced or broken down. Hence, as such, we cannot directly prove whether they correspond to fundamental aspects of reality. However, if the models that emerge from an axiom set explain and describe reality and, most importantly, allows predictions that can be tested, then confirmation of the predictions become evidence supporting the axiom set.

The Axiomatic Approach

It can scarcely be denied that the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience. Albert Einstei

The dominant approach in science (and a hugely successful one for that matter) is the empirical approach. That is, the approach by which science accumulates data from which it extracts relationships and assumptions that better our understanding of the Universe.

The empirical approach is an essential part of what one which we might call deconstructive. By that I mean that we take pieces or segments of reality from which, through experiments and observations, we extract data from which we hope to deduce the governing laws of the Universe. But though the deconstructive approach works well with observable phenomena, it has so far failed to provide us with a complete and consistent understanding of fundamental reality.

Of course, when a theory is formulated that is in agreement with a data set, it must be tested against future data sets for which it makes predictions. And if the data disagrees with predictions, the theory may be adjusted so as to make it consistent with the data. Then the theory is tested against a new or expanded data set to see if it holds. If it doesn’t, the trial and error process may be repeated so as to make the theory applicable to an increasingly wider domain of reality.

The amount of data accumulated from experiments and observations is astronomical, but we have yet to find the key to decipher it and unlock the fundamental laws governing the Universe.

Also, data is subject to countless interpretations and the number of mutually exclusive models and theories increases as a function of the quantity of accumulated data.

About the Source of Incompatibilities between Theories

Reality can be thought as an axiomatic system in which fundamental aspects correspond to axioms and non-fundamental aspects correspond to theorems.

The empirical method is essentially a method by which we try to deduce the axiom set of reality, the fundamental components and forces, from theorems (non-fundamental interactions). There lies the problem. Even though reality is a complete and consistent system, the laws extracted from observations at different scales of reality and which form the basis of physics theories do not together form a complete and consistent axiomatic system.

The predictions of current theories may agree with observations at the scale from which their premises were extracted, but they fail, often catastrophically, when it comes to making predictions at different scales of reality.

This may indicate that current theories are not axiomatic; they are not based on true physical axioms, that is; the founding propositions of the theories do not correspond to fundamental aspects of reality. If they were, then the axioms from distinct theories could be merged into a consistent (but not necessarily complete) axiomatic set. There would be no incompatibilities.

Also, if theories were axiomatic systems in the way we described above, their axioms would be similar or complementary. Physical axioms can never be in contradiction.

This raises important questions in regards to the empirical method and its potential to extract physical axioms from the theorems it deduces from observations. The fact that even theories which are derived from observations of phenomena at the microscopic scale have failed to produce physical axioms (if they had, they would explain interactions at larger scales as well) suggests that there is an distinction between the microscopic scale, which is so relative to our scale, and the fundamental scale which may be any order of magnitude smaller.

There is nothing that allows us to infer that the microscopic scale is the fundamental scale or that what we observe at the microscopic scale is fundamental. It may very well be that everything we hold as fundamental, the particles, the forces, etc., are not.

Also, theories founded on theorems related to different scales rather than axioms cannot be unified. It follows that the grand unification of the reigning theories which has been the dream of generation of physicists is mathematically impossible. A theory of everything cannot result from the unification of the standard model and relativity, for instance, them being based on mutually exclusive axiom sets. This is why it was so essential to rigorously derive quantum-geometry dynamics (QGD) from its initial axiom set and avoid at all times the temptation of contriving it into agreeing with other theories.

So even though, as we will see later, Newton’s law of universal gravity, the laws of motion, the universality of free fall and the relation between matter and energy have all been derived from QGD’s axiom set, deriving them was never the goal when the axiomatic set was chosen. These laws just followed naturally from QGD’s axiom set.

However, an axiomatic approach as we have described poses two important obstacles.

The first is choosing a set of axioms where each axiom corresponds to a fundamental aspect of reality if fundamental reality is inaccessible thus immeasurable.

The second obstacle is how to test the predictions of an axiomatically derived theory when the scale of fundamental reality makes its immeasurable.

In the following chapters, we will see that even in the likely scenario that fundamental reality is unobservable, if the axioms of our chosen set correspond to fundamental aspects of reality then there must be inevitable and observable consequences at larger scales which will allow us to derive unique testable predictions. We will show that it possible to choose a complete and consistent set of axioms, that is one from which interactions at all scales of reality can be reduced to. In other words, even if the fundamental scale of reality remains unobservable, an axiomatic theory would make precise predictions at scales that are.

Internal Consistency and Validity of a Theory

Any theory that is rigorously developed from a given consistent set of axioms will itself be internally consistent. That said, since any number of such axiom set can be constructed, an equal number of theories can derived that will be internally consistent. To be a valid axiomatic physics theory, it must answer positively to the following questions.

Do its axioms form an internally consistent set?

Is the theory rigorously derived from the axiom set?

Are all descriptions derived from the theory consistent with observations?

Can we derive explanations from the axiom set that are consistent with observations?

Can we derive from the axiom set unique and testable predictions?

And if an axiom set is consistent and complete, then:

Does the theory derived from the axiom set describe physical reality at all scales?

In the following chapters, we will see how quantum-geometry dynamics answers these questions.

Posted in Quantum-Geometry Dynamics | Comments Off on Do Gödel’s Incompleteness Theorems Exclude the Possibility of a Theory of Everything?

To be clear, quantum-geometry dynamics (QGD) does not solve the unsolved problems of current physics theories. Generations of the best minds in science have been working on the problems that arise from our best current theories and it would not only be presumptuous to claim to have solved these problems. These problems are too big and complex for any one person to tackle. Now, you may ask: What does QGD have to do with these problems? The response is: It efficiently does not solve them.

The idea behind it was simple: to develop a theory from a minimal axiom set necessary to describe dynamics systems. QGD was never was meant to take on the problems facing dominant theoretical physics. As I explored the consequences of QGD’s axiom set I found equations that describe gravity, the electromagnetic effects, the laws of motion for example, the laws of that govern optics for example, and all were direct consequences of the chosen axiom set. Because it was based a different axiom set none of the theory specific problems that arise in current theories came up in QGD.

QGD’s derived equation for gravity predicts that gravity is not fundamental but the effect of two fundamental forces. It also predicts that beyond a threshold distance , gravity becomes negative. Therefore, QGD’s description of gravitationally interacting systems does not require dark energy (see Effects Attributed to Dark Matter and Dark Energy). QGD also reproduces the predictions of our best theories of gravity (see here) and its equations are found to describe a number of physical phenomena.

QGD proposes that there exists only one fundamental material particle we call . QGD predicts that all other particles and their antiparticles are made from and that the difference between a particle and its antiparticle is due to their dynamic structural properties. All particles being made from the same matter, the problem of the matter/antimatter asymmetry does not arise.

QGD proposes that most are free though they interact too weakly to be detected individually; their mass over large regions of space interacts gravitationally with bounded (visible matter) and produces the effects we attribute to dark matter. Also what we call magnetic fields are predicted to be polarized . So dark matter, far from being an exotic form of matter, are really the most common and most commonly observed.

According to QGD, time is nothing more than pure relational concept which allows us to compare events to periodic systems (clocks). Time does not correspond to a physical aspect of reality and the universe being strictly causal, the problem of the arrow of time does not arise in QGD.

are also the fundamental unit of matter mass we find that mass of a particle, structure or contained in a regions of space is simply the number of that it contains. All equations describing the evolution of a system need only use this definition of mass. Thus mass being an intrinsic property of matter we find that no other mechanism is necessary to generate it.

QGD only has two physical constants; the fundamental momentum of the and the units of the two fundamental forces that it predicts exists. All other constants in nature can be derived from them.

The unsolved problems of physics are theory dependent, they are their by-products, but none of these problems emerge from QGD’s axioms. QGD does not resolve the unsolved problems of our current physics theories because it doesn’t need to.

The image on the left is that of a primitive dark matter detector. Really, it is. Okay, if you know anything about physics (especially if you happen to be a physicist), the reasonable assumption is that the author of the opening sentence is crank, a crackpot or simply crazy. The claim that a magnetic compass is a dark matter detector should raise your red flags. If it didn’t then I would suspect that you are either a crank or you don’t have a sufficient understanding of physics to distinguish between a crank and a scientist. If your red flags are up and high as they should, then I invite you to read further.

Quantum-geometry dynamics is a theory derived from a minimal axiom set necessary to describe the evolution of dynamics systems. It proposes that all particles are made from one and only one type of fundamental particle which we call the . , which QGD predicts were the only particles that existed in the initial state of the universe, still permeates space (QGD assumes space to be discrete rather than continuous). were distributed uniformly throughout space and a fraction of them combined to form the lightest detectable particles, low momentum photons (the cosmic microwave background radiation), then progressively larger particles and structures that eventually gave birth to present universe. I have discussed the cosmology derived from QGD elsewhere so I’ll focus here on magnetic fields.

QGD predicts that magnetic fields are made from polarized and that effects we attribute to dark matter are due to gravitational interactions with the populating large regions of space**. Objects which we call electrically charged interact directly with free absorbing and emitting them, polarizing the preonic field and producing the effect of electromagnetic attraction and repulsion. The needle of a magnetic compass reacts to the polarized preonic field, that it, it moves due to polarized which as we have suggested are the particles causing the dark matter effects.

If exist then, why haven’t we detected them? Well, we have. We do every day with anything that creates, detects or uses magnetic fields. It is true that we haven’t detected individual but that is due to the fact that they never decay into other particles, annihilate into photons (they are the components of photons) or transmute into other particles. are eternally stable, which would explain why experiments that hope to observe their decay haven’t been successful.

Also, being the fundamental unit of mass are orders of magnitude less massive than photons and possessing the fundamental unit of momentum, itself orders of magnitude smaller than even the least energetic photon, cannot be individually detected. But we can measure the gravitational effect a large number of them can have. That is, we have observed the sum momentum polarized can impart as a magnetic field. This is why a magnetic compass is essentially a device that detects the polarization of the preonic field around the Earth and it interacts or what has come be known as dark matter.

Thus, according to QGD, dark matter is not a mysterious and exotic type of matter. We see its effects and use it every day. Dark matter was hiding in plain sight. We just failed to recognize it.

Contrary to what I first thought, a multi-messenger event such as GW170817 does not confirm the predictions of existence of gravitational waves; it merely supports it. In fact, the observed electromagnetic counterpart may very well refute them (I propose a simple experiment at the end of this post that would do just that).

As you know, QGD predicts that gravity is not fundamental but a composite effect of the only two fundamental forces it predicts exist. QGD’s description of gravity follows naturally from a minimal axiom set necessary to describe the evolution of dynamics systems (see New Equation for Gravity as Derived from QGD’s Axiom Set). It describes correctly observations, reproduces the predictions of special and general relativity (see Special and General Relativity Axiomatic Derivations) yet allows for new predictions that distinguish it from other theories. QGD excludes all possibility of gravitational-electromagnetic multi-messenger events, that is, the possibility of simultaneous detecting a gravitational signal and electromagnetic signals from the same event. In the case of GW170817, if the estimation of the distance of the source is correct, QGD predicts that the electromagnetic counterparts of the binary star merger would arrive 130 million years after the gravitational signal. So GW170817, having electromagnetic counterpart in the form of a gamma ray bursts GRB 170817A must falsify QGD’s prediction, right?

That is what I thought, but then I realized that I hadn’t considered that though the GW170817 signal may be real, it may not be gravitational in nature. Most importantly, I realized that QGD offers an alternative explanation as to the nature of the signal that follows naturally from the theory without any modifications or addition whatsoever. QGD being rigorously derived from its axiom set forbids modifications or ad hoc explanations. In other words, it cannot be changed to fit contradictory observations.

Yes, the observed electromagnetic counterpart to GW170817 supports the existence of gravitational waves but there is important distinction between support for a prediction and its confirmation. Support leaves one or several significant questions unanswered; questions about the certainty of the nature of what was observed. Confirmation on the other hand leaves minor questions without questioning the nature of the observed phenomenon.

The only thing that the electromagnetic counterpart confirms is that the GW170817 signal travelled at the speed of light. The assumption that it must be gravitational at the exclusion all other explanation is the result of the dominant theoretical bias. However, as I explained in my earlier post, the nature of GW170817 may be electromagnetic rather than gravitational (see here for explanation). I also have proposed a simple experiment that could falsify QGD’s prediction that GW170817 and all previous detections by LIGO are electromagnetic and caused by intense polarization and modulation of the preonic field (if you are not familiar with QGD, see here for explanation). If QGD is correct, signals detected by LIGO-VIRGO detectors would be exactly mirrored by fluctuations in the magnetic moment of a reference magnet. If that were the case, then the prediction of the existence of gravitational waves would be falsified.

Posted in Quantum-Geometry Dynamics | Comments Off on Multi-messenger Events such as GW170807 May Falsify Gravitational Waves

Say what? Dark matter? LIGO is designed to detect gravitational waves, right, not dark matter. Well not exactly.

First, LIGO detects a lot of signals which it considers noise because they interfere with the type of signal it attempts to detect. But we must remember that noise is made of signals generated by a number different physical phenomena, the sources of which often unknown. I have discussed a bit about what such noise can reveal (the data it contains). Today however, I want to discuss of the actual signals that were detected by LIGO and most recently by the LIGO-VIRGO collaboration.

The signals that were detected are as what theory would expect from gravitational waves to look like (although the validity of the signals is being disputed). As interpreted through general relativity, the signals can’t be anything other than gravitational waves. The fact is that the observations fit so well with theoretical predictions that very few people feel there is any need to even look for alternative explanations. Alternative explanations of the observations are not considered, even when such explanations are not only consistent with observations but in some case consistent with a wider spectrum of physical phenomena than does general relativity. I will examine here one such alternative explanation and derive a prediction that distinguishes it from general relativity.

Before we do so, we need to look at QGD’s explanation of dark matter.

Dark Matter

Quantum-geometry dynamics is derived from a minimum set of axioms necessary to describe dynamic systems. One of its axioms is that there is only one fundamental particle of matter, the , and everything else, including particles we consider elementary like photons, electrons and neutrinos is composed of . In its initial state, the universe contained only free which were distributed uniformly throughout space (itself composed of discrete units we call ). Over its evolution of the universe, some combined to form photons and neutrinos (note that the isotropy of the cosmic microwave background radiation, CMBR for short, is more consistent with an initial isotropic state of the universe than it is with a singularity). Following the formation of the CMBR, particles formed that were progressively more massive, which led to the formation of more massive structures, eventually giving birth to stars, galaxies and large scale structures. But most would still be free today and account for the effect we attribute to dark matter; dark matter being the gravitational effect of the mass of contained in large regions of space.

never decay and transmute into any other particle, because of that, and because their momentum is orders of magnitude smaller than that of even the least energetic photon, they have and will always escape any efforts to directly detect them as particles. travel at only one speed which is fundamental and is equal to . Note that the constancy of the speed of light is not an axiom of QGD but rather a consequence of its axioms (see Why can’t anything move faster than the speed of light?).

Large Scale Effect

On large scale, the total mass of over large regions of space is such that it exceeds the mass of visible matter. The effect of the gravitational interaction between dark matter and visible matter has been observed which made possible the estimation of the amount of dark matter in the universe.

Small Scale Effect

When the of even a small region of space are polarized (their motion is made to go in a same general direction), if the density of polarized is large enough, their constitute a field which momentum can be detected. Polarized can interact with and be absorbed by material structures, imparting those structures with their momentum (for a detailed explanation see sections on the laws of momentum in Introduction to Quantum-Geometry Dynamics). Essentially, according to QGD, polarized are the fundamental constituents of magnetic fields.

What does it have to do with the LIGO detections?

The GW170817 event has electromagnetic counterparts. One in particular, was the detection of a gamma ray burst which was detected about two seconds after the GW170817 signal. This tells us that the signal that caused the GW170817 detection travelled at the speed of light. QGD predicts that only three types of particles can move at the fundamental speed ; , photons and neutrinos. Since neutrinos have not been detected and since photons would not have affected the detectors, the only possibility that is consistent with QGD is that the signal, the wave, was composed of . Note that in the context of QGD, waves are distribution curves of discrete particles is discrete space. They are not continuous as is assumed by most physics theories.

Also important to keep in mind is that according to QGD, there is no such thing as pure energy. The mass energy relation is a direct consequence of the axioms of QGD, but the relation is not one of equivalence but one of proportionality. This means that energy is an intrinsic property of matter, therefore it cannot exist in a pure form. For a quick explanation of the relation between mass and energy see Mass, Energy and Momentum or better yet An Axiomatic Approach to Physics.

Free can become polarized when they interact with an object which itself is polarized. Basically, a polarized object is one whose components particles move in the same direction. Polarized objects absorb and emit which intersect with them along the direction of rotation through a mechanism described here.

Whether the polarized object is as small as an electron or as large as a neutron star, a black hole or even a super massive black hole, but the mechanism by which the polarized object interacts with free is governed by the same laws of momentum. What varies is the size and density of the polarized preonic field, which in turn determines how much momentum it carries and can impart at a distance.

If the shape of the object in relative to its rotation plane is spherical, the amount of reflected or emitted is constant, hence undetectable. But binary systems form a non-spherical structure which causes the fluctuations in the polarization of the preonic field. The flow of is modulated by the orbital motion the objects of the binary system creating a wave of which frequency is equal to time is takes to accomplish half and orbit, and the amplitude proportional to the speed of the objects and inversely proportional to the distance between them. The shape of what we could call a preonic wave would be exactly that of the predicted gravitational waves. Most importantly, the preonic wave would interact with the LIGO detectors, imparting their momentum to them and inducing a signal that LIGO which form would be indistinguishable from gravitational waves.

How to distinguish between gravitational waves from preonic waves?

If gravitational interferometer cannot distinguish between gravitational and preonic waves, how can we know which of gravitational waves or preonic waves LIGO-VIRGO detected?

A preonic wave is periodic fluctuations of the polarized preonic field. If, as QGD predicts, magnetic fields are composed of polarized , then the momentum of a magnetic field is proportional to the preonic density and a preonic wave will affect the magnetic moment of a reference magnet. A gravitational wave will not affect the moment of a reference magnet. So in order to distinguish between a gravitational wave and a preonic wave, all we need to do is measure fluctuations in the magnetic moment of a reference magnet. If such fluctuations are detected and correlated to a wave detected by LIGO-VIRGO interferometers, then the wave would not be gravitational but preonic.

LIGO detections and its consequences for QGD

QGD forbids the very existence of gravitational waves. So the detection of gravitational waves would falsify QGD. However, if the experiment suggested above is performed and fluctuations in the magnetic moment of a reference magnet are found. Then, though the prediction of the gravitational waves would not be falsified (only their detection would be), it would provide support for QGD prediction of preonic waves.

The significance of the LIGO detections would in no way be diminished if the waves are found to be preonic. Quite the opposite since it would help answer questions about the nature of dark matter, magnetic fields and the evolution of the universe. Most importantly, it would force us to question our best theories of gravity. The discovery may even deserve a second Nobel prize for the detection of elusive dark matter.

Everyone who is familiar with quantum-geometry dynamics knows that since it precludes the very existence of gravitational waves it also prohibits simultaneous gravitational signals and electromagnetic signals from a single event. Obviously, if LIGO-VIRGO detected gravitational waves then quantum-geometry dynamics would be falsified. That would be the end of what some people consider a promising theory. Nature is an implacable judge and its decisions can never be appealed. However, though QGD prohibits the existence of gravitational waves, it does not exclude the possibility that the LIGO-VIRGO observatories have detected something else. If that were the case, then GW170817 would indeed be a sort of multi-messenger event, just not one of the GW kind. So the question follows: What could the detection be that is both consistent with the GW170817 observations and with quantum-geometry dynamics?

Whatever the LIGO-VIRGO observatories detected travels, if it is linked to the detection of gamma ray burst that followed two seconds later, then it must travel at the speed of light. Now, according to QGD, the only thing that can travel at the speed of light are , photons and neutrinos. If the signals had been composed of photons or neutrinos, they would have been simultaneously detected by telescopes and neutrinos detectors. Since they haven’t, that leaves us with only one possibility; LIGO-VIRGO detected .

travel at the speed of light and cannot be detected by telescopes. But to impart sufficient momentum for the LIGO-VIRGO detectors to see them, we would need to have massive number of polarized . Considering this, I remembered a prediction I wrote years ago. That rotating black holes and neutron stars would polarized the preonic field around them. We’re talking about massive amount of . For single black holes or neutron stars, the polarization would be uniform (thus undetectable), but due to their orbital motions binary systems of the polarization of the preonic field would be modulated creating what we could call waves of or preonic waves. The LIGO-VIRGO could detect Such preonic waves modulated by the inward spiralling of merging massive structures such as black holes and neutron stars which would look like gravitational waves to the LIGO-VIRGO. The question is then, how can we distinguish between preonic waves and gravitational waves?

QGD provides a simple answer that follows naturally from its axioms. We have seen that according to QGD, magnetic fields are made of polarized preonic field. Since the momentum of magnetic fields is proportional to the preonic density, then preonic waves would cause fluctuations in a reference magnetic field. So if the signals detected by LIGO-VIRGO are polarized , then fluctuations in the momentum of a reference magnetic field should mirror exactly the signals detected by LIGO-VIRGO observatories. Such fluctuations in the momentum of a reference magnetic field is a prediction specific to QGD.

Note: preonic waves are distributions of latex preon{{s}^{\left( + \right)}}&bg=181818&fg=ffffff$ similar to electromagnetic waves which are distributions of photons.

Posted in Quantum-Geometry Dynamics | Comments Off on What if GW170817 actually was a multi-messenger event? (which I admit it may very well be)

On February 12, 2016, LIGO made the extraordinary announcement that they had detected gravitational waves for the first time. The day following the announcement, I posted an article predicting that the announcement was premature and that the signal was probably due to noise.

Then on June 13, 2017, just a few days ago, a group of researchers from the Niels Bohr Institute in Copenhagen (see Forbes article here) published a new study. After analysing the data from LIGO, they found correlations in the noise detected by the two LIGO detectors. Thus casts some serious doubts on the LIGO discovery and supporting my prediction.

That QGD excludes the existence of gravitational waves does not demean the importance of the LIGO-VIRGO observatories. They may not detect gravitational waves, but they could detect variations in the gravitational interactions between the Earth and all massive structures in the Universe causing a gravitational tidal effect. In fact, if QGD is correct, the noise is the continuous fluctuations in the sum of the gravitational interactions between the Earth and the rest of the universe which the peak fluctuations due to the events involving the most distant massive structures. So what is discarded as noise (aside from the systematic and local sources) is more revealing than any individual gravitational signal.

A New Prediction

The VIRGO observatory will soon join the two LIGO detectors and it is expected that together they answer the question as to whether or not gravitational waves have been detected. I do believe they will, but the answer may not be the one expected by the researchers. If the signals are due to stellar gravitational tidal forces, then QGD predicts correlations between the noise of all three detectors similar to that found by the Copenhagen group.

Newton’s theory of universal gravity also fails to describe the orbital decay of binary systems such as the Hulse-Taylor binary system which observation was consistent with general relativity. Favoring general relativity as the theory that correctly describes gravity is a clear cut decision considering its successes. General relativity succeeded where Newton’s theory of gravity had failed. But is the matter really settled? Let’s take a closer look at how Newton’s theory of gravity has been applied to the observations cited above.

In order to describe the evolution of two gravitationally interacting bodies and , the magnitude of the gravitational force is calculated using Newton’s equation for gravity where and are the masses of the bodies, then substituted in the equation for Newton’s second law of motion; the familiar where is the acceleration of . This is as straightforward a calculation as can be but there lays the problem.

Gravity, according to Newton’s law, is instantaneous. It follows that if gravity is instantaneous, so must the action of gravity be instantaneous. So applying the second law of motion (which is time dependent) to describe the effect of Newtonian gravity introduces a lag in the action that is incompatible with instantaneous gravity. This lag of the action of gravity introduced by using the second law of motion is precisely what caused predictive errors in Newtonian mechanical description of the precession of the perihelion of Mercury, of the bending of star light and of the orbital decay of binary systems. In fact, once the time dependency and consequently the time lag are eliminated from the gravitational action, we find that Newtonian gravity is in perfect agreement with observations (see Special and General Relativity Axiomatic Derivations).

The fact is that Newtonian gravity (when correctly applied) and general relativity can and with equal precision predict the behaviour of gravitationally interacting bodies for the above phenomena is problematic. This forces us to find other ways to answer the question as to whether gravity is a force that acts instantaneously between bodies or if is the effect of curvature of space due to the presence of matter. Clearly, the two explanations of the nature of gravity are foundationally incompatible.

It follows from QGD’s equation for gravity that gravity becomes repulsive when bodies separated by distances such that . That is, there is a threshold distance (from observations) beyond which gravity becomes repulsive and increases proportionally to the square of the distance. The effect of repulsive gravity as described by QGD is consistent with the observed expansion of the universe which is currently attributed to dark energy. This allows for new predictions that are distinct from those of general relativity.

If QGD is correct, the magnitude of the gravitational repulsion between the Earth and the black holes that caused the GW150914 event must be greater than the magnitude of the attractive gravitational force in close proximity to the binary system that caused the event. Such gravitational effect is astronomically greater than the signal detected by LIGO in 2015. In fact, the repulsive force would be enough to tear our galaxy apart from the gravitational tidal force and accelerate it to speed approaching the speed of light. And the repulsive force between the Earth and the recently observed GW170104 event, presumed to be a twice the distance, would be four times as great. The reason our galaxy (and others) is not torn apart is that the distribution of matter in the universe is nearly homogenous so that the repulsive gravitational forces from distant massive systems acting on each individual particle that compose our galaxy are nearly cancelled out by the repulsions from systems in the opposite directions; resulting in a weak net gravitational effect. So, if the GW150914 and GW170104 events are gravitational, the detected signals would be tidal effects of the net gravitational forces acting on the detectors . That is, the signals are not gravitational waves but the measurement of the instantaneous gravitational tidal effect where is the detector and is one of a total of massive structures forming the universe. So, LIGO may be thought as measuring the fluctuations of the gravitational tidal effect of the universe on its instruments.

Some Distinctive Predictions of QGD that Are Now Being Tested (or will be in the near Future)

If gravity is instantaneous as predicted by QGD and Newton’s law of universal gravity, then

we will never detect multi-messengers signals from events predicted to simultaneously generate gravitational and electromagnetic signals. Electromagnetic signal from the merging, for example, of neutron stars, would arrive up to billions of years after the gravitational signal.

Gravitational signal from the merging of massive objects at distance close the threshold distance would be undetectable.

No loss in mass of the merging massive objects in the form of gravitational waves (in fact, there is no mechanism that may account for the conversion of mass into gravitational waves). The mass of the object resulting from the merging will be equal to the sum of the masses of the merged objects.

Angular radius of the shadow of Sagittarius A* should be 10 times larger than predicted by general relativity

Last year was all about Advanced LIGO’s announcement that they had for the first time detected gravitational waves predicted to exist a hundred years earlier. Understandingly, the press coverage was proportional to the importance of the discovery. The conference which was released in the entire world was, to my knowledge, amongst the events that received the widest press coverage ever for a scientific discovery.

In the field of astrophysics, the only comparable event was probably the detection of primordial gravitational waves by the BICEP2 experiment announced with great fanfare in 2014.

Immediately after the BICEP2 announcement, I predicted that the results would be refuted by further observations. It was not that I was skeptic. It was not just a random opinion, but a direct consequence of quantum-geometry dynamics. The level of confidence in the BICEP2 discovery was so high than very few doubted the validity of the results. I was one of few people who immediately predicted that the results would not hold and as we all know the BICEP2 discovery was refuted later that year.

I made a similar prediction for the LIGO detections the days prior and following the announcement in February 2016. Since the announcement, the sensitivity of LIGO was increased and the second run of observation started in November 2016. Tomorrow, the results of the second run of observations will be released, but this time, there is no press coverage except from two minor local news sources. The release is not even mentioned on the Facebook page of the LIGO collaboration. Why is the release so hush hush? One would think that after the last year’s announcement of the detection of gravitational waves (and the unrelenting news coverage since then) that any news from LIGO would be treated as a highest priority by the media if that is what the LIGO collaboration made the slightest effort to publicize it. But the lack of any attempt to draw attention to the results is probably, as I predicted, because the earlier detection have not be corroborated by new detections.

Good science requires that before being considered a discovery the results of any observation or experiment must be reproducible. Considering its higher sensitivity, the duration of the second run and the theoretical probability of more detection, Advanced LIGO should have made more detections in its second run and it had in its first. Because of that, null results are even more significant than the detection announced last year as they cast doubts on the validity of the discovery.

My prediction is no new detections of black hole mergers announced tomorrow but not to worry, that only provides new constraints on the frequency of events capable of producing detectable gravitational waves, right?

[UPDATE] It seems that they are announcing the detection of one black holes merger (see article here).

From the article:

“Normally, an event like this would trigger an alert to the astronomy community, which could then attempt observations in the area of the sky where the event took place. But, in this case, a recent period of maintenance had left one of the two detectors set in a calibration mode.”

That is disappointing since the simultaneous independent detections of the non-gravitational signals would test the predicted speed of propagation of gravitational waves and would put to rest the prediction of QGD that gravity is instantaneous and that the signals detected by LIGO are due to the tidal effect of gravity.

If QGD’s equation for gravity is correct, gravity becomes repulsive at distances greater than 10Mpc and the magnitude of the repulsion increases as a function of distance (this would account of the expansion of the universe we attribute to dark energy). That means that the greater the distance, the greater the tidal effect of gravity.

QGD predicts that black holes are extremely dense but not infinitely so. Considering that are strictly kinetic and that no two can simultaneously occupy any given then . It follows that or, since is the fundamental unit of space, we can simply write for the minimum corresponding radius .

For the radius of the black hole predicted to be a the center of our galaxy, and where the mass is expressed in and radius in . Though converting this into conventional units requires observations to determine the values of the QGD constants and , using relation between QGD and Newtonian gravity, we also predict that the radius within which light cannot escape a massive structure is where is used to represent the gravitational constant. Since the Schwarzschild radius for a black hole of mass is then .

Using to calculate the angular radius of the shadow of Sagitarius A*, the black hole at the center of our galaxy, we get arcsecond as a minimum value which is about 10 times the angular radius calculated using the Schwarzschild radius which i arcsecond. This prediction will be tested in the near future by the upcoming observations by the Event Horizon Telescope.