One of the key problems of modern theoretical physics is the quantization of gravity.

Or, as I want to explain here, it is the quantization of general relativity which is problematic, and not the quantization of gravity. A particular example of an extremely simply quantum theory of gravity would be the quantization of Newtonian gravity. It is a triviality, one can simply follow the standard methods of quantization of a multi-particle theory with a Newtonian interaction. Unfortunately, Newtonian gravity fails, and has to be replaced by a relativistic theory of gravity. And GR quantization is, unfortunately, highly problematic.

It appears that most of them are a consequence of background independence of GR. GLET reverts this and reintroduces a Newtonian background of absolute space and time into the theory, so that problems caused by background independence will not appear in GLET.

Topological foam: GR allows, in principle, arbitrary topologies. In classical GR, this does not cause big problems, because to change topology is almost impossible in agreement with the classical equations of motion and reasonable assumptions about matter. But these restrictions are no longer valid in quantum gravity. In quantum theory, one has to consider all possible configurations. Moreover, minor violations of the classical equations have to be allowed. But if the non-trivial topology is sufficiently localized, it will be small, thus, it would be allowed. So one would obtain for very small distances small distortions from trivial topology, a thing named "topological foam". How could such a topological foam handled mathematically is completely unknown. In GLET, the background fixes the topology, thus, there will be no such topological foam.

Uncertainty of causality: Once the gravitational field is quantized, it means the metric is quantized, and, as a consequence, it will be uncertain. But Einstein causality is defined by the metric. With an uncertain metric, we obtain uncertain Einstein causality. Causality is already problematic in GR solutions with closed causal loops, but such solutions seem rare exceptions of no practical importance. The uncertainty of causality would, instead, be a general property. Instead, in GLET causality is defined by the background, and connected with absolute time, thus, is fixed.

No local energy conservation law: GR does not have a simple local energy and momentum conservation law which could be integrated to define a global energy. There is some replacement - the so-called energy-momentum pseudotensors. But they are not covariant, and so cannot be nicely integrated into GR philosophy: To take them seriously, as physical entities, one would have to break diffeomorphism symmetry and to give up background independence. Instead, in GLET the background has translational invariance, thus, gives, via the Noether theorem, standard local energy and momentum conservation laws.

Degenerated Hamilton formalism: Closely related to the non-existence of a local energy conservation law is the problem of defining a Hamilton formalism. There has been some replacement - the ADM formalism - but this is a quite weak replacement, which requires the consideration of some constraints - the Hamiltonian constraint and the momentum constraint. The reason for the appearance of these problems is the diffeomorphism symmetry, which gives a lot of different (in coordinates) but equivalent (by the equivalence principle) metrics the same energy. Instead, this additional diffeomorphism symmetry does not exist in GLET. Therefore the Hamiltonian formalism of GLET does not lead to such complications. The standard construction works and gives a standard Hamilton formalism.

The problem of time: Technically, it is the consequence of the problems created by the Hamiltonian constraint for the quantization of gravity. But the relation between the notions of time in GR and in quantum theory is problematic anyway, for conceptual reasons too. Instead, GLET has a classical absolute time, which is completely identical to the notion of time used on quantum theory, and there is also no Hamiltonian constraint in its Lagrange formalism, thus, there is also no problem of time.

So, we see that a large amount of problems of GR quantization are simply a consequence of its background independence - and, once GLET does not have any background independence, all these problems do not even appear in GLET to be solved. They simply do not exist there.

There are also other problems. A minor technical problem is related with the quantization of the conservation laws which have been introduced via the harmonic coordinates: The conservation laws are first order equations, and such equations may cause some problems in some quantization approaches. This is not the case in condensed matter theory, where it is better for quantization to switch from the Eulerian, local specification of the flow field to the Lagrangian, material one. In this specification, the state of the ether is described by the position \(x(x_0)\) of a point of the ether characterized by its position in some initial reference state \(x_0\). This description is also the appropriate one for an atomic description of the ether. In such a description, the atoms of this ether would be identified by their position \(x_0(n)\) in an undeformed, regular lattice, their actual position described by \(x(n)=x(x_0(n))\). The density in the large distance limit would be simply the number of nodes in some volume, and the continuity equation would become and automatic consequence, which holds exactly, independent of any quantum fluctuations.

The much more famous problem is, instead, the non-renormalizability of GR. Here, GLET changes nothing. It is non-renormalizable too. Fortunately, the modern understanding of renormalization, following Wilson, shows that a non-renormalizable theory can be, nonetheless, understood as an effective field theory - a theory which is a good approximation only for large distances, but becomes invalid below some critical distance. An example would be a lattice regularization of the theory with the lattice shift as the critical distance.

But such a lattice discretization is essentially the same as what one would expect from a simple atomic model for the ether. And an atomic ether is what one would expect anyway from the future development of a continuous ether theory. So, while non-renormalizability is a serious problem if one thinks that the theory remains valid for arbitrary small distances, it is not at all a problem for a theory which, from the start, presupposes that it has to be replaced, below some critical "atomic" distance, by a different "atomic" theory.

To summarize: The two things introduced by ether theory, namely the classical fixed background, and the atomic hypothesis for the ether, lead to the disappearance of all the serious problems of GR quantization: Most of them disappear simply because they are caused by the non-existence of a fixed background in GR. And and non-renormalizability is also unproblematic for a theory which, because of the atomic hypothesis, is never supposed to be more than an effective large distance approximation.

Essentially, the quantization of an ether theory is trivial, because we already know how to quantize condensed matter theories, so, once the gravitational field is presented in the form of a classical condensed matter theory, all we have to do is to follow the prescriptions of quantization of condensed matter theory.