On the scale of galaxies, gravity appears to be stronger than we can account for using only particles that are able to emit light. So we add dark matter particles as 25% of the mass-energy of the Universe. Such particles have never been directly detected.

On the larger scales on which the Universe is expanding, gravity appears weaker than expected in a universe containing only particles – whether ordinary or dark matter. So we add “dark energy”: a weak anti-gravity force that acts independently of matter.

Brief history of “dark energy”

The idea of dark energy is as old as general relativity itself. Albert Einstein included it when he first applied relativity to cosmology exactly 100 years ago.

Einstein mistakenly wanted to exactly balance the self attraction of matter by anti-gravity on the largest scales. He could not imagine that the Universe had a beginning and did not want it to change in time.

Almost nothing was known about the Universe in 1917. The very idea that galaxies were objects at vast distances was debated.

That means space naturally wants to expand or contract, bending together with the matter. It never stands still.

This was realised by Alexander Friedmann who in 1922 kept the same ingredients as Einstein. But he did not try to balance the amount of matter and dark energy. That suggested a model in which universes that could expand or contract.

Further, the expansion would always slow down if only matter was present. But it could speed up if anti-gravitating dark energy was included.

Since the late 1990s many independent observations have seemed to demand such accelerating expansion, in a Universe with 70% dark energy. But this conclusion is based on the old model of expansion that has not changed since the 1920s.

Standard cosmological model

Einstein’s equations are fiendishly difficult. And not simply because there are more of them than in Isaac Newton’s theory of gravity.

Unfortunately, Einstein left some basic questions unanswered. These include – on what scales does matter tell space how to curve? What is the largest object that moves as an individual particle in response? And what is the correct picture on other scales?

These issues are conveniently avoided by the 100-year old approximation — introduced by Einstein and Friedmann — that, on average, the Universe expands uniformly. Just as if all cosmic structures could be put through a blender to make a featureless soup.

This homogenising approximation was justified early in cosmic history. We know from the cosmic microwave background — the relic radiation of the Big Bang — that variations in matter density were tiny when the Universe was less than a million years old.

But the universe is not homogeneous today. Gravitational instability led to the growth of stars, galaxies, clusters of galaxies, and eventually a vast “cosmic web”, dominated in volume by voids surrounded by sheets of galaxies and threaded by wispy filaments.

In standard cosmology, we assume a background expanding as if there were no cosmic structures. We then do computer simulations using only Newton’s 330-year old theory. This produces a structure resembling the observed cosmic web in a reasonably compelling fashion. But it requires including dark energy and dark matter as ingredients.

Matter and curvature distributions start out near uniform when the universe is young. But as the cosmic web emerges and becomes more complex, the variations of small-scale curvature grow large and average expansion can differ from that of standard cosmology.

Recent numerical results of a team in Budapest and Hawaii that claim to dispense with dark energy used standard Newtonian simulations. But they evolved their code forward in time by a non-standard method to model the backreaction effect.

In the next decade, experiments such as the Euclid satellite and the CODEX experiment, will have the power to test whether cosmic expansion follows the homogeneous law of Friedmann, or an alternative backreaction model.

An artist’s impression shows the European Extremely Large Telescope (E-ELT) which uses CODEX as an optical, very stable, high spectral resolution instrument.ESO/L. Calçada, CC BY-SA

To avoid stagnation and nurture a vibrant scientific culture, a research frontier should always maintain at least two ways of interpreting data so that new experiments will aim to select the correct one. A healthy dialogue between different points of view should be fostered through conferences that discuss conceptual issues and not just experimental results and phenomenology, as often is the case currently.

What can general relativity teach us?

While most researchers accept that the backreaction effects exist, the real debate is about whether this can lead to more than a 1% or 2% difference from the mass-energy budget of standard cosmology.

Any backreaction solution that eliminates dark energy must explain why the law of average expansion appears so uniform despite the inhomogeneity of the cosmic web, something standard cosmology assumes without explanation.

Since Einstein’s equations can in principle make space expand in extremely complicated ways, some simplifying principle is required for their large-scale average. This is the approach of the timescape cosmology.

Any simplifying principle for cosmological averages is likely to have its origins in the very early Universe, given it was much simpler than the Universe today. For the past 38 years, inflationary universe models have been invoked to explain the simplicity of the early Universe.

Many physicists still view the Universe as a fixed continuum that comes into existence independently of the matter fields that live in it. But, in the spirit of relativity – that space and time only have meaning when they are relational – we may need to rethink basic ideas.

Since time itself is only measured by particles with a non-zero rest mass, maybe spacetime as we know it only emerges as the first massive particles condense.

Whatever the final theory, it will likely embody the key innovation of general relativity, namely the dynamical coupling of matter and geometry, at the quantum level.