|Part of a series on|
Is the universe homogeneous and isotropic at large enough scales, as claimed by the cosmological principle and assumed by all models that use the Friedmann–Lemaître–Robertson–Walker metric, including the current version of the ΛCDM model, or is the universe inhomogeneous or anisotropic?
An inhomogeneous cosmology is a physical cosmological theory (an astronomical model of the physical universe's origin and evolution) which, unlike the currently widely accepted cosmological concordance model, assumes that inhomogeneities in the distribution of matter across the universe affect local gravitational forces (i.e., at the galactic level) enough to skew our view of the Universe. When the universe began, matter was distributed homogeneously, but over billions of years, galaxies, clusters of galaxies, and superclusters have coalesced, and must, according to Einstein's theory of general relativity, warp the space-time around them. While the concordance model acknowledges this fact, it assumes that such inhomogeneities are not sufficient to affect large-scale averages of gravity in our observations. When two separate studies claimed in 1998-1999 that high redshift supernovae were further away than our calculations showed they should be, it was suggested that the expansion of the universe is accelerating, and dark energy, a repulsive energy inherent in space, was proposed to explain the acceleration. Dark energy has since become widely accepted, but it remains unexplained. Accordingly, some scientists continue to work on models that might not require dark energy. Inhomogeneous cosmology falls into this class.
Inhomogeneous cosmologies assume that the backreactions of denser structures, as well as those of very empty voids, on space-time are significant enough that when not taken into account, they distort our understanding of time and our observations of distant objects. Following Thomas Buchert's publication of equations in 1997 and 2000 that derive from general relativity but also allow for the inclusion of local gravitational variations, a number of cosmological models were proposed under which the acceleration of the universe is in fact a misinterpretation of our astronomical observations and in which dark energy is unnecessary to explain them. For example, in 2007, David Wiltshire proposed a model (timescape cosmology) in which backreactions have caused time to run more slowly or, in voids, more quickly, thus giving the supernovae observed in 1998 the illusion of being further away than they were. Timescape cosmology may also imply that the expansion of the universe is in fact slowing.
Main article: Lambda-CDM model
The conflict between the two cosmologies derives from the inflexibility of Einstein's theory of general relativity, which shows how gravity is formed by the interaction of matter, space, and time. Physicist John Wheeler famously summed up the theory's essence as "Matter tells space how to curve; space tells matter how to move." However, in order to build a workable cosmological model, all of the terms on both sides of Einstein's equations must be balanced: on one side, matter (i.e., all the things that warp time and space); on the other, the curvature of the universe and the speed at which space-time is expanding. In short, a model requires a particular amount of matter in order to produce particular curvatures and expansion rates.
In terms of matter, all modern cosmologies are founded on the cosmological principle, which states that whichever direction we look from Earth, the universe is basically the same: homogeneous and isotropic (uniform in all dimensions). This principle grew out of Copernicus's assertion that there were no special observers in the universe and nothing special about the Earth's location in the universe (i.e., Earth was not the center of the universe, as previously thought). Since the publication of general relativity in 1915, this homogeneity and isotropy have greatly simplified the process of devising cosmological models.
In terms of the curvature of space-time and the shape of the universe, it can theoretically be closed (positive curvature, or space-time folding in itself as though on a four-dimensional sphere's surface), open (negative curvature, with space-time folding outward), or flat (zero curvature, like the surface of a "flat" four-dimensional piece of paper).
The first real difficulty came with regards to expansion, for in 1915, as previously, the universe was assumed to be static, neither expanding nor contracting. All of Einstein's solutions to his equations in general relativity, however, predicted a dynamic universe. Therefore, in order to make his equations consistent with the apparently static universe, he added a cosmological constant, a term representing some unexplained extra energy. But when in the late 1920s Georges Lemaître's and Edwin Hubble's observations proved Alexander Friedmann's notion (derived from general relativity) that the universe was expanding, the cosmological constant became unnecessary, Einstein calling it "my greatest blunder."
With this term gone from the equation, others derived the Friedmann-Lemaître–Robertson–Walker (FLRW) solution to describe such an expanding universe — a solution built on the assumption of a flat, isotropic, homogeneous universe. The FLRW model became the foundation of the standard model of a universe created by the Big Bang, and further observational evidence has helped to refine it. For example, a smooth, mostly homogeneous, and (at least when it was almost 400,000 years old) flat universe seemed to be confirmed by data from the cosmic microwave background (CMB). And after galaxies and clusters of galaxies were found in the 1970s to be rotating faster than they should without flying apart, the existence of dark matter seemed also proven, confirming its inference by Jacobus Kapteyn, Jan Oort, and Fritz Zwicky in the 1920s and 1930s and demonstrating the flexibility of the standard model. Dark matter is believed to make up roughly 23% of the energy density of the universe.
Main article: Dark energy
Another observation in 1998 seemed to complicate the situation further: two separate studies found distant supernovae to be fainter than expected in a steadily expanding universe; that is, they were not merely moving away from the earth but accelerating. The universe's expansion was calculated to have been accelerating since approximately 5 billion years ago. Given the gravitation braking effect that all the matter of the universe should have had on this expansion, a variation of Einstein's cosmological constant was reintroduced to represent an energy inherent in space, balancing the equations for a flat, accelerating universe. It also gave Einstein's cosmological constant new meaning, for by reintroducing it into the equation to represent dark energy, a flat universe expanding ever faster can be reproduced.
Although the nature of this energy has yet to be adequately explained, it makes up almost 70% of the energy density of the universe in the concordance model. And thus, when including dark matter, almost 95% of the universe's energy density is explained by phenomena that have been inferred but not entirely explained nor directly observed. Most cosmologists still accept the concordance model, although science journalist Anil Ananthaswamy calls this agreement a "wobbly orthodoxy."
While the universe began with homogeneously distributed matter, enormous structures have since coalesced over billions of years: hundreds of billions of stars inside of galaxies, clusters of galaxies, superclusters, and vast filaments of matter. These denser regions and the voids between them must, under general relativity, have some effect, as matter dictates how space-time curves. So the extra mass of galaxies and galaxy clusters (and dark matter, should particles of it ever be directly detected) must cause nearby space-time to curve more positively, and voids should have the opposite effect, causing space-time around them to take on negative curvatures. The question is whether these effects, called backreactions, are negligible or together comprise enough to change the universe's geometry. Most scientists have assumed that they are negligible, but this has partly been because there has been no way to average space-time geometry in Einstein's equations.
In 2000, a set of new equations—now referred to as the set of Buchert equations—based on general relativity was published by cosmologist Thomas Buchert of the École Normale Supérieure in Lyon, France, which allow the effects of a non-uniform distribution of matter to be taken into account but still allow the behavior of the universe to be averaged. Thus, models based on a lumpy, inhomogeneous distribution of matter could now be devised. "There is no dark energy, as far as I'm concerned," Buchert told New Scientist in 2016. "In ten years' time, dark energy is gone." In the same article, cosmologist Syksy Räsänen said, "It’s not been established beyond reasonable doubt that dark energy exists. But I’d never say that it has been established that dark energy does not exist." He also told the magazine that the question of whether backreactions are negligible in cosmology "has not been satisfactorily answered."
Inhomogeneous cosmology in the most general sense (assuming a totally inhomogeneous universe) is modeling the universe as a whole with the spacetime which does not possess any spacetime symmetries. Typically considered cosmological spacetimes have either the maximal symmetry, which comprises three translational symmetries and three rotational symmetries (homogeneity and isotropy with respect to every point of spacetime), the translational symmetry only (homogeneous models), or the rotational symmetry only (spherically symmetric models). Models with less symmetries (e.g. axisymmetric) are also considered as symmetric. However, it is common to call spherically symmetric models or non-homogeneous models as inhomogeneous. In inhomogeneous cosmology, the large-scale structure of the universe is modeled by exact solutions of the Einstein field equations (i.e. non-perturbatively), unlike cosmological perturbation theory, which is study of the universe that takes structure formation (galaxies, galaxy clusters, the cosmic web) into account but in a perturbative way.
Inhomogeneous cosmology usually includes the study of structure in the Universe by means of exact solutions of Einstein's field equations (i.e. metrics) or by spatial or spacetime averaging methods. Such models are not homogeneous, but may allow effects which can be interpreted as dark energy, or can lead to cosmological structures such as voids or galaxy clusters.
Perturbation theory, which deals with small perturbations from e.g. a homogeneous metric, only holds as long as the perturbations are not too large, and N-body simulations use Newtonian gravity which is only a good approximation when speeds are low and gravitational fields are weak.
Work towards a non-perturbative approach includes the Relativistic Zel'dovich Approximation. As of 2016[update], Thomas Buchert, George Ellis, Edward Kolb, and their colleagues judged that if the universe is described by cosmic variables in a backreaction scheme that includes coarse-graining and averaging, then whether dark energy is an artifact of the traditional way of using the Einstein equation remains an unanswered question.
The first historical examples of inhomogeneous (though spherically symmetric) solutions are the Lemaître–Tolman metric (or LTB model - Lemaître–Tolman-Bondi ). The Stephani metric can be spherically symmetric or totally inhomogeneous. Other examples are the Szekeres metric, Szafron metric, Barnes metric, Kustaanheimo-Qvist metric, and Senovilla metric. The Bianchi metrics as given in the Bianchi classification and Kantowski-Sachs metrics are homogeneous.
The best-known[according to whom?] averaging approach is the scalar averaging approach[further explanation needed], leading to the kinematical backreaction and mean 3-Ricci curvature functionals. Buchert's equations are the main equations[further explanation needed] of such averaging methods.
In 2007, David Wiltshire, a professor of theoretical physics at the University of Canterbury in New Zealand, argued in the New Journal of Physics that quasilocal variations in gravitational energy had in 1998 given the false conclusion that the expansion of the universe is accelerating. Moreover, due to the equivalence principle, which holds that gravitational and inertial energy are equivalent and thus prevents aspects of gravitational energy from being differentiated at a local level, scientists thus misidentified these aspects as dark energy. This misidentification was the result of presuming an essentially homogeneous universe, as the standard cosmological model does, and not accounting for temporal differences between matter-dense areas and voids. Wiltshire and others argued that if the universe is not only assumed not to be homogeneous but also not flat, models could be devised in which the apparent acceleration of the universe's expansion could be explained otherwise.
One more important step being left out of the standard model, Wiltshire claimed, was the fact that as proven by observation, gravity slows time. Thus, from the perspective of the same observer, a clock will move faster in empty space, which possesses low gravitation, than inside a galaxy, which has much more gravity, and he argued that as large as a 38% difference between the time on clocks in the Milky Way and those in a galaxy floating in a void exists. Thus, unless we can correct for that—timescapes each with different times—our observations of the expansion of space will be, and are, incorrect. Wiltshire claims that the 1998 supernovae observations that led to the conclusion of an expanding universe and dark energy can instead be explained by Buchert's equations if certain strange aspects of general relativity are taken into account.