Despite being the most successful theory of particle physics to date, the Standard Model is not perfect. A large share of the published output of theoretical physicists consists of proposals for various forms of "Beyond the Standard Model" new physics proposals that would modify the Standard Model in ways subtle enough to be consistent with existing data, yet address its imperfections materially enough to predict non-Standard Model outcomes of new experiments that can be proposed.
Phenomena not explained
The Standard Model is inherently an incomplete theory. There are fundamental physical phenomena in nature that the Standard Model does not adequately explain:
Gravity. The standard model does not explain gravity. The approach of simply adding a graviton to the Standard Model does not recreate what is observed experimentally without other modifications, as yet undiscovered, to the Standard Model. Moreover, the Standard Model is widely considered to be incompatible with the most successful theory of gravity to date, general relativity.
Dark matter. Cosmological observations tell us the standard model explains about 5% of the mass-energy present in the universe. About 26% should be dark matter (the remaining 69% being dark energy) which would behave just like other matter, but which only interacts weakly (if at all) with the Standard Model fields. Yet, the Standard Model does not supply any fundamental particles that are good dark matter candidates.
Dark energy. As mentioned, the remaining 69% of the universe's energy should consist of the so-called dark energy, a constant energy density for the vacuum. Attempts to explain dark energy in terms of vacuum energy of the standard model lead to a mismatch of 120 orders of magnitude.
Neutrino masses. According to the standard model, neutrinos are massless particles. However, neutrino oscillation experiments have shown that neutrinos do have mass. Mass terms for the neutrinos can be added to the standard model by hand, but these lead to new theoretical problems. For example, the mass terms need to be extraordinarily small and it is not clear if the neutrino masses would arise in the same way that the masses of other fundamental particles do in the Standard Model.
Matter–antimatter asymmetry. The universe is made out of mostly matter. However, the standard model predicts that matter and antimatter should have been created in (almost) equal amounts if the initial conditions of the universe did not involve disproportionate matter relative to antimatter. Yet, there is no mechanism in the Standard Model to sufficiently explain this asymmetry.
Experimental results not explained
No experimental result is accepted as definitively contradicting the Standard Model at the 5 σ level, widely considered to be the threshold of a discovery in particle physics. Because every experiment contains some degree of statistical and systemic uncertainty, and the theoretical predictions themselves are also almost never calculated exactly and are subject to uncertainties in measurements of the fundamental constants of the Standard Model (some of which are tiny and others of which are substantial), it is to be expected that some of the hundreds of experimental tests of the Standard Model will deviate from it to some extent, even if there were no new physics to be discovered.
At any given moment there are several experimental results standing that significantly differ from a Standard Model-based prediction. In the past, many of these discrepancies have been found to be statistical flukes or experimental errors that vanish as more data has been collected, or when the same experiments were conducted more carefully. On the other hand, any physics beyond the Standard Model would necessarily first appear in experiments as a statistically significant difference between an experiment and the theoretical prediction. The task is to determine which is the case.
In each case, physicists seek to determine if a result is merely a statistical fluke or experimental error on the one hand, or a sign of new physics on the other. More statistically significant results cannot be mere statistical flukes but can still result from experimental error or inaccurate estimates of experimental precision. Frequently, experiments are tailored to be more sensitive to experimental results that would distinguish the Standard Model from theoretical alternatives.
Some of the most notable examples include the following:
B meson decay etc. – results from a BaBar experiment may suggest a surplus over Standard Model predictions of a type of particle decay ( B → D(*) τ−ντ ). In this, an electron and positron collide, resulting in a B meson and an antimatter B meson, which then decays into a D meson and a tau lepton as well as a tau antineutrino. While the level of certainty of the excess (3.4 σ in statistical jargon) is not enough to declare a break from the Standard Model, the results are a potential sign of something amiss and are likely to affect existing theories, including those attempting to deduce the properties of Higgs bosons. In 2015, LHCb reported observing a 2.1 σ excess in the same ratio of branching fractions. The Belle experiment also reported an excess. In 2017 a meta analysis of all available data reported a 5 σ deviation from SM.
Anomalous mass of the W boson - results from the CDF Collaboration, reported in April 2022, indicate that the mass of a W boson exceeds the mass predicted by the Standard Model with a significance of 7 σ. In 2023, the ATLAS experiment released an improved measurement for the mass of the W boson, 80,360 ± 16 MeV, which aligned with predictions from the Standard Model.
Theoretical predictions not observed
Observation at particle colliders of all of the fundamental particles predicted by the Standard Model has been confirmed. The Higgs boson is predicted by the Standard Model's explanation of the Higgs mechanism, which describes how the weak SU(2) gauge symmetry is broken and how fundamental particles obtain mass; it was the last particle predicted by the Standard Model to be observed. On July 4, 2012, CERN scientists using the Large Hadron Collider announced the discovery of a particle consistent with the Higgs boson, with a mass of about 126 GeV/c2. A Higgs boson was confirmed to exist on March 14, 2013, although efforts to confirm that it has all of the properties predicted by the Standard Model are ongoing.
A few hadrons (i.e. composite particles made of quarks) whose existence is predicted by the Standard Model, which can be produced only at very high energies in very low frequencies have not yet been definitively observed, and "glueballs" (i.e. composite particles made of gluons) have also not yet been definitively observed. Some very low frequency particle decays predicted by the Standard Model have also not yet been definitively observed because insufficient data is available to make a statistically significant observation.
Koide formula - an unexplained empirical equation remarked upon by Yoshio Koide in 1981, and later by others. It relates the masses of the three charged leptons: . The Standard Model does not predict lepton masses (they are free parameters of the theory). However, the value of the Koide formula being equal to 2/3 within experimental errors of the measured lepton masses suggests the existence of a theory which is able to predict lepton masses.
The CKM matrix, if interpreted as a rotation matrix in a 3-dimensional vector space, "rotates" a vector composed of square roots of down-type quark masses into a vector of square roots of up-type quark masses , up to vector lengths, a result due to Kohzo Nishida.
The sum of squares of the Yukawa couplings of all Standard Model fermions is approximately 0.984, which is very close to 1. To put it another way, the sum of squares of fermion masses is very close to half of squared Higgs vacuum expectation value.
The sum of squares of boson masses (that is, W, Z, and Higgs bosons) is also very close to half of squared Higgs vacuum expectation value, the ratio is approximately 1.004.
Consequently, the sum of squared masses of all Standard Model particles is very close to the squared Higgs vacuum expectation value, the ratio is approximately 0.994.
It is unclear if these empirical relationships represent any underlying physics; according to Koide, the rule he discovered "may be an accidental coincidence".
Some features of the standard model are added in an ad hoc way. These are not problems per se (i.e. the theory works fine with these ad hoc features), but they imply a lack of understanding. These ad hoc features have motivated theorists to look for more fundamental theories with fewer parameters. Some of the ad hoc features are:
Hierarchy problem – the standard model introduces particle masses through a process known as spontaneous symmetry breaking caused by the Higgs field. Within the standard model, the mass of the Higgs gets some very large quantum corrections due to the presence of virtual particles (mostly virtual top quarks). These corrections are much larger than the actual mass of the Higgs. This means that the bare mass parameter of the Higgs in the standard model must be fine tuned in such a way that almost completely cancels the quantum corrections. This level of fine-tuning is deemed unnatural by many theorists.[who?]
Number of parameters – the standard model depends on 19 numerical parameters. Their values are known from experiment, but the origin of the values is unknown. Some theorists[who?] have tried to find relations between different parameters, for example, between the masses of particles in different generations or calculating particle masses, such as in asymptotic safety scenarios.
Quantum triviality – suggests that it may not be possible to create a consistent quantum field theory involving elementary scalar Higgs particles. This is sometimes called the Landau pole problem.
Strong CP problem – theoretically it can be argued that the standard model should contain a term that breaks CP symmetry—relating matter to antimatter—in the strong interaction sector. Experimentally, however, no such violation has been found, implying that the coefficient of this term is very close to zero.
The standard model has three gauge symmetries; the colourSU(3), the weak isospinSU(2), and the weak hyperchargeU(1) symmetry, corresponding to the three fundamental forces. Due to renormalization the coupling constants of each of these symmetries vary with the energy at which they are measured. Around 1016 GeV these couplings become approximately equal. This has led to speculation that above this energy the three gauge symmetries of the standard model are unified in one single gauge symmetry with a simple gauge group, and just one coupling constant. Below this energy the symmetry is spontaneously broken to the standard model symmetries. Popular choices for the unifying group are the special unitary group in five dimensions SU(5) and the special orthogonal group in ten dimensions SO(10).
Theories that unify the standard model symmetries in this way are called Grand Unified Theories (or GUTs), and the energy scale at which the unified symmetry is broken is called the GUT scale. Generically, grand unified theories predict the creation of magnetic monopoles in the early universe, and instability of the proton. Neither of these have been observed, and this absence of observation puts limits on the possible GUTs.
Supersymmetry extends the Standard Model by adding another class of symmetries to the Lagrangian. These symmetries exchange fermionic particles with bosonic ones. Such a symmetry predicts the existence of supersymmetric particles, abbreviated as sparticles, which include the sleptons, squarks, neutralinos and charginos. Each particle in the Standard Model would have a superpartner whose spin differs by 1/2 from the ordinary particle. Due to the breaking of supersymmetry, the sparticles are much heavier than their ordinary counterparts; they are so heavy that existing particle colliders may not be powerful enough to produce them.
In the standard model, neutrinos have exactly zero mass. This is a consequence of the standard model containing only left-handed neutrinos. With no suitable right-handed partner, it is impossible to add a renormalizable mass term to the standard model. Measurements however indicated that neutrinos spontaneously change flavour, which implies that neutrinos have a mass. These measurements only give the mass differences between the different flavours. The best constraint on the absolute mass of the neutrinos comes from precision measurements of tritium decay, providing an upper limit 2 eV, which makes them at least five orders of magnitude lighter than the other particles in the standard model. This necessitates an extension of the standard model, which not only needs to explain how neutrinos get their mass, but also why the mass is so small.
One approach to add masses to the neutrinos, the so-called seesaw mechanism, is to add right-handed neutrinos and have these couple to left-handed neutrinos with a Dirac mass term. The right-handed neutrinos have to be sterile, meaning that they do not participate in any of the standard model interactions. Because they have no charges, the right-handed neutrinos can act as their own anti-particles, and have a Majorana mass term. Like the other Dirac masses in the standard model, the neutrino Dirac mass is expected to be generated through the Higgs mechanism, and is therefore unpredictable. The standard model fermion masses differ by many orders of magnitude; the Dirac neutrino mass has at least the same uncertainty. On the other hand, the Majorana mass for the right-handed neutrinos does not arise from the Higgs mechanism, and is therefore expected to be tied to some energy scale of new physics beyond the standard model, for example the Planck scale. Therefore, any process involving right-handed neutrinos will be suppressed at low energies. The correction due to these suppressed processes effectively gives the left-handed neutrinos a mass that is inversely proportional to the right-handed Majorana mass, a mechanism known as the see-saw. The presence of heavy right-handed neutrinos thereby explains both the small mass of the left-handed neutrinos and the absence of the right-handed neutrinos in observations.
However, due to the uncertainty in the Dirac neutrino masses, the right-handed neutrino masses can lie anywhere. For example, they could be as light as keV and be dark matter, they can have a mass in the LHC energy range and lead to observable lepton number violation, or they can be near the GUT scale, linking the right-handed neutrinos to the possibility of a grand unified theory.
The mass terms mix neutrinos of different generations. This mixing is parameterized by the PMNS matrix, which is the neutrino analogue of the CKM quark mixing matrix. Unlike the quark mixing, which is almost minimal, the mixing of the neutrinos appears to be almost maximal. This has led to various speculations of symmetries between the various generations that could explain the mixing patterns. The mixing matrix could also contain several complex phases that break CP invariance, although there has been no experimental probe of these. These phases could potentially create a surplus of leptons over anti-leptons in the early universe, a process known as leptogenesis. This asymmetry could then at a later stage be converted in an excess of baryons over anti-baryons, and explain the matter-antimatter asymmetry in the universe.
The light neutrinos are disfavored as an explanation for the observation of dark matter, due to considerations of large-scale structure formation in the early universe. Simulations of structure formation show that they are too hot—i.e. their kinetic energy is large compared to their mass—while formation of structures similar to the galaxies in our universe requires cold dark matter. The simulations show that neutrinos can at best explain a few percent of the missing dark matter. However, the heavy sterile right-handed neutrinos are a possible candidate for a dark matter WIMP.
Several preon models have been proposed to address the unsolved problem concerning the fact that there are three generations of quarks and leptons. Preon models generally postulate some additional new particles which are further postulated to be able to combine to form the quarks and leptons of the standard model. One of the earliest preon models was the Rishon model.
To date, no preon model is widely accepted or fully verified.
Theoretical physics continues to strive toward a theory of everything, a theory that fully explains and links together all known physical phenomena, and predicts the outcome of any experiment that could be carried out in principle.
In practical terms the immediate goal in this regard is to develop a theory which would unify the Standard Model with General Relativity in a theory of quantum gravity. Additional features, such as overcoming conceptual flaws in either theory or accurate prediction of particle masses, would be desired.
The challenges in putting together such a theory are not just conceptual - they include the experimental aspects of the very high energies needed to probe exotic realms.
Theories of quantum gravity such as loop quantum gravity and others are thought by some to be promising candidates to the mathematical unification of quantum field theory and general relativity, requiring less drastic changes to existing theories. However recent work places stringent limits on the putative effects of quantum gravity on the speed of light, and disfavours some current models of quantum gravity.
Extensions, revisions, replacements, and reorganizations of the Standard Model exist in attempt to correct for these and other issues. String theory is one such reinvention, and many theoretical physicists think that such theories are the next theoretical step toward a true Theory of Everything.
Among the numerous variants of string theory, M-theory, whose mathematical existence was first proposed at a String Conference in 1995 by Edward Witten, is believed by many to be a proper "ToE" candidate, notably by physicists Brian Greene and Stephen Hawking. Though a full mathematical description is not yet known, solutions to the theory exist for specific cases. Recent works have also proposed alternate string models, some of which lack the various harder-to-test features of M-theory (e.g. the existence of Calabi–Yau manifolds, many extra dimensions, etc.) including works by well-published physicists such as Lisa Randall.
^Sushkov, A. O.; Kim, W. J.; Dalvit, D. A. R.; Lamoreaux, S. K. (2011). "New Experimental Limits on Non-Newtonian Forces in the Micrometer Range". Physical Review Letters. 107 (17): 171101. arXiv:1108.2547. Bibcode:2011PhRvL.107q1101S. doi:10.1103/PhysRevLett.107.171101. PMID22107498. S2CID46596924. It is remarkable that two of the greatest successes of 20th century physics, general relativity and the standard model, appear to be fundamentally incompatible. But see also Donoghue, John F. (2012). "The effective field theory treatment of quantum gravity". AIP Conference Proceedings. 1473 (1): 73. arXiv:1209.3511. Bibcode:2012AIPC.1483...73D. doi:10.1063/1.4756964. S2CID119238707. One can find thousands of statements in the literature to the effect that "general relativity and quantum mechanics are incompatible". These are completely outdated and no longer relevant. Effective field theory shows that general relativity and quantum mechanics work together perfectly normally over a range of scales and curvatures, including those relevant for the world that we see around us. However, effective field theories are only valid over some range of scales. General relativity certainly does have problematic issues at extreme scales. There are important problems which the effective field theory does not solve because they are beyond its range of validity. However, this means that the issue of quantum gravity is not what we thought it to be. Rather than a fundamental incompatibility of quantum mechanics and gravity, we are in the more familiar situation of needing a more complete theory beyond the range of their combined applicability. The usual marriage of general relativity and quantum mechanics is fine at ordinary energies, but we now seek to uncover the modifications that must be present in more extreme conditions. This is the modern view of the problem of quantum gravity, and it represents progress over the outdated view of the past."
^Nishida, Kohzo (2017-10-14). "Phenomenological formula for CKM matrix and its physical interpretation". Progress of Theoretical and Experimental Physics. 2017 (10). arXiv:1708.01110. doi:10.1093/ptep/ptx138.