There is a strong scientific consensus that greenhouse effect due to carbon dioxide is a main driver of climate change. Following is an illustrative model meant for a pedagogical purpose, showing the main physical determinants of the effect.
Under this understanding, global warming is determined by a simple energy budget: In the long run, Earth emits radiation in the same amount as it receives from the sun. However, the amount emitted depends both on Earth's temperature and on its albedo: The more reflective the Earth in a certain wavelength, the less radiation it would both receive and emit in this wavelength; the warmer the Earth, the more radiation it emits. Thus changes in the albedo may have an effect on Earth's temperature, and the effect can be calculated by assuming a new steady state would be arrived at.
In most of the electromagnetic spectrum, atmospheric carbon dioxide either blocks the radiation emitted from the ground almost completely, or is almost transparent, so that increasing the amount of carbon dioxide in the atmosphere, e.g. doubling the amount, will have negligible effects. However, in some narrow parts of the spectrum this is not so; doubling the amount of atmospheric carbon dioxide will make Earth's atmosphere relatively opaque to in these wavelengths, which would result in Earth emitting light in these wavelengths from the upper layers of the atmosphere, rather from lower layers or from the ground. Since the upper layers are colder, the amount emitted would be lower, leading to warming of Earth until the reduction in emission is compensated by the rise in temperature.
Furthermore, such warming may cause a feedback mechanism due to other changes in Earth's albedo, e.g. due to ice melting.
Most of the air - including ~88% of the CO2 - is located in the lower part of the atmosphere known as troposphere. The troposphere is thicker in the equator and thinner at the poles, but the global mean of its thickness is around 11 km.
Inside the troposphere, the temperature drops approximately linearly at a rate of 6.5 Celsius degrees per km, from a global mean of 288 Kelvin (15 Celsius) on the ground to 220 K (-53 Celsius). At higher altitudes, up to 20 km, the temperature is approximately constant; this layer is called the tropopause.
The troposphere and tropopause together consist of ~99% of the atmospheric CO2. Inside the troposphere, the CO2 drops with altitude approximately exponentially, with a typical length of 6.3 km; this means that the density at height y is approximately proportional to exp(-y/6.3 km), and it goes down to 37% at 6.3 km, and to 17% at 11 km. Higher through the tropopause, density continues dropping exponentially, albeit faster, with a typical length of 4.2 km.
Earth constantly absorbs energy from sunlight and emits thermal radiation as infrared light. In the long run, Earth radiates the same amount of energy per second as it absorbs, because the amount of thermal radiation emitted depends upon temperature: If Earth absorbs more energy per second than it radiates, Earth heats up and the thermal radiation will increase, until balance is restored; if Earth absorbs less energy than it radiates, it cools down and the thermal radiation will decrease, again until balance is restored.
Atmospheric CO2 absorbs some of the energy radiated by the ground, but it emits itself thermal radiation: For example, in some wavelengths the atmosphere is totally opaque due to absorption by CO2; at these wavelengths, looking at Earth from outer space one would not see the ground, but the atmospheric CO2, and hence its thermal radiation—rather than the ground's thermal radiation. Had the atmosphere been at the same temperature as the ground, this would not change Earth's energy budget; but since the radiation is emitted from atmosphere layers that are cooler than the ground, less radiation is emitted.
As CO2 content of the atmosphere increases due to human activity, this process intensifies, and the total radiation emitted by Earth diminishes; therefore, Earth heats up until the balance is restored.
CO2 absorbs the ground's thermal radiation mainly at wavelengths between 13 and 17 micron. At this wavelength range, it is almost solely responsible for the attenuation of radiation from the ground. The amount of ground radiation that is transmitted through the atmosphere in each wavelength is related to the optical depth of the atmosphere at this wavelength, OD, by:
The optical depth itself is given by Beer–Lambert law:
where σ is the absorption cross section of a single CO2 molecule, and n(y) is the number density of these molecules at altitude y. Due to the high dependence of the cross section in wavelength, the OD changes from around 0.1 at 13 microns to ~10 at 14 microns and even higher beyond 100 at 15 microns, then dropping off to ~10 at 16 microns, ~1 at 17 microns and below 0.1 at 18 microns. Note that the OD depends on the total number of molecules per unit area in the atmosphere, and therefore rises linearly with its CO2 content.
Looked upon from outer space into the atmosphere at a specific wavelength, one would see to different degrees different layers of the atmosphere, but on average one would see down to an altitude such that the part of the atmosphere from this altitude and up has an optical depth of ~1. Earth will therefore radiate at this wavelength approximately according to the temperature of that altitude. The effect of increasing CO2 atmospheric content means that the optical depth increases, so that the altitude seen from outer space increases; as long as it increases within the troosphere, the radiation temperature drops and the radiation decreases. When it reaches the tropopause, any further increase in CO2 levels will have no noticeable effect, since the temperature no longer depends there on the altitude.
At wavelengths of 14 to 16 microns, even the tropopause, having ~0.12 of the amount of CO2 of the whole atmosphere, has OD>1. Therefore, at these wavelengths Earth radiates mainly in the tropopause temperature, and addition of CO2 does not change this. At wavelengths smaller than 13 microns or larger than 18 microns, the atmospheric absorption is negligible, and addition of CO2 hardly changes this. Therefore, the effect of CO2 increase on radiation is relevant in wavelengths 13–14 and 16–18 microns, and addition on CO2 mainly contributes to the opacity of the troposphere, changing the altitude that is effectively seen from outer space within the troposphere.
We now turn to calculating the effect of CO2 on radiation, using a one-layer model, i.e. we treat the whole troposphere as a single layer:
Looking at a particular wavelength λ up to λ+dλ, the whole atmosphere has an optical depth OD, while the tropopause has an optical depth 0.12*OD; the troposphere has an optical depth of 0.88*OD. Thus, of the radiation from below the tropopause is transmitted out, but this includes of the radiation that originates from the ground. Thus, the weight of the troposphere in determining the radiation that is emitted to outer space is:
A relative increase in the CO2 concentration means an equal relative increase in the total CO2 content of the atmosphere, dN/N where N is the number of CO2 molecules. Adding a minute number of such molecules dN will increase the troposphere's weight in determining the radiation for the relevant wavelengths, approximately by the relative amount dN/N, and thus by:
Since CO2 hardly influences sunlight absorption by Earth, the radiative forcing due to an increase in CO2 content is equal to the difference in the flux radiated by Earth due to such an increase. To calculate this, one must multiply the above by the difference in radiation due to the difference in temperature. According to Planck's Law, this is:
The ground is at temperature T0 = 288 K, and for the troposphere we will take a typical temperature, the one at the average height of molecules, 6.3 km, where the temperature is T1247 K.
Therefore, dI, the change in Earth's emitted radiation is, in a rough approximation, is:
Since dN/N = d(ln N), this can be written as:
The function is maximal for x = 2.41, with a maximal value of 0.66, and it drops to half this value at x=0.5 and x = 9.2. Thus we look at wavelengths for which the OD is between 0.5 and 9.2: This gives a wavelength band at the width of approximately 1 micron around 17 microns, and less than 1 micron around 13.5 microns. We therefore take:
Which gives -2.3 W/m2 for the 13.5 microns band, and -2.7 W/m2 for the 17 microns band, for a total of 5 W/m2.
A 2-fold increase in CO2 content changes the wavelengths ranges only slightly, and so this derivative is approximately constant along such an increase. Thus, a 2-fold increase in CO2 content will reduce the radiation emitted by Earth by approximately:
More generally, an increase by a factor c/c0 gives:
These results are close to the approximation of a more elaborate yet simplified model giving
We may make a more elaborate calculation by treating the atmosphere as compounded of many thin layers. For each such layer, at height y and thickness dy, the weight of this layer in determining the radiation temperaure seen from outer space is a generalization of the expression arrived at earlier for the troposphere. It is:
where OD(y) is the optical depth of the part of the atmosphere from y upwards.
The total effect of CO2 on the radiation at wavelengths λ to λ+dλ is therefore:
where B is the expression for radiation according to Planck's law presented above:
and the infinity here can be taken actually as the top of the tropopause.
Thus the effect of a relative change in CO2 concentration, dN/N = dn0/n0 (where n0 is the density number near ground), would be (noting that dN/N = d(ln N) = d(ln n0):
where we have used integration by part.
Because B does not depend on N, and because , we have:
Now, is constant in the troposphere and zero in the tropopause. We denote the height of the border between them as U. So:
The optical depth is proportional to the integral of the number density over y, as does the pressure. Therefore, OD(y) is proportional to the pressure p(y), which within the troposphere (height 0 to U) falls exponentially with decay constant 1/Hp (Hp~5.6 km for CO2), thus:
Since + constant, viewed as a function of both y and N, we have:
And therefore differentiating with respect to ln N is the same as differentiating with respect to y, times a factor of .
We arrive at:
Since the temperature only changes by ~25% within the troposphere, one may take a (rough) linear approximation of B with T at the relevant wavelengths, and get:
Due to the linear approximation of B we have: with T1 taken at Hp, so that totally:
giving the same result as in the one-layer model presented above, as well as the logarithmic dependence on N, except that now we see T1 is taken at 5.6 km (the pressure drop height scale), rather than 6.3 km (the density drop height scale).
The total average energy per unit time radiated by Earth is equal to the average energy flux j times the surface area 4πR2, where R is Earth's radius. On the other hand, the average energy flux absorbed from sunlight is the solar constant S0 times Earth's cross section of πR2, times the fraction absorbed by Earth, which is one minus Earth's albedo a.
The average energy per unit time radiated out is equal to the average energy per unit time absorbed from sunlight, so:
Based on the value of 3.1 W/m^2 obtained above in the section on the one layer model, the radiative forcing due to CO2 relative to the average radiated flux is therefore:
An exact calculation using the MODTRAN model, over all wavelengths and including methane and ozone greenhouse gasses, as shown in the plot above, gives, for tropical latitudes, an outgoing flux 298.645 W/m2 for current CO2 levels and 295.286 W/m2 after CO2 doubling, i.e. a radiative forcing of 1.1%, under clear sky conditions, as well as a ground temperature of 299.7o K (26.6o Celsius). The radiative forcing is largely similar in different latitudes and under different weather conditions.
On average, the total power of the thermal radiation emitted by Earth is equal to the power absorbed from sunlight. As CO2 levels rise, the emitted radiation can maintain this equilibrium only if the temperature increases, so that the total emitted radiation is unchanged (averaged over enough time, in the order of few years so that diurnal and annual periods are averaged upon).
According to Stefan–Boltzmann law, the total emitted power by Earth per unit area is:
where σB is Stefan–Boltzmann constant and ε is the emissivity in the relevant wavelengths. T is some average temperature representing the effective radiation temperature.
CO2 content changes the effective T, but instead one may treat T to be a typical ground or lower-atmosphere temperature (same as T0 or close to it) and consider CO2 content as changing the emissivity ε. We thus re-interpret ε in the above equation as an effective emissivity that includes the CO2 effect;, and take T=T0. A change in CO2 content thus causes a change dε in this effective emissivity, so that is the radiative forcing, divided by the total energy flux radiated by Earth.
The relative change in the total radiated energy flux due to changes in emissivity and temperature is:
Thus, if the total emitted power is to remain unchanged, a radiative forcing relative to the total energy flux radiated by Earth, causes a 1/4-fold relative change in temperature.
Main article: Ice–albedo feedback
Since warming of Earth means less ice on the ground on average, it would cause lower albedo and more sunlight absorbed, hence further increasing Earth's temperature.
As a rough estimate, we note that the average temperature on most of Earth are between -20 and +30 Celsius degree, a good guess will be that 2% of its surface are between -1 and 0 OC, and thus an equivalent area of its surface will be changed from ice-covered (or snow-covered) to either ocean or forest.
For comparison, in the northern hemisphere, the arctic sea ice has shrunk between 1979 and 2015 by 1.43x1012 m2 at maxima and 2.52x1012 m2 at minima, for an average of almost 2x1012 m2, which is 0.4% of Earth's total surface of 510x1012 m2. At this time the global temperature rose by ~0.6 oC. The areas of inland glaciers combined (not including the antarctice ice sheet), the antarctic sea ice, and the arctic sea ice are all comparable, so one may expect the change in ice of the arctic sea ice is roughly a third of the total change, giving 1.2% of the Earth surface turned from ice to ocean or bare ground per 0.6 oC, or equivalently 2% per 1 oC. The antarctic ice cap size oscillates, and it is hard to predict its future course, with factors such as relative thermal insulated and constraints due to the Antarctic Circumpolar Current probably playing a part.
As the difference in albedo between ice and e.g. ocean is around 2/3, this means that due to a 1 OC rise, the albedo will drop by 2%*2/3 = 4/3%. However this will mainly happen in northern and southern latitudes, around 60 degrees off the equator, and so the effective area is actually 2% * cos(60o) = 1%, and the global albedo drop would be 2/3%.
Since a change in radiation of 1.3% causes a direct change of 1 degree Celsius (without feedback), as calculated above, and this causes another change of 2/3% in radiation due to positive feedback, whice is half the original change, this means the total factor caused by this feedback mechanism would be:
Thus, this feedback would double the effect of the change in radiation, causing a change of ~ 2 K in the global temperature, which is indeed the commonly accepted short-term value. For long-term value, including further feedback mechanisms, ~3K is considered more probable.