A sequence of distributions corresponds to a sequence of random variablesZ_{i} for i = 1, 2, ..., I . In the simplest case, an asymptotic distribution exists if the probability distribution of Z_{i} converges to a probability distribution (the asymptotic distribution) as i increases: see convergence in distribution. A special case of an asymptotic distribution is when the sequence of random variables is always zero or Z_{i} = 0 as i approaches infinity. Here the asymptotic distribution is a degenerate distribution, corresponding to the value zero.

However, the most usual sense in which the term asymptotic distribution is used arises where the random variables Z_{i} are modified by two sequences of non-random values. Thus if

$Y_{i}={\frac {Z_{i}-a_{i)){b_{i))))$

converges in distribution to a non-degenerate distribution for two sequences {a_{i}} and {b_{i}} then Z_{i} is said to have that distribution as its asymptotic distribution. If the distribution function of the asymptotic distribution is F then, for large n, the following approximations hold

If an asymptotic distribution exists, it is not necessarily true that any one outcome of the sequence of random variables is a convergent sequence of numbers. It is the sequence of probability distributions that converges.

Suppose {X_{1}, X_{2}, ...} is a sequence of i.i.d. random variables with E[X_{i}] = µ and Var[X_{i}] = σ^{2} < ∞. Let S_{n} be the average of {X_{1}, ..., X_{n}}. Then as n approaches infinity, the random variables √n(S_{n} − µ) converge in distribution to a normalN(0, σ^{2}):^{[1]}

The central limit theorem gives only an asymptotic distribution. As an approximation for a finite number of observations, it provides a reasonable approximation only when close to the peak of the normal distribution; it requires a very large number of observations to stretch into the tails.

Local asymptotic normality is a generalization of the central limit theorem. It is a property of a sequence of statistical models, which allows this sequence to be asymptotically approximated by a normal location model, after a rescaling of the parameter. An important example when the local asymptotic normality holds is in the case of independent and identically distributed sampling from a regular parametric model; this is just the central limit theorem.

Barndorff-Nielson & Cox provide a direct definition of asymptotic normality.^{[2]}