As the positive integer${\textstyle n}$ becomes larger and larger, the value ${\textstyle n\times \sin \left({\tfrac {1}{n))\right)}$ becomes arbitrarily close to ${\textstyle 1}$. We say that "the limit of the sequence ${\textstyle n\times \sin \left({\tfrac {1}{n))\right)}$ equals ${\textstyle 1}$."
In mathematics, the limit of a sequence is the value that the terms of a sequence "tend to", and is often denoted using the $\lim$ symbol (e.g., $\lim _{n\to \infty }a_{n))$).^{[1]} If such a limit exists and is finite, the sequence is called convergent.^{[2]} A sequence that does not converge is said to be divergent.^{[3]} The limit of a sequence is said to be the fundamental notion on which the whole of mathematical analysis ultimately rests.^{[1]}
Grégoire de Saint-Vincent gave the first definition of limit (terminus) of a geometric series in his work Opus Geometricum (1647): "The terminus of a progression is the end of the series, which none progression can reach, even not if she is continued in infinity, but which she can approach nearer than a given segment."^{[4]}
Pietro Mengoli anticipated the modern idea of limit of a sequence with his study of quasi-proportions in Geometriae speciosae elementa (1659). He used the term quasi-infinite for unbounded and quasi-null for vanishing.
Newton dealt with series in his works on Analysis with infinite series (written in 1669, circulated in manuscript, published in 1711), Method of fluxions and infinite series (written in 1671, published in English translation in 1736, Latin original published much later) and Tractatus de Quadratura Curvarum (written in 1693, published in 1704 as an Appendix to his Optiks). In the latter work, Newton considers the binomial expansion of ${\textstyle (x+o)^{n))$, which he then linearizes by taking the limit as ${\textstyle o}$ tends to ${\textstyle 0}$.
In the 18th century, mathematicians such as Euler succeeded in summing some divergent series by stopping at the right moment; they did not much care whether a limit existed, as long as it could be calculated. At the end of the century, Lagrange in his Théorie des fonctions analytiques (1797) opined that the lack of rigour precluded further development in calculus. Gauss in his study of hypergeometric series (1813) for the first time rigorously investigated the conditions under which a series converged to a limit.
The modern definition of a limit (for any ${\textstyle \varepsilon }$ there exists an index ${\textstyle N}$ so that ...) was given by Bernard Bolzano (Der binomische Lehrsatz, Prague 1816, which was little noticed at the time), and by Karl Weierstrass in the 1870s.
In the real numbers, a number $L$ is the limit of the sequence$(x_{n})$, if the numbers in the sequence become closer and closer to $L$, and not to any other number.
If $x_{n}=c$ for constant ${\textstyle c}$, then $x_{n}\to c$.^{[proof 1]}^{[5]}
If $x_{n}={\frac {1}{n))$, then $x_{n}\to 0$.^{[proof 2]}^{[5]}
If $x_{n}={\frac {1}{n))$ when $n$ is even, and $x_{n}={\frac {1}{n^{2))))$ when $n$ is odd, then $x_{n}\to 0$. (The fact that $x_{n+1}>x_{n))$ whenever $n$ is odd is irrelevant.)
Given any real number, one may easily construct a sequence that converges to that number by taking decimal approximations. For example, the sequence ${\textstyle 0.3,0.33,0.333,0.3333,\dots }$ converges to ${\textstyle {\frac {1}{3))}$. The decimal representation${\textstyle 0.3333\dots }$ is the limit of the previous sequence, defined by $0.3333...:=\lim _{n\to \infty }\sum _{k=1}^{n}{\frac {3}{10^{k))))$
Finding the limit of a sequence is not always obvious. Two examples are $\lim _{n\to \infty }\left(1+{\tfrac {1}{n))\right)^{n))$ (the limit of which is the number e) and the arithmetic–geometric mean. The squeeze theorem is often useful in the establishment of such limits.
We call $x$ the limit of the sequence$(x_{n})$, which is written
$x_{n}\to x$, or
$\lim _{n\to \infty }x_{n}=x$,
if the following condition holds:
For each real number$\varepsilon >0$, there exists a natural number$N$ such that, for every natural number $n\geq N$, we have $|x_{n}-x|<\varepsilon$.^{[6]}
In other words, for every measure of closeness $\varepsilon$, the sequence's terms are eventually that close to the limit. The sequence $(x_{n})$ is said to converge to or tend to the limit $x$.
If a sequence $(x_{n})$ converges to some limit $x$, then it is convergent and $x$ is the only limit; otherwise $(x_{n})$ is divergent. A sequence that has zero as its limit is sometimes called a null sequence.
Example of a sequence which converges to the limit $a$
Regardless which $\varepsilon >0$ we have, there is an index $N_{0))$, so that the sequence lies afterwards completely in the epsilon tube $(a-\varepsilon ,a+\varepsilon )$.
There is also for a smaller $\varepsilon _{1}>0$ an index $N_{1))$, so that the sequence is afterwards inside the epsilon tube $(a-\varepsilon _{1},a+\varepsilon _{1})$.
For each $\varepsilon >0$ there are only finitely many sequence members outside the epsilon tube.
Some other important properties of limits of real sequences include the following:
When it exists, the limit of a sequence is unique.^{[5]}
Limits of sequences behave well with respect to the usual arithmetic operations. If $\lim _{n\to \infty }a_{n))$ and $\lim _{n\to \infty }b_{n))$ exists, then
For any continuous function${\textstyle f}$, if $\lim _{n\to \infty }x_{n))$ exists, then $\lim _{n\to \infty }f\left(x_{n}\right)$ exists too. In fact, any real-valued function${\textstyle f}$ is continuous if and only if it preserves the limits of sequences (though this is not necessarily true when using more general notions of continuity).
If $a_{n}\leq b_{n))$ for all $n$ greater than some $N$, then $\lim _{n\to \infty }a_{n}\leq \lim _{n\to \infty }b_{n))$.
(Squeeze theorem) If $a_{n}\leq c_{n}\leq b_{n))$ for all $n$ greater than some $N$, and $\lim _{n\to \infty }a_{n}=\lim _{n\to \infty }b_{n}=L$, then $\lim _{n\to \infty }c_{n}=L$.
A sequence is convergent if and only if every subsequence is convergent.
If every subsequence of a sequence has its own subsequence which converges to the same point, then the original sequence converges to that point.
These properties are extensively used to prove limits, without the need to directly use the cumbersome formal definition. For example, once it is proven that $1/n\to 0$, it becomes easy to show—using the properties above—that ${\frac {a}{b+{\frac {c}{n))))\to {\frac {a}{b))$ (assuming that $b\neq 0$).
A sequence $(x_{n})$ is said to tend to infinity, written
$x_{n}\to \infty$, or
$\lim _{n\to \infty }x_{n}=\infty$,
if the following holds:
For every real number $K$, there is a natural number $N$ such that for every natural number $n\geq N$, we have $x_{n}>K$; that is, the sequence terms are eventually larger than any fixed $K$.
Similarly, we say a sequence tends to minus infinity, written
$x_{n}\to -\infty$, or
$\lim _{n\to \infty }x_{n}=-\infty$,
if the following holds:
For every real number $K$, there is a natural number $N$ such that for every natural number $n\geq N$, we have $x_{n}<K$; that is, the sequence terms are eventually smaller than any fixed $K$.
If a sequence tends to infinity or minus infinity, then it is divergent. However, a divergent sequence need not tend to plus or minus infinity, and the sequence $x_{n}=(-1)^{n))$ provides one such example.
When it exists, the limit of a sequence is unique, as distinct points are separated by some positive distance, so for $\varepsilon$ less than half this distance, sequence terms cannot be within a distance $\varepsilon$ of both points.
For any continuous functionf, if $\lim _{n\to \infty }x_{n))$ exists, then $\lim _{n\to \infty }f(x_{n})=f\left(\lim _{n\to \infty }x_{n}\right)$. In fact, a functionf is continuous if and only if it preserves the limits of sequences.
A Cauchy sequence is a sequence whose terms ultimately become arbitrarily close together, after sufficiently many initial terms have been discarded. The notion of a Cauchy sequence is important in the study of sequences in metric spaces, and, in particular, in real analysis. One particularly important result in real analysis is the Cauchy criterion for convergence of sequences: a sequence of real numbers is convergent if and only if it is a Cauchy sequence. This remains true in other complete metric spaces.
A point $x\in X$ of the topological space $(X,\tau )$ is a limit or limit point^{[7]}^{[8]} of the sequence$\left(x_{n}\right)_{n\in \mathbb {N} ))$ if:
For every neighbourhood$U$ of $x$, there exists some $N\in \mathbb {N}$ such that for every $n\geq N$, we have $x_{n}\in U$.^{[9]}
This coincides with the definition given for metric spaces, if $(X,d)$ is a metric space and $\tau$ is the topology generated by $d$.
A limit of a sequence of points $\left(x_{n}\right)_{n\in \mathbb {N} ))$ in a topological space $T$ is a special case of a limit of a function: the domain is $\mathbb {N}$ in the space $\mathbb {N} \cup \lbrace +\infty \rbrace$, with the induced topology of the affinely extended real number system, the range is $T$, and the function argument $n$ tends to $+\infty$, which in this space is a limit point of $\mathbb {N}$.
In a Hausdorff space, limits of sequences are unique whenever they exist. This need not be the case in non-Hausdorff spaces; in particular, if two points $x$ and $y$ are topologically indistinguishable, then any sequence that converges to $x$ must converge to $y$ and vice versa.
The definition of the limit using the hyperreal numbers formalizes the intuition that for a "very large" value of the index, the corresponding term is "very close" to the limit. More precisely, a real sequence $(x_{n})$ tends to L if for every infinite hypernatural${\textstyle H}$, the term $x_{H))$ is infinitely close to ${\textstyle L}$ (i.e., the difference $x_{H}-L$ is infinitesimal). Equivalently, L is the standard part of $x_{H))$:
$L={\rm {st))(x_{H})$.
Thus, the limit can be defined by the formula
$\lim _{n\to \infty }x_{n}={\rm {st))(x_{H})$.
where the limit exists if and only if the righthand side is independent of the choice of an infinite ${\textstyle H}$.
Sometimes one may also consider a sequence with more than one index, for example, a double sequence $(x_{n,m})$. This sequence has a limit $L$ if it becomes closer and closer to $L$ when both n and m becomes very large.
If $x_{n,m}=c$ for constant ${\textstyle c}$, then $x_{n,m}\to c$.
If $x_{n,m}={\frac {1}{n+m))$, then $x_{n,m}\to 0$.
If $x_{n,m}={\frac {n}{n+m))$, then the limit does not exist. Depending on the relative "growing speed" of ${\textstyle n}$ and ${\textstyle m}$, this sequence can get closer to any value between ${\textstyle 0}$ and ${\textstyle 1}$.
For each real number$\varepsilon >0$, there exists a natural number$N$ such that, for every pair of natural numbers $n,m\geq N$, we have $|x_{n,m}-x|<\varepsilon$.^{[10]}
In other words, for every measure of closeness $\varepsilon$, the sequence's terms are eventually that close to the limit. The sequence $(x_{n,m})$ is said to converge to or tend to the limit $x$.
The double limit is different from taking limit in n first, and then in m. The latter is known as iterated limit. Given that both the double limit and the iterated limit exists, they have the same value. However, it is possible that one of them exist but the other does not.
For every real number $K$, there is a natural number $N$ such that for every pair of natural numbers $n,m\geq N$, we have $x_{n,m}>K$; that is, the sequence terms are eventually larger than any fixed $K$.
For every real number $K$, there is a natural number $N$ such that for every pair of natural numbers $n,m\geq N$, we have $x_{n,m}<K$; that is, the sequence terms are eventually smaller than any fixed $K$.
If a sequence tends to infinity or minus infinity, then it is divergent. However, a divergent sequence need not tend to plus or minus infinity, and the sequence $x_{n,m}=(-1)^{n+m))$ provides one such example.
For a double sequence $(x_{n,m})$, we may take limit in one of the indices, say, $n\to \infty$, to obtain a single sequence $(y_{m})$. In fact, there are two possible meanings when taking this limit. The first one is called pointwise limit, denoted
For each real number$\varepsilon >0$ and each fixed natural number$m$, there exists a natural number $N(\varepsilon ,m)>0$ such that, for every natural number $n\geq N$, we have $|x_{n,m}-y_{m}|<\varepsilon$.^{[11]}
For each real number$\varepsilon >0$, there exists a natural number $N(\varepsilon )>0$ such that, for every natural number$m$ and for every natural number $n\geq N$, we have $|x_{n,m}-y_{m}|<\varepsilon$.^{[11]}
In this definition, the choice of $N$ is independent of $m$. In other words, the choice of $N$ is uniformly applicable to all natural numbers $m$. Hence, one can easily see that uniform convergence is a stronger property than pointwise convergence: the existence of uniform limit implies the existence and equality of pointwise limit:
If $x_{n,m}\to y_{m))$ uniformly, then $x_{n,m}\to y_{m))$ pointwise.
When such a limit exists, we say the sequence $(x_{n,m})$converges uniformly to $(y_{m})$.
For a double sequence $(x_{n,m})$, we may take limit in one of the indices, say, $n\to \infty$, to obtain a single sequence $(y_{m})$, and then take limit in the other index, namely $m\to \infty$, to get a number $y$. Symbolically,
A sufficient condition of equality is given by the Moore-Osgood theorem, which requires the limit $\lim _{n\to \infty }x_{n,m}=y_{m))$ to be uniform in ${\textstyle m}$.^{[10]}
^Van Looy, H. (1984). A chronology and historical analysis of the mathematical manuscripts of Gregorius a Sancto Vincentio (1584–1667). Historia Mathematica, 11(1), 57-75.
^Zeidler, Eberhard (1995). Applied functional analysis : main principles and their applications (1 ed.). New York: Springer-Verlag. p. 29. ISBN978-0-387-94422-7.
^ ^{a}^{b}Zakon, Elias (2011). "Chapter 4. Function Limits and Continuity". Mathematical Anaylysis, Volume I. p. 223. ISBN9781617386473.