In mathematics, differential refers to several related notions[1] derived from the early days of calculus, put on a rigorous footing, such as infinitesimal differences and the derivatives of functions.[2]

The term is used in various branches of mathematics such as calculus, differential geometry, algebraic geometry and algebraic topology.

## Introduction

The term differential is used nonrigorously in calculus to refer to an infinitesimal ("infinitely small") change in some varying quantity. For example, if x is a variable, then a change in the value of x is often denoted Δx (pronounced delta x). The differential dx represents an infinitely small change in the variable x. The idea of an infinitely small or infinitely slow change is, intuitively, extremely useful, and there are a number of ways to make the notion mathematically precise.

Using calculus, it is possible to relate the infinitely small changes of various variables to each other mathematically using derivatives. If y is a function of x, then the differential dy of y is related to dx by the formula

${\displaystyle dy={\frac {dy}{dx))\,dx,}$
where ${\displaystyle {\frac {dy}{dx))\,}$denotes the derivative of y with respect to x. This formula summarizes the intuitive idea that the derivative of y with respect to x is the limit of the ratio of differences Δyx as Δx becomes infinitesimal.

## History and usage

Infinitesimal quantities played a significant role in the development of calculus. Archimedes used them, even though he didn't believe that arguments involving infinitesimals were rigorous.[3] Isaac Newton referred to them as fluxions. However, it was Gottfried Leibniz who coined the term differentials for infinitesimal quantities and introduced the notation for them which is still used today.

In Leibniz's notation, if x is a variable quantity, then dx denotes an infinitesimal change in the variable x. Thus, if y is a function of x, then the derivative of y with respect to x is often denoted dy/dx, which would otherwise be denoted (in the notation of Newton or Lagrange) or y. The use of differentials in this form attracted much criticism, for instance in the famous pamphlet The Analyst by Bishop Berkeley. Nevertheless, the notation has remained popular because it suggests strongly the idea that the derivative of y at x is its instantaneous rate of change (the slope of the graph's tangent line), which may be obtained by taking the limit of the ratio Δyx as Δx becomes arbitrarily small. Differentials are also compatible with dimensional analysis, where a differential such as dx has the same dimensions as the variable x.

Calculus evolved into a distinct branch of mathematics during the 17th century CE, although there were antecedents going back to antiquity. The presentations of, e.g., Newton, Leibniz, were marked by non-rigorous definitions of terms like differential, fluent and "infinitely small". While many of the arguments in Bishop Berkeley's 1734 The Analyst are theological in nature, modern mathematicians acknowledge the validity of his argument against "the Ghosts of departed Quantities"; however, the modern approaches do not have the same technical issues. Despite the lack of rigor, immense progress was made in the 17th and 18th centuries. In the 19th century, Cauchy and others gradually developed the Epsilon, delta approach to continuity, limits and derivatives, giving a solid conceptual foundation for calculus.

In the 20th century, several new concepts in, e.g., multivariable calculus, differential geometry, seemed to encapsulate the intent of the old terms, especially differential; both differential and infinitesimal are used with new, more rigorous, meanings.

Differentials are also used in the notation for integrals because an integral can be regarded as an infinite sum of infinitesimal quantities: the area under a graph is obtained by subdividing the graph into infinitely thin strips and summing their areas. In an expression such as

${\displaystyle \int f(x)\,dx,}$
the integral sign (which is a modified long s) denotes the infinite sum, f(x) denotes the "height" of a thin strip, and the differential dx denotes its infinitely thin width.

## Approaches

There are several approaches for making the notion of differentials mathematically precise.

1. Differentials as linear maps. This approach underlies the definition of the derivative and the exterior derivative in differential geometry.[4]
2. Differentials as nilpotent elements of commutative rings. This approach is popular in algebraic geometry.[5]
3. Differentials in smooth models of set theory. This approach is known as synthetic differential geometry or smooth infinitesimal analysis and is closely related to the algebraic geometric approach, except that ideas from topos theory are used to hide the mechanisms by which nilpotent infinitesimals are introduced.[6]
4. Differentials as infinitesimals in hyperreal number systems, which are extensions of the real numbers that contain invertible infinitesimals and infinitely large numbers. This is the approach of nonstandard analysis pioneered by Abraham Robinson.[7]

These approaches are very different from each other, but they have in common the idea of being quantitative, i.e., saying not just that a differential is infinitely small, but how small it is.

### Differentials as linear maps

There is a simple way to make precise sense of differentials, first used on the Real line by regarding them as linear maps. It can be used on ${\displaystyle \mathbb {R} }$, ${\displaystyle \mathbb {R} ^{n))$, a Hilbert space, a Banach space, or more generally, a topological vector space. The case of the Real line is the easiest to explain. This type of differential is also known as a covariant vector or cotangent vector, depending on context.

#### Differentials as linear maps on R

Suppose ${\displaystyle f(x)}$ is a real-valued function on ${\displaystyle \mathbb {R} }$. We can reinterpret the variable ${\displaystyle x}$ in ${\displaystyle f(x)}$ as being a function rather than a number, namely the identity map on the real line, which takes a real number ${\displaystyle p}$ to itself: ${\displaystyle x(p)=p}$. Then ${\displaystyle f(x)}$ is the composite of ${\displaystyle f}$ with ${\displaystyle x}$, whose value at ${\displaystyle p}$ is ${\displaystyle f(x(p))=f(p)}$. The differential ${\displaystyle \operatorname {d} f}$ (which of course depends on ${\displaystyle f}$) is then a function whose value at ${\displaystyle p}$ (usually denoted ${\displaystyle df_{p))$) is not a number, but a linear map from ${\displaystyle \mathbb {R} }$ to ${\displaystyle \mathbb {R} }$. Since a linear map from ${\displaystyle \mathbb {R} }$ to ${\displaystyle \mathbb {R} }$ is given by a ${\displaystyle 1\times 1}$ matrix, it is essentially the same thing as a number, but the change in the point of view allows us to think of ${\displaystyle df_{p))$ as an infinitesimal and compare it with the standard infinitesimal ${\displaystyle dx_{p))$, which is again just the identity map from ${\displaystyle \mathbb {R} }$ to ${\displaystyle \mathbb {R} }$ (a ${\displaystyle 1\times 1}$ matrix with entry ${\displaystyle 1}$). The identity map has the property that if ${\displaystyle \varepsilon }$ is very small, then ${\displaystyle dx_{p}(\varepsilon )}$ is very small, which enables us to regard it as infinitesimal. The differential ${\displaystyle df_{p))$ has the same property, because it is just a multiple of ${\displaystyle dx_{p))$, and this multiple is the derivative ${\displaystyle f'(p)}$ by definition. We therefore obtain that ${\displaystyle df_{p}=f'(p)\,dx_{p))$, and hence ${\displaystyle df=f'\,dx}$. Thus we recover the idea that ${\displaystyle f'}$ is the ratio of the differentials ${\displaystyle df}$ and ${\displaystyle dx}$.

This would just be a trick were it not for the fact that:

1. it captures the idea of the derivative of ${\displaystyle f}$ at ${\displaystyle p}$ as the best linear approximation to ${\displaystyle f}$ at ${\displaystyle p}$;
2. it has many generalizations.

#### Differentials as linear maps on Rn

If ${\displaystyle f}$ is a function from ${\displaystyle \mathbb {R} ^{n))$ to ${\displaystyle \mathbb {R} }$, then we say that ${\displaystyle f}$ is differentiable[8] at ${\displaystyle p\in \mathbb {R} ^{n))$ if there is a linear map ${\displaystyle df_{p))$ from ${\displaystyle \mathbb {R} ^{n))$ to ${\displaystyle \mathbb {R} }$ such that for any ${\displaystyle \varepsilon >0}$, there is a neighbourhood ${\displaystyle N}$ of ${\displaystyle p}$ such that for ${\displaystyle x\in N}$,

${\displaystyle \left|f(x)-f(p)-df_{p}(x-p)\right|<\varepsilon \left|x-p\right|.}$

We can now use the same trick as in the one-dimensional case and think of the expression ${\displaystyle f(x_{1},x_{2},\ldots ,x_{n})}$ as the composite of ${\displaystyle f}$ with the standard coordinates ${\displaystyle x_{1},x_{2},\ldots ,x_{n))$ on ${\displaystyle \mathbb {R} ^{n))$ (so that ${\displaystyle x_{j}(p)}$ is the ${\displaystyle j}$-th component of ${\displaystyle p\in \mathbb {R} ^{n))$). Then the differentials ${\displaystyle \left(dx_{1}\right)_{p},\left(dx_{2}\right)_{p},\ldots ,\left(dx_{n}\right)_{p))$ at a point ${\displaystyle p}$ form a basis for the vector space of linear maps from ${\displaystyle \mathbb {R} ^{n))$ to ${\displaystyle \mathbb {R} }$ and therefore, if ${\displaystyle f}$ is differentiable at ${\displaystyle p}$, we can write ${\displaystyle \operatorname {d} f_{p))$ as a linear combination of these basis elements:

${\displaystyle df_{p}=\sum _{j=1}^{n}D_{j}f(p)\,(dx_{j})_{p}.}$

The coefficients ${\displaystyle D_{j}f(p)}$ are (by definition) the partial derivatives of ${\displaystyle f}$ at ${\displaystyle p}$ with respect to ${\displaystyle x_{1},x_{2},\ldots ,x_{n))$. Hence, if ${\displaystyle f}$ is differentiable on all of ${\displaystyle \mathbb {R} ^{n))$, we can write, more concisely:

${\displaystyle \operatorname {d} f={\frac {\partial f}{\partial x_{1))}\,dx_{1}+{\frac {\partial f}{\partial x_{2))}\,dx_{2}+\cdots +{\frac {\partial f}{\partial x_{n))}\,dx_{n}.}$

In the one-dimensional case this becomes

${\displaystyle df={\frac {df}{dx))dx}$
as before.

This idea generalizes straightforwardly to functions from ${\displaystyle \mathbb {R} ^{n))$ to ${\displaystyle \mathbb {R} ^{m))$. Furthermore, it has the decisive advantage over other definitions of the derivative that it is invariant under changes of coordinates. This means that the same idea can be used to define the differential of smooth maps between smooth manifolds.

Aside: Note that the existence of all the partial derivatives of ${\displaystyle f(x)}$ at ${\displaystyle x}$ is a necessary condition for the existence of a differential at ${\displaystyle x}$. However it is not a sufficient condition. For counterexamples, see Gateaux derivative.

#### Differentials as linear maps on a vector space

The same procedure works on a vector space with a enough additional structure to reasonably talk about continuity. The most concrete case is a Hilbert space, also known as a complete inner product space, where the inner product and its associated norm define a suitable concept of distance. The same procedure works for a Banach space, also known as a complete Normed vector space. However, for a more general topological vector space, some of the details are more abstract because there is no concept of distance.

For the important case of a finite dimension, any inner product space is a Hilbert space, any normed vector space is a Banach space and any topological vector space is complete. As a result, you can define a coordinate system from an arbitrary basis and use the same technique as for ${\displaystyle \mathbb {R} ^{n))$.

### Differentials as germs of functions

This approach works on any differentiable manifold. If

1. U and V are open sets containing p
2. ${\displaystyle f\colon U\to \mathbb {R} }$ is continuous
3. ${\displaystyle g\colon V\to \mathbb {R} }$ is continuous

then f is equivalent to g at p, denoted ${\displaystyle f\sim _{p}g}$, if and only if there is an open ${\displaystyle W\subseteq U\cap V}$ containing p such that ${\displaystyle f(x)=g(x)}$ for every x in W. The germ of f at p, denoted ${\displaystyle [f]_{p))$, is the set of all real continuous functions equivalent to f at p; if f is smooth at p then ${\displaystyle [f]_{p))$ is a smooth germ. If

1. ${\displaystyle U_{1))$, ${\displaystyle U_{2))$ ${\displaystyle V_{1))$ and ${\displaystyle V_{2))$ are open sets containing p
2. ${\displaystyle f_{1}\colon U_{1}\to \mathbb {R} }$, ${\displaystyle f_{2}\colon U_{2}\to \mathbb {R} }$, ${\displaystyle g_{1}\colon V_{1}\to \mathbb {R} }$ and ${\displaystyle g_{2}\colon V_{2}\to \mathbb {R} }$ are smooth functions
3. ${\displaystyle f_{1}\sim _{p}g_{1))$
4. ${\displaystyle f_{2}\sim _{p}g_{2))$
5. r is a real number

then

1. ${\displaystyle r*f_{1}\sim _{p}r*g_{1))$
2. ${\displaystyle f_{1}+f_{2}\colon U_{1}\cap U_{2}\to \mathbb {R} \sim _{p}g_{1}+g_{2}\colon V_{1}\cap V_{2}\to \mathbb {R} }$
3. ${\displaystyle f_{1}*f_{2}\colon U_{1}\cap U_{2}\to \mathbb {R} \sim _{p}g_{1}*g_{2}\colon V_{1}\cap V_{2}\to \mathbb {R} }$

This shows that the germs at p form an algebra.

Define ${\displaystyle {\mathcal {I))_{p))$ to be the set of all smooth germs vanishing at p and ${\displaystyle {\mathcal {I))_{p}^{2))$ to be the product of ideals ${\displaystyle {\mathcal {I))_{p}{\mathcal {I))_{p))$. Then a differential at p (cotangent vector at p) is an element of ${\displaystyle {\mathcal {I))_{p}/{\mathcal {I))_{p}^{2))$. The differential of a smooth function f at p, denoted ${\displaystyle \mathrm {d} f_{p))$, is ${\displaystyle [f-f(p)]_{p}/{\mathcal {I))_{p}^{2))$.

A similar approach is to define differential equivalence of first order in terms of derivatives in an arbitrary coordinate patch. Then the differential of f at p is the set of all functions differentially equivalent to ${\displaystyle f-f(p)}$ at p.

### Algebraic geometry

In algebraic geometry, differentials and other infinitesimal notions are handled in a very explicit way by accepting that the coordinate ring or structure sheaf of a space may contain nilpotent elements. The simplest example is the ring of dual numbers R[ε], where ε2 = 0.

This can be motivated by the algebro-geometric point of view on the derivative of a function f from R to R at a point p. For this, note first that f − f(p) belongs to the ideal Ip of functions on R which vanish at p. If the derivative f vanishes at p, then f − f(p) belongs to the square Ip2 of this ideal. Hence the derivative of f at p may be captured by the equivalence class [f − f(p)] in the quotient space Ip/Ip2, and the 1-jet of f (which encodes its value and its first derivative) is the equivalence class of f in the space of all functions modulo Ip2. Algebraic geometers regard this equivalence class as the restriction of f to a thickened version of the point p whose coordinate ring is not R (which is the quotient space of functions on R modulo Ip) but R[ε] which is the quotient space of functions on R modulo Ip2. Such a thickened point is a simple example of a scheme.[5]

#### Algebraic geometry notions

Differentials are also important in algebraic geometry, and there are several important notions.

### Synthetic differential geometry

A fifth approach to infinitesimals is the method of synthetic differential geometry[9] or smooth infinitesimal analysis.[10] This is closely related to the algebraic-geometric approach, except that the infinitesimals are more implicit and intuitive. The main idea of this approach is to replace the category of sets with another category of smoothly varying sets which is a topos. In this category, one can define the real numbers, smooth functions, and so on, but the real numbers automatically contain nilpotent infinitesimals, so these do not need to be introduced by hand as in the algebraic geometric approach. However the logic in this new category is not identical to the familiar logic of the category of sets: in particular, the law of the excluded middle does not hold. This means that set-theoretic mathematical arguments only extend to smooth infinitesimal analysis if they are constructive (e.g., do not use proof by contradiction). Some[who?] regard this disadvantage as a positive thing, since it forces one to find constructive arguments wherever they are available.

### Nonstandard analysis

The final approach to infinitesimals again involves extending the real numbers, but in a less drastic way. In the nonstandard analysis approach there are no nilpotent infinitesimals, only invertible ones, which may be viewed as the reciprocals of infinitely large numbers.[7] Such extensions of the real numbers may be constructed explicitly using equivalence classes of sequences of real numbers, so that, for example, the sequence (1, 1/2, 1/3, ..., 1/n, ...) represents an infinitesimal. The first-order logic of this new set of hyperreal numbers is the same as the logic for the usual real numbers, but the completeness axiom (which involves second-order logic) does not hold. Nevertheless, this suffices to develop an elementary and quite intuitive approach to calculus using infinitesimals, see transfer principle.

## Differential geometry

The notion of a differential motivates several concepts in differential geometry (and differential topology).

## Other meanings

The term differential has also been adopted in homological algebra and algebraic topology, because of the role the exterior derivative plays in de Rham cohomology: in a cochain complex ${\displaystyle (C_{\bullet },d_{\bullet }),}$ the maps (or coboundary operators) di are often called differentials. Dually, the boundary operators in a chain complex are sometimes called codifferentials.

The properties of the differential also motivate the algebraic notions of a derivation and a differential algebra.

## Citations

1. ^ "Differential". Wolfram MathWorld. Retrieved February 24, 2022. The word differential has several related meaning in mathematics. In the most common context, it means "related to derivatives." So, for example, the portion of calculus dealing with taking derivatives (i.e., differentiation), is known as differential calculus.
The word "differential" also has a more technical meaning in the theory of differential k-forms as a so-called one-form.
2. ^ "differential - Definition of differential in US English by Oxford Dictionaries". Oxford Dictionaries - English. Archived from the original on January 3, 2014. Retrieved 13 April 2018.
3. ^
4. ^
5. ^ a b
6. ^
7. ^ a b See Robinson 1996 and Keisler 1986.
8. ^ See, for instance, Apostol 1967.
9. ^ See Kock 2006 and Lawvere 1968.
10. ^