Reversed orientation corresponds to negating the exterior product.
Geometric interpretation of grade n elements in a real exterior algebra for n = 0 (signed point), 1 (directed line segment, or vector), 2 (oriented plane element), 3 (oriented volume). The exterior product of n vectors can be visualized as any n-dimensional shape (e.g. n-parallelotope, n-ellipsoid); with magnitude (hypervolume), and orientation defined by that of its (n − 1)-dimensional boundary and on which side the interior is.[1][2]
In mathematics, the exterior algebra, or Grassmann algebra, named after Hermann Grassmann,[3] is an algebra that uses the exterior product or wedge product as its multiplication. In mathematics, the exterior product or wedge product of vectors is an algebraic construction used in geometry to study areas, volumes, and their higher-dimensional analogues. The exterior product of two vectors and , denoted by is called a bivector and lives in a space called the exterior square, a vector space that is distinct from the original space of vectors. The magnitude[4] of can be interpreted as the area of the parallelogram with sides and which in three dimensions can also be computed using the cross product of the two vectors. More generally, all parallel plane surfaces with the same orientation and area have the same bivector as a measure of their oriented area. Like the cross product, the exterior product is anticommutative, meaning that for all vectors and but, unlike the cross product, the exterior product is associative (after introducing the exterior cubic, that is, oriented volume).
When regarded in this manner, the exterior product of two vectors is called a 2-blade. More generally, the exterior product of any number of vectors can be defined and is sometimes called a -blade (or decomposable, or simple, by some authors). It lives in a space known as the -th exterior power (generalizing exterior square and exterior cubic). Blades are basic objects in Projective Geometry, where no measure for length or angle (hence no parallelism) is assumed, but the main structure in there is linearity. If Euclidean product is given for the vectors, the magnitude (that is, a scalar) of the resulting -blade is the oriented hypervolume of the -dimensional parallelotope whose edges are the given vectors, just as the magnitude of the scalar triple product of vectors in three dimensions gives the volume of the parallelepiped generated by those vectors.
The exterior algebra provides an algebraic setting to answer some type of geometric questions. For instance, blades have a concrete geometric interpretation and objects in the exterior algebra can be manipulated according to a set of unambiguous rules. The exterior algebra contains objects that are not only -blades, but sums of -blades; such a sum is called a k-vector.[5] Combining -blades in a linear structure by adding and scalar multiplication is the core of Projective Geometry, (see Plücker coordinates). -vectors are somehow similar to homogeneous polynomials, just being skew-commutative or anticommutative; naturally, is called the degree of the -vector.
For any -vector more associated objects exist: rank is defined to be the smallest number of simple elements of which it is a sum; support is defined as the minimal subspace the -vector lives in; the divisor space (some authors might have other names for it, like kernel or factor) consists of the set of vectors that factor out.
Once defined for two vectors (in a linearly way), the exterior product extends to the full exterior algebra, so that it makes sense to multiply any two elements of the algebra. Equipped with this product, the exterior algebra is an associative algebra, which means that for any elements . As said, the -vectors are a lot like homogeneous polynomials of degree , such that when elements of different degrees are multiplied, the degrees add for the degree of the product. This means that the exterior algebra is a graded algebra.
Exterior algebra emerged on two paths: as abstract skew(anti)-commuting objects in a vector space setting and also as antisymmetric (also called alternating) tensors. This twofold approach exists in almost all cases, but an exception has to be singled out: when the vector spaces in the construction are over a field of characteristic 2. Hence, whenever antisymmetric/alternating tensors are considered in connection to exterior algebra, the basic field is supposed of 0 or odd characteristic, but not 2. Also note that exterior algebra has two basic ingredients: first, the vector spaces involved, then the exterior product. On the first path, the abstract one, both ingredients are clearly given (pretty abstract, though) and this is its main power. On the second path, the vector space is clear and less abstract, but the exterior product can be defined in more (equivalent) ways, and much care is needed to avoid mistakes.
The definition of the exterior algebra makes sense for spaces not just of geometric vectors, but of other vector-like objects (infinite dimensional) such as vector fields or functions. Moreover, the field the vector spaces are based on may not be numeric, as real or complex numbers, but other (less usual) field (finite or not) with zero or positive characteristic. In full generality, the exterior algebra can be defined for modules over a commutative ring, and for other structures of interest in abstract algebra. It is one of these more general constructions where the exterior algebra finds one of its most important applications, where it appears as the algebra of differential forms that is fundamental in areas that use differential geometry. The exterior algebra also has many algebraic properties that make it a convenient tool in algebra itself. The association of the exterior algebra to a vector space is a type of functor on vector spaces, which means that it is compatible in a certain way with linear transformations of vector spaces. The exterior algebra is one example of a bialgebra, meaning that its dual space also possesses a product, and this dual product is compatible with the exterior product. This dual algebra is precisely the algebra of alternating multilinear forms, and the pairing between the exterior algebra and its dual is given by the interior product.
Motivating examples
The first two examples assume a metric tensor field and an orientation; the third example does not assume either.
Areas in the plane
The area of a parallelogram in terms of the determinant of the matrix of coordinates of two of its vertices.
are a pair of given vectors in written in components. There is a unique parallelogram having v and w as two of its sides. The area of this parallelogram is given by the standard determinant formula:
Consider now the exterior product of v and w:
where the first step uses the distributive law for the exterior product, and the last uses the fact that the exterior product is alternating, and in particular (The fact that the exterior product is alternating also forces ) Note that the coefficient in this last expression is precisely the determinant of the matrix [vw]. The fact that this may be positive or negative has the intuitive meaning that v and w may be oriented in a counterclockwise or clockwise sense as the vertices of the parallelogram they define. Such an area is called the signed area of the parallelogram: the absolute value of the signed area is the ordinary area, and the sign determines its orientation.
The fact that this coefficient is the signed area is not an accident. In fact, it is relatively easy to see that the exterior product should be related to the signed area if one tries to axiomatize this area as an algebraic construct. In detail, if A(v, w) denotes the signed area of the parallelogram of which the pair of vectors v and w form two adjacent sides, then A must satisfy the following properties:
A(rv, sw) = rsA(v, w) for any real numbers r and s, since rescaling either of the sides rescales the area by the same amount (and reversing the direction of one of the sides reverses the orientation of the parallelogram).
A(v, v) = 0, since the area of the degenerate parallelogram determined by v (i.e., a line segment) is zero.
A(w, v) = −A(v, w), since interchanging the roles of v and w reverses the orientation of the parallelogram.
A(v + rw, w) = A(v, w) for any real number r, since adding a multiple of w to v affects neither the base nor the height of the parallelogram and consequently preserves its area.
A(e1, e2) = 1, since the area of the unit square is one.
The cross product (blue vector) in relation to the exterior product (light blue parallelogram). The length of the cross product is to the length of the parallel unit vector (red) as the size of the exterior product is to the size of the reference parallelogram (light red).
With the exception of the last property, the exterior product of two vectors satisfies the same properties as the area. In a certain sense, the exterior product generalizes the final property by allowing the area of a parallelogram to be compared to that of any chosen parallelogram in a parallel plane (here, the one with sides e1 and e2). In other words, the exterior product provides a basis-independent formulation of area.[6]
Oriented areas in space
The power of linearity of the cross product, implicitly, of the exterior product, is seen in a simple geometric property, not that obvious: the total signed/oriented area of a non self overlapping polyhedron
is zero. This is just the simplest linear algebra, if using exterior product, but quite a lot of work, using metric geometry:
Each blade in the sum above is (twice) the oriented (either all inward, or all outward) surface area of the four faces of a tetrahedron with concurrent edges along vectors . Iterating this property for each and all tetrahedrons that build a non self overlapping polyhedron, we get the general statement.
Oriented areas in affine space
The natural setting for (oriented) area and exterior algebra is, actually, that of an affine space. This is also the intimate connection between exterior algebra and differential forms, as to integrate we need a 'differential' object to measure the 'infinitesimal' area in higher dimensional space. If is an affine space over the vector space , and a (simplex) collection of ordered points , we can define (up to a factor, the same for any dimensional simplex) its oriented area as the exterior product of vectors , where the obvious zero factor is missing; if changing the order of the points, that is, the orientation of the simplex, the (oriented) area changes by a sign, according to the parity of the permutation. This oriented area has interesting properties, we only mention here: the total surface (boundary) area of a simplex is zero, as for the tetrahedron in the previous section. Note that this is also true if the simplex is 'flat', that is, it is contained in an dimensional affine subspace (as 4 points in a plane).
where (e1 ∧ e2, e2 ∧ e3, e3 ∧ e1) is a basis for the three-dimensional space The coefficients above are the same as those in the usual definition of the cross product of vectors in three dimensions with a given orientation, the only differences being that the exterior product is not an ordinary vector, but instead is a 2-vector, and that the exterior product does not depend on the choice of orientation[clarification needed].
Bringing in a third vector
the exterior product of three vectors is
where e1 ∧ e2 ∧ e3 is the basis vector for the one-dimensional space The scalar coefficient is the triple product of the three vectors.
The cross product and triple product in a three dimensional Euclidean vector space each admit both geometric and algebraic interpretations, via Hodge star duality. The cross product u × v can be interpreted as a vector which is perpendicular to both u and v and whose magnitude is equal to the area of the parallelogram determined by the two vectors. It can also be interpreted as the vector consisting of the minors of the matrix with columns u and v. The triple product of u, v, and w is a signed scalar representing a geometric oriented volume. Algebraically, it is the determinant of the matrix with columns u, v, and w. The exterior product in three dimensions allows for similar interpretations: it, too, can be identified with oriented length, areas, volumes, etc., that are spanned by one, two or more vectors. The exterior product generalizes these geometric notions to all vector spaces and to any number of dimensions, even in the absence of a scalar product.
The exterior algebra of a vector space V over a fieldK is defined as the quotient algebra of the tensor algebraT(V) by the two-sided idealI generated by all elements of the form x ⊗ x for x ∈ V (i.e. all tensors that can be expressed as the tensor product of a vector in V by itself).[7] The ideal I contains the ideal J generated by elements of the form and these ideals coincide if . If these ideals are different (except for the zero vector space), and this makes the difference between symmetric and exterior algebra, in spite of the latter being a subspace of the first (corresponding to the inclusion JI ), that is, symmetric algebra is 'big', as the vector space of polynomials, and the exterior algebra is 'small', that is, dimension of power of 2 (see 'Basis and dimension', below).
So, being a quotient by a two sided ideal,
is an associative algebra.
Its multiplication is called the exterior product, and denoted ∧. This means that the product of is induced by the tensor product ⊗ of T(V).
As T0 = K, T1 = V, and the inclusions of K and V in T(V) induce injections of K and V into These injections are commonly considered as inclusions, and called natural embeddings, natural injections or natural inclusions. The word canonical is also commonly used in place of natural.
Alternating product
The exterior product is by construction alternating on elements of which means that for all by the above construction. It follows that the product is also anticommutative on elements of for supposing that
hence
More generally, if σ is a permutation of the integers [1, ..., k], and x1, x2, ..., xk are elements of V, it follows that
In particular, if xi = xj for some i ≠ j, then the following generalization of the alternating property also holds:
Together with the distributive property of the exterior product, one further generalization is that if and only if is a linearly dependent set of vectors, then
Exterior power
The kth exterior power of V, denoted is the vector subspace of spanned by elements of the form
If then α is said to be a k-vector. If, furthermore, α can be expressed as an exterior product of k elements of V, then α is said to be decomposable (or simple, by some authors; or a blade, by others). Although decomposable k-vectors span not every element of is decomposable. For example, in the following 2-vector is not decomposable:
If the dimension of V is n and { e1, …, en } is a basis for V, then the set
is a basis for The reason is the following: given any exterior product of the form
every vector vj can be written as a linear combination of the basis vectors ei; using the bilinearity of the exterior product, this can be expanded to a linear combination of exterior products of those basis vectors. Any exterior product in which the same basis vector appears more than once is zero; any exterior product in which the basis vectors do not appear in the proper order can be reordered, changing the sign whenever two basis vectors change places. In general, the resulting coefficients of the basis k-vectors can be computed as the minors of the matrix that describes the vectors vj in terms of the basis ei.
By counting the basis elements, the dimension of is equal to a binomial coefficient:
where n is the dimension of the vectors, and k is the number of vectors in the product. The binomial coefficient produces the correct result, even for exceptional cases; in particular, for k > n.
Any element of the exterior algebra can be written as a sum of k-vectors. Hence, as a vector space the exterior algebra is a direct sum
(where by convention the field underlying V, and ), and therefore its dimension is equal to the sum of the binomial coefficients, which is 2n.
Rank of a k-vector
If then it is possible to express α as a linear combination of decomposable k-vectors:
where each α(i) is decomposable, say
The rank of the k-vector α is the minimal number of decomposable k-vectors in such an expansion of α. This is similar to the notion of tensor rank.
Rank is particularly important in the study of 2-vectors (Sternberg 1964, §III.6) (Bryant et al. 1991). The rank of a 2-vector α can be identified with half the rank of the matrix of coefficients of α in a basis. Thus if ei is a basis for V, then α can be expressed uniquely as
where aij = −aji (the matrix of coefficients is skew-symmetric). The rank of the matrix aij is therefore even, and is twice the rank of the form α.
In characteristic 0, the 2-vector α has rank p if and only if
and
Graded structure
The exterior product of a k-vector with a p-vector is a (k + p)-vector, once again invoking bilinearity. As a consequence, the direct sum decomposition of the preceding section
gives the exterior algebra the additional structure of a graded algebra, that is
Moreover, if K is the base field, we have
and
The exterior product is graded anticommutative, meaning that if and then
In addition to studying the graded structure on the exterior algebra, Bourbaki (1989) studies additional graded structures on exterior algebras, such as those on the exterior algebra of a graded module (a module that already carries its own gradation).
Universal property
Let V be a vector space over the field K. Informally, multiplication in is performed by manipulating symbols and imposing a distributive law, an associative law, and using the identity for v ∈ V. Formally, is the "most general" algebra in which these rules hold for the multiplication, in the sense that any unital associative K-algebra containing V with alternating multiplication on V must contain a homomorphic image of In other words, the exterior algebra has the following universal property:[10]
Given any unital associative K-algebra A and any K-linear map such that for every v in V, then there exists precisely one unital algebra homomorphism such that j(v) = f(i(v)) for all v in V (here i is the natural inclusion of V in see above).
Universal property of the exterior algebra
To construct the most general algebra that contains V and whose multiplication is alternating on V, it is natural to start with the most general associative algebra that contains V, the tensor algebraT(V), and then enforce the alternating property by taking a suitable quotient. We thus take the two-sided idealI in T(V) generated by all elements of the form v ⊗ v for v in V, and define as the quotient
(and use ∧ as the symbol for multiplication in ). It is then straightforward to show that contains V and satisfies the above universal property.
As a consequence of this construction, the operation of assigning to a vector space V its exterior algebra is a functor from the category of vector spaces to the category of algebras.
Rather than defining first and then identifying the exterior powers as certain subspaces, one may alternatively define the spaces first and then combine them to form the algebra This approach is often used in differential geometry and is described in the next section.
Generalizations
Given a commutative ringR and an R-moduleM, we can define the exterior algebra just as above, as a suitable quotient of the tensor algebra T(M). It will satisfy the analogous universal property. Many of the properties of also require that M be a projective module. Where finite dimensionality is used, the properties further require that M be finitely generated and projective. Generalizations to the most common situations can be found in Bourbaki (1989).
Exterior algebras of vector bundles are frequently considered in geometry and topology. There are no essential differences between the algebraic properties of the exterior algebra of finite-dimensional vector bundles and those of the exterior algebra of finitely generated projective modules, by the Serre–Swan theorem. More general exterior algebras can be defined for sheaves of modules.
Alternating tensor algebra
Regardless the characteristic of the field (except characteristic 2) ,[11] the exterior algebra of a vector space V over K can be canonically identified with the vector subspace of T(V) consisting of antisymmetric tensors. For characteristic 0 (or higher than the dimension of the vector space ), the vector space of -linear antisymmetric tensors is transversal to the ideal , hence, a good choice to represent the quotient. But for positive characteristic, the vector space of k-linear antisymmetric tensors could be not transversal to the ideal (actually, for the characteristic of the field, the vector space of k-linear antisymmetric tensors is contained in ); nevertheless, transversal or not, a product can be defined on this space such that the resulting algebra is isomorphic to the exterior algebra: in the first case the natural choice for the product is just the quotient product (using the available projection), in the second case, this product must be slightly modified as given below (along Arnold setting), but such that the algebra stays isomorphic with the exterior algebra , i.e. the quotient of T(V) by the ideal I generated by elements of the form x ⊗ x. Of course, for characteristic 0 (or higher than the dimension of the vector space), one or the other definition of the product could be used, as the two algebras are isomorphic (see V. I. Arnold or Kobayashi-Nomizu).
Let Tr(V) be the space of homogeneous tensors of degree r. This is spanned by decomposable tensors
The antisymmetrization (or sometimes the skew-symmetrization) of a decomposable tensor is defined by
and, when (for positive characterisctic field might be 0):
where the sum is taken over the symmetric group of permutations on the symbols {1, ..., r}. This extends by linearity and homogeneity to an operation, also denoted by and , on the full tensor algebra T(V).
Note that:
Such that, when defined, is the projection for the exterior (quotient) algebra onto the r-homogeneous alternating tensor subspace.
On the other hand, the image (T(V)) is always the alternating tensor graded subspace (not yet an algebra, as product is not yet defined), denoted . This is a vector subspace of T(V), and it inherits the structure of a graded vector space from that on T(V). Moreover, the kernel of is precisely I, the homogeneous subset of the ideal I, or the kernel of is I. When is defined, carries an associative graded product defined by (the same as the wedge product)
Assuming K has characteristic 0, as is a supplement of I in T(V), with the above given product, there is a canonical isomorphism
When characterisctic of the field is positive, will do what did before, but the product cannot be defined as above. In such a case, isomorphism still holds, in spite of not being a supplement of the ideal I, but then, the product should be modified as given below ( product , Arnold setting).
Finally, we always get isomorphic with , but the product could (or should) be chosen in two ways (or only one). Actually, the product could be chosen in many ways, rescaling it on homogeneous spaces as for an arbitrary sequence in the field, as long as the division makes sense (this is such that the redefined product is also associative, i.e. defines an algebra on ). Also note, the interior product definition should be changed accordingly, in order to keep its skew derivation property.
Index notation
Suppose that V has finite dimension n, and that a basis e1, ..., en of V is given. Then any alternating tensor t ∈ Ar(V) ⊂ Tr(V) can be written in index notation as
The exterior product of two alternating tensors t and s of ranks r and p is given by
The components of this tensor are precisely the skew part of the components of the tensor product s ⊗ t, denoted by square brackets on the indices:
The interior product may also be described in index notation as follows. Let be an antisymmetric tensor of rank r. Then, for α ∈ V∗, iαt is an alternating tensor of rank r − 1, given by
where n is the dimension of V.
Duality
Alternating operators
Given two vector spaces V and X and a natural number k, an alternating operator from Vk to X is a multilinear map
such that whenever v1, ..., vk are linearly dependent vectors in V, then
The map
which associates to vectors from their exterior product, i.e. their corresponding -vector, is also alternating. In fact, this map is the "most general" alternating operator defined on given any other alternating operator there exists a unique linear map with This universal property characterizes the space and can serve as its definition.
Geometric interpretation for the exterior product of n1-forms (ε, η, ω) to obtain an n-form ("mesh" of coordinate surfaces, here planes),[1] for n = 1, 2, 3. The "circulations" show orientation.[12][13]
The above discussion specializes to the case when X = K, the base field. In this case an alternating multilinear function
is called an alternating multilinear form. The set of all alternatingmultilinear forms is a vector space, as the sum of two such maps, or the product of such a map with a scalar, is again alternating. By the universal property of the exterior power, the space of alternating forms of degree k on V is naturally isomorphic with the dual vector space If V is finite-dimensional, then the latter is naturally isomorphic[clarification needed] to In particular, if V is n-dimensional, the dimension of the space of alternating maps from Vk to K is the binomial coefficient
Under such identification, the exterior product takes a concrete form: it produces a new anti-symmetric map from two given ones. Suppose ω : Vk → K and η : Vm → K are two anti-symmetric maps. As in the case of tensor products of multilinear maps, the number of variables of their exterior product is the sum of the numbers of their variables. Depending on the choice of identification of elements of exterior power with multilinear forms, the exterior product is defined as
or as
where, if the characteristic of the base field K is 0, the alternation Alt of a multilinear map is defined to be the average of the sign-adjusted values over all the permutations of its variables:
When the fieldK has finite characteristic, an equivalent version of the second expression without any factorials or any constants is well-defined:
where here Shk,m ⊂ Sk+m is the subset of (k,m) shuffles: permutationsσ of the set {1, 2, ..., k + m} such that σ(1) < σ(2) < ⋯ < σ(k), and σ(k + 1) < σ(k + 2) < ⋯ < σ(k + m). As this might look very specific and fine tuned, an equivalent raw version is to sum in the above formula over permutations in left cosets of Sk+m/Sk x Sm.
Suppose that V is finite-dimensional. If V∗ denotes the dual space to the vector space V, then for each α ∈ V∗, it is possible to define an antiderivation on the algebra
This derivation is called the interior product with α, or sometimes the insertion operator, or contraction by α.
Suppose that Then w is a multilinear mapping of V∗ to K, so it is defined by its values on the k-fold Cartesian productV∗ × V∗ × ... × V∗. If u1, u2, ..., uk−1 are k − 1 elements of V∗, then define
Additionally, let iαf = 0 whenever f is a pure scalar (i.e., belonging to ).
Axiomatic characterization and properties
The interior product satisfies the following properties:
For each k and each α ∈ V∗,
(By convention, )
If v is an element of V (), then iαv = α(v) is the dual pairing between elements of V and elements of V∗.
Suppose that V has finite dimension n. Then the interior product induces a canonical isomorphism of vector spaces
by the recursive definition
In the geometrical setting, a non-zero element of the top exterior power (which is a one-dimensional vector space) is sometimes called a volume form (or orientation form, although this term may sometimes lead to ambiguity). The name orientation form comes from the fact that a choice of preferred top element determines an orientation of the whole exterior algebra, since it is tantamount to fixing an ordered basis of the vector space. Relative to the preferred volume form σ, the isomorphism is given explicitly by
If, in addition to a volume form, the vector space V is equipped with an inner product identifying V with V∗, then the resulting isomorphism is called the Hodge star operator, which maps an element to its Hodge dual:
The composition of with itself maps → and is always a scalar multiple of the identity map. In most applications, the volume form is compatible with the inner product in the sense that it is an exterior product of an orthonormal basis of V. In this case,
where id is the identity mapping, and the inner product has metric signature(p, q) — p pluses and q minuses.
Inner product
For V a finite-dimensional space, an inner product (or a pseudo-Euclidean inner product) on V defines an isomorphism of V with V∗, and so also an isomorphism of with The pairing between these two spaces also takes the form of an inner product. On decomposable k-vectors,
the determinant of the matrix of inner products. In the special case vi = wi, the inner product is the square norm of the k-vector, given by the determinant of the Gramian matrix(⟨vi, vj⟩). This is then extended bilinearly (or sesquilinearly in the complex case) to a non-degenerate inner product on If ei, i = 1, 2, ..., n, form an orthonormal basis of V, then the vectors of the form
constitute an orthonormal basis for , a statement equivalent to the Cauchy–Binet formula.
With respect to the inner product, exterior multiplication and the interior product are mutually adjoint. Specifically, for and
for all y ∈ V. This property completely characterizes the inner product on the exterior algebra.
Indeed, more generally for and iteration of the above adjoint properties gives
where now is the dual l-vector defined by
for all
Bialgebra structure
There is a correspondence between the graded dual of the graded algebra and alternating multilinear forms on V. The exterior algebra (as well as the symmetric algebra) inherits a bialgebra structure, and, indeed, a Hopf algebra structure, from the tensor algebra. See the article on tensor algebras for a detailed treatment of the topic.
The exterior product of multilinear forms defined above is dual to a coproduct defined on giving the structure of a coalgebra. The coproduct is a linear function Δ : → ⊗ which is given by
on elements v∈V. The symbol 1 stands for the unit element of the field K. Recall that K ⊂ , so that the above really does lie in ⊗ . This definition of the coproduct is lifted to the full space by (linear) homomorphism.
The correct form of this homomorphism is not what one might naively write, but has to be the one carefully defined in the coalgebra article. In this case, one obtains
Expanding this out in detail, one obtains the following expression on decomposable elements:
where the second summation is taken over all (p+1, k−p)-shuffles. The above is written with a notational trick, to keep track of the field element 1: the trick is to write and this is shuffled into various locations during the expansion of the sum over shuffles. The shuffle follows directly from the first axiom of a co-algebra: the relative order of the elements is preserved in the riffle shuffle: the riffle shuffle merely splits the ordered sequence into two ordered sequences, one on the left, and one on the right.
Observe that the coproduct preserves the grading of the algebra. Extending to the full space one has
The tensor symbol ⊗ used in this section should be understood with some caution: it is not the same tensor symbol as the one being used in the definition of the alternating product. Intuitively, it is perhaps easiest to think it as just another, but different, tensor product: it is still (bi-)linear, as tensor products should be, but it is the product that is appropriate for the definition of a bialgebra, that is, for creating the object ⊗ . Any lingering doubt can be shaken by pondering the equalities (1 ⊗ v) ∧ (1 ⊗ w) = 1 ⊗ (v ∧ w) and (v ⊗ 1) ∧ (1 ⊗ w) = v ⊗ w, which follow from the definition of the coalgebra, as opposed to naive manipulations involving the tensor and wedge symbols. This distinction is developed in greater detail in the article on tensor algebras. Here, there is much less of a problem, in that the alternating product ∧ clearly corresponds to multiplication in the bialgebra, leaving the symbol ⊗ free for use in the definition of the bialgebra. In practice, this presents no particular problem, as long as one avoids the fatal trap of replacing alternating sums of ⊗ by the wedge symbol, with one exception. One can construct an alternating product from ⊗, with the understanding that it works in a different space. Immediately below, an example is given: the alternating product for the dual space can be given in terms of the coproduct. The construction of the bialgebra here parallels the construction in the tensor algebra article almost exactly, except for the need to correctly track the alternating signs for the exterior algebra.
In terms of the coproduct, the exterior product on the dual space is just the graded dual of the coproduct:
where the tensor product on the right-hand side is of multilinear linear maps (extended by zero on elements of incompatible homogeneous degree: more precisely, α ∧ β = ε ∘ (α ⊗ β) ∘ Δ, where ε is the counit, as defined presently).
The counit is the homomorphism ε : → K that returns the 0-graded component of its argument. The coproduct and counit, along with the exterior product, define the structure of a bialgebra on the exterior algebra.
With an antipode defined on homogeneous elements by the exterior algebra is furthermore a Hopf algebra.[14]
Functoriality
Suppose that V and W are a pair of vector spaces and f : V → W is a linear map. Then, by the universal property, there exists a unique homomorphism of graded algebras
such that
In particular, preserves homogeneous degree. The k-graded components of are given on decomposable elements by
Let
The components of the transformation relative to a basis of V and W is the matrix of k × k minors of f. In particular, if V = W and V is of finite dimension n, then is a mapping of a one-dimensional vector space to itself, and is therefore given by a scalar: the determinant of f.
In applications to linear algebra, the exterior product provides an abstract algebraic manner for describing the determinant and the minors of a matrix. For instance, it is well known that the determinant of a square matrix is equal to the volume of the parallelotope whose sides are the columns of the matrix (with a sign to track orientation). This suggests that the determinant can be defined in terms of the exterior product of the column vectors. Likewise, the k × k minors of a matrix can be defined by looking at the exterior products of column vectors chosen k at a time. These ideas can be extended not just to matrices but to linear transformations as well: the determinant of a linear transformation is the factor by which it scales the oriented volume of any given reference parallelotope. So the determinant of a linear transformation can be defined in terms of what the transformation does to the top exterior power. The action of a transformation on the lesser exterior powers gives a basis-independent way to talk about the minors of the transformation.
Technical details: Definitions
Let[18] be an n-dimensional vector space over field with basis
For define on simple tensors by
and expand the definition linearly to all tensors. More generally, we can define on simple tensors by
i.e. choose k components on which A would act, then sum up all results obtained from different choices. For , this recovers the determinant of . For , this recovers the trace of . If define Since is 1-dimensional with basis we can identify with the unique number satisfying
For define the exterior transpose to be the unique operator satisfying for any and
For define These are equivalent to the previous definitions.
Basic properties
All results obtained from other definitions of the determinant, trace and adjoint can be obtained from this definition (since these definitions are equivalent). Here are some basic properties related to these new definitions:
is -linear.
We have a canonical isomorphism
However, there is no canonical isomorphism between and
The entries of the transposed matrix of are -minors of
are the coefficients of the terms in the characteristic polynomial. They also appear in the expressions of and Leverrier's Algorithm[19] is an economical way of computing and
In physics, many quantities are naturally represented by alternating operators. For example, if the motion of a charged particle is described by velocity and acceleration vectors in four-dimensional spacetime, then normalization of the velocity vector requires that the electromagnetic force must be an alternating operator on the velocity. Its six degrees of freedom are identified with the electric and magnetic fields.
Linear geometry
The decomposable k-vectors have geometric interpretations: the bivector u ∧ v represents the plane spanned by the vectors, "weighted" with a number, given by the area of the oriented parallelogram with sides u and v. Analogously, the 3-vector u ∧ v ∧ w represents the spanned 3-space weighted by the volume of the oriented parallelepiped with edges u, v, and w.
The exterior algebra has notable applications in differential geometry, where it is used to define differential forms.[20] Differential forms are mathematical objects that evaluate the length of vectors, areas of parallelograms, and volumes of higher-dimensional bodies, so they can be integrated over curves, surfaces and higher dimensional manifolds in a way that generalizes the line integrals and surface integrals from calculus. A differential form at a point of a differentiable manifold is an alternating multilinear form on the tangent space at the point. Equivalently, a differential form of degree k is a linear functional on the k-th exterior power of the tangent space. As a consequence, the exterior product of multilinear forms defines a natural exterior product for differential forms. Differential forms play a major role in diverse areas of differential geometry.
In particular, the exterior derivative gives the exterior algebra of differential forms on a manifold the structure of a differential graded algebra. The exterior derivative commutes with pullback along smooth mappings between manifolds, and it is therefore a naturaldifferential operator. The exterior algebra of differential forms, equipped with the exterior derivative, is a cochain complex whose cohomology is called the de Rham cohomology of the underlying manifold and plays a vital role in the algebraic topology of differentiable manifolds.
The exterior algebra over the complex numbers is the archetypal example of a superalgebra, which plays a fundamental role in physical theories pertaining to fermions and supersymmetry. A single element of the exterior algebra is called a supernumber[21] or Grassmann number. The exterior algebra itself is then just a one-dimensional superspace: it is just the set of all of the points in the exterior algebra. The topology on this space is essentially the weak topology, the open sets being the cylinder sets. An n-dimensional superspace is just the n-fold product of exterior algebras.
Lie algebra homology
Let L be a Lie algebra over a field K, then it is possible to define the structure of a chain complex on the exterior algebra of L. This is a K-linear mapping
defined on decomposable elements by
The Jacobi identity holds if and only if ∂∂ = 0, and so this is a necessary and sufficient condition for an anticommutative nonassociative algebra L to be a Lie algebra. Moreover, in that case is a chain complex with boundary operator ∂. The homology associated to this complex is the Lie algebra homology.
The exterior algebra was first introduced by Hermann Grassmann in 1844 under the blanket term of Ausdehnungslehre, or Theory of Extension.[22]
This referred more generally to an algebraic (or axiomatic) theory of extended quantities and was one of the early precursors to the modern notion of a vector space. Saint-Venant also published similar ideas of exterior calculus for which he claimed priority over Grassmann.[23]
The algebra itself was built from a set of rules, or axioms, capturing the formal aspects of Cayley and Sylvester's theory of multivectors. It was thus a calculus, much like the propositional calculus, except focused exclusively on the task of formal reasoning in geometrical terms.[24]
In particular, this new development allowed for an axiomatic characterization of dimension, a property that had previously only been examined from the coordinate point of view.
The import of this new theory of vectors and multivectors was lost to mid 19th century mathematicians,[25]
until being thoroughly vetted by Giuseppe Peano in 1888. Peano's work also remained somewhat obscure until the turn of the century, when the subject was unified by members of the French geometry school (notably Henri Poincaré, Élie Cartan, and Gaston Darboux) who applied Grassmann's ideas to the calculus of differential forms.
A short while later, Alfred North Whitehead, borrowing from the ideas of Peano and Grassmann, introduced his universal algebra. This then paved the way for the 20th century developments of abstract algebra by placing the axiomatic notion of an algebraic system on a firm logical footing.
^Grassmann (1844) introduced these as extended algebras (cf. Clifford 1878). He used the word äußere (literally translated as outer, or exterior) only to indicate the produkt he defined, which is nowadays conventionally called exterior product, probably to distinguish it from the outer product as defined in modern linear algebra.
^Strictly speaking, the magnitude depends on some additional structure, namely that the vectors be in a Euclidean space. We do not generally assume that this structure is available, except where it is helpful to develop intuition on the subject.
^The term k-vector is not equivalent to and should not be confused with similar terms such as 4-vector, which in a different context could mean an element of a 4-dimensional vector space. A minority of authors use the term -multivector instead of -vector, which avoids this confusion.
^This part of the statement also holds in greater generality if V and W are modules over a commutative ring: That converts epimorphisms to epimorphisms. See Bourbaki (1989, Proposition 3, §III.7.2).
^This statement generalizes only to the case where V and W are projective modules over a commutative ring. Otherwise, it is generally not the case that converts monomorphisms to monomorphisms. See Bourbaki (1989, Corollary to Proposition 12, §III.7.9).
^Such a filtration also holds for vector bundles, and projective modules over a commutative ring. This is thus more general than the result quoted above for direct sums, since not every short exact sequence splits in other abelian categories.
^James, A.T. (1983). "On the Wedge Product". In Karlin, Samuel; Amemiya, Takeshi; Goodman, Leo A. (eds.). Studies in Econometrics, Time Series, and Multivariate Statistics. Academic Press. pp. 455–464. ISBN0-12-398750-4.
^Kannenberg (2000) published a translation of Grassmann's work in English; he translated Ausdehnungslehre as Extension Theory.
^J Itard, Biography in Dictionary of Scientific Biography (New York 1970–1990).
^Authors have in the past referred to this calculus variously as the calculus of extension (Whitehead 1898; Forder 1941), or extensive algebra (Clifford 1878), and recently as extended vector algebra (Browne 2007).
Includes a treatment of alternating tensors and alternating forms, as well as a detailed discussion of Hodge duality from the perspective adopted in this article.
This is the main mathematical reference for the article. It introduces the exterior algebra of a module over a commutative ring (although this article specializes primarily to the case when the ring is a field), including a discussion of the universal property, functoriality, duality, and the bialgebra structure. See §III.7 and §III.11.
This book contains applications of exterior algebras to problems in partial differential equations. Rank and related concepts are developed in the early chapters.
Clifford, W. (1878), "Applications of Grassmann's Extensive Algebra", American Journal of Mathematics, The Johns Hopkins University Press, 1 (4): 350–358, doi:10.2307/2369379, JSTOR2369379
Includes applications of the exterior algebra to differential forms, specifically focused on integration and Stokes's theorem. The notation in this text is used to mean the space of alternating k-forms on V; i.e., for Spivak is what this article would call Spivak discusses this in Addendum 4.
Chapter 10: The Exterior Product and Exterior Algebras
"The Grassmann method in projective geometry" A compilation of English translations of three notes by Cesare Burali-Forti on the application of exterior algebra to projective geometry