In linear algebra, a column vector with ${\displaystyle m}$ elements is an ${\displaystyle m\times 1}$ matrix[1] consisting of a single column of ${\displaystyle m}$ entries, for example,

${\displaystyle {\boldsymbol {x))={\begin{bmatrix}x_{1}\\x_{2}\\\vdots \\x_{m}\end{bmatrix)).}$

Similarly, a row vector is a ${\displaystyle 1\times n}$ matrix for some ${\displaystyle n}$, consisting of a single row of ${\displaystyle n}$ entries,

${\displaystyle {\boldsymbol {a))={\begin{bmatrix}a_{1}&a_{2}&\dots &a_{n}\end{bmatrix)).}$
(Throughout this article, boldface is used for both row and column vectors.)

The transpose (indicated by T) of any row vector is a column vector, and the transpose of any column vector is a row vector:

${\displaystyle {\begin{bmatrix}x_{1}\;x_{2}\;\dots \;x_{m}\end{bmatrix))^{\rm {T))={\begin{bmatrix}x_{1}\\x_{2}\\\vdots \\x_{m}\end{bmatrix))}$
and
${\displaystyle {\begin{bmatrix}x_{1}\\x_{2}\\\vdots \\x_{m}\end{bmatrix))^{\rm {T))={\begin{bmatrix}x_{1}\;x_{2}\;\dots \;x_{m}\end{bmatrix)).}$

The set of all row vectors with n entries in a given field (such as the real numbers) forms an n-dimensional vector space; similarly, the set of all column vectors with m entries forms an m-dimensional vector space.

The space of row vectors with n entries can be regarded as the dual space of the space of column vectors with n entries, since any linear functional on the space of column vectors can be represented as the left-multiplication of a unique row vector.

## Notation

To simplify writing column vectors in-line with other text, sometimes they are written as row vectors with the transpose operation applied to them.

${\displaystyle {\boldsymbol {x))={\begin{bmatrix}x_{1}\;x_{2}\;\dots \;x_{m}\end{bmatrix))^{\rm {T))}$

or

${\displaystyle {\boldsymbol {x))={\begin{bmatrix}x_{1},x_{2},\dots ,x_{m}\end{bmatrix))^{\rm {T))}$

Some authors also use the convention of writing both column vectors and row vectors as rows, but separating row vector elements with commas and column vector elements with semicolons (see alternative notation 2 in the table below).[citation needed]

Row vector Column vector
Standard matrix notation
(array spaces, no commas, transpose signs)
${\displaystyle {\begin{bmatrix}x_{1}\;x_{2}\;\dots \;x_{m}\end{bmatrix))}$ ${\displaystyle {\begin{bmatrix}x_{1}\\x_{2}\\\vdots \\x_{m}\end{bmatrix)){\text{ or )){\begin{bmatrix}x_{1}\;x_{2}\;\dots \;x_{m}\end{bmatrix))^{\rm {T))}$
Alternative notation 1
(commas, transpose signs)
${\displaystyle {\begin{bmatrix}x_{1},x_{2},\dots ,x_{m}\end{bmatrix))}$ ${\displaystyle {\begin{bmatrix}x_{1},x_{2},\dots ,x_{m}\end{bmatrix))^{\rm {T))}$
Alternative notation 2
(commas and semicolons, no transpose signs)
${\displaystyle {\begin{bmatrix}x_{1},x_{2},\dots ,x_{m}\end{bmatrix))}$ ${\displaystyle {\begin{bmatrix}x_{1};x_{2};\dots ;x_{m}\end{bmatrix))}$

## Operations

Matrix multiplication involves the action of multiplying each row vector of one matrix by each column vector of another matrix.

The dot product of two column vectors a, b, considered as elements of a coordinate space, is equal to the matrix product of the transpose of a with b,

${\displaystyle \mathbf {a} \cdot \mathbf {b} =\mathbf {a} ^{\intercal }\mathbf {b} ={\begin{bmatrix}a_{1}&\cdots &a_{n}\end{bmatrix)){\begin{bmatrix}b_{1}\\\vdots \\b_{n}\end{bmatrix))=a_{1}b_{1}+\cdots +a_{n}b_{n}\,,}$

By the symmetry of the dot product, the dot product of two column vectors a, b is also equal to the matrix product of the transpose of b with a,

${\displaystyle \mathbf {b} \cdot \mathbf {a} =\mathbf {b} ^{\intercal }\mathbf {a} ={\begin{bmatrix}b_{1}&\cdots &b_{n}\end{bmatrix)){\begin{bmatrix}a_{1}\\\vdots \\a_{n}\end{bmatrix))=a_{1}b_{1}+\cdots +a_{n}b_{n}\,.}$

The matrix product of a column and a row vector gives the outer product of two vectors a, b, an example of the more general tensor product. The matrix product of the column vector representation of a and the row vector representation of b gives the components of their dyadic product,

${\displaystyle \mathbf {a} \otimes \mathbf {b} =\mathbf {a} \mathbf {b} ^{\intercal }={\begin{bmatrix}a_{1}\\a_{2}\\a_{3}\end{bmatrix)){\begin{bmatrix}b_{1}&b_{2}&b_{3}\end{bmatrix))={\begin{bmatrix}a_{1}b_{1}&a_{1}b_{2}&a_{1}b_{3}\\a_{2}b_{1}&a_{2}b_{2}&a_{2}b_{3}\\a_{3}b_{1}&a_{3}b_{2}&a_{3}b_{3}\\\end{bmatrix))\,,}$

which is the transpose of the matrix product of the column vector representation of b and the row vector representation of a,

${\displaystyle \mathbf {b} \otimes \mathbf {a} =\mathbf {b} \mathbf {a} ^{\intercal }={\begin{bmatrix}b_{1}\\b_{2}\\b_{3}\end{bmatrix)){\begin{bmatrix}a_{1}&a_{2}&a_{3}\end{bmatrix))={\begin{bmatrix}b_{1}a_{1}&b_{1}a_{2}&b_{1}a_{3}\\b_{2}a_{1}&b_{2}a_{2}&b_{2}a_{3}\\b_{3}a_{1}&b_{3}a_{2}&b_{3}a_{3}\\\end{bmatrix))\,.}$

## Matrix transformations

 Main article: Transformation matrix

An n × n matrix M can represent a linear map and act on row and column vectors as the linear map's transformation matrix. For a row vector v, the product vM is another row vector p:

${\displaystyle \mathbf {v} M=\mathbf {p} \,.}$

Another n × n matrix Q can act on p,

${\displaystyle \mathbf {p} Q=\mathbf {t} \,.}$

Then one can write t = pQ = vMQ, so the matrix product transformation MQ maps v directly to t. Continuing with row vectors, matrix transformations further reconfiguring n-space can be applied to the right of previous outputs.

When a column vector is transformed to another column vector under an n × n matrix action, the operation occurs to the left,

${\displaystyle \mathbf {p} ^{\mathrm {T} }=M\mathbf {v} ^{\mathrm {T} }\,,\quad \mathbf {t} ^{\mathrm {T} }=Q\mathbf {p} ^{\mathrm {T} },}$

leading to the algebraic expression QM vT for the composed output from vT input. The matrix transformations mount up to the left in this use of a column vector for input to matrix transformation.

## Notes

1. ^ Artin, Michael (1991). Algebra. Englewood Cliffs, NJ: Prentice-Hall. p. 2. ISBN 0-13-004763-5.

## References

• Axler, Sheldon Jay (1997), Linear Algebra Done Right (2nd ed.), Springer-Verlag, ISBN 0-387-98259-0
• Lay, David C. (August 22, 2005), Linear Algebra and Its Applications (3rd ed.), Addison Wesley, ISBN 978-0-321-28713-7
• Meyer, Carl D. (February 15, 2001), Matrix Analysis and Applied Linear Algebra, Society for Industrial and Applied Mathematics (SIAM), ISBN 978-0-89871-454-8, archived from the original on March 1, 2001
• Poole, David (2006), Linear Algebra: A Modern Introduction (2nd ed.), Brooks/Cole, ISBN 0-534-99845-3
• Anton, Howard (2005), Elementary Linear Algebra (Applications Version) (9th ed.), Wiley International
• Leon, Steven J. (2006), Linear Algebra With Applications (7th ed.), Pearson Prentice Hall