In mathematics, given a field ${\displaystyle \mathbb {F} }$, nonnegative integers ${\displaystyle m,n}$, and a matrix ${\displaystyle A\in \mathbb {F} ^{m\times n))$, a rank decomposition or rank factorization of A is a factorization of A of the form A = CF, where ${\displaystyle C\in \mathbb {F} ^{m\times r))$ and ${\displaystyle F\in \mathbb {F} ^{r\times n))$, where ${\displaystyle r=\operatorname {rank} A}$ is the rank of ${\displaystyle A}$.

## Existence

Every finite-dimensional matrix has a rank decomposition: Let ${\textstyle A}$ be an ${\textstyle m\times n}$ matrix whose column rank is ${\textstyle r}$. Therefore, there are ${\textstyle r}$ linearly independent columns in ${\textstyle A}$; equivalently, the dimension of the column space of ${\textstyle A}$ is ${\textstyle r}$. Let ${\textstyle \mathbf {c} _{1},\mathbf {c} _{2},\ldots ,\mathbf {c} _{r))$ be any basis for the column space of ${\textstyle A}$ and place them as column vectors to form the ${\textstyle m\times r}$ matrix ${\textstyle C={\begin{bmatrix}\mathbf {c} _{1}&\mathbf {c} _{2}&\cdots &\mathbf {c} _{r}\end{bmatrix))}$. Therefore, every column vector of ${\textstyle A}$ is a linear combination of the columns of ${\textstyle C}$. To be precise, if ${\textstyle A={\begin{bmatrix}\mathbf {a} _{1}&\mathbf {a} _{2}&\cdots &\mathbf {a} _{n}\end{bmatrix))}$ is an ${\textstyle m\times n}$ matrix with ${\textstyle \mathbf {a} _{j))$ as the ${\textstyle j}$-th column, then

${\displaystyle \mathbf {a} _{j}=f_{1j}\mathbf {c} _{1}+f_{2j}\mathbf {c} _{2}+\cdots +f_{rj}\mathbf {c} _{r},}$

where ${\textstyle f_{ij))$'s are the scalar coefficients of ${\textstyle \mathbf {a} _{j))$ in terms of the basis ${\textstyle \mathbf {c} _{1},\mathbf {c} _{2},\ldots ,\mathbf {c} _{r))$. This implies that ${\textstyle A=CF}$, where ${\textstyle f_{ij))$ is the ${\textstyle (i,j)}$-th element of ${\textstyle F}$.

## Non-uniqueness

If ${\textstyle A=C_{1}F_{1))$ is a rank factorization, taking ${\textstyle C_{2}=C_{1}R}$ and ${\textstyle F_{2}=R^{-1}F_{1))$ gives another rank factorization for any invertible matrix ${\textstyle R}$ of compatible dimensions.

Conversely, if ${\textstyle A=F_{1}G_{1}=F_{2}G_{2))$ are two rank factorizations of ${\textstyle A}$, then there exists an invertible matrix ${\textstyle R}$ such that ${\textstyle F_{1}=F_{2}R}$ and ${\textstyle G_{1}=R^{-1}G_{2))$.[1]

## Construction

### Rank factorization from reduced row echelon forms

In practice, we can construct one specific rank factorization as follows: we can compute ${\textstyle B}$, the reduced row echelon form of ${\textstyle A}$. Then ${\textstyle C}$ is obtained by removing from ${\textstyle A}$ all non-pivot columns (which can be determined by looking for columns in ${\textstyle B}$ which do not contain a pivot), and ${\textstyle F}$ is obtained by eliminating any all-zero rows of ${\textstyle B}$.

Note: For a full-rank square matrix (i.e. when ${\textstyle n=m=r}$), this procedure will yield the trivial result ${\textstyle C=A}$ and ${\textstyle F=B=I_{n))$ (the ${\textstyle n\times n}$ identity matrix).

#### Example

Consider the matrix

${\displaystyle A={\begin{bmatrix}1&3&1&4\\2&7&3&9\\1&5&3&1\\1&2&0&8\end{bmatrix))\sim {\begin{bmatrix}1&0&-2&0\\0&1&1&0\\0&0&0&1\\0&0&0&0\end{bmatrix))=B{\text{.))}$

${\textstyle B}$ is in reduced echelon form.

Then ${\textstyle C}$ is obtained by removing the third column of ${\textstyle A}$, the only one which is not a pivot column, and ${\textstyle F}$ by getting rid of the last row of zeroes from ${\textstyle B}$, so

${\displaystyle C={\begin{bmatrix}1&3&4\\2&7&9\\1&5&1\\1&2&8\end{bmatrix)){\text{,))\qquad F={\begin{bmatrix}1&0&-2&0\\0&1&1&0\\0&0&0&1\end{bmatrix)){\text{.))}$

It is straightforward to check that

${\displaystyle A={\begin{bmatrix}1&3&1&4\\2&7&3&9\\1&5&3&1\\1&2&0&8\end{bmatrix))={\begin{bmatrix}1&3&4\\2&7&9\\1&5&1\\1&2&8\end{bmatrix)){\begin{bmatrix}1&0&-2&0\\0&1&1&0\\0&0&0&1\end{bmatrix))=CF{\text{.))}$

#### Proof

Let ${\textstyle P}$ be an ${\textstyle n\times n}$ permutation matrix such that ${\textstyle AP=(C,D)}$ in block partitioned form, where the columns of ${\textstyle C}$ are the ${\textstyle r}$ pivot columns of ${\textstyle A}$. Every column of ${\textstyle D}$ is a linear combination of the columns of ${\textstyle C}$, so there is a matrix ${\textstyle G}$ such that ${\textstyle D=CG}$, where the columns of ${\textstyle G}$ contain the coefficients of each of those linear combinations. So ${\textstyle AP=(C,CG)=C(I_{r},G)}$, ${\textstyle I_{r))$ being the ${\textstyle r\times r}$ identity matrix. We will show now that ${\textstyle (I_{r},G)=FP}$.

Transforming ${\textstyle A}$ into its reduced row echelon form ${\textstyle B}$ amounts to left-multiplying by a matrix ${\textstyle E}$ which is a product of elementary matrices, so ${\textstyle EAP=BP=EC(I_{r},G)}$, where ${\textstyle EC={\begin{pmatrix}I_{r}\\0\end{pmatrix))}$. We then can write ${\textstyle BP={\begin{pmatrix}I_{r}&G\\0&0\end{pmatrix))}$, which allows us to identify ${\textstyle (I_{r},G)=FP}$, i.e. the nonzero ${\textstyle r}$ rows of the reduced echelon form, with the same permutation on the columns as we did for ${\textstyle A}$. We thus have ${\textstyle AP=CFP}$, and since ${\textstyle P}$ is invertible this implies ${\textstyle A=CF}$, and the proof is complete.

### Singular value decomposition

If ${\displaystyle \mathbb {F} \in \{\mathbb {R} ,\mathbb {C} \},}$ then one can also construct a full-rank factorization of ${\textstyle A}$ via a singular value decomposition

${\displaystyle A=U\Sigma V^{*}={\begin{bmatrix}U_{1}&U_{2}\end{bmatrix)){\begin{bmatrix}\Sigma _{r}&0\\0&0\end{bmatrix)){\begin{bmatrix}V_{1}^{*}\\V_{2}^{*}\end{bmatrix))=U_{1}\left(\Sigma _{r}V_{1}^{*}\right).}$

Since ${\textstyle U_{1))$ is a full-column-rank matrix and ${\textstyle \Sigma _{r}V_{1}^{*))$ is a full-row-rank matrix, we can take ${\textstyle C=U_{1))$ and ${\textstyle F=\Sigma _{r}V_{1}^{*))$.

## Consequences

### rank(A) = rank(AT)

An immediate consequence of rank factorization is that the rank of ${\textstyle A}$ is equal to the rank of its transpose ${\textstyle A^{\textsf {T))}$. Since the columns of ${\textstyle A}$ are the rows of ${\textstyle A^{\textsf {T))}$, the column rank of ${\textstyle A}$ equals its row rank.[2]

Proof: To see why this is true, let us first define rank to mean column rank. Since ${\textstyle A=CF}$, it follows that ${\textstyle A^{\textsf {T))=F^{\textsf {T))C^{\textsf {T))}$. From the definition of matrix multiplication, this means that each column of ${\textstyle A^{\textsf {T))}$ is a linear combination of the columns of ${\textstyle F^{\textsf {T))}$. Therefore, the column space of ${\textstyle A^{\textsf {T))}$ is contained within the column space of ${\textstyle F^{\textsf {T))}$ and, hence, ${\textstyle \operatorname {rank} \left(A^{\textsf {T))\right)\leq \operatorname {rank} \left(F^{\textsf {T))\right)}$.

Now, ${\textstyle F^{\textsf {T))}$ is ${\textstyle n\times r}$, so there are ${\textstyle r}$ columns in ${\textstyle F^{\textsf {T))}$ and, hence, ${\textstyle \operatorname {rank} \left(A^{\textsf {T))\right)\leq r=\operatorname {rank} \left(A\right)}$. This proves that ${\textstyle \operatorname {rank} \left(A^{\textsf {T))\right)\leq \operatorname {rank} \left(A\right)}$.

Now apply the result to ${\textstyle A^{\textsf {T))}$ to obtain the reverse inequality: since ${\textstyle \left(A^{\textsf {T))\right)^{\textsf {T))=A}$, we can write ${\textstyle \operatorname {rank} \left(A\right)=\operatorname {rank} \left(\left(A^{\textsf {T))\right)^{\textsf {T))\right)\leq \operatorname {rank} \left(A^{\textsf {T))\right)}$. This proves ${\textstyle \operatorname {rank} \left(A\right)\leq \operatorname {rank} \left(A^{\textsf {T))\right)}$.

We have, therefore, proved ${\textstyle \operatorname {rank} \left(A^{\textsf {T))\right)\leq \operatorname {rank} \left(A\right)}$ and ${\textstyle \operatorname {rank} \left(A\right)\leq \operatorname {rank} \left(A^{\textsf {T))\right)}$, so ${\textstyle \operatorname {rank} \left(A\right)=\operatorname {rank} \left(A^{\textsf {T))\right)}$.

## Notes

1. ^ Piziak, R.; Odell, P. L. (1 June 1999). "Full Rank Factorization of Matrices". Mathematics Magazine. 72 (3): 193. doi:10.2307/2690882. JSTOR 2690882.
2. ^ Banerjee, Sudipto; Roy, Anindya (2014), Linear Algebra and Matrix Analysis for Statistics, Texts in Statistical Science (1st ed.), Chapman and Hall/CRC, ISBN 978-1420095388

## References

• Banerjee, Sudipto; Roy, Anindya (2014), Linear Algebra and Matrix Analysis for Statistics, Texts in Statistical Science (1st ed.), Chapman and Hall/CRC, ISBN 978-1420095388
• Lay, David C. (2005), Linear Algebra and its Applications (3rd ed.), Addison Wesley, ISBN 978-0-201-70970-4
• Golub, Gene H.; Van Loan, Charles F. (1996), Matrix Computations, Johns Hopkins Studies in Mathematical Sciences (3rd ed.), The Johns Hopkins University Press, ISBN 978-0-8018-5414-9
• Stewart, Gilbert W. (1998), Matrix Algorithms. I. Basic Decompositions, SIAM, ISBN 978-0-89871-414-2
• Piziak, R.; Odell, P. L. (1 June 1999). "Full Rank Factorization of Matrices". Mathematics Magazine. 72 (3): 193. doi:10.2307/2690882. JSTOR 2690882.