In mathematics, the Hadamard product (also known as the element-wise product, entrywise product: ch. 5  or Schur product) is a binary operation that takes in two matrices of the same dimensions and returns a matrix of the multiplied corresponding elements. This operation can be thought as a "naive matrix multiplication" and is different from the matrix product. It is attributed to, and named after, either French-Jewish mathematician Jacques Hadamard or German-Jewish mathematician Issai Schur.

The Hadamard product is associative and distributive. Unlike the matrix product, it is also commutative.

## Definition

For two matrices A and B of the same dimension m × n, the Hadamard product $A\circ B$ (or $A\odot B$ ) is a matrix of the same dimension as the operands, with elements given by

$(A\circ B)_{ij}=(A\odot B)_{ij}=(A)_{ij}(B)_{ij}.$ For matrices of different dimensions (m × n and p × q, where mp or nq), the Hadamard product is undefined.

For example, the Hadamard product for two arbitrary 2 × 3 matrices is:

${\begin{bmatrix}2&3&1\\0&8&-2\end{bmatrix))\circ {\begin{bmatrix}3&1&4\\7&9&5\end{bmatrix))={\begin{bmatrix}2\times 3&3\times 1&1\times 4\\0\times 7&8\times 9&-2\times 5\end{bmatrix))={\begin{bmatrix}6&3&4\\0&72&-10\end{bmatrix))$ ## Properties

• The Hadamard product is commutative (when working with a commutative ring), associative and distributive over addition. That is, if A, B, and C are matrices of the same size, and k is a scalar:
{\begin{aligned}A\circ B&=B\circ A,\\A\circ (B\circ C)&=(A\circ B)\circ C,\\A\circ (B+C)&=A\circ B+A\circ C,\\\left(kA\right)\circ B&=A\circ \left(kB\right)=k\left(A\circ B\right),\\A\circ 0&=0\circ A=0.\end{aligned)) • The identity matrix under Hadamard multiplication of two m × n matrices is an m × n matrix where all elements are equal to 1. This is different from the identity matrix under regular matrix multiplication, where only the elements of the main diagonal are equal to 1. Furthermore, a matrix has an inverse under Hadamard multiplication if and only if none of the elements are equal to zero.
• For vectors x and y, and corresponding diagonal matrices Dx and Dy with these vectors as their main diagonals, the following identity holds:: 479
$\mathbf {x} ^{*}(A\circ B)\mathbf {y} =\operatorname {tr} \left({D}_{\mathbf {x} }^{*}A{D}_{\mathbf {y} }{B}^{\mathsf {T))\right),$ where x* denotes the conjugate transpose of x. In particular, using vectors of ones, this shows that the sum of all elements in the Hadamard product is the trace of ABT where superscript T denotes the matrix transpose, that is, $\operatorname {tr} \left(AB^{\mathsf {T))\right)=\mathbf {1} ^{\mathsf {T))\left(A\circ B\right)\mathbf {1}$ . A related result for square A and B, is that the row-sums of their Hadamard product are the diagonal elements of ABT:
$\sum _{i}(A\circ B)_{ij}=\left(B^{\mathsf {T))A\right)_{jj}=\left(AB^{\mathsf {T))\right)_{ii}.$ Similarly,
$\left(\mathbf {y} \mathbf {x} ^{*}\right)\circ A={D}_{\mathbf {y} }A{D}_{\mathbf {x} }^{*)$ Furthermore, a Hadamard matrix-vector product can be expressed as:
$(A\circ B)\mathbf {y} =\operatorname {diag} \left(AD_{\mathbf {y} }B^{\mathsf {T))\right)$ where $\operatorname {diag} (M)$ is the vector formed from the diagonals of matrix M.
• The Hadamard product is a principal submatrix of the Kronecker product.
• The Hadamard product satisfies the rank inequality
$\operatorname {rank} (A\circ B)\leq \operatorname {rank} (A)\operatorname {rank} (B)$ • If A and B are positive-definite matrices, then the following inequality involving the Hadamard product holds:
$\prod _{i=k}^{n}\lambda _{i}(A\circ B)\geq \prod _{i=k}^{n}\lambda _{i}(AB),\quad k=1,\ldots ,n,$ where λi(A) is the ith largest eigenvalue of A.
• If D and E are diagonal matrices, then
{\begin{aligned}D(A\circ B)E&=(DAE)\circ B=(DA)\circ (BE)\\&=(AE)\circ (DB)=A\circ (DBE).\end{aligned)) • The Hadamard product of two vectors $\mathbf {a}$ and $\mathbf {b}$ is the same as matrix multiplication of one vector by the corresponding diagonal matrix of the other vector:
$\mathbf {a} \circ \mathbf {b} =D_{\mathbf {a} }\mathbf {b} =D_{\mathbf {b} }\mathbf {a}$ • The vector to diagonal matrix $\operatorname {diag}$ operator may be expressed using the Hadamard product as:
$\operatorname {diag} (\mathbf {a} )=(\mathbf {a} \mathbf {1} ^{T})\circ I$ where $\mathbf {1}$ is a constant vector with elements $1$ and $I$ is the identity matrix.

## The mixed-product property

$(A\otimes B)\circ (C\otimes D)=(A\circ C)\otimes (B\circ D),$ where $\otimes$ is Kronecker product, assuming $A$ has the same dimensions of $C$ and $B$ with $D$ .

$(A\bullet B)\circ (C\bullet D)=(A\circ C)\bullet (B\circ D),$ where $\bullet$ denotes face-splitting product.

$(A\bullet B)(C\ast D)=(AC)\circ (BD),$ where $\ast$ is column-wise Khatri–Rao product.

## Schur product theorem

 Main article: Schur product theorem

The Hadamard product of two positive-semidefinite matrices is positive-semidefinite. This is known as the Schur product theorem, after Russian mathematician Issai Schur. For two positive-semidefinite matrices A and B, it is also known that the determinant of their Hadamard product is greater than or equal to the product of their respective determinants:

$\det({A}\circ {B})\geq \det({A})\det({B}).$ ## In programming languages

Hadamard multiplication is built into certain programming languages under various names. In MATLAB, GNU Octave, GAUSS and HP Prime, it is known as array multiplication, or in Julia broadcast multiplication, with the symbol .*. In Fortran, R, APL, J and Wolfram Language (Mathematica), it is done through simple multiplication operator * or ×, whereas the matrix product is done through the function matmul, %*%, +.×, +/ .* and the . operators, respectively. In Python with the NumPy numerical library, multiplication of array objects as a*b produces the Hadamard product, and multiplication as a@b produces the matrix product. With the SymPy symbolic library, multiplication of array objects as both a*b and a@b will produce the matrix product, the Hadamard product can be obtained with a.multiply_elementwise(b). In C++, the Eigen library provides a cwiseProduct member function for the Matrix class (a.cwiseProduct(b)), while the Armadillo library uses the operator % to make compact expressions (a % b; a * b is a matrix product). R package matrixcalc introduces function hadamard.prod() for Hadamard Product of numeric matrices or vectors.

## Applications

The Hadamard product appears in lossy compression algorithms such as JPEG. The decoding step involves an entry-for-entry product, in other words the Hadamard product.[citation needed]

In image processing, the Hadamard operator can be used for enhancing, suppressing or masking image regions. One matrix represents the original image, the other acts as weight or masking matrix.

It is used in the machine learning literature, for example, to describe the architecture of recurrent neural networks as GRUs or LSTMs.

It is also used to study the statistical properties of random vectors and matrices. 

## Analogous operations

Other Hadamard operations are also seen in the mathematical literature, namely the Hadamard root and Hadamard power (which are in effect the same thing because of fractional indices), defined for a matrix such that:

For

{\begin{aligned}{B}&={A}^{\circ 2}\\B_{ij}&={A_{ij))^{2}\end{aligned)) and for

{\begin{aligned}{B}&={A}^{\circ {\frac {1}{2))}\\B_{ij}&={A_{ij))^{\frac {1}{2))\end{aligned)) {\begin{aligned}{B}&={A}^{\circ -1}\\B_{ij}&={A_{ij))^{-1}\end{aligned)) A Hadamard division is defined as:

{\begin{aligned}{C}&={A}\oslash {B}\\C_{ij}&={\frac {A_{ij)){B_{ij))}\end{aligned)) ## The penetrating face product

According to the definition of V. Slyusar the penetrating face product of the p×g matrix ${A)$ and n-dimensional matrix ${B)$ (n > 1) with p×g blocks (${B}=[B_{n}]$ ) is a matrix of size ${B)$ of the form:

${A}[\circ ]{B}=\left[{\begin{array}{c | c | c | c }{A}\circ {B}_{1}&{A}\circ {B}_{2}&\cdots &{A}\circ {B}_{n}\end{array))\right].$ ### Example

If

${A}={\begin{bmatrix}1&2&3\\4&5&6\\7&8&9\end{bmatrix)),\quad {B}=\left[{\begin{array}{c | c | c }{B}_{1}&{B}_{2}&{B}_{3}\end{array))\right]=\left[{\begin{array}{c c c | c c c | c c c }1&4&7&2&8&14&3&12&21\\8&20&5&10&25&40&12&30&6\\2&8&3&2&4&2&7&3&9\end{array))\right]$ then

${A}[\circ ]{B}=\left[{\begin{array}{c c c | c c c | c c c }1&8&21&2&16&42&3&24&63\\32&100&30&40&125&240&48&150&36\\14&64&27&14&32&18&49&24&81\end{array))\right].$ ### Main properties

${A}[\circ ]{B}={B}[\circ ]{A};$ ${M}\bullet {M}={M}[\circ ]\left({M}\otimes \mathbf {1} ^{\textsf {T))\right),$ where $\bullet$ denotes the face-splitting product of matrices,

$\mathbf {c} \bullet {M}=\mathbf {c} [\circ ]{M},$ where $\mathbf {c}$ is a vector.

### Applications

The penetrating face product is used in the tensor-matrix theory of digital antenna arrays. This operation can also be used in artificial neural network models, specifically convolutional layers.

1. ^ a b Horn, Roger A.; Johnson, Charles R. (2012). Matrix analysis. Cambridge University Press.
2. ^ Davis, Chandler (1962). "The norm of the Schur product operation". Numerische Mathematik. 4 (1): 343–44. doi:10.1007/bf01386329. S2CID 121027182.
3. ^ a b c Million, Elizabeth (April 12, 2007). "The Hadamard Product" (PDF). buzzard.ups.edu. Retrieved September 6, 2020.
4. ^ "Hadamard product - Machine Learning Glossary". machinelearning.wtf.
5. ^ "linear algebra - What does a dot in a circle mean?". Mathematics Stack Exchange.
6. ^ "Element-wise (or pointwise) operations notation?". Mathematics Stack Exchange.
7. ^ a b Million, Elizabeth. "The Hadamard Product" (PDF). Retrieved 2 January 2012.
8. ^ a b c Styan, George P. H. (1973), "Hadamard Products and Multivariate Statistical Analysis", Linear Algebra and Its Applications, 6: 217–240, doi:10.1016/0024-3795(73)90023-2, hdl:10338.dmlcz/102190
9. ^ Liu, Shuangzhe; Trenkler, Götz (2008). "Hadamard, Khatri-Rao, Kronecker and other matrix products". International Journal of Information and Systems Sciences. 4 (1): 160–177.
10. ^ Liu, Shuangzhe; Leiva, Víctor; Zhuang, Dan; Ma, Tiefeng; Figueroa-Zúñiga, Jorge I. (2022). "Matrix differential calculus with applications in the multivariate linear model and its diagnostics". Journal of Multivariate Analysis. 188: 104849. doi:10.1016/j.jmva.2021.104849. S2CID 239598156.
11. ^ Hiai, Fumio; Lin, Minghua (February 2017). "On an eigenvalue inequality involving the Hadamard product". Linear Algebra and Its Applications. 515: 313–320. doi:10.1016/j.laa.2016.11.017.
12. ^ "Project" (PDF). buzzard.ups.edu. 2007. Retrieved 2019-12-18.
13. ^ Slyusar, V. I. (1998). "End products in matrices in radar applications" (PDF). Radioelectronics and Communications Systems. 41 (3): 50–53.
14. ^ "Arithmetic Operators + - * / \ ^ ' -". MATLAB documentation. MathWorks. Archived from the original on 24 April 2012. Retrieved 2 January 2012.
15. ^ "Matrix multiplication". An Introduction to R. The R Project for Statistical Computing. 16 May 2013. Retrieved 24 August 2013.
16. ^
17. ^ Sak, Haşim; Senior, Andrew; Beaufays, Françoise (2014-02-05). "Long Short-Term Memory Based Recurrent Neural Network Architectures for Large Vocabulary Speech Recognition". arXiv:1402.1128 [cs.NE].
18. ^ Neudecker, Heinz; Liu, Shuangzhe; Polasek, Wolfgang (1995). "The Hadamard product and some of its applications in statistics". Statistics. 26 (4): 365–373. doi:10.1080/02331889508802503.
19. ^ Neudecker, Heinz; Liu, Shuangzhe (2001). "Some statistical properties of Hadamard products of random matrices". Statistical Papers. 42 (4): 475–487. doi:10.1007/s003620100074. S2CID 121385730.
20. ^ a b Reams, Robert (1999). "Hadamard inverses, square roots and products of almost semidefinite matrices". Linear Algebra and Its Applications. 288: 35–43. doi:10.1016/S0024-3795(98)10162-3.
21. ^ Wetzstein, Gordon; Lanman, Douglas; Hirsch, Matthew; Raskar, Ramesh. "Supplementary Material: Tensor Displays: Compressive Light Field Synthesis using Multilayer Displays with Directional Backlighting" (PDF). MIT Media Lab.
22. ^ Cyganek, Boguslaw (2013). Object Detection and Recognition in Digital Images: Theory and Practice. John Wiley & Sons. p. 109. ISBN 9781118618363.
23. ^ a b c Slyusar, V. I. (March 13, 1998). "A Family of Face Products of Matrices and its properties" (PDF). Cybernetics and Systems Analysis C/C of Kibernetika I Sistemnyi Analiz. 1999. 35 (3): 379–384. doi:10.1007/BF02733426. S2CID 119661450.
24. ^ Ha D., Dai A.M., Le Q.V. (2017). "HyperNetworks". The International Conference on Learning Representations (ICLR) 2017. – Toulon, 2017.: Page 6. arXiv:1609.09106.((cite journal)): CS1 maint: multiple names: authors list (link)