From the point of view of linear algebra, the "natural" multiplication operation for matrices is the usual matrix product, and there are lots of theorems involving this product---e.g. the result $\det(AB) = \det(A)\det(B)$, or $\text{tr}(AB) = \text{tr}(BA)$, etc. However, there are lots of matrices one encounters in practice whose structure allows them to be written in a convenient way as an element-wise (Hadamard) product of two other matrices. This is one of the reasons why the default multiplication of arrays is element-wise in many programming languages (e.g. Python). In situations where element-wise products appear, it could be very nice to have theorems (like the above determinant & trace relations) concerning the linear algebraic character of the element-wise product. My question is: Do any "interesting" such theorems exist?
[I don't expect to find any results as slick as the above $\det$ and $\text{tr}$ identities, but perhaps there are analogous inequalities, or maybe some non-trivial statements about diagonalizability, or eigenvalue relations, etc.]
numpy
) very annoying because they do not separate matrix operations from array operations cleanly. $\endgroup$