5
$\begingroup$

Recently my supervisor told me about an efficient way to calculate eigenvalues and eigenvectors of matrix $A \otimes B$ with $a_{1} \times a_{2}$ as dimensions of $A$ and $b_{1} \times b_{2}$ is of $B$. Instead of creating this matrix directly (which occupies large amount of RAM because of large dimension $(a_{1}\times b_{1}, a_{2}\times b_{2})$ ) and after that compute eigenvalues, he proposed an optimized Lanczos method that uses a vector $v_{1}$ with dimension $b_{2}\times a_{1}$ so that in each step, new vector defined by:
$$ w_{i+1} = A v_{i} B^T. $$ It will be used in the Lanczos algorithm.
I'll be grateful if someone can introduce any references or answer here why this method works and talk a little about the mathematical formula behind this method.

$\endgroup$
4
  • $\begingroup$ By the way, the correct identity is $(A\otimes B)\operatorname{vec}(V) = \operatorname{vec}(BXA^T)$, where $T$ stands for a transpose (without conjugation). $\endgroup$ Commented May 26 at 14:12
  • 1
    $\begingroup$ You could also ask your supervisor :-) $\endgroup$ Commented May 28 at 17:40
  • $\begingroup$ @WolfgangBangerth its a task that I have to solve it :) $\endgroup$ Commented May 29 at 17:06
  • $\begingroup$ Is it homework? $\endgroup$ Commented May 29 at 17:19

2 Answers 2

11
$\begingroup$
  1. There isn't much complicated behind this idea; it's just that since Lanczos is a black-box method you can use any method of your choice to compute the products $v\mapsto (A\otimes B)v$ needed in the algorithm, and this is a particularly efficient way to do it. All the theory is the same.

  2. Do you really need to do this? Eigenvalues and a basis of eigenvectors of $A\otimes B$ are obtained easily from those of $A$ and $B$: take $\lambda_i\mu_j$ and $v_i\otimes w_j$, where $\lambda_i,v_i$ are eigenvalues and eigenvectors of $A$ and $\mu_j,w_j$ of $B$.

$\endgroup$
2
  • 2
    $\begingroup$ Thanks for your answer! but how about generalized version of Kronecker's product like $\sum_{i} A_{i} \otimes B_{i}$? It seems to me that product of eigenvalues doesn't work in here $\endgroup$ Commented May 26 at 11:48
  • 1
    $\begingroup$ You are right, the trick doesn't work for a sum of Kronecker products. In that case the Lanczos approach makes sense. $\endgroup$ Commented May 26 at 13:43
0
$\begingroup$

The key note is to rewrite Kronecker's product as a matrix multiplication. At first, if we denote bases of $A$ as ${|a_{i}\rangle}$ and $B$ as ${|b_{i}\rangle}$ (apology if I used unconventional bra&ket notation), we can replace an equivalent equation instead of $A\otimes B |v\rangle$. Here is necessary steps:

  1. first, by writing $|v\rangle$ in mentioned bases: $$ |v\rangle = \sum_{i,j} v_{ij} |a_{i}\rangle\otimes|b_{j}\rangle $$ we can reshape vector $|v\rangle$ to a matrix form ($V$) by simply map second basis ($|b\rangle$) to its dual space: $$ |v\rangle \to \sum_{i,j} v_{ij} |a_{i}\rangle \otimes \langle b_{j}| = V $$
  2. the product $A\otimes B |v\rangle$ can brought to matrix multiplication form: $$ A\otimes B |v\rangle = \sum_{i,j} v_{ij} (A|i\rangle) \otimes (B|j\rangle) \to \sum_{i,j} v_{ij} (A|i\rangle) \otimes (\langle j|B^T) $$ since dual vector of $B|j\rangle$ is $(B|j\rangle)^T$ or $\langle j|B^T$. Now it is easy to show that above equation equals to: $$ = \sum_{i,j,m,n} v_{ij}(A_{m,i}|m\rangle (\langle n|B_{nj}) $$ or in matrix form: $$ = A V B^T. $$ Although this replacement of tensor product by matrix multiplication reduced size of matrices (and no out of memory error) but increase CPU usage because of matrix product.
$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.