4

In my case $\mathscr{H}$ is a finite dimensional Hilbert space, and I am looking at $B(\mathscr{H}\otimes\mathscr{H})$ i.e. the matrices over $\mathscr{H}\otimes\mathscr{H}$. What is the condition that one of my $M\in B(\mathscr{H}\otimes\mathscr{H})$ has to fullfill such that it can be written as $M=A\otimes B$, where $A,B\in B(\mathscr{H})$.

More abstractly speaking what is the condition that a rank $2$ tensor has to fullfill to be pure? (Is this even the case here? Or am I asking about rank $4$ tensors, because the matrices are already rank $2$?) Why is it this condition?

I found this question which seems to ask the same, but I think mine is not duplicate, because

  1. I have the more concrete case of Matrices over a Hilbert space which may help to get an easier condition and

  2. I am not satisfied with the answer, because for me it does not become clear, why the conditions on the $f_{i,j}$ has to be this (and I think there is a typo in the condition) and it does not become clear, why this ist the same as the matrix having rank $1$. Also I am not sure what exactly the generalization is, i.e. whether it means that that the matrix $f_{ij}$ always has rank $1$ (which is what I think).

  • $M$ being decomposable as $M = A \otimes B$ is equivalent to $M$ having matrix rank 1. Moreover, $M$ being decomposable as $M = A \otimes B + C \otimes D$ is equivalent to it having matrix rank 2, etc. – CrabMan Aug 20 '24 at 14:49
  • 1
    @CrabMan when $A=C$ your $M$ should have rank one. And even that is sloppy: at most one sounds better. – Kurt G. Aug 20 '24 at 16:27
  • @Crabman Are you sure you are using the same definition of rank as OP? – Chris Aug 21 '24 at 06:18
  • @Chris I was talking about tensor rank and CrabMan is talking about matrix rank, so we were talking about different things, but I understand it, because they specified to matrix rank. – Sinthoras Aug 21 '24 at 06:22
  • @CrabMan yes, this seems to be the condition, thank you, but can you explain, why this corresponds to having matrix rank of at most one? – Sinthoras Aug 21 '24 at 06:25
  • @Sinthoras I unfortunately do not understand what Crabman means by $M=A\otimes B$ is euqivalent to $M$ having matrix rank $1$. This does not seem true to me as if $A=I$ and $B=I$, then $A\otimes B$ is again the identity matrix which has full rank. – Chris Aug 21 '24 at 06:27
  • @Chris You were right in that it is more complicated. Actually the condition is that the Matrix with some indices permuted has to have matrix rank 1, see the accepted answer. The reason for that is that we use the isomorphism $ℋ⊗ℋ⊗ℋ^⊗ℋ^→(ℋ⊗ℋ^)⊗(ℋ⊗ℋ^)$ – Sinthoras Aug 21 '24 at 10:50

2 Answers2

3

For a finite-dimensional $\mathcal{H}$ with $\dim \mathcal{H} = n$ we have an isomorphism $B(\mathcal{H} \otimes \mathcal{H}) \cong \mathcal{H} \otimes \mathcal{H} \otimes \mathcal{H}^* \otimes \mathcal{H}^*$, so for some aspects it is important to view elements of $B(\mathcal{H} \otimes \mathcal{H})$ as $4$-tensors. If $M \in B(\mathcal{H} \otimes \mathcal{H})$, then I will write $M^{ij}_{kl}$ for the coordinates of this tensor, with the top indices corresponding to contravariant ($\mathcal{H}$) factors, and bottom indices corresponding to covariant ($\mathcal{H}^*$) factors.

Your problem does not need this, it is a matrix rank problem in disguise. But to understand how to form a matrix, it is useful to look at our $4$-tensor.

There are different ways to group factors in the tensor product $\mathcal{H} \otimes \mathcal{H} \otimes \mathcal{H}^* \otimes \mathcal{H}^*$. If we look at $M \in B(\mathcal{H} \otimes \mathcal{H})$ as a linear map and form a matrix of this linear map, we essentially use the grouping $\mathcal{H} \otimes \mathcal{H} \otimes \mathcal{H}^* \otimes \mathcal{H}^* = (\mathcal{H} \otimes \mathcal{H}) \otimes (\mathcal{H}^* \otimes \mathcal{H}^*)$ to get from a $4$-tensor of format $n \times n \times n \times n$ to a matrix ($2$-tensor) of size $n^2 \times n^2$. This is not what we need.

To see if $M \in B(\mathcal{H} \otimes \mathcal{H})$ can be represented as $M = A \otimes B$ with $A, B \in B(\mathcal{H})$, we use isomorphism $B(\mathcal{H} \otimes \mathcal{H}) \cong B(\mathcal{H}) \otimes B(\mathcal{H})$, which in the tensor language corresponds to the isomorphism $$\mathcal{H} \otimes \mathcal{H} \otimes \mathcal{H}^* \otimes \mathcal{H}^* \cong (\mathcal{H} \otimes \mathcal{H}^*) \otimes (\mathcal{H} \otimes \mathcal{H}^*)$$ where the first and the third factors are grouped into one side, and the second and the fourth - into the other.

In coordinates, the element $M \in B(\mathcal{H} \otimes \mathcal{H})$ corresponds to a $n^2 \times n^2$ matrix $\mathbf{M} = (m_{ik, jl})$ with entries $m_{ik, jl} = M^{ij}_{kl}$ If $M = A \otimes B$ with $A, B \in B(\mathcal{H})$, then we have $M^{ij}_{kl} = A^i_k B^j_l$. Equivalently, $m_{ik, jl} = a_{ik} b_{jl}$ (here $a$ and $b$ are $n^2$-dimensional vectors corresponding to the matrices $A$ and $B$), which means that the matrix $\mathbf{M}$ is of rank at most $1$. There are many way to check it, for example, by checking the minors of $\mathbf{M}$, getting the following: the decomposition $M = A \otimes B$ exists if and only if $$M^{ij}_{kl} M^{pq}_{rs} - M^{iq}_{ks} M^{pj}_{rl} = 0$$ (for all values of indices).

I stress again that the matrix $\mathbf{M}$ is not the matrix of $M$ as a linear operator, but corresponds to a different rearrangement of components $M^{ij}_{kl}$. For example, consider $\mathcal{H} = \mathbb{C}^2$ and $M = I \otimes I$. As a linear map, $M$ is the identity and has a full rank (rank $4$) matrix. But the matrix $\mathbf{M}$ constructed above will have rank $1$. We have $$M^{ij}_{kl} = \begin{cases}1 & \text{if $i = k$ and $j = l$,} \\ 0 & \text{otherwise}.\end{cases}$$ We have $m_{ik, jl} = 1$ if and only if $ik \in \{11, 22\}$ and $jl \in \{11, 22\}$, and $$\mathbf{M} = \begin{bmatrix}1 & 0 & 0 & 1 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 1 & 0 & 0 & 1\end{bmatrix}.$$ Here the order of rows and columns is $11, 12, 21, 22$.

  • Thank you for the answer! "which means that the matrix M is of rank at most 1" Why is that? Where does this come from? Additionally: Can you give me a hint where the if and only if condition comes from or point to a ressource (e.g. a book or an article) where it is derived? – Sinthoras Aug 21 '24 at 06:18
  • That is basic linear algebra. For a matrix $M = (m_{IJ})$ we have $m_{IJ} = a_I b_J$ iff all columns are proportional to each other iff $M$ has rank at most $1$ iff all $2 \times 2$ vanish ($m_{IJ} m_{KL} - m_{IL} m_{JK} = 0$). In our case indices are two-letter, so we have $M = A \otimes B$ iff $m_{ik,jl} = a_{ik} b_{jl}$ iff $\operatorname{rk}\mathbf{M} \leq 1$ iff $m_{ik,jl}m_{pr,qs} - m_{ik,qs} m_{pr,jl} = 0$ iff $M^{ij}{kl} M^{pq}{rs} - M^{iq}{ks} M^{pj}{rl} = 0$ – Vladimir Lysikov Aug 21 '24 at 06:35
  • See this question: https://math.stackexchange.com/questions/1545118/a-rank-one-matrix-is-the-product-of-two-vectors – Vladimir Lysikov Aug 21 '24 at 06:37
  • Thanks, that explains that and was helpful! :) In another comment the example $M=⊗$ came up, which is obviously of the type $A⊗B$, but seems to have full rank, because it is the identity. I feel like this has something to do with that $\mathbb{M}$ is not the matrix of $M$ as a linear operator, but I can not figure out what exactly the problem is. – Sinthoras Aug 21 '24 at 07:34
  • As I say in the answer, the matrix we form here is not the matrix of the linear map. The matrix of lineat map $I \otimes I$ will of course be full rank, but the matrix $\mathbf{M}$ constructed after rearranging the factors will not be. I added an example to the answer. – Vladimir Lysikov Aug 21 '24 at 07:43
  • That clears it up. Thanks a lot! :) – Sinthoras Aug 21 '24 at 10:47
1

I cannot fit this as a comment for the answer by @VladimirLysikov.

If I write $M^{ij}_{kl}$ as $\delta^i_k \delta^j_l$ (which is your definition), with $i,j$ as row indices, $k,l$ as column indices I get $$ \begin{array}{cccc} &\begin{array}{} {\small k=1} & {\small k=1} \\ {\small l=1} & {\small l=2} \end{array} &\begin{array}{} {\small k=2} & {\small k=2} \\ {\small l=1} & {\small l=2} \end{array}\\ \begin{array}{} {\small i=1, \; j=1} \\ {\small i=1, \; j=2 } \end{array} & \begin{array}{cc} 1 \phantom{xx} & 0 \\ 0 \phantom{xx}& 1 \end{array} & \begin{array}{cc} 0 \phantom{xx} & 0 \\ 0 \phantom{xx}& 0 \end{array} \\ \begin{array}{} {\small i=2, \; j=1} \\ {\small i=2, \; j=2 } \end{array} & \begin{array}{cc} 0 \phantom{xx} & 0 \\ 0 \phantom{xx}& 0 \end{array} & \begin{array}{cc} 1 \phantom{xx} & 0 \\ 0 \phantom{xx}& 1 \end{array} \end{array} $$ which, treated as a matrix, is full rank. To get your matrix I must write $M^{ij}_{kl}=\delta^{ij}\delta_{kl}$ in which case $$ \begin{array}{cccc} &\begin{array}{} {\small k=1} & {\small k=1} \\ {\small l=1} & {\small l=2} \end{array} &\begin{array}{} {\small k=2} & {\small k=2} \\ {\small l=1} & {\small l=2} \end{array}\\ \begin{array}{} {\small i=1, \; j=1} \\ {\small i=1, \; j=2 } \end{array} & \begin{array}{cc} 1 \phantom{xx} & 0 \\ 0 \phantom{xx}& 0 \end{array} & \begin{array}{cc} 0 \phantom{xx} & 1 \\ 0 \phantom{xx}& 0 \end{array} \\ \begin{array}{} {\small i=2, \; j=1} \\ {\small i=2, \; j=2 } \end{array} & \begin{array}{cc} 0 \phantom{xx} & 0 \\ 1 \phantom{xx}& 0 \end{array} & \begin{array}{cc} 0 \phantom{xx} & 0 \\ 0 \phantom{xx}& 1 \end{array} \end{array}. $$ Your definition corresponds to the first matrix.

Ted Black
  • 1,639
  • This is the matrix of $M$ as a linear map (identity). The matrix is my answer is obtained after a rearrangement of indices, it should be indexed not by $i,j$ and $k,l$ but by $i,k$ and $j,l$. – Vladimir Lysikov Aug 24 '24 at 10:26
  • Thanks @VladimirLysikov ; see the changes to my comment. I thought, for consistency, contravariant indices are always row indices. – Ted Black Aug 24 '24 at 10:34
  • Again, we take $M^{ij}_{kl} = \delta^{i}_k \delta^j_l$ and group the indices as follows: $i$ with $k$ and $j$ with $l$. Not $i$ with $j$ and $k$ with $l$. We get $\begin{bmatrix} & {\scriptstyle j = 1, l = 1} & {\scriptstyle j = 1, l = 2} & {\scriptstyle j = 2, l = 1} & {\scriptstyle j = 2, l = 2}\ {\scriptstyle i = 1, k = 1} & 1 & 0 & 0 & 1 \ {\scriptstyle i = 1, k = 2} & 0 & 0 & 0 & 0 \ {\scriptstyle i = 2, k = 1} & 0 & 0 & 0 & 0 \ {\scriptstyle i = 2, k = 2} & 1 & 0 & 0 & 1 \end{bmatrix}$ – Vladimir Lysikov Aug 24 '24 at 10:42
  • There is nothing in the rules that says that we cannot group a covariant and a contravariant index. Covariance starts to get murky after that, but it will be the case with grouped indices anyway. Rows and columns in this case correspond to the same space, although it is naturally self-dual. What is important is that the rank of the matrix we get is invariant under $\operatorname{GL}(\mathcal{H})$. – Vladimir Lysikov Aug 24 '24 at 10:43
  • @VladimirLysikov this is the confusing bit; $i$ is contravariant and $k$ is covariant but you show them both as row indices. If $x^i$ are the components of a vector and $y_k$ are the components of a covector, I have to use a column vector for $x^i$ and a row vector for $y_k$. – Ted Black Aug 24 '24 at 10:45
  • In is not necessary that a matrix always has covariant rows and contravariant columns. For example, in a matrix of a bilinear form both rows and columns are covariant, and it transforms differently to a matrix of a linear map under base change. In this case both rows $ik$ and columns $jl$ have one covariant and one contravariant index, so rows and columns also the same here, they correspond to a space of type $\mathcal{H}\otimes \mathcal{H}^*$. – Vladimir Lysikov Aug 24 '24 at 10:51
  • @VladimirLysikov I agree that, for illustration purposes, a bilinear form $V \times V \to \mathbb{R}$ can be represented by a matrix. I associate matrices with maps like $V \to V$ which correspond to $V \times V^* \to \mathbb{R}$ bilinear forms. In any case, my comment was made to improve my understanding of the finer points in your answer. – Ted Black Aug 24 '24 at 11:06
  • For me, a matrix is a table of numbers, and linear maps, bilinear forms or more complicated things can be represented by matrices for computation. I hope my comments helped; if they came up a bit hostile, this was not my intention. – Vladimir Lysikov Aug 24 '24 at 11:11