2

Let $H_A$ be a finite dimension Hilbert space. I consider matrices of this space, thus the space $\mathcal{L}(H_A)$.

I would like to know (I think I have read it somewhere but I'm not sure) if there exist a basis of this space composed of density matrices ?

I remind that density matrices are operator hermitic, semi definite positive, of trace $1$.

I think I can show that any matrix in $\mathcal{L}(H_A)$ can be written as a sum of density matrices doing the following:

First, any hermitic $H$ matrix is a sum of density matrices. Indeed, considering $|\psi_i \rangle$ an orthonormal basis in which $H$ is diagonal, we have, with $\lambda_i \in \mathbb{R}$:

$$H=\sum_i \lambda_i |\psi_i \rangle \langle \psi_i |$$

Then, any matrix $A$ can be written as:

$$A=H_1+i H_2$$

Where $H_1$ and $H_2$ are hermitic.

Then, $A$ can be written as a sum of density matrices, the coefficients being either real or pure imaginary.

Now, how to prove that there exist a basis of density matrices in which any operator $A$ can be decomposed ? If there is a simple example of such basis I would like to see it as well (decomposed in the canonical basis $|i\rangle \langle j|$).

What confuses me a little bit and that I have forgotten from linear algebra basics is that I see that any $A$ can be written as sum of density matrices. Does that necesseraly implies that there is a basis of density matrices or not necesserally ?

glS
  • 7,963
StarBucK
  • 779

1 Answers1

4

Here's an explicit set of density matrices that form a basis. Take the collection of all matrices of the following forms:

  • $\left|j\rangle \langle j\right|$
  • $\frac{1}{2}\left(\left|j\rangle+\left|k\rangle\right) \left( \langle j\right| + \langle k\right|\right)$
  • $\frac{1}{2}\left(\left|j\rangle+i\left|k\rangle\right) \left( \langle j\right| - i\langle k\right|\right)$

Where $j, k$ range over some fixed orthonormal basis set for $H_A$, with $j< k$, and where $i$ is the imaginary unit. These are clearly symmetric with unit trace, and they are positive semidefinite because they are all of the form $\left|v\rangle \langle v\right|$ for some $v$. To see that they form a basis for the set of operators on $H_A$, note that there are the right number of them ($n^2$, where $n$ is the dimension of $H_A$), so if we can show that they span the space of matrices then we are done.

The diagonal matrices are clearly attained using just the first type of matrices $\left|j\rangle \langle j\right|$. Similarly an off-diagonal matrix $\left|j\rangle \langle k\right|$ can be written as $$\frac{1}{2}\left(\left|j\rangle+\left|k\rangle\right) \left( \langle j\right| + \langle k\right|\right) + \frac{i}{2}\left(\left|j\rangle+i\left|k\rangle\right) \left( \langle j\right| - i\langle k\right|\right) - \left|j\rangle \langle j\right| - \left|k\rangle \langle k\right|$$, showing that the above set of matrices spans the space.

More generally, any time you have a subset $S$ of a vector space $V$ whose span contains the entire space, you can choose a basis from said subset $S$. The proof proceeds by induction. Choose an element $b\in S$ of the subset to be the first candidate basis element. Then, given any collection of candidate basis elements $b_1, \dots, b_k$, if the number of them doesn't equal the dimension $n=\dim(V)$ of the vector space, there must be some $x\in S$ which is linearly independent of $b_1,\dots,b_k$, because if not, then the span of $S$ would have strictly lower dimension than $V$! So we can keep adding elements to our candidate basis set $b_1,\dots,b_k$ until we get $n$ linearly independent elements, which are necessarily a basis for $V$.

Yly
  • 15,791