1

Suppose you're given the list of $2^{2n}$ complex coefficients defining the state of a $2n$ qubit register. You want to compute the $2^n \times 2^n$ density matrix of the first $n$ qubits.

Is it possible to compute the result in $O(4^n)$ time? That would be optimal, since the output has size $\Theta(4^n)$.

It's easy to hit $O(8^n)$. The naive algorithm that simply computes the density matrix for each possible value of the unrelated qubits and then adds those together achieves that:

def density_matrix_of_first_half(amplitude_vector):
  v = amplitude_vector
  M = new complex[2^n, 2^n]
  for r < 2^n:
    offset = r * 2^n
    for i < 2^n:
      for j < 2^n:
        M[i, j] += (v[i + offset] * v[j + offset]).conjugate()
  return M

But I was hoping it was possible to do better. Are there any papers about this problem?

(The reason I care so much more about $4^n$ vs $8^n$ is because I'm actually computing things with a GPU. My textures can have $2^{2n} = 4^{n}$ pixels, so I get the $4^n$ inner-loop in the application of a single shader. But $8^n$ is far over the max texture size, so the $8^n$ algorithm would require a pipeline of $2^n$ shaders. Or at least might force me to drop the qubit limit another 30%.)

Craig Gidney
  • 5,992
  • 1
  • 26
  • 51

1 Answers1

1

It's a matrix multiplication. Not sure how I didn't notice it right away, but the problem is exactly computing $U \cdot U^\dagger$ for a $U$ whose rows are made up of the chunks of amplitude vector.

def density_matrix_of_first_half(amplitude_vector):
  U = new ComplexMatrix(2^n, 2^n)
  for i < 2^n:
    for j < 2^n:
      U[i, j] = amplitude_vector[i + (j << n)]
  P = U * U.adjoint()  // <-- expensive
  return P

Fancy matrix multiplication does asymptotically better than the naive algorithm

Matrix multiplication can be done in time $O(n^\omega)$, where $n$ is the dimension of the matrix and $\omega$ is known to be below $2.375$. It's conjectured that there may be algorithms where $\omega = 2 + \epsilon$, but no one's managed to find any yet.

In the context of the question with $2^n \times 2^n$ matrices, the complexity is $O(2^{n \omega})$. However, because the asymptotically faster algorithms tend to have large constant factors hidden by the Big-O notation, in practice it's often better to stick with naive multiplication or Strassen multiplication. Also there are probably GPU-optimized matrix multiplication algorithms that would be more relevant given the note at the end of the question.

$U \cdot U^\dagger$ is not easier than $U \cdot V$

General matrix multiplication combines two totally independent matrices $U \cdot V$. Given that our density matrix problem only computes $U \cdot U^\dagger$, where the two inputs are totally dependent, one wonders if this "adjoint-squaring" is easier than general multiplication. Turns out it's not.

Craig Gidney
  • 5,992
  • 1
  • 26
  • 51