12

How can I quickly judge whether matrix A is the inverse matrix of B?

This is an exercise for the course I take. This question is given in the section of randomized algorithms. So I think its solution may be related to randomized algorithms.

t24akeru
  • 165
  • 1
  • 7

3 Answers3

24

You might be looking for something like Freivalds' algorithm. It is a randomized probabilistic algorithm that given three square matrices $A,B$ and $C$ checks if $A \times B = C$ by using random vectors. This method reduces the time complexity from $O(n^{2.3729}$) (regular matrix multiplication) to $O(n^2)$ with high probability. In your case, the matrices $A$ and $B$ would be the matrices you are given, and the matrix $C$ would be the identity matrix.

Nathaniel
  • 18,309
  • 2
  • 30
  • 58
phan801
  • 614
  • 5
  • 10
8

tl;dr: You can make a rough probabilistic judgement in $O(1)$ time

Let's assume you are willing to settle on a test which differentiates "good" matrices $A,B$ from "pretty bad" $A,B$, in the following sense:

  • If $A \times B = I$, the test will accept with high probability.
  • If $A \times B$ is far* from $I$ , the test will reject with high probability.
  • If $A \times B$ is close to $I$, you don't care

If you can live with my relaxation** , then here's your test: repeatedly compute a random cell of $AxB$ and check it against $I$. That is, repeatedly:

  1. Uniformly sample a row index $i$.
  2. Uniformly sample a column index $j$.
  3. Compute the inner product of the $i$'th row of $A$ by the $j$'th column of $B$.
  4. Ensure the result is 0 for $i \neq j$ or 1 for $i = j$.

Each repetition takes $\Theta(n)$ time, and the number of repetitions depends on your distance parameter and the desired probability of being correct, only. And it's one-sided error too :-)

You could go even further, and estimate the inner product instead of computing it fully, by repeatedly sampling pairs of corresponding elements in the two vectors, multiplying just the pair, and taking an average over these individual element multiplications. The expected value of a single-element-pair multiplication is in fact the overall inner product (easy exercise). This will reduce your time complexity from $\Theta(n)$ to $O(1)$ (times a function of the distance parameter and desired correctness probability), but now the test has two-sided error.


(*) - This should work w.r.t. $L_k$ norms with $k > 0$. If you don't know what these are, see here.
(**) - This relaxation is the object of study of the field of Property Testing.

einpoklum
  • 1,025
  • 6
  • 19
0
  1. Multiply row 1 of A by column 1 of B. For randomly chosen matrices, this has unit probability of proving that they're not inverses, and you're done in O(n) time.

  2. Find any eigenvalue and the corresponding eigenvector of A. I think this is O(n^2) (quicker than finding all eigenvalue-eigenvector pairs). Check whether this eigenvector is also an eigenvector of B, with the inverse eigenvalue. If not, then you're done.

  3. Repeat 2 with an eigenvector that's orthogonal to the one found previously. Keep going until it becomes advantageous to switch and do...

  4. In the space orthogonal to all the previously found eigenvectors, do matrix multiplication.