5

I know that the fastest known algorithm for multiplying two $m \times m$ matrices runs in time $m^{\omega}$, where currently we have $\omega = 2.3728596$ due to Virginia Williams's latest result (see here and here). But I'm not sure how this translates to matrices with other dimensions.

In terms of $\omega$, $l$, $m$, and $n$, what is the fastest known algorithm for multiplying an $l \times m$ and $m \times n$ matrix?


Notes:

  • If $l > m$ and $n > m$ I think it would just be $O(lnm^{\omega - 2})$. But I'm not sure about when $l < m$ or other such cases.

  • Yuval Filmus's answer points to Improved Rectangular Matrix Multiplication using Powers of the Coppersmith-Winograd Tensor, Le Gall and Urrutia 2017. But it is difficult for me to extract a precise idea of how these results translate to $l \times m$ by $m \times n$. I think there is some implicit knowledge involved in the easy cases and hard cases of multiplying an $n \times n^\alpha$ and $n^\alpha \times n$ matrix.

  • Coppersmith 1982, Rapid Multiplication of Rectangular Matrices, appears to be the seminal reference, where the quantity I am interested in is called Rank<K,M,N>, but does not provide a derivation of it.

  • The important basic fact that these references show is that multiplying an $n \times n^\alpha$ and $n^\alpha \times n$ matrix takes approximately $\tilde{O}(n^2)$ time (less than $n^{2 + \epsilon}$ for any $\epsilon$) for sufficiently small $\alpha$, approximately $0.313$ or less, currently. I expect that going from this to $l \times m$ and $m \times n$ is more or less direct or well-known, but this is not my area and the complexity for $l \times m$ and $m \times n$ is not stated directly.

Caleb Stanford
  • 7,298
  • 2
  • 29
  • 50

2 Answers2

3

More "down to earth" fast matrix multiplication algorithms are described at this url. If you know better ones for these matrix sizes (all up to 32x23), their definitions is welcome.

Nathaniel
  • 18,309
  • 2
  • 30
  • 58
sedoglavic
  • 31
  • 1
2

Le Gall improved over Williams' result, and Alman and Williams improved over his result. The latter is currently the state of the art for multiplying two square matrices.

The best known algorithms for multiplying an $n \times n^\alpha$ by an $n^\alpha \times n$ matrix ("rectangular matrix multiplication") can be found in Le Gall and Urrutia.

If your dimensions are any different, try to reduce your problem to one of the above. For example, let $t = \min(l,m,n)$. You can think of your problem as that of multiplying an $l/t \times m/t$ matrix with an $m/t \times n/t$ matrix, both of them having $t \times t$ entries. The corresponding complexity is $lmn t^{\omega-3+o(1)}$ (recall that $\omega$ is an infimum!), which is roughly $lmn/t$ if $\omega = 2$. If you only want to assume currently known bounds on $\omega$, you could get better bounds in some cases using rectangular matrix multiplication.

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514