I know that the fastest known algorithm for multiplying two $m \times m$ matrices runs in time $m^{\omega}$, where currently we have $\omega = 2.3728596$ due to Virginia Williams's latest result (see here and here). But I'm not sure how this translates to matrices with other dimensions.
In terms of $\omega$, $l$, $m$, and $n$, what is the fastest known algorithm for multiplying an $l \times m$ and $m \times n$ matrix?
Notes:
If $l > m$ and $n > m$ I think it would just be $O(lnm^{\omega - 2})$. But I'm not sure about when $l < m$ or other such cases.
Yuval Filmus's answer points to Improved Rectangular Matrix Multiplication using Powers of the Coppersmith-Winograd Tensor, Le Gall and Urrutia 2017. But it is difficult for me to extract a precise idea of how these results translate to $l \times m$ by $m \times n$. I think there is some implicit knowledge involved in the easy cases and hard cases of multiplying an $n \times n^\alpha$ and $n^\alpha \times n$ matrix.
Coppersmith 1982, Rapid Multiplication of Rectangular Matrices, appears to be the seminal reference, where the quantity I am interested in is called
Rank<K,M,N>, but does not provide a derivation of it.The important basic fact that these references show is that multiplying an $n \times n^\alpha$ and $n^\alpha \times n$ matrix takes approximately $\tilde{O}(n^2)$ time (less than $n^{2 + \epsilon}$ for any $\epsilon$) for sufficiently small $\alpha$, approximately $0.313$ or less, currently. I expect that going from this to $l \times m$ and $m \times n$ is more or less direct or well-known, but this is not my area and the complexity for $l \times m$ and $m \times n$ is not stated directly.