4

I've recently been working on an algorithm for bilinear systems in the form $y = (Lw) \odot (Rx)$, where $\odot$ denotes the elementwise product between two matrices of compatible sizes. In the above, we assume $w \in \mathbb{R}^d$ and $L \in \mathbb{R}^{m \times d}$, where $m \geq 2d$ at minimum.

One of the settings involves $L$ being a selection of the first $d$ columns of a $2^k \times 2^k$ Hadamard matrix (we assume that $m = 2^k$ for some $k$). Computing $L x $ can be done efficiently using the Fast Walsh-Hadamard Transform, as we can write

$$ Lx = H_k \tilde{x} = H_k \begin{pmatrix} x \\ \mathbf{0}_{m - d} \end{pmatrix} $$

where $H_k$ is the $2^k \times 2^k$ Hadamard matrix and $\tilde{x}$ is a zero-padding of $x$ so that $\tilde{x} \in \mathbb{R}^{2^k}$.

I was wondering if there is a way to efficiently compute the product $L^\top v$, for the same matrix $L$, (possibly using the FWHT). In the case where $d$ is a power of $2$, I think we could decompose $L^\top$ using the recursive decomposition of a Hadamard matrix, and obtain something like the following:

$$ L^\top v = \begin{bmatrix} H_{d} & \dots & H_{d} \end{bmatrix} v $$

Is my line of thinking above correct? Moreover, is there an efficient way to attack the general case, where $d$ is not a power of $2$, without the need to explicitly generate and store the matrix $L$?

I would appreciate any pointers to references and/or an implementation, if that is a common problem in signal processing.

VHarisop
  • 4,100

1 Answers1

3

One partial answer for general $d$:

if we assume that we follow Sylvester's construction for the Hadamard matrix $H_k$, which implies that $H_k^\top = H_k$, we can get away with computing $L^\top w $ by computing a full Hadamard matrix-vector multiplication, $H_k w$, and then keeping the first $d$ rows of the result. This requires $\Theta(m)$ space, $O(m \log m)$ time and does not require storing $L$ explicitly.

VHarisop
  • 4,100