1

I want to understand Householder transformation. I get some geometric intuition from one M.SE answer (below).

We start with a square matrix $M$ of dimension $n$. We can think of its $n$ columns as vectors in $\mathbb{R}^n$. We consider the hyperplane generated by the first column (for example the orthogonal complement of that vector). Next, we reflect each of the columns about this hyperplane. In symbols: $H_1M= [ H_1(v_1) \ldots H_1(v_n)]$, where on the RHS we use functional notation for $H_1$. Now, because $v_1$ is normal to the hyperplane, $H_1(v_1)$ looks simple. The rest of the vectors transform like:
alt text
That is we subtract twice their projections onto $v_1$ (this gives me the formula for householder reflections). Then we consider the $n-1$ dimensional submatrix of $H_1M:=M_2$, and repeat. The submatrix takes me into the hyperplane, since the first reflection leaves that plane invariant. What we are doing is changing the basis (since Reflections have $det \neq 0$) of the underlying space progressively so that the vectors have a nice representation (Thats what QR decomposition is, The Q contains the orthonormal vectors, while the R tracks all the changes we have made).

I don't understand the bolded lines.

  • Like how "reflect of the columns about this hyperplane" related with tridiagonalization?
  • "That is we subtract twice their projections onto $v_1$" what is mean? And how it gives us the formula for householder reflections?

In short, I want to understand the relation between the geometrical intuition with derivation of the formula.

falamiw
  • 956
  • Householder reflectors are primarily used for a QR decomposition. One can use them too to bring a matrix in Hessenberg form as first step in a diagonalization process. For a symmetric matrix this form is then tridiagonal. In the first step of the Kahan algorithm for the SVD computation one obtains a bi-diagonal form. So what did you have in mind? – Lutz Lehmann Sep 23 '21 at 16:24
  • I was interested in symmetric matrix, as it was our syllabus. But other case are welcome @LutzLehmann – falamiw Sep 23 '21 at 17:03
  • Then you probably want the transformation to Hessenberg as start to some variant of the Francis' QR algorithm. Then indeed the geometric interpretation of the QR decomposition will be less helpful. Here you want to leave the first component of $v_1$ alone and reflect the other components into the second one. Then applying the same transformation from the right, as is required for a similarity transform, leaves the first column unchanged. As a result the first row and first column are now in tridiagonal form. Repeat for the lower sub-matrix. – Lutz Lehmann Sep 23 '21 at 17:12
  • Have you heard about the expression $I_n-2v_1^tv_1$ ? ($v_1$ is assumed under its unit norm column vector representation). – Jean Marie Sep 23 '21 at 17:13
  • Yes, @JeanMarie. Letting $P=I_n-2v_1^tv_1$ then $A^{k+1}=P^{k}A^{k}P^{k}$ approach tridiagonal and symmetric matrix. And I really want to know why it happening. Like how to formula derive? – falamiw Sep 23 '21 at 18:11

0 Answers0