4

Given $3 \times 3$ matrices $A$ and $B$, how can I find a scalar $s$ that makes the matrix $A + s B$ rank-$1$? Is there a method using singular value decomposition or eigenvalues?

Thanks!

2 Answers2

0

Try Gauss elimination and pick $s$ that makes the resulting echelon matrix being of rank $1$ (if possible).

A.Γ.
  • 30,381
0

Generalized eigenvalue problem

$C = A + sB$ has rank 1 if and only if its range is one-dimensional, so its nullspace must be two-dimensional. I.e., there must exist two linearly independent vectors $x_1, x_2$ such that $$0 = Cx_k = (A + sB) x_k \Leftrightarrow Ax_k = -sBx_k$$ for $k = 1,2$. This is a generalized eigenvalue problem.

Rethinking a bit (or reading the link above), we see that for such an $s$ to exist, we must first have $\det (A + sB) = 0$, which might have several solutions. For a specific $s$ satisfying the determinant equation above, we can then calculate the nullspace of $A + sB$ and see if it has dimension two. This is very much like calculating regular eigenvalues and eigenvectors, just that you use the matrix $B$ instead of the identity matrix.

Minor determinants

This is another possible solution route.

The matrix $C = A = sB$ has rank one if and only if all determinants of $2 \times 2$ minors of the matrix is zero (the set of matrices with rank $\leq 1$ forms a variety, being the zero locus of the $2 \times 2$ minor determinants), and at least one entry in $C$ is non-zero. This gives you nine (second degree polynomial) equations in one unknown, $s$, that needs to be satisfied.

E.g. the equation for the minor of $C$ corresponding to row 1 and 2 and column a 1 and 2 will be: $$\begin{align} 0 = &\begin{vmatrix} a_{1,1} + s b_{1,1} & a_{1,2} + s b_{1,2} \\ a_{2,1} + s b_{2,1} & a_{2,2} + s b_{2,2} \end{vmatrix} \\ &= (a_{1,1} + s b_{1,1})(a_{2,2} + s b_{2,2}) - (a_{2,1} + s b_{2,1})(a_{1,2} + s b_{1,2}) \end{align}$$

Calle
  • 7,857