1

For any given three fixed $3\times3$ matrices $A,B,C$, is it possible to find four $3\times3$ matrices $M,N,P,Q$ where $\text{rank}(M)=\text{rank}(N)=\text{rank}(P)=\text{rank}(Q)=1$ such that $A,B,C$ can be all written as linear combinations of $M,N,P,Q$? Here linear combinations refer to the following equations \begin{aligned}A=\alpha_1M+\alpha_2N+\alpha_3P+\alpha_4Q,\\B=\beta_1M+\beta_2N+\beta_1P+\beta_1Q,\\C=\gamma_1M+\gamma_2N+\gamma_3P+\gamma_4Q,\\\end{aligned} where $\alpha_1,\alpha_2,\alpha_3,\alpha_4,\beta_1,\beta_2,\beta_3,\beta_4,\gamma_1,\gamma_2,\gamma_3,\gamma_4$ are arbitrary numbers.

This problem comes from a generalization of the following true statement. For any given two fixed $2\times2$ matrices $A',B'$, it is always possible to find three $2\times2$ matrices $M',N',P'$ where $\text{rank}(M')=\text{rank}(N')=\text{rank}(P')=1$ such that $A',B'$ can be both written as linear combinations of $M',N',P'$. The proof is given below.

Proof Let $\displaystyle A'=\bigg(\begin{matrix}a&b\\c&d\end{matrix}\bigg)$, $\displaystyle B'=\bigg(\begin{matrix}e&f\\g&h\end{matrix}\bigg)$, $\displaystyle M'=\bigg(\begin{matrix}0&-x\\0&-y\end{matrix}\bigg)$, $\displaystyle N'=\bigg(\begin{matrix}a&b+x\\c&d+y\end{matrix}\bigg)$, $\displaystyle P'=\bigg(\begin{matrix}e&f+x\\g&h+y\end{matrix}\bigg)$. It's clear that now $A'=M'+N'$, $B'=M'+P'$ and of course $\text{rank}(M')=1$. Now we need only to guarantee that $\text{rank}(N')=\text{rank}(P')=1$, which means we need only to make sure $\text{det}(N')=\text{det}(P')=0$. This implies the following two equations \begin{aligned}\left\{ \begin{array}{r1} cx-ay=ad-bc \\ gx-ey=gf-eh \end{array} \right.,\end{aligned} which has a solution when $(a,c)$ and $(e,g)$ are linearly independent. The only case we need to consider is that $(a,c)$ and $(e,g)$ are linearly dependent. However, when they're linear dependent, we can just let $\displaystyle M'=\bigg(\begin{matrix}a&0\\c&0\end{matrix}\bigg)$, $\displaystyle N'=\bigg(\begin{matrix}0&b\\0&d\end{matrix}\bigg)$, $\displaystyle P'=\bigg(\begin{matrix}0&f\\0&h\end{matrix}\bigg)$ with $A'=M'+N'$ and $\displaystyle B'=\frac{e}{a}M'+P'$. Then the proof is completed.

However, this method doesn't seem effective when the dimension of the given arrays jumps from 2 to 3. I really doubt whether the existence of $M,N,P,Q$ for matrices $A,B,C$ when the dimension becomes 3, but I cannot give a counterexample. Any help will be appreciated. By the way, if the existence of the four matrices are not guaranteed, please tell me the minimum number of matrices of rank 1 needed to write $A,B,C$ as linear combinations of these matrices, if that is not too hard to prove.

bob
  • 3,079
grj040803
  • 1,027

1 Answers1

1

Consider: $$A =\begin{pmatrix} 1 & 0 & 0\\ 0 & 1 & 0\\ 0 & 0 & 1 \end{pmatrix}$$ $$B = \begin{pmatrix} 0 & 1 & 0\\ 0 & 0 & 1\\ 0 & 0 & 0 \end{pmatrix}$$ $$C = \begin{pmatrix} 0 & 0 & 1\\ 0 & 0 & 0\\ 0 & 0 & 0 \end{pmatrix}$$

We have $A$ of rank $3$, $B$ of rank $2$, and $C$ of rank $1$.
We know that: $\det(A + By + Cx )= 1$ for all $x$ and $y$. Thus its rank is always $3$.

Assume there exists $M,N,P,Q$ of rank $1$ such that: $$\begin{aligned}A=\alpha_1M+\alpha_2N+\alpha_3P+\alpha_4Q,\\B=\beta_1M+\beta_2N+\beta_1P+\beta_1Q,\\C=\gamma_1M+\gamma_2N+\gamma_3P+\gamma_4Q,\\\end{aligned}$$

for some arbitrary numbers.
Informally, this can be written as:

$$ \begin{pmatrix} A\\ B\\ C \end{pmatrix} = \begin{pmatrix} \alpha_1 & \alpha_2 & \alpha_3 & \alpha_4\\ \beta_1 & \beta_2 & \beta_3 & \beta_4\\ \gamma_1 & \gamma_2 & \gamma_3 & \gamma_4 \end{pmatrix} \begin{pmatrix} M\\ N\\ P\\ Q \end{pmatrix} $$

Let $T$ be the above $4\times 3$ coefficient matrix.

Let: $$\vec{\alpha} = \begin{pmatrix} \alpha_1 & \alpha_2 & \alpha_3 & \alpha_4 \end{pmatrix}\\ \vec{\beta} = \begin{pmatrix} \beta_1 & \beta_2 & \beta_3 & \beta_4 \end{pmatrix}\\ \vec{\gamma} = \begin{pmatrix} \gamma_1 & \gamma_2 & \gamma_3 & \gamma_4 \end{pmatrix}$$

We are interested in the coefficients of $A + By + Cx$ when written as linear combination of $M,N,P,Q$. In other words, we are interested in the component values of: $$\vec{\alpha} + x\vec{\beta} + y \vec{\gamma}$$

If, for some chosen $x$ and $y$, two of the four component values of $\vec{\alpha} + x\vec{\beta} + y \vec{\gamma}$ become $0$, then the rank of $A + By + Cx$ becomes $\leq 2$, leading to a contradiction.

Consider now $2$ arbitrary but distinct columns of the coefficient matrix $T = \begin{pmatrix} \alpha_1 & \alpha_2 & \alpha_3 & \alpha_4\\ \beta_1 & \beta_2 & \beta_3 & \beta_4\\ \gamma_1 & \gamma_2 & \gamma_3 & \gamma_4 \end{pmatrix}$:

$$T_{mn} = \begin{pmatrix} \alpha_m & \alpha_n \\ \beta_m & \beta_n \\ \gamma_m & \gamma_n \end{pmatrix}$$

If the determinant of $\begin{pmatrix} \beta_m & \beta_n \\ \gamma_m & \gamma_n \end{pmatrix}$ were not $0$ then surely we could find $x,y$ such that $$\begin{pmatrix} \alpha_m & \alpha_n \end{pmatrix} + (x \begin{pmatrix} \beta_m & \beta_n \end{pmatrix} + y \begin{pmatrix} \gamma_m & \gamma_n \end{pmatrix}) = \begin{pmatrix} 0 & 0 \end{pmatrix}$$

thus: $$\text{rank}(A + xB + yC) < 3$$

This should be impossible, therefore: $\det\begin{pmatrix} \beta_m & \beta_n \\ \gamma_m & \gamma_n \end{pmatrix} = 0$ . But this applies to any two columns of $T$.

We conclude that that the rank of $\begin{pmatrix} \beta_1 & \beta_2 & \beta_3 & \beta_4\\ \gamma_1 & \gamma_2 & \gamma_3 & \gamma_4 \end{pmatrix}$ is less than $2$. But this means the vectors $\vec{\beta}$ and $\vec{\gamma}$ are proportional to one-another (collinear, not linearly independent). But this would imply proportionality between $B$ and $C$, which are made by linear combinations of $\begin{pmatrix} M\\ N\\ P\\ Q \end{pmatrix}$ according to said vectors. The matrices $B$ and $C$ are clearly not proportional to one-another, therefore we've reached a contradiction. We conclude that $\begin{pmatrix} M\\ N\\ P\\ Q \end{pmatrix}$ and the coefficient matrix $T$ do not exist.

user3257842
  • 4,526