0

$M$ is a $3\times 3$ orthogonal matrix. $A$ is a $2×2$ matrix in upper left corner of $M$. $B$ is a $1×1$ matrix in the lower right corner of $M$. $M_1$ is a matrix with first $2$ columns same as that of $M$ and last column same as that of $I$(the $3×3$ identity matrix). $M_2$ is a matrix with first $2$ columns identical to $I$ and third column identical to $M$. We have to prove that $MM_2=M_1$.

I am not able to understand how to approach this problem. Simply assuming the $9$ elements of M and creating equations is becoming too long an approach. Is there any elegant or shorter method to tackle this problem?

Fawkes4494d3
  • 3,009
V2002
  • 147

2 Answers2

1

With regards to block matrix multiplication, if you have a $m\times n$ matrix $A$ and a $n\times p$ matrix $B$ partitioned as $$A=\begin{pmatrix} \alpha_{a\times b} & \beta_{a\times (n-b)}\\ \gamma_{(m-a)\times b} & \delta_{(m-a)\times (n-b)} \end{pmatrix}, B=\begin{pmatrix} \eta_{b\times c} & \zeta_{b\times (p-c)} \\ \theta_{(n-b)\times c} & \mu_{(n-b)\times(p-c)} \end{pmatrix}$$ with $\alpha, \beta,\gamma,\delta,\eta,\zeta,\theta,\mu$ being submatrices of the above matrices with dimensions as shown above then the product matrix $AB$ is calculated exactly as if $\alpha,\beta,\gamma,\delta,\eta,\zeta,\theta,\mu$ were numbers (note that all the matrix multiplications below are compatible - i.e in a product, number of columns of the first matrix equals the number of rows in the second matrix) $$AB=\begin{pmatrix} \alpha\eta+\beta\theta &\alpha\zeta+\beta\mu \\ \gamma\eta+\delta\theta & \gamma\zeta+\delta\mu \end{pmatrix}$$


So we have $M_{3\times 3}=\begin{pmatrix} A_{2\times 2} & a_{2\times 1} \\ b_{1\times 2} & B\end{pmatrix}, M_{1}=\begin{pmatrix} A_{2\times 2} & 0_{2\times 1} \\ b_{1\times 2} & 1\end{pmatrix}$ where the respective dimensions of the matrices are written as subscript, with the $1\times 1$ matrix $B$ itself being a number only so $B_{1\times 1}=B$,
$a_{2\times 1}= (a_1 \quad a_2)^T, b_{1\times 2}=(b_1\quad b_2)$, and $0_{2\times 1}=(0 \quad 0)^T$ is a zero matrix of appropriate dimension.
Denoting the $n\times n$ identity matrix as $I_n$ we thus have $$M_2=\begin{pmatrix} 1&0& a_1 \\ 0&1&a_2 \\ 0&0&B\end{pmatrix}=\begin{pmatrix} I_2 &a_{2\times 1}\\ 0_{1\times 2} & B \end{pmatrix}\\ \implies MM_2=\begin{pmatrix} A_{2\times 2} & a_{2\times 1} \\ b_{1\times 2} & B\end{pmatrix}\begin{pmatrix} I_2 &a_{2\times 1}\\ 0_{1\times 2} & B \end{pmatrix}\\ = \begin{pmatrix} A_{2\times 2}I_2 + a_{2\times 1}0_{1\times 2} &A_{2\times 2}a_{2\times 1}+a_{2\times 1}B\\ b_{1\times 2}I_2+B0_{1\times 2} & b_{1\times 2}a_{2\times 1}+B^2 \end{pmatrix}$$ Note that
$(1)$ any $n\times m$ matrix multiplied with a $m\times p$ zero matrix will give a $n\times p$ zero matrix.
$(2)$ any $n\times m$ matrix multiplied with the identity matrix $I_m$ will give itself as the product.
Thus $$A_{2\times 2}I_2 + a_{2\times 1}0_{1\times 2}=A_{2\times 2}\\ b_{1\times 2}I_2+B0_{1\times 2}=b_{1\times 2} \\ \implies MM_2=\begin{pmatrix} A_{2\times 2} & A_{2\times 2}a_{2\times 1}+a_{2\times 1}B\\ b_{1\times 2} & b_{1\times 2}a_{2\times 1}+B^2\end{pmatrix}$$ Note that the first column of $MM_2$ in the partitioned form already matches the first column of the partitioned form of $M_1$ from above. I'll leave the rest of it to you, where you can try to use the fact that $M$ is orthogonal, i.e. $MM^T=M^TM=I_3$. You can find $M^T$ given the partitioned form of $M$ like this.

N.B. The problem as stated is probably wrong, because the last (second) column of partitioned form of $M_1$ is the last column of $MM^T$, but the last (second) column of $MM_2$ is the last column of $M^2$, so a valid counter-example would be any non-symmetric orthogonal matrix $M$.

Fawkes4494d3
  • 3,009
0

I will construct a counter example.
Let \begin{equation} M= \left[ { \begin{array}{ccc} \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} & 0\\ \frac{1}{\sqrt{3}} & \frac{1}{\sqrt{3}} & -\frac{1}{\sqrt{3}}\\ \frac{1}{\sqrt{6}} & \frac{1}{\sqrt{6}} & \frac{2}{\sqrt{6}}\\ \end{array} } \right] \end{equation} Then $M$ is an (Special) orthogonal matrix (consider orthonormal basis or check by a calculator).
Now \begin{equation} M_1= \left[ { \begin{array}{ccc} \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} & 0\\ \frac{1}{\sqrt{3}} & \frac{1}{\sqrt{3}} & 0\\ \frac{1}{\sqrt{6}} & \frac{1}{\sqrt{6}} & 1\\ \end{array} } \right], M_2= \left[ { \begin{array}{ccc} 1 & 0 & 0\\ 0 & 1 & -\frac{1}{\sqrt{3}}\\ 0 & 0 & \frac{2}{\sqrt{6}}\\ \end{array} } \right],\\ MM_2= \left[ { \begin{array}{ccc} \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{2}} & \frac{1}{\sqrt{6}}\\ \frac{1}{\sqrt{3}} & \frac{1}{\sqrt{3}} & -\frac{1}{3}-\frac{2}{3\sqrt{2}}\\ \frac{1}{\sqrt{6}} & \frac{1}{\sqrt{6}} & -\frac{1}{3\sqrt{2}}+\frac{2}{3}\\ \end{array} } \right]\neq M_1 \end{equation} So your proposition is not generally true. If you instead want to ask something else, consider asking a new question.

Tony Ma
  • 2,320