5

Suppose I have a matrix, and for some $n\in \mathbb{N}$, I insert a row of zeros between the $n$th and $n+1$th rows, and a column of zeroes between the $n$th and $n+1$the columns. For example, for $n=1$ and the matrix: $$A=\begin{bmatrix} 5&7&9 \\ 7&1&1 \\ 9&1&3 \\ \end{bmatrix}$$ I would obtain: $$B=\begin{bmatrix} 5&0&7&9 \\ 0&0&0&0 \\ 7&0&1&1 \\ 9&0&1&3 \\ \end{bmatrix}$$ In general, what effect does this have on the eigenvalues? If we subtract $\lambda I$ and cofactor expand along the zero row or zero column of a matrix transformed in this way, clearly zero must be an eigenvalue. From this, and playing around with some matrices, I believe the following:

If $A$ is a singular matrix, then the eigenvalues of $B$ are the same as the eigenvalues of $A$. If $A$ is an invertible matrix, then the eigenvalues of $B$ are the eigenvalues of $A$ as well as 0.

Is this claim true, and how would I prove it if so?

  • Hint: with a permutation you can always suppose the row and column you add are the first – Exodd Jan 15 '18 at 10:50
  • 2
    If you desire the eigenvalues of $B$, calculate the characteristic polynomial $\det(B-\lambda I)$ by cofactor expansion on one of the zero rows or columns. The only term that will survive is $\lambda\det(A-\lambda I)$. From what I can see, it's clear that your statement holds in the both cases. – superckl Jan 15 '18 at 10:58

3 Answers3

5

You can transform matrix by permuting columns and rows of a new matrix to the matrix where zero columns and rows are for example in the first column and row. So you have then a diagonal block matrix when one of the block is old matrix and other is zero ...

Widawensen
  • 8,517
  • For determinant of block diagonal matrix see https://proofwiki.org/wiki/Determinant_of_Block_Diagonal_Matrix – Widawensen Jan 15 '18 at 11:10
  • For the case see also permutations in action:
    $$ \begin{bmatrix} 0&1&0&0 \ 1&0&0&0 \ 0&0&1&0 \ 0&0&0&1 \ \end{bmatrix}\begin{bmatrix} 5&0&7&9 \ 0&0&0&0 \ 7&0&1&1 \ 9&0&1&3 \ \end{bmatrix}\begin{bmatrix} 0&1&0&0 \ 1&0&0&0 \ 0&0&1&0 \ 0&0&0&1 \ \end{bmatrix} =\begin{bmatrix} 0&0&0&0 \ 0&5&7&9 \ 0&7&1&1 \ 0&9&1&3 \ \end{bmatrix}$$
    – Widawensen Jan 15 '18 at 13:31
2

So far as I can tell, the comment by superckl is exactly what I need (thanks by the way!). Subtracting $\lambda I$ and cofactor expanding along the zero row or column yields: $$ \pm\lambda\det(A-\lambda I)=\det(B-\lambda I)=0 $$ Clearly, $\lambda=0$ must be a solution. That's as far as I got when I posted the question. But as superckl pointed out, if we suppose now that $\lambda\neq0$ to get the other eigenvalues, we see that: $$\det(A-\lambda I)=0$$ and so since $\lambda$ was an arbitrary nonzero eigenvalue of $B$, the remaining eigenvalues are the same.

0

Expending on user502382's answer which is correct

$$A = \begin{pmatrix} a & b & c \\ d & e & f \\ g & h & i \end{pmatrix}$$

$$B = \begin{pmatrix} 0 & 0 & 0 & 0 \\ 0 & a & b & c \\ 0 & d & e & f \\ 0 & g & h & i \end{pmatrix}$$

For characteristic matrix (or characteristic polynomial): $$A - \lambda I = \begin{vmatrix} a - \lambda & b & c\\ d & e - \lambda & f\\ g & h & i - \lambda \end{vmatrix} = 0$$

$$B - \lambda I = \begin{vmatrix} -\lambda & 0 & 0 & 0 \\ 0 & a - \lambda & b & c \\ 0 & d & e - \lambda & f \\ 0 & g & h & i - \lambda \end{vmatrix} = 0$$

Which is just

$$B - \lambda I = -\lambda \begin{vmatrix} a - \lambda & b & c\\ d & e - \lambda & f\\ g & h & i - \lambda \end{vmatrix} = 0 = - \lambda (A - \lambda I)$$

(The sign before $\lambda$ doesn't matter in this equation)

That means all the eigen values of B are same and if $\lambda = 0$ was not already an eigenvalue of $A$, it will now be an eigenvalue of $B$ with a multiplicity of 1.

So, the zero row and column can be added anywhere in this manner and we will have the same results.

dikshank
  • 135