Suppose I have a matrix, and for some $n\in \mathbb{N}$, I insert a row of zeros between the $n$th and $n+1$th rows, and a column of zeroes between the $n$th and $n+1$the columns. For example, for $n=1$ and the matrix: $$A=\begin{bmatrix} 5&7&9 \\ 7&1&1 \\ 9&1&3 \\ \end{bmatrix}$$ I would obtain: $$B=\begin{bmatrix} 5&0&7&9 \\ 0&0&0&0 \\ 7&0&1&1 \\ 9&0&1&3 \\ \end{bmatrix}$$ In general, what effect does this have on the eigenvalues? If we subtract $\lambda I$ and cofactor expand along the zero row or zero column of a matrix transformed in this way, clearly zero must be an eigenvalue. From this, and playing around with some matrices, I believe the following:
If $A$ is a singular matrix, then the eigenvalues of $B$ are the same as the eigenvalues of $A$. If $A$ is an invertible matrix, then the eigenvalues of $B$ are the eigenvalues of $A$ as well as 0.
Is this claim true, and how would I prove it if so?