3

If

$\hspace{2in}$$ A = \begin{bmatrix} B & 0 \\ C & D \end{bmatrix} \in \mathsf{M}_n, $

where $ B \in \mathsf{M}_k $ and $ D \in \mathsf{M}_{n-k}$, prove that $ p_A = p_B p_D $. ${Hint}$: proceed by induction on $n$ and expand the determinant across the first row.

i have no idea what to do. All i know is that $p_A (t) = \det(tI_n-A)$ , $p_B (t) =\det(tI_n-B)$ and that $p_D(t) = \det(tI_{n-k}-D) $

i also feel like you can prove this without induction by saying that $\det(A) = BC$

but i also feel like that is totally incorrect

What should i do? how do i prove this?

if you have a better title feel free to chage it

how would induction even play into this?

Davide Giraudo
  • 181,608
LKRC
  • 157
  • 1
  • 3
  • 10
  • 1
    You surely mean $\det A=\det B\det D$? Yes, that basically does it! – Angina Seng Aug 16 '17 at 04:36
  • no i do not, how would i get $detA = detBdetD$? i was under the assumption that it worked as $detA = BD - 0C$ where does the detB and detD come from – LKRC Aug 16 '17 at 04:37
  • $\det A=BD$ is nonsense: the left side is a scalar, the right side tries to be the product of two matrices, but their product isn't even defined, because $B$ has $k$ columns but $D$ has only $n-k$. – symplectomorphic Aug 16 '17 at 04:38
  • thats why i felt like i was totally incorrect, but i dont understand how you get $ detA=detBdetD$ – LKRC Aug 16 '17 at 04:39

2 Answers2

6

This is really not about characteristic polynomials at all, just a fundamental property of determinants (over any commutative ring $R$, where here we take $R$ to be the ring of polynomials in $t$ over your field), namely $$ \det\pmatrix{B&0\\C&D}=\det(B)\det(D). $$ You can apply this immediately for the characteristic polynomial, since the act of transforming $A$ into $xI_n-A$ amounts to transforming $B$ into $tI_k-A$, and $D$ into $xI_{n-k}-D$ (also $C$ becomes $-C$).

That property of determinants is the subject of this other question, and in my opinion the best proof is really directly from the (Leibniz formula) definition of determinants, as I detailed in my answer to that question. In particular, I would want to avoid using a property like $\det(XY)=\det(X)\det(Y)$, which although of course true, is actually quite a bit harder to prove directly from the definition.

2

The easiest trick to implement (but not necessarily to think of) is to write $tI - A$ as the product $$ tI - A = \pmatrix{tI - B & 0\\0&I}\pmatrix{I & 0\\-C&I} \pmatrix{I & 0\\0&tI - D} $$ and it suffices to determine that the matrices in this product have determinants $\det(tI - B),1,\det(tI - D)$ (in that order). We could prove those formulas using induction, if you like. In particular, the formulas for $$ \det \pmatrix{I & 0\\0&tI - D}, \det\pmatrix{I & 0\\C&I} $$ are very nicely proven using the hint as it's given.

Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355
  • this is like magic. you have skipped to many steps for my brain to understand, could you please explain what you did again. i beseech you, thanks – LKRC Aug 16 '17 at 05:20
  • Do you know anything about "block-matrix multiplication"? Can you verify/understand that the first equation holds? – Ben Grossmann Aug 16 '17 at 05:23
  • I think $C$ should be $-C$. – Gribouillis Aug 16 '17 at 05:27
  • all i understand is how the matrices fit into each other, like how B is k by k, and how D is n-k by n-k, but i dont really understand how any operators work on a block matrix – LKRC Aug 16 '17 at 05:28
  • @Gribouillis good catch. – Ben Grossmann Aug 16 '17 at 05:32
  • 1
    @LKRC Ah. A key concept (commonly referred to but rarely taught) is that whenever the relevant multiplications make sense (i.e. when the block matrices are conformally partitioned), then we can multiply the block matrix using the "usual matrix multiplication". So in other words, $$ \pmatrix{A&B\C&D} \pmatrix{W&X\Y&Z} = \pmatrix{AW + BY & AX + BZ\ CW + DY & CX + DZ} $$ note that the order matters for matrix multiplication. – Ben Grossmann Aug 16 '17 at 05:35
  • @LKRC You have made the correct observation that this fact is magical. – Ben Grossmann Aug 16 '17 at 05:36
  • You don't need to decompose into three factors, since two will do. As in $$\pmatrix{tI-B&0\-C&tI-D} = \pmatrix{tI-B&0\0&I} \pmatrix{I&0\-C&tI-D}. $$ The point is the fact that the latter two matrices have determinants $\det(tI-B)$ respectively $\det(tI-D)$ is proved exactly as in your proof (by repeated Laplace expansion w.r.t. appropriate rows). But even then, I would prefer avoiding the less elementary multiplicativity of determinants in such a simple question. – Marc van Leeuwen Aug 16 '17 at 10:53
  • @Marc I think of the product formula as very essential to why we care about determinants in the first place, so I tend to have the opposite view. – Ben Grossmann Aug 16 '17 at 11:53
  • I agree the product formula is extremely important when considering determinants, and it is also very easy to recall. But while it has several nice proofs, some deep and insightful, but I know none that quite match the simplicity of the statement (the Cayley-Hamilton theorem is another example of this kind). It is a matter of taste whether one takes complexity of proof into account at all when using (or not) a result, but when a proof avoiding it is immediately obvious as it is here, I personally prefer to do so. – Marc van Leeuwen Aug 17 '17 at 07:04