0

I read here that

$$\begin{aligned} \frac{d}{dt}\begin{vmatrix} a_{11}(t) & a_{12}(t) & a_{13}(t) \\ a_{21}(t) & a_{22}(t) & a_{23}(t) \\ a_{31}(t) & a_{32}(t) & a_{33}(t) \end{vmatrix} &= \begin{vmatrix} a'_{11}(t) & a'_{12}(t) & a'_{13}(t) \\ a_{21}(t) & a_{22}(t) & a_{23}(t) \\ a_{31}(t) & a_{32}(t) & a_{33}(t) \end{vmatrix} \\ \\ &+ \begin{vmatrix} a_{11}(t) & a_{12}(t) & a_{13}(t) \\ a'_{21}(t) & a'_{22}(t) & a'_{23}(t) \\ a_{31}(t) & a_{32}(t) & a_{33}(t) \end{vmatrix} \\ \\&+ \begin{vmatrix} a_{11}(t) & a_{12}(t) & a_{13}(t) \\ a_{21}(t) & a_{22}(t) & a_{23}(t) \\ a'_{31}(t) & a'_{32}(t) & a'_{33}(t) \end{vmatrix}. \end{aligned}$$

That is to find the derivative, we go go from row-to-row and find derivatives of each element in that row. Is this true for higher order square matrices also? If so, what is the proof?

MangoPizza
  • 1,856
  • 7
  • 27
  • The product rule works for multilinear maps (not merely for actual multiplication). And a determinant is multilinear in its rows. (Also in its columns, so you could do it that way.) The proof in this general form uses the "chin rule for partial derivatives". – GEdgar Aug 14 '22 at 12:15
  • The general case is knowns as Jacobi's formula. – Zhanxiong Aug 14 '22 at 12:43

1 Answers1

1

That's just the usual product rule. In two dimensions, $$ \begin{align} {d\over dt} \left(a(t)d(t)-b(t)c(t)\right) &=a'd-b'c+ad'-bc'\\ &= \begin{vmatrix} a' & b'\\ c & d \\ \end{vmatrix} + \begin{vmatrix} a & b\\ c' & d' \\ \end{vmatrix} \\ \end{align} $$

Suzu Hirose
  • 11,949
  • Indeed, but my question is: can this row wise differentiation be extended to higher order square matrices? – MangoPizza Aug 14 '22 at 17:22
  • @MangoPizza The way to extend it from two or three dimensions to more dimensions seems fairly straightforward to me. What problems do you envisage? – Suzu Hirose Aug 14 '22 at 22:16