It is said that we can solve differential equ $\ddot x + x = 0$ by writing it as $(d/dt + i) (d/dt - i)x=0$. Why can we do this? Certainly we cannot simply say $\frac{d^2}{dt^2} = \left(\frac{d}{dt}\right)^2$, since the linear operators, such as $d/dt$, are not the same as numbers. For example, the former is usually not commutative (though in this special case it is), and even possibly non-associative.
$\\$
Perhaps it is because that (1) $\frac{d^2}{dt^2} = \frac{d}{dt} \frac{d}{dt}$ according to the def of 2nd-order differentiation (2) the operators $d/dt$, though non-commutative, obey distribution law, similar to the matrix?
$\\$
Anyway despite the confusion about laws that linear operators follow, if I try to expand the expression $(d/dt + i) (d/dt - i)x$ from right to left then it makes sense:
$(d/dt + i) (d/dt - i)x = (d/dt + i) (dx/dt - ix)= d/dt (dx/dt - ix) + i (dx/dt - ix) \\ = \ddot x + x$
And so we can solve the ode from left to right:
From $(d/dt + i)f(t) = 0$ we get $f(t)$, then from $(d/dt - i)x(t) = f(t)$ we get $x(t)$.
$\\$
More generally speaking,
- are there some rules that the addition and multiplication of (partial) differentiation operators ($d/dt$ and $\partial/\partial x$, etc.) and numbers would follow?
- it seems that (partial) differentiation operators and matrices, both of which can be regarded as linear operators, share similarities. But they are still seemingly different, e.g. $d/dt$ and $\partial/\partial x$ may not obey associative laws while matrices do. Is there an article or book discussing the similarities and dissimilarities between the two?
- is there any chapter or essay that discusses in general the way of solving a higher order ode by factorizing the operators?