I am curious about a certain transformation called the logarithmic derivative that seems to appear a lot in different cool ideas, for example:
- The use in generating functions for recursions of the form $a_{n+1}=\sum _{k=0}^n\binom{n}{k}a_kb_{n-k}$ gives the functional equation $A'=AB$ implying that $B=(\log A)'.$
- Cumulants are defined as the coefficients of the logarithm of the Moment generating function, so the generating function is $K(x)=\log(\mathbb{E}[e^{tX}]),$ and so $K'(x)=\mu +\sigma ^2x+O(x^2)$ giving, for example, the mean when evaluated at $x=0.$
- The method of singular part transformation, when you use $u = (\log x)'$ as a change of variable (Present, for example, in the work of Painleve that gives raise to the Painleve equations).
- The digamma function is defined as $\psi(z)=(\log \Gamma(z))'$ and for example, evaluated at $z=1$ gives the euler-mascheroni constant $\gamma .$
- The relation in between the zeta and sigma Weierstrass functions in number theory is given by this transformation i.e., $\zeta (z,\Lambda) =(\log \sigma(z,\Lambda))'.$
- In the multiplicative calculus, the derivative is defined as $$f^*(x)=\lim _{h\rightarrow \infty}\left (\frac{f(x+h)}{f(x)}\right )^{1/h},$$ it turns out that if $f$ is positive and differentiable at $f$ one has the relation $$\ln (f^*(x))=\left (\ln f\right )'.$$ reference
- The Maurer-Cartan form on matrix lie groups look like $\omega _g=g^{-1}dg$ as established here
- In the proof of a condition for the $\mu -$equidistribution of a sequence in the space of conjugacy classes of a compact group G, with $\mu$ the Haar measure, by some properties of the L-functions $L(s,\rho).$ Discussed, for example, here.
- To pass from Witt components to ghost components in the theory of Witt vectors. i.e., $$\left (\log \left (\prod _{n=1}^{\infty}(1-x_nt^n)\right )\right )'=\sum _{n=1}^{\infty}w_n(x)t^n.$$ This is done, for example, here.
- In the proof of the Gauss-Lucas theorem that says that the roots of $p'$ are in the convex hull of the roots of $p$ where $p$ is a polynomial. The analysis is done thru $p'/p.$ See e.g., here.
- The argument principle theorem relates the integral of the logarithmic derivative of a function with the difference in between the poles and the zeros.
I kind of get that this transformation eliminates simple singularities and changes roots to poles and viceversa but I would like to know other parts of math where this is useful or if there is a more general idea of why this transformation is so powerful.