2

If all I'm interested is in the moments of my random variable $X$, then given its characteristic function $\varphi _{X}(t)$ we have $$ \operatorname {E} \left[X^{n}\right]=i^{-n}\left[{\frac {d^{n}}{dt^{n}}}\varphi _{X}(t)\right]_{t=0}, $$ then it seems as if all I care about is $\varphi _{X}(t)$ at an arbitrary small neightbourhood of $t=0$. Surely, away from this neightborhood, I can let $\varphi _{X}(t)$ be whatever I want and I will still get all my moments right. Hence the question, what's the point of the characteristic function away from zero?

To illustrate my point, say I let $$ \varphi _{X}(t) = \begin{cases} f(t)& t < -\epsilon\\ g(t)& \text{otherwise}\\ h(t)& t > \epsilon \end{cases} $$ such that $f(t)$ and $g(t)$ are smoothly connected at $t=-\epsilon$ and $h(t)$ and $g(t)$ are smoothly connected at $t=\epsilon$ (which you can apparently do). Then $$ \operatorname {E} \left[X^{n}\right]=i^{-n}\left[{\frac {d^{n}}{dt^{n}}}\varphi _{X}(t)\right]_{t=0} = \left[\begin{cases} i^{-n}{\frac {d^{n}}{dt^{n}}}f(t)& t < -\epsilon\\ i^{-n}{\frac {d^{n}}{dt^{n}}}g(t)& \text{otherwise}\\ i^{-n}{\frac {d^{n}}{dt^{n}}}h(t)& t > \epsilon \end{cases}\right]_{t=0} = i^{-n}\left[{\frac {d^{n}}{dt^{n}}}g(t)\right]_{t=0}, $$ and I can change $f(t)$ and $h(t)$, changing $\varphi _{X}(t)$, without changing the moments of $X$.

My intuition tells me this is wrong since for instance any analitic function is fully characterised by its Taylor series at $x=0$, which knows from the function infinitely away from $x=0$.

What am I missing? Can someone shine some light on this?

1 Answers1

3

If all you're interested in are the moments, then yes, you only need it in a neighbourhood of $0$. But the characteristic function has other useful properties. For example, in general a probability distribution is not determined by its moments (even if all moments are finite), but it is determined by its characteristic function (even if the moments are not all finite). This doesn't contradict your statement "an analytic function is fully characterized by its Taylor series at $x=0$", because the characteristic function is not necessarily analytic.

On the other hand, I'm not sure that your example works: how do you know that when you change $f$ and $h$ you still have the characteristic function of a probability distribution?

Robert Israel
  • 470,583
  • This is interesting, so is the following true? if the characteristic function is analytic then you cannot construct my corner case. Also, regarding my example, Im not sure either but it doesn't seem that crazy. – FriendlyLagrangian Apr 25 '24 at 22:29
  • 1
    Yes, if the characteristic function is real-analytic, then it is uniquely determined by its Taylor series about $0$, and you can't change it on $t \ge \epsilon$ or $t \le -\epsilon$ without killing the analyticity. – Robert Israel Apr 26 '24 at 03:08
  • 1
    You are essentially asking the question of uniqueness in the Hamburger moment problem. If the moments satisfy a bound $\left|\mathbb E[X^n]\right| < C D^n n!$, which says that the characteristic function is analytic in a neighbourhood of $0$, then there is only one probability distribution with these moments. – Robert Israel Apr 26 '24 at 14:23
  • that's a great comment, thank you! – FriendlyLagrangian Apr 27 '24 at 12:30