1

I am learning about Moments in the context of Probability

  • For some given probability distribution function the theoretical $k$-th moment is given by : $$\int_{\mathbb R} x^k \cdot f(x) \mathrm{d}x$$

  • Given some measurements $x_1, x_2... x_n$ - regardless of our assumption on their probability distribution, the sample $k$-th moment is given by: $$\frac{1}{n} \sum_{i=1}^n x_i^k$$

My Question: I have often heard people (informally) say that when the number of measurements increases in size, the sample moment converges to the theoretical moment - but I have never seen a mathematical proof for this.

Is it possible to prove that :

$$ \frac{1}{n} \sum_{i=1}^n x_i^k \xrightarrow{} \int_{\mathbb R} x^k \cdot f(x) \mathrm{d}x $$

Thanks!

RobPratt
  • 50,938
stats_noob
  • 4,107

1 Answers1

2

As hinted by geetha290krm in the comments, $X_i$ iid implies $X^k_i$ are iid.

The (weak) LLN states:

$$\frac1n \sum{Y_i} \xrightarrow{p} E[Y_i] \text{ as }n\to\infty,\; Y_i\text{ are iid}$$

Here, we let $Y_i=X_i^k$, and assuming the moments exist, we apply the LLN above to show convergence of sample moment to the theoretical moment.

  • @ Annika: thank you so much for your answer! I was just wondering - why is it so easy to apply LLN here? – stats_noob Aug 02 '23 at 20:59
  • BTW - do you enjoy probability theory? lately I have been thinking about and working on some probability brain teasers - would you be interested in looking at them? – stats_noob Aug 02 '23 at 21:00
  • @stats_noob -- yeah, probability is a favorite -- happy to look –  Aug 02 '23 at 21:02
  • https://math.stackexchange.com/questions/4739218/if-you-go-fishing-everyday-what-is-the-probability-you-know-x-of-the-pond , https://math.stackexchange.com/questions/4731910/how-to-create-a-mathematical-model-of-an-elevator , https://math.stackexchange.com/questions/4735898/what-is-the-probability-of-knowing-someone-in-the-office – stats_noob Aug 02 '23 at 21:02
  • @ Annika: Did you find these brain teasers interesting? Currently, I am looking into simulation based approaches for these problems .... I am not sure if exact analytical solutions exist... – stats_noob Aug 06 '23 at 16:38
  • @stats_noob -- yes, very -- agree that simulation and numerical stuff good place to start to build intuition (thank goodness for computers :) –  Aug 06 '23 at 19:02
  • :) I would be curious to learn more from you! are you a PhD student in math? Btw here is my latest (embarrassing) question https://math.stackexchange.com/questions/4748560/understanding-the-relationship-between-measure-theory-and-probability-theory haha – stats_noob Aug 06 '23 at 19:03
  • 1
    @stats_noob nope, no PhD -- take a look at MathOverflow -- that stuff is intense! Just a data scientist -- I was trained in Operations Research so it did get math-ier than the typical engineering degree FWIW. Frankly, just constantly learning new math and asking/answering questions on here is how to progress and get better.

    Actually, I specifically credit this site (MSE) for the majority of my learning in math beyond college and the development (to the extend it has) of my mathematical maturity. I also really like math books with answers in the back :)

    –  Aug 06 '23 at 20:37
  • so great to hear! these websites (stackoverflow for computers, math) have been very useful for me as well! – stats_noob Aug 06 '23 at 21:12
  • @stats_noob re: "I was just wondering - why is it so easy to apply LLN here?" -- As noted by geetha290km, since your $X_i$ are iid, you can create a new sequence by applying a function $f$ to each $X_i$ to get a new iid sequence of random variables $f(X_i)$. However, they are still a sequence of iid variates so we can apply LLN just like before. In your case, $f(X_i)=X_i^k$. –  Aug 07 '23 at 14:12