I am learning about Moments in the context of Probability
For some given probability distribution function the theoretical $k$-th moment is given by : $$\int_{\mathbb R} x^k \cdot f(x) \mathrm{d}x$$
Given some measurements $x_1, x_2... x_n$ - regardless of our assumption on their probability distribution, the sample $k$-th moment is given by: $$\frac{1}{n} \sum_{i=1}^n x_i^k$$
My Question: I have often heard people (informally) say that when the number of measurements increases in size, the sample moment converges to the theoretical moment - but I have never seen a mathematical proof for this.
Is it possible to prove that :
$$ \frac{1}{n} \sum_{i=1}^n x_i^k \xrightarrow{} \int_{\mathbb R} x^k \cdot f(x) \mathrm{d}x $$
Thanks!