0

Considering the integral definition of the expectation operator, I'm supposed to prove that if $$ \lim_{N \to \infty} \hat{\theta}_N = \theta^*$$ then $$ E(\lim_{N \to \infty} \hat{\theta}_N) = \lim_{N \to \infty} E(\hat{\theta}_N) = \theta^* $$ As a hint, the problem says that I should use the limit definition from Calculus, which I did, but I have no idea how that should relate to the proof.

Right now, I'm stuck at:

$$ \| E(\hat{\theta}_N) - \theta^* \| < \epsilon \\ \forall N > N_0 $$

I've also tried proving that $$ E(\lim_{N \to \infty} \hat{\theta}_N - \lim_{N \to \infty} E(\hat{\theta}_N)) = 0 $$ to no avail.

And if I try to rewrite the expectation integral $$ \int \hat{\theta}_N \cdot p(\hat{\theta}_N) d\hat{\theta}_N = \hat{\theta}_N \cdot \int p(\hat{\theta}_N) d\hat{\theta}_N - \int \int p(\hat{\theta}_N) d\hat{\theta}_N d\hat{\theta}_N$$ as it is over the entire domain, it amounts to zero, so either the domain is not supposed to be complete or I'm not allowed to use integration by parts.

Can you please at least give me a usable hint? I'm studying engineering and don't have much in-depth math knowledge.

I should also mention that $\hat{\theta}_N, \theta^* \in \mathbb{R^{n \theta}}$, so they're vectors. I don't know anything about the probability distribution. They should normally be bounded, since they're supposed to be parameters of a physical system (in theory, there's no mention of anything related to the system in the problem), but that's only a supposition I make which may be false.

Thank you!

Edit: $E()$ is the expectation operator defined as: $$E(x) = \int x \cdot p(x) dx$$

Edit 2: The full problem sounds like this:

Starting with the original definition of the expectation operator, show that the consistency property of the system parameter estimation implies asymptotic nondeviation. Namely, you'll prove the fact that the expectation operator, $E()$ commutes with the limit operator, $\lim_{N \to \infty}$:

$$\lim_{N \to \infty} E(\hat{\theta}_N) = E(\lim_{N \to \infty} \hat{\theta}_N) = \theta^* $$

Hint: It's recommended to use the definition of limit you've learned in calculus.

Edit 3: I've modified the starting hypothesis.

  • 1
    It is difficult to suggest anything not knowing what is $\hat \theta_N$. – John B Dec 28 '17 at 22:38
  • Besides it being a vector of parameters that can take real values and the aforementioned limit of its expectation, there is no relevant information regarding it. As trivia, $\hat{\theta}_N$ is the approximation of the parameter vector of a system when its equations are expressed in linear regression form for a finite dataset of length N. There is no information regarding the system or the method used to obtain it. – Anthonius Daoud-Moraru Dec 28 '17 at 22:46
  • 1
    OK, but then since in general limits don't commute with integrals it is impossible to suggest anything rigorous. In particular, your notation $E$ suggests that $\hat \theta_N$ is a function of some variable (that is then integrated). It makes a huge difference how that function behaves when $N\to\infty$. – John B Dec 28 '17 at 22:51
  • E() is the expectation operator. I've edited my post to clarify. – Anthonius Daoud-Moraru Dec 28 '17 at 22:56
  • Maybe knowing the whole context may help, because one cannot prove the statement with the things you have written. There are things that you may have hidden because you would think it is not important... Is that so? – Shashi Dec 28 '17 at 23:02
  • I've added the full problem. – Anthonius Daoud-Moraru Dec 28 '17 at 23:13
  • There are various conditions that let $\lim$ and $\mathbb{E}$ commute. One particularly convenient condition for example is that if $|X_n| \leq Y$ for some $Y \in L_1$ (in particular $Y$ might be some constant), then if $X_n \to X$ pointwise (or almost surely), $\mathbb{E}(X_n) \to \mathbb{E}(X)$. Without this, things can fail spectacularly. – Project Book Dec 29 '17 at 01:59
  • I've tried most things in this article: http://stanford.edu/class/msande321/Handouts/Appendix%20B%20-%20Limits%20and%20Expectations.pdf

    Problem is, I don't have any guarantee of pointwise convergence and I have no upper bound for the parameter vector.

    – Anthonius Daoud-Moraru Dec 29 '17 at 09:49
  • I see you have added consistency, there are several notions of consistency. Which one do you mean? Do you mean that the estimator converges in probability to the true parameter? Then we can still not prove it. However if you mean MSE consistent, then we would be finished a very long time ago lol – Shashi Dec 29 '17 at 11:13
  • I think it converges in probability, since the problem has a second point where I'm supposed to use the first point to prove that an estimation is consistent iff $$ \lim_{N \to \infty} E((\hat{\theta}N - \theta)\cdot(\hat{\theta}_N - \theta)^T) = 0$$. There is a statement in my course that $ \lim{N \to \infty} \hat{\theta}N = \theta^* \Leftrightarrow \lim{N \to \infty} E((\hat{\theta}_N - \theta^)\cdot(\hat{\theta}_N - \theta^)^T) = 0 $$, however the proof is left as an exercise to the reader, so according to my professor it should be provable, somehow. – Anthonius Daoud-Moraru Dec 29 '17 at 11:25
  • The second point of the problem is quite simple once I know the limit and the expectation commute, however the first is beyond my skills. – Anthonius Daoud-Moraru Dec 29 '17 at 11:28
  • I just realized I left something out, since you were talking about MSE. According to my prof, "it can be shown" that if $\lim_{N \to \infty} E((\hat{\theta}_N - \theta^) \cdot (\hat{\theta}_N - \theta^)^T) = 0$, the MSE also goes to 0. – Anthonius Daoud-Moraru Dec 29 '17 at 11:47

0 Answers0