8

I have a square integrable strictly stationary time series $(r_t)$. Suppose that $(r_t)$ satisfies certain conditions such that

$$\sqrt{T}(\bar{r}_T-\mu_r) \rightsquigarrow N(0,\sigma_1^2)$$ $$\sqrt{T}(s_T^2-\gamma_r(0)) \rightsquigarrow N(0,\sigma_2^2)$$

$\mu_r := E[r_t]$ and $\gamma_r(0) := E[(r_t-\mu_r)^2]$. Here $\bar{r}_T$ denotes the sample mean, i.e. $\bar{r}_T = \frac{1}{T}\sum_{t=1}^Tr_t$. $s_T^2$ is the sample autocovariance at lag $0$, i.e. $s_T^2 = \frac{1}{T}\sum_{t=1}^T(r_t-\bar{r}_T)^2$.

I want to find the limit distribution of $$\sqrt{T}\begin{pmatrix}\bar{r}_T-\mu_r \\ s_T^2-\gamma_r(0)\end{pmatrix}$$

For this purpose I think the Cramer-Wold device is suitable. So let $a_1,a_2 \in \mathbb{R}$ such that either $a_1 \neq 0$ or $a_2\neq 0$. Then I need to find the asymptotic distribution of

$$\sqrt{T}\begin{pmatrix}a_1 & a_2\end{pmatrix}\begin{pmatrix}\bar{r}_T-\mu_r \\ s_T^2-\gamma_r(0)\end{pmatrix}$$

I manipulated the expression above a bit but I am kind of stuck. I am also not sure whether this is the way to go. I would appreciate some help.

I realized that for my purposes the limit distribution of $$\sqrt{T}\begin{pmatrix}\bar{r}_T-\mu_r \\ \bar{r^2}_T-E[r_t^2]\end{pmatrix}$$ is also an acceptable answer. Here $\bar{r^2}_T = \frac{1}{T}\sum_{t=1}^Tr_t^2$

Calculon
  • 5,843
  • Do you want the joint distribution of $\sqrt{T}\begin{pmatrix}\bar{r}_T-\mu_r \ s_T^2-\gamma_r(0)\end{pmatrix}$ or the limit in distribution of this random vector? – Davide Giraudo Sep 14 '16 at 20:48
  • @DavideGiraudo Sorry for the confusion. I meant asymptotic joint distribution. So the limit distribution of that random vector. – Calculon Sep 14 '16 at 20:57
  • Can you explain how you showed that the limit distribution of $$\sqrt{T}\begin{pmatrix}\bar{r}_T-\mu_r \ \bar{r^2}_T-E[r_t^2]\end{pmatrix}$$ is sufficient to get the asymptotic joint distribution of the sample mean and sample variance? It seems like maybe some combination of the delta method and Slutsky's theorem would be required, but I still can't figure it out. – Chill2Macht Feb 13 '18 at 01:43

1 Answers1

5

The problem I will try to solve is the asymptotic joint distribution of

$$\sqrt{T}\begin{pmatrix}\bar{r}_T-\mu_r \\ \bar{r^2}_T-E[r_t^2]\end{pmatrix}$$

I will write this as

$$ \sqrt{n}\begin{pmatrix}\bar{X}_n-\mu \\ \bar{Y}_n-(\sigma^2 + \mu^2)\end{pmatrix}$$

where $X_1, \dots, X_n, \dots$ are i.i.d., for each $n$ we define $Y_n := X_n^2$, and then we write $\mathbb{E}X_1 =: \mu$ and $Var(X_1) =: \sigma^2$, so that $\mathbb{E}[Y_1] = \mathbb{E}[X_1^2] = Var(X_1) + (\mathbb{E}X_1)^2 = \sigma^2 + \mu^2$. This corresponds also to writing the indexing variable as $n$ instead of $T$ and the RV's as $X$ instead of $r$.

I will write it this way solely because I am more comfortable with this and want to avoid unnecessary mistakes due to discomfort with notation.

Anyway, it follows then by the multivariate central limit theorem that

$$ \sqrt{n}\begin{pmatrix}\bar{X}_n-\mu \\ \bar{Y}_n-(\sigma^2 + \mu^2)\end{pmatrix} \overset{D}{\to} \mathscr{N}(0, \Sigma)$$

where $\Sigma$ is the matrix:

$$\begin{pmatrix} Var(X_1) & Cov(X_1, Y_1) \\ Cov(X_1, Y_1) & Var(Y_1) \end{pmatrix} = \begin{pmatrix} Var(X_1) & Cov(X_1, X_1^2) \\ Cov(X_1, X_1^2) & Var(X_1^2) \end{pmatrix}$$ $$ = \begin{pmatrix} \sigma^2 & \mathbb{E}(X_1^3) - \mathbb{E}(X_1)\mathbb{E}(X_1^2) \\ \mathbb{E}(X_1^3) - \mathbb{E}(X_1)\mathbb{E}(X_1^2) & \mathbb{E}(X_1^4) - (\mathbb{E}(X_1^2))^2 \end{pmatrix} $$ $$= \begin{pmatrix} \sigma^2 & \mathbb{E}(X_1^3) - \mu(\sigma^2 + \mu^2) \\ \mathbb{E}(X_1^3) - \mu(\sigma^2 + \mu^2) & \mathbb{E}(X_1^4) - (\sigma^2 + \mu^2)^2 \end{pmatrix} \,.$$

To get from here to the joint asymptotic distribution of the sample mean and variance, first use the identity that the sample variance, $S_n^2$, is equal to $\bar{Y}_n - (\bar{X}_n)^2$. Therefore if we define:

$$ g: \begin{pmatrix} z_1 \\ z_2 \end{pmatrix} \mapsto \begin{pmatrix} z_1 \\ z_2 - z_1^2 \end{pmatrix} $$ this should give us the final result we want if we apply the Delta Method.

Please double check my work, but I think this gives that the asymptotic covariance matrix is:

$$\begin{pmatrix}\sigma^2 & \mathbb{E}(X_1 - \mu)^3 \\ \mathbb{E}(X_1 - \mu)^3 & \mathbb{E}(X_1 - \mu)^4 - \sigma^4 \end{pmatrix}$$

Here are some related links:

In the case that the $X_i$ are normal, one can show that $S_n^2$ and $\bar{X}_n$ are independent, likewise that $\mathbb{E}[(X_1 - \mu)^4] = 3\sigma^4$, which leads to everything simplifying considerably. It seems that people as a result are often tempted to ignore the non-normal case (e.g. here).

Pang
  • 407
  • 5
  • 9
Chill2Macht
  • 22,055
  • 10
  • 67
  • 178