3

I was given a model $r_{t} = ϕ_{0} + ϕ_{2}r_{t-2} + ϵ_{t}$ with $\epsilon_t \sim N(0,\sigma^2)$ and have to derive the likelihood of $(r_{3}, r_{4}, . . . , r_{T})$ conditional on $(r_{1}, r_{2})$ and find $ϕ_{0}$ and $ϕ_2$ that maximize the likelihood function given $σ^2$ is known

I think I managed to get the conditional likelihood function as followed: $$\begin{aligned} \ln L(r_{3}, r_{4}, . . . , r_{T}|r_{1},r_{2};\sigma^2) &= -\displaystyle\frac{T-2}{2}\ln(2\pi)-\displaystyle\frac{T-2}{2}\ln(\sigma^2)\\ \\ &-\displaystyle\sum_{t=3}^{T} \displaystyle\frac{(r_{t}-\phi_{0}-\phi_{2}r_{t-2})^2}{2\sigma^2}\\ \\ &=-\displaystyle\frac{T-2}{2}\ln(2\pi)-\displaystyle\frac{T-2}{2}\ln(\sigma^2)-\displaystyle\sum_{t=3}^{T} \displaystyle\frac{\epsilon_{t}^2}{2\sigma^2} \end{aligned}$$

However I am not sure how to derive MLE from this. Could someone let me know how to proceed further?

Tin Dao
  • 31
  • First put a hat on the $\epsilon_t$ in your likelihood and the $\sigma^2$ in your likelihood. Then, take the derivative of the likelihood with respect to $\hat{\sigma}^2$ and then set it equal to zero. That will allow you to write $\hat{\sigma}^2$ as a function of the $\hat{\epsilon}_t$. $\hat{\sigma}^2$ is the MLE of $\sigma^2$. – mark leeds Dec 26 '23 at 07:22
  • yea I already know that part, I just could not find the right equations of the derivatives – Tin Dao Dec 26 '23 at 16:14
  • I'm sorry. I read it wrong. It is assumed that $\sigma^2$ is known and you want to find the estimates of the mean and the AR coefficient. That should make the likelihood pretty straightforward. I'll do it on paper and then latex it but I don't have time at this moment. Hopefully I'll have some time tonight. In the meantime, you could google for it. something like "conditional AR(1) likelihood with known variance". – mark leeds Dec 26 '23 at 18:14
  • Hi: Just looking at your likelihood, the first two terms don't involve the observations so it's only the last term that we need to consider. So, we want to minimize

    $ \sum_{t=3}^{T} (r_t - \phi_2 r_{t-2} - \phi_{0})^2 $ This is the same minimizing the sum of the squares in a regression, where $\beta_{0} = \phi_{0}$ and $\beta_1 = \phi_2$. So, you can just take the derivative with respect to each coefficient and set that equal to zero. It's exactly the same as an ordinary least squares regression.

    – mark leeds Dec 26 '23 at 18:48
  • thank you very much – Tin Dao Dec 27 '23 at 09:20
  • You are welcome. Be careful to only sum over the odd integers in your formulae because of the t+2 relation in the coefficient of the predictor. Of course, you can sum over all of them but the even ones are zero. – mark leeds Dec 27 '23 at 21:43
  • $$\displaystyle\frac{ \partial \mathcal{L} }{ \partial \phi_{0} }=\displaystyle\frac{1}{2\sigma^2} (\displaystyle\sum_{t=3}^{T}r_{t}^2-\phi_{2}\displaystyle\sum_{t=3}^{T}r_{t}r_{t-2})=0$$

    $$\displaystyle\frac{ \partial \mathcal{L} }{ \partial \phi_{2} }=\displaystyle\frac{1}{2\sigma^2} (\displaystyle\sum_{t=3}^{T}r_{t}r_{t-2}-\phi_{2}\displaystyle\sum_{t=3}^{T}r_{t-2}^2)=0$$ I got these 2 equations but idk if they're correct, and idk how to solve them

    – Tin Dao Dec 28 '23 at 07:31

1 Answers1

2

Okay. I'll take the derivatives and show you but you should practice with these things so you can get better with doing this sort of thing.

We have $L = \sum_{t=3}^{t=T} (r_t - \phi_2 r_{t-2} - \phi_{0})^2$

First take $\frac{\partial L}{\partial \phi_{0}}$ and set it to zero.

This gives $ -2 \sum_{t=3}^{t=T}(r_t - \phi_2 r_{t-2} - \phi_{0}) = 0$

$ (T-2) \phi_{0} = \sum_{t=3}^{t=T}(r_t - \phi_2 r_{t-2}) \longrightarrow $

$\phi_{0} = \frac{\sum_{t=3}^{t=T}(r_t - \phi_2 r_{t-2})}{T-2}$

So, that's the estimate for $\phi_0$ which we call $\hat{\phi}_{0}$. (Notice that it's a function of $\phi_2$ which we will do next)

So, now we take $\frac{\partial L}{\partial \phi_{2}}$ and set it to zero.

This gives $-r_{t-2} \sum_{t=3}^{t=T}(r_t - \phi_2 r_{t-2}) = 0 \longrightarrow \hat{\phi}_2 = \sum_{t=3}^{t=T} \frac{(r_t \times r_{t-2})}{r_{t-2} \times r_{t-2}}$

So, now that we have the MLE of $\phi_2$, we can use that in the MLE formula for $\hat{\phi}_{0}$ in order to obtain that.

I hope this is clear. If not, then let me know. Also, notice that all summations should be over the odd integers because the even integers don't come into play. Also, I just realized that you shouldn't divide by (T-2) but rather divide by the number of observations which is either ((T-2)/2) or ((T-2)/2 + 1) depending on the value of $T$.

mark leeds
  • 1,710