I was given a model $r_{t} = ϕ_{0} + ϕ_{2}r_{t-2} + ϵ_{t}$ with $\epsilon_t \sim N(0,\sigma^2)$ and have to derive the likelihood of $(r_{3}, r_{4}, . . . , r_{T})$ conditional on $(r_{1}, r_{2})$ and find $ϕ_{0}$ and $ϕ_2$ that maximize the likelihood function given $σ^2$ is known
I think I managed to get the conditional likelihood function as followed: $$\begin{aligned} \ln L(r_{3}, r_{4}, . . . , r_{T}|r_{1},r_{2};\sigma^2) &= -\displaystyle\frac{T-2}{2}\ln(2\pi)-\displaystyle\frac{T-2}{2}\ln(\sigma^2)\\ \\ &-\displaystyle\sum_{t=3}^{T} \displaystyle\frac{(r_{t}-\phi_{0}-\phi_{2}r_{t-2})^2}{2\sigma^2}\\ \\ &=-\displaystyle\frac{T-2}{2}\ln(2\pi)-\displaystyle\frac{T-2}{2}\ln(\sigma^2)-\displaystyle\sum_{t=3}^{T} \displaystyle\frac{\epsilon_{t}^2}{2\sigma^2} \end{aligned}$$
However I am not sure how to derive MLE from this. Could someone let me know how to proceed further?
$ \sum_{t=3}^{T} (r_t - \phi_2 r_{t-2} - \phi_{0})^2 $ This is the same minimizing the sum of the squares in a regression, where $\beta_{0} = \phi_{0}$ and $\beta_1 = \phi_2$. So, you can just take the derivative with respect to each coefficient and set that equal to zero. It's exactly the same as an ordinary least squares regression.
– mark leeds Dec 26 '23 at 18:48$$\displaystyle\frac{ \partial \mathcal{L} }{ \partial \phi_{2} }=\displaystyle\frac{1}{2\sigma^2} (\displaystyle\sum_{t=3}^{T}r_{t}r_{t-2}-\phi_{2}\displaystyle\sum_{t=3}^{T}r_{t-2}^2)=0$$ I got these 2 equations but idk if they're correct, and idk how to solve them
– Tin Dao Dec 28 '23 at 07:31