I am studying the book Nonparametric and Semiparametric Models written by Wolfgang Hardle and have difficulty with the following exercise:
$\textbf{Exercise 3.13}$ Show that $\hat{f_h}^{(n)}(x) \xrightarrow {a.s.}f(x)$. Assume that $f$ possesses a second derivative and $\Vert > K \Vert_2 <\infty$.
where $\hat{f_h}^{(n)}(x)=\dfrac{1}{n}\sum\limits_{i=1}^n K_h(x-X_i)$ is the density kernel estimator, $\Vert K \Vert_2=(\int_{\mathbb{R}}K^2(s)ds)^{\frac{1}{2}}$ and we set $h=O(n^{-\frac{1}{5}})$ which is the optimal bandwidth parameter in this exercise.
Here is my attempt: I want to use Borel-Cantelli lemma to prove the almost sure convergence. By Chebyshev inequality we have $$ \mathbb{P}(\vert \hat{f_h}^{(n)}(x)-f(x)\vert>\epsilon)\le \dfrac{\mathbb{E}(\hat{f_h}^{(n)}(x)-f(x))^2}{\epsilon ^2}=\dfrac{MSE(\hat{f_h}^{(n)}(x))}{\epsilon^2}, $$ and it is known that $$ MSE(\hat{f_h}^{(n)}(x))=\dfrac{f(x)}{nh}\Vert K \Vert_2^2+\dfrac{1}{4}(f''(x))^2h^4(\mu_2(K))^2+o(h^4)+o(\dfrac{1}{nh}) $$ where $\mu_2(K)=\int_{\mathbb{R}}s^2K(s)ds$. By replacing $h$ with $h_{opt}=O(n^{-\frac{1}{5}})$ we derive that $$MSE(\hat{f_h}^{(n)}(x))=O(n^{-\frac{4}{5}})$$ However, $\sum\limits_{n=1}^{\infty} n^{-\frac{4}{5}}=\infty$ and I was stuck here. Can anyone help me with this?