Question :
let $Z_n, n\geq 1$, be a sequence of random variables and $c$ a constant such that, for each $\epsilon \gt 0, P{|Z_n-c|\gt\epsilon}\rightarrow\ 0 \,as\, n \, \rightarrow \infty $. Show that, for any bounded continuous function g, $$E[g(Z_n)]\rightarrow g(c)\,as\, n\rightarrow\infty$$
Answer attempt:
I set $Z^n=Z_1+Z_2+...+Z_n$ to be my sequence of random variables.
Chebyshev's inequality states, $P{|X-\mu|\geq k}\leq \frac{\sigma^2}{k^2}$, so based on this inequality I set $c=n\mu$, where $\mu$ is the sample mean, and if $\epsilon = n\epsilon_0$, where $\epsilon_0$ is some arbitrary constant, then if I plug this into Chebyshev's inequality I get : $$p(|Z^n-\mu n|\geq\epsilon_0 n)\leq \frac{\sigma^2}{n^2 \epsilon^2_0}=p(|\frac{Z^n}{n}-\mu|\geq\epsilon_0)\leq \frac{\sigma^2}{n^3 \epsilon_0}$$ So this goes to $0$ as $n\rightarrow \infty$, so it makes sense that $c=n\mu$.
So then, to prove $ E(g(Z^n)) \rightarrow g(c) \, as \, n\rightarrow\infty$, I am not sure what to do. I think, intuitively, $E(g(z_i)) \,as \,n\rightarrow\infty = g(\mu)$, which would prove the inequality.
But, how do you prove $\sum_0^\infty g(z_i)p(z)=g(\mu)$? Anyway yeah, I am lost.