In this book (p. 37, 43, 44) the notion of stochastic stability of a stochastic process is defined by the condition $$ \mathbb{E} \left[\sum_{j=0}^{\infty} \|x_k\|^2\right] < \infty.\tag{1} $$ It is shown that for Markovian switching systems of the form $x_{k+1} = f_{\theta_k}(x_k)$ driven by a finite Markov chain $(\theta_k)_k$, stochastic stability is equivalent to mean-square stability.
Let us replace the above expectation operator $\mathbb{E}$ by a coherent risk measure $r$ leading to the condition $$ r \left[\sum_{j=0}^{\infty} \|x_k\|^2\right] < \infty\tag{2}\label{2} $$
We know that $r \left[\sum_{j=0}^{\infty} \|x_k\|^2\right] \leq \sum_{k=0}^{\infty}r(\|x_k\|^2)$ (subadditivity property), therefore, if $r(\|x_k\|^2)$ converges adequately fast so that $\sum_{k=0}^{\infty}r(\|x_k\|^2)$ is finite, then \eqref{2} will hold. I am, however, interested in the converse: under what additional conditions does \eqref{2} imply that $r(\|x_k\|^2)\to 0$?
A first step would be to see whether this holds for the essential supremum; if $\mathrm{esssup} \left[\sum_{j=0}^{\infty} \|x_k\|^2\right] < \infty$ is it true that $\mathrm{esssup}(\|x_k\|^2)\to 0$?