3

For a random dynamical system the Lyapunov exponent is defined as:

$$\lambda(x) = \lim_{n\to\infty} \sup \frac{1}{n}\log||A_n \cdots A_1||,$$

where $A_i$ are i.i.d. random matrices. Furstenberg-Kesten theorem states that this limit does exist (provided $\mathbb{E}\log||A||<\infty$).

My question is, what rate does convergence occur at? Or more specifically, is there an upper bound on the convergence rate for any $A_i$?

Alp Uzman
  • 12,209

1 Answers1

1

In case this is what you're asking, you should never expect the rate of convergence to be uniform in the random sample of $A_1, A_2, \cdots$. A simple example is the following: sample the matrix

$$ \left( \begin{array}{c c} 2 & 0 \\ 0 & 1/2 \end{array} \right) $$ with probability $1/2$, and sample a rigid rotation $R_\theta$ with probability $1/2$, where $\theta$ is some fixed (deterministic) irrational angle. A theorem of Furstenberg says that that Lyapunov exponent $\lambda$ is strictly positive, while with positive probability you could very well sample "too many" of the rigid rotations $R_\theta$, preventing the accumulation of the exponent $\lambda$.

It is more interesting to study Large Deviations Principles for the convergence of Lypaunov exponents. For instance: what is the asymptotic of

$$ \mathbb P \{ | \frac{1}{n} \log \| A_n \cdots A_1 \| - \lambda | > \epsilon \} $$ for fixed $\epsilon$ as $n \to \infty$? For IID matrix products, this has been worked out in a paper of Arnold and Kliemann from (I think) 1987. It turns out to decay exponentially, with an exponent depending on the Legendre transform of the moment Lyapunov function (sorry to be so cryptic-- see the paper for more details).

A Blumenthal
  • 5,128
  • Another way to formulate these kinds of "rate" questions is to consider a CLT-type scaling. For IID matrix products this is also classical. Le Page has results of this kind for a slightly more general setting. Again the asymptotic variance you get is related to the Moment Lyapunov function. – A Blumenthal Mar 01 '19 at 22:34