5

Let $C>0$ and $(X_n)$ be a sequence of positive random variables. Assume that $$ |X_n - C| = o_p(r_n^{-1}) \iff r_n|X_n-C|=o_p(1) $$ for some fixed sequence $(r_n)$ with $r_n \to \infty$.

What can we say about the rate of convergence of the log-transform: $$ |\log(X_n)- \log(C)| = o_p(?). $$ I guess it depends on $C$ and $(r_n)$ but I can't seem to derive anything useful.

John
  • 1,805

1 Answers1

3

It has the same rate.

I hope that the following hint can help.

  • Recall that: $r_n|X_n-C|=o_p(1) \iff \left( \forall \varepsilon>0, \mathbb P(|X_n-C|>\frac{\epsilon}{r_n}) \rightarrow 0 \text{ when } n\rightarrow \infty \right)$.

  • For the case $X_n>C+\frac{\epsilon}{r_n}$, we apply the Taylor expansion and get $$r_n[\log(X_n)-\log(C)]> \frac{\varepsilon}{C} + O(\varepsilon^2).$$

  • Similar for the case $X_n<C-\frac{\epsilon}{r_n}$.

Edit. As pointed out by @John, above implication is not correct, what we should prove is the reversion. However, the same idea can be applied, i.e, we should use Taylor expansion of $e^x$ instead of $\log x$.

In the following, I propose a more detailed solution without using Taylor expansion (to avoid big-O notation).

Claim: For all $\varepsilon>0$, exists $\delta>0$ such that $$r_n|\log(X_n)-\log(C)|>\varepsilon \Longrightarrow r_n|X_n-C|> \delta$$ for $n$ large enough.

Proof of Claim: Let $Y_n=\log X_n$ and $B=\log C$. So we want to show that, for all $\varepsilon>0$, exists $\delta>0$ such that $$r_n|Y_n-B|>\varepsilon \Longrightarrow r_n|e^{Y_n}-e^B|> \delta$$ for large $n$.

Now assume that $r_n|Y_n-B|>\varepsilon$, we consider two cases.

First, recall a simple inequality that $e^{x}\geq 1+x$ for all $x\in \mathbb R$. If $Y_n > B +\frac{\varepsilon}{r_n}$, then $e^{Y_n} > e^B(1 +\frac{\varepsilon}{r_n})$ and thus $r_n(e^{Y_n} - e^B)>e^B \varepsilon$.

Second, it is easy to prove that $e^{-x} \leq 1-x+\frac{x^2}{2}$ for $x\geq 0$. If $Y_n < B -\frac{\varepsilon}{r_n}$, then $r_n(e^{Y_n} - e^B)<-e^B \varepsilon+e^B\frac{\varepsilon^2}{2r_n}$.

Third, note that for any $\varepsilon>0$, there exists a sufficiently large integer $n$ such that $\frac{\varepsilon}{2r_n}<1$. Choose $M$ such that $\frac{\varepsilon}{2r_n}<M<1$ and $\delta:=e^B(1-M)\varepsilon$.

Then, we have $r_n|e^{Y_n} - e^B|>e^B \varepsilon-e^B\frac{\varepsilon^2}{2r_n}> e^B(1-M)\varepsilon=\delta$. Done.

Leonard Neon
  • 1,404
  • Are you treating $C$ as a constant? OP seems to want the dependence of the convergence on $C$, in which case should the rate not be scaled by $1/C$ (derivative of log)? – E-A May 28 '21 at 01:06
  • 1
    @E-A The $o_P$ notation absorbs any constant factor, so it's not necessary to do the scaling if $C$ is a constant. However, the definition of $C$, i.e. “Let $C>0$ and…,” is indeed ambiguous. But even if $C$ is a random variable, it cannot appear in the rate of convergence as it's random after all. – Ѕᴀᴀᴅ May 28 '21 at 03:36
  • @Saad you can certainly have rates depend on parameters (o_p need not absorb every single parameter you use; there are many rates like O($n/\epsilon^2)$). But yeah, I just assumed they have a uniform convergence in C earlier, and now they want the rate of convergence depending on C. Either way, I am not personally invested in this, so John if you read this, do note that I think the rate of convergence you want would be 1/C of the previous one, and the proof should be exactly is what is written here. – E-A May 28 '21 at 17:52
  • Thank You. Yes, C is a constant. I guess that the conclusion follows by noting that sufficiently small $\epsilon>0$ there exists a $\delta>0$ such that $ P(r_n|\log(Z_n)-\log(c)|>\epsilon ) \leq P(r_n|Z_n-c|> \delta) \to 0,$ right? – John May 29 '21 at 06:58
  • 1
    Actually, I just realized my above comment uses the reversed implication of what we actually show. Can you add some details about how to draw the final conclusion? – John May 29 '21 at 07:04
  • 1
    @John Yes, you are totally right and thank! I have added an Edit part to my answer. – Leonard Neon May 29 '21 at 10:25