1

What is a clever and robust argument to show that for any $x>0$ and $a>1$, $n^x \in o(a^n)$, or equivalently; for all $c>0$, there is a $n_0 >0$ such that $n \geq n_0 \Rightarrow n^x < ca^n$?

Here's my argument; Since $\log_n(ca^n)$ is an increasing function for any $a,c >0$, there is always a $n_0$ such that $\log_{n_0}(ca^{n_0}) > x$. As a result, for all $n \geq n_0$, $ca^n = n^{\log_{n}(ca^{n})} \geq n^{\log_{n_0}(ca^{n_0})} > n^x $.

I'm not really happy with this because it doesn't really give a precise and nice definition of $n_0$. Is my proof correct and is there a better/more clever way?

  • Why do you need "for all $c > 0$"? Are you trying to prove that $n^x$ is $o(a^n)$ or $O(a^n)$? (Both happen to be true, but the definition of $O$ only requires there to exist such a constant $c$.) – ShreevatsaR Sep 14 '17 at 20:18
  • Sorry, I want to prove $o$, I'll edit it. Ty. – Very Noob CS Student Sep 14 '17 at 20:38
  • The way your question is worded, you are asking to show that $n^x$ is $O(a^n)$. If you want little $o$, then you are asking to prove that $\lim_{n\to\infty} n^x/a^n = 0$. – Alex Ortiz Sep 14 '17 at 20:41

2 Answers2

2

You need your $\log_n(ca^n)$ not only to be increasing, but to actually go to infinity. Apart from that, your proof looks fine.

Here's how I would prove it. Lightly tweaking the notation a little*, we want to prove that $n^b = o(a^n)$ (for fixed constants $b > 0$ and $a > 1$), or in other words that for any given constant $c > 0$, we have $n^b < c a^n$ for sufficiently large $n$.

Just solve for $n$. Note that $n^b = e^{b \ln n}$, and $a^n = e^{n \ln a}$. So we want $e^{b \ln n} < e^{\ln c + n \ln a}$, or $b \ln n < \ln c + n \ln a$, or equivalently $n \ln a - b \ln n > -\ln c$. This is something you can see clearly is true for sufficiently large $n$: the LHS goes to infinity, while the RHS is a fixed constant. This you can now prove.


If you wanted a “precise and nice definition for $n_0$” (but really you shouldn't want it: note that the whole point of the $O$ notation is to suppress such information as nonessential), you could probably do the following: pick $n_0$ to be large enough such that:

  • $b \ln n_0 < 0.5n_0 \ln a$ (this just means $n_0 / \ln n_0 > 2b/\ln a$)

    and such that

  • $0.5n_0 \ln a > -\ln c$ (this just means $n_0 > -2\ln c/\ln a$)

(That is, pick $n_0$ to be the maximum of the points where the two conditions hold.) Then for $n \ge n_0$, we have

$$ n \ln a - b \ln n > n \ln a - 0.5 \ln a > 0.5n \ln a > -\ln c$$ and therefore $c a^n > n^b$ as required.

(The way I arrived from “$n \ln a - b \ln n > -\ln c$” to this trick is to see that the LHS was basically $O(n)$, so the pesky thing being subtracted could be bounded by say $0.1n$ or $0.5n$… and carrying the factors along.)


If you want something even more explicit: to prove that $n^b < ca^n$ for sufficiently large $n$, assume without loss of generality that $c \le 1$ (otherwise if $c > 1$ we could just prove the stronger inequality with $c = 1$), pick $$n_0 = e + \left(\frac{b + \ln(1/c)}{\ln a}\right)^2$$ so that, for all $n \ge n_0$, we have:

  • $\dfrac{n}{\ln n} > \sqrt{n}$ (true for all $n > 1$), and
  • $\sqrt{n} > \dfrac{b + \ln(1/c)}{\ln a}$

which together give

$$ \frac{n}{\ln n} > \frac{b + \ln(1/c)}{\ln a}$$

or

$$ n \ln a > b \ln n + \ln(1/c)\ln n > b \ln n + \ln(1/c) $$

and therefore

$$a^n > n^b (1/c)$$

which is the same as $n^b < c a^n$ that we wanted to prove.


But if you don't want a “precise and nice definition for $n_0$”, then the nicer way to prove this (and to think about this) would be to write something like:

$$n^b = e^{O(\ln n)} = e^{o(n)} = o(a^n)$$

after proving that each of these (one-way) equalities is generally valid. That is the whole power of the notation (or the idea of asymptotic analysis): to avoid having to thinking about specific constants where things happen, while still being completely formal and rigorous.


[* Footnote on notation: I used $b$ instead of $x$ to indicate that it is a constant. And I don't like the subset notation for $O$, so I will not indulge in that misguided pedanticism. You can read the treatments by de Bruijn and by Knuth to see that we lose much of the value of $O$ notation by definining $O(x)$ as a set; for example, statements like $e^{O(1)} = O(1)$ become harder to write. See also the “four reasons” for preferring “$=$” over “$\subseteq$”, on pp 446–447 of Concrete Mathematics: actually just work through the whole Asymptotics chapter and see if you still prefer thinking of $O(\ldots)$ as sets.]

ShreevatsaR
  • 42,279
  • For the footnote, you may want to weigh in at https://math.stackexchange.com/questions/86076/what-are-the-rules-for-equals-signs-with-big-o-and-little-o – hmakholm left over Monica Sep 14 '17 at 21:34
  • @HenningMakholm Nice question and answers :-) There's a clear (IMO) line indicated by de Bruijn: we simply refuse to assign any formal meaning to $O(f(n))$ in isolation (such as declaring that it denotes a set of functions) (we do retain an informal/intuitive meaning), and we only assign meanings to full statements (including an $=$ sign) in which an $O$ occurs on either side. I think it's closer to your answer than to Srivatsan's. (I also think it will be clearer to start with the $L$ or $A$ examples.) Let me know if you'd like me to post an answer at that question along those lines. :-) – ShreevatsaR Sep 14 '17 at 21:49
  • I accepted your answer. I didn't think of splitting the inequality into one half > first part and one half > second part, it was way too clever for me to think of. Though, this is still not entirely what I'm looking for. $n_0/ \ln n 0 >2b/\ln a $ still does not give $n_0$ in explicit form. – Very Noob CS Student Sep 14 '17 at 23:01
  • @VeryNoobCSStudent Added an explanation for how I arrived at the trick… and I've just added an even more explicit expression for $n_0$ for you. :-) (Even though I think such explicit expressions miss the point and obfuscate what is going on.) – ShreevatsaR Sep 15 '17 at 00:07
  • I like the new method much more; instead of switching bases, we take the logarithms, make a $\ln n$ appear (which is too clever for me), then find $n_0$ by going through a simpler, smaller function (which is also too clever for me). At least I'm able to understand this, sigh. I will never understand how people think of clever tricks like those. – Very Noob CS Student Sep 15 '17 at 01:09
  • @VeryNoobCSStudent I can tell you how I thought of it: (1) Whenever there is exponentiation (something is raised to another power), take logarithms. This immediately turns $n^b < ca^n$ into $b\ln n < \ln c + n\ln a$ and there is no longer any exponentiation to think about. (2) Actually think about the asymptotics, how “big” the different terms are. The term $b\ln n$ is $O(\ln n)$. The term $\ln c$ is a constant $O(1)$ and you can almost ignore it. The term $n\ln a$ is $\sim n$. So all you're trying to prove is that something that is $O(\ln n)$ is less than something that is $\sim n$. – ShreevatsaR Sep 15 '17 at 03:46
  • @VeryNoobCSStudent More concretely, I looked at $b \ln n < \ln c + n \ln a$, tried to get all the $n$s in one place (like middle-school algebra), and ended up with $$\frac{b}{\ln a} < \frac{\ln c}{\ln a \ln n} + \frac{n}{\ln n}$$ -- next I realized that the $\ln n$ in the denominator of the $c$ term can be ignored (fortunately) as larger $n$ only makes the inequality “easier” (by making that term smaller), so we're left with trying to prove that $\frac{n}{\ln n} > \mathrm{something}$. If we want this explicitly, we can replace $\frac n{\ln n}$ with something easier, and I thought of $\sqrt n$. – ShreevatsaR Sep 15 '17 at 03:50
  • @ShreevatsaR: I had hoped you might post an answer with a formalism backed by more authority than my own synthesis-from-practice. – hmakholm left over Monica Sep 17 '17 at 14:20
1

Hint:

Let $k=\lceil x\rceil$. To prove $n^x=o(a^n)$, it is enough to prove $n^k=o(a^n)$.

Denote $u_n=\dfrac{n^k}{a^n}$. A standard argument to show $(u_n)\to 0$ consists in proving $\;\dfrac{u_{n+1}}{u_n}$ has a limit $\ell <1$.

Indeed, if this is the case, take any $r$ such that $\ell<r<1$. There exists an integer $N>0$ such that $\dfrac{u_{n+1}}{u_n}<r$ for all $n\ge N$. By a trivial induction there results that $$0<u_n<u_N\, r^{n-N}=\frac{u_N}{r^N}r^n,$$i.e. if $n$ is large enough, $(u_n)$ is bounded from above by a geometric series of ratio $r<1$, which therefore converges to $0$. Hence, by the squeezing principle, $(u_n)$ converges to $0$.

Last step: let's check $\dfrac{u_{n+1}}{u_n}$ conerges to a limit $<1$: $$\frac{u_{n+1}}{u_n}=\frac{(n+1)^k}{a^{n+1}}\frac{a^n}{n^k}=\Bigl(1+\frac1n\Bigr)^k\frac1a\to 1\cdot\frac 1a<1\quad\text{by hypothesis.}$$

Bernard
  • 179,256