The following is a problem from chapter 5, "Limits", of Spivak's Calculus
- (a) For $c>1$, show that $c^{1/n}=\sqrt[n]{c}$ approaches $1$ as $n$ becomes very large. Hint: Show that for any $\epsilon>0$ we cannot have $c^{1/n}>1+\epsilon$ for large $n$.
(b) More generally, if $c>0$, then $c^{1/n}$ approaches $1$ as $n$ becomes very large.
Note that the concept of a sequence hasn't been introduced at this point in the book. Only the basic definition of a limit of a function. I am looking for solutions that only use the latter concept.
My question regards the part $(b)$.
Here is the solution to part $(a)$.
Assume $c>1$ and $f(n)=c^{1/n}$
Bernoulli's Inequality says that
$$\forall \epsilon>0\ \forall n>0,\ 1<1+n\epsilon\leq (1+\epsilon)^n$$
Therefore
$$\lim\limits_{n\to \infty}(1+n\epsilon)=\infty\implies \lim\limits_{n\to\infty}(1+\epsilon)^n=\infty$$
This means that
$$\forall \epsilon>0\ \forall M>0\ \exists N>0\ \forall n, n>N \implies (1+\epsilon)^n>M$$
If we specify $M=c$ then
$$\forall \epsilon>0\ \exists N>0\ \forall n, n>N \implies (1+\epsilon)^n>c \implies 0<c^{1/n}-1<\epsilon$$
Now part $(b)$
Let $c>0$.
Assume $c^{1/n}>1+\epsilon$, for all $n\in\mathbb{N}$.
Then $c>(1+\epsilon)^n>1+n\epsilon$ by Bernoulli's Inequality.
But then $n<\frac{c-1}{\epsilon}$ for all $n \in \mathbb{N}$. But we can choose any natural number $n_1>\left \lfloor \frac{c-1}{\epsilon} \right \rfloor +1$ and have $n_1>\frac{c-1}{\epsilon}$.
Therefore, we have a contradiction, and can infer that
$$\forall \epsilon>0\ \exists M\ \forall n, n>M \implies 0<c^{1/n}<1+\epsilon$$
$$\implies -1<c^{1/n}-1<\epsilon\tag{1}$$
Note that if in this proof we had assumed $c>1$ then $(1)$ would be $0<c^{1/n}-1<\epsilon$, and this proof could have been used for part $(a)$. We assumed, however that $c>0$.
How do we complete this proof? Is it possible with this proof by contradiction? Is there another way?