I am trying to prove the following limit with the definition:
$$\lim_{x\to1} \log(x)=0$$
That is - I must prove that for all $\epsilon>0$, there exists $\delta >0$ such that:
$$ 0<|x-1|< \delta \implies |\log x - 0|< \epsilon$$
So - my trouble is to understand what is "fair" to use here. I know the following:
- $x<e^x \implies \log(x)<x$
And it would be nice if we could find that
- $\log(x)\leq x-1$
From this I think we get:
$$|\log(x)|\leq|x-1| \text{ for }x\in[1,\infty]\qquad |\log(x)|\geq|x-1| \text{ for }x\in[-\infty,1]\tag{$\star$}$$
The trouble for me is that I know $\log(x)\leq x-1$ is true because of two things:
Because $\log(1)=1-1=0$ and hence as $\log(x)$ and $x-1$ start in the same point and $\log'(x)=\frac{1}{x}$ and $(x-1)'=1$ then $\log (x)$ increases slower than $x-1$ when $x>1$ and decreases faster than $x-1$ when $x<1$. But is it fair to use derivatives? In the book I am reading, we didn't even reached derivatives yet.
I looked a plot of both functions, which also doesn't seems fair.
Also, is it fair to assume we know that $\log(1)=0$? For me, knowing this seems to defeat the purpose of the problem somehow and the inequalities I obtained in $(\star)$, the first would be helpful but I don't see how the second would help me.
So how can we proceed to prove this in a "fair" way?