3

I am trying to prove the following limit with the definition:

$$\lim_{x\to1} \log(x)=0$$

That is - I must prove that for all $\epsilon>0$, there exists $\delta >0$ such that:

$$ 0<|x-1|< \delta \implies |\log x - 0|< \epsilon$$

So - my trouble is to understand what is "fair" to use here. I know the following:

  • $x<e^x \implies \log(x)<x$

And it would be nice if we could find that

  • $\log(x)\leq x-1$

From this I think we get:

$$|\log(x)|\leq|x-1| \text{ for }x\in[1,\infty]\qquad |\log(x)|\geq|x-1| \text{ for }x\in[-\infty,1]\tag{$\star$}$$

The trouble for me is that I know $\log(x)\leq x-1$ is true because of two things:

  • Because $\log(1)=1-1=0$ and hence as $\log(x)$ and $x-1$ start in the same point and $\log'(x)=\frac{1}{x}$ and $(x-1)'=1$ then $\log (x)$ increases slower than $x-1$ when $x>1$ and decreases faster than $x-1$ when $x<1$. But is it fair to use derivatives? In the book I am reading, we didn't even reached derivatives yet.

  • I looked a plot of both functions, which also doesn't seems fair.

Also, is it fair to assume we know that $\log(1)=0$? For me, knowing this seems to defeat the purpose of the problem somehow and the inequalities I obtained in $(\star)$, the first would be helpful but I don't see how the second would help me.

So how can we proceed to prove this in a "fair" way?

Red Banana
  • 24,885
  • 21
  • 101
  • 207
  • 2
    +1 to your posting, partly for reasonable work shown, and partly for asking outstanding questions : e.g. "But is it fair to use derivatives?", "is it fair to assume we know that log(1)=0?". Derivatives was my first thought, but if you haven't been taught about them, then you have to ask what the intent of the problem composer is. What definition of natural logarithms have you been given? What previous worked examples, theorems, or problems, leading up to this problem, do you think might be pertinent? Have you been taught, in your class that $\log(1) = 0~$ ? – user2661923 Aug 16 '22 at 04:37
  • @user2661923 I'm not really in a class. It's a Russian book from Efimov called "Problems in Higher Mathematics". I decided to answer all this book to get better at basic mathematics but judging from the book, it's not really clear what previous knowledge people reading the book had. Perhaps all the mathematical knowledge in a Russian high-school? – Red Banana Aug 16 '22 at 04:45
  • 1
    My doubt is, how did the book define $\log(x)$ and $e^x$ without any calculus concepts – Clemens Bartholdy Aug 16 '22 at 04:46
  • 1
  • @Beautifullyirrational Yeah, there is also that problem I didn't realize. Perhaps it's "fair" to use the definitions for $\log(x)$ and $e^x$ we see in calculus? – Red Banana Aug 16 '22 at 04:48
  • 1
    Substitute $y=x-1$ & $x=1+y$ , then the new limit may become easier to visualize. – Prem Aug 16 '22 at 04:55
  • Using any specific definition of $\log x$ one should first establish the well known standard inequality $\log x\leq x-1$ for all $x>0$. This allows us to deduce $\frac{x-1}{x}\leq \log x\leq x-1$ for $x>0$. Now use $\epsilon, \delta $. – Paramanand Singh Aug 16 '22 at 10:51

2 Answers2

2

I guess the fairness depends on your definition of $\log(x)$.

Version 1 (hystorical). We may define $\log(x)$ as $\int_{1}^{x}\frac{dt}{t}$ (essentially the work in a isothermal process) and $e^x$ as the inverse function of $\log(x)$. By this way we have that $\log(x)$ is differentiable, increasing, concave and fulfills $\log(xy)=\log(x)+\log(y)$, so the inverse function $\exp(x)$ is differentiable, increasing, convex and fulfills $\exp(x+y)=\exp(x)\exp(y)$.

Version 2 (current). We may define $\exp(x)$ as $\sum_{n\geq 0}\frac{x^n}{n!}$ or as the unique solution of the Cauchy problem $f'(x)=f(x)$ with $f(0)=1$. By this way we have that $e^x$ is differentiable, increasing, convex and fulfills $\exp(x+y)=\exp(x)\exp(y)$. By naming $\log(x)$ the inverse function of $\exp(x)$ we have that $\log(x)$ is differentiable, increasing, concave and fulfills $\log(xy)=\log(x)+\log(y)$.

Tomato, tomato: in the first version, for any $\varepsilon\in(-1,1)$ we have $$ \log(1+\varepsilon)=\int_{0}^{\varepsilon}\frac{dt}{1+t}\Longrightarrow \left|\log(1+\varepsilon)\right|\leq \frac{|\varepsilon|}{1-|\varepsilon|}. $$ In the second version we have $$ \exp\left(\lim_{z\to 1}\log(z)\right)=\lim_{z\to 1}z=1 $$ and in both cases $\lim_{z\to 1}\log(z)=0$.

Jack D'Aurizio
  • 361,689
1

One way to define $\ln x$ is as $$\int_1^x\dfrac1t\rm dt$$.

Then if we take the limit we get $$\lim_{x\to1}\ln x=\lim_{x\to1}\int_1^x\dfrac 1t\rm dt=\int_1^1\dfrac 1t\rm dt=0$$, if we know $\ln x$ is continuous at $1$.

But in fact, by the fundamental theorem of calculus, it's differentiable there.