16

I don't understand why some functions that contain a singularity in the domain of integration are integrable but others are not.

For example, consider $f(x) = -\log(x)$ and $g(x) = \frac{1}{x}$ on the interval $[0, 1]$. These functions look very similar when they are plotted but only $f(x)$ can be integrated.

  1. What is the precise mathematical reason(s) that makes some functions with singularities integrable while others are not?
  2. Are $\log$ functions the only functions with singularities that can be integrated or are there other types of functions with singularities that can be integrated?
ManUtdBloke
  • 2,724
  • 4
    How about $1/x^p$, $0<p<1$? –  Jan 02 '17 at 11:46
  • 3
    The integral in $[0,1]$ is the same that the integral in $(0,1)$. The reason is not the singularity, the reason is the divergence or convergence of the integral. –  Jan 02 '17 at 11:57

4 Answers4

8

Think about it this way - what's the inverse?

$$y = \frac{1}{x}; x = \frac{1}{y}$$ $$y = -\log x; x = e^{-y}$$

Looking at it this way, it's clear that as $y$ shoots off to infinity, $x$ approaches zero much faster in one case than in the other.

user361424
  • 1,777
  • Ok so one approaches zero much faster but how fast precisely does a function have to approach zero to make it integrable? ..what is the minimum 'speed' a function needs to have for it to be integrable? – ManUtdBloke Jan 06 '17 at 22:08
  • 2
    Well, there isn't a minimum exactly... it has to approach quickly enough that the inverse converges at infinity. So, for instance, $\frac{1}{x^{1-\epsilon}}$ would converge at zero, since $\frac{1}{x^{1+\epsilon}}$ converges at infinity. – user361424 Jan 07 '17 at 04:09
8

It's just whether or not the area under the curve is finite or not. It's doesn't matter that there is an asymptote.

You might consider the area under the curves $y=e^{-x}$ and $y=1/x$ for $x>0$. These are really the same two curves you mention, just along the other axis.

It's akin to the idea that an infinite series may or may not converge; just because there are infinitely many terms in a series doesn't mean the series must diverge.

MPW
  • 44,860
  • 2
  • 36
  • 83
  • I don't see how these are the "same two curves"..one of them, $y=e^{-x}$, has a precise value at $x=0$ and so it can be integrated on interval $[0, \infty)$, whereas $y=1/x$ gets arbitrarily large near zero and can't be integrated on that interval so the area under this curve is undefined. I don't understand what point you are making in your post? – ManUtdBloke Jan 06 '17 at 22:00
  • 1
    @eurocoder : You misunderstand--I mean the two curves I mention are essentially the same two curves you ask about, just along the other axis. I added this to my answer to clarify. – MPW Jan 07 '17 at 04:56
  • 1
    They're note essentially the same for the purposes of integration, $\log x$ is not integrable on $(0,\infty)$ but $e^{-x}$ is. Perhaps you mean something else, but it's not clear from your wording what notion of "same" you are using and why it's relevant to the question at-hand. – Adam Hughes Jan 08 '17 at 09:26
  • @AdamHughes : Of course, I'm talking about finding the area between the alternate curves and the other axis, as I already said. They are exactly the same thing. – MPW Jan 25 '17 at 14:16
  • @MPW maybe as curves I different coordinate systems, but integration users a consistent orientation, so this doesn't amount to much mathematically to say they're the same in an integration context if they're not even being considered on the same ground for integration's necessary context. – Adam Hughes Jan 25 '17 at 14:20
  • @AdamHughes : Good grief. Are you just being argumentative? I don't think I'm really saying anything very profound, just trying to write it in terms of a function OP may realize is integrable. It should be obvious to you that $\int_0^1 (-\log x); dx$ and $\int_0^{\infty}e^{-y}; dy$ are the same thing. You can think of the area as being either between the curve and the x-axis, or between the SAME curve and the y-axis. It's the same curve, the same area. After all, $y=-\log x$ is equivalent to $x = e^{-y}$. – MPW Jan 25 '17 at 14:27
  • @MPW I don't think you are understanding the criticism I'm putting forward: One of those functions has a singularity in its domain, the other doesn't. The fact that they happen to be inverse functions and both have infinite integrals isn't really relevant because the question doesn't ask about that. Once you do the inversion the singularity disappears because it's no longer in the domain, so you leave the context of the question. – Adam Hughes Jan 25 '17 at 15:16
  • @AdamHughes : That's just not the case. The domain in one case is $(0,1)$, and is $(0,\infty)$ in the other (the inclusion/exclusion of $1$ doesn't affect the value of the integral). Both are improper integrals: the first, because the function has a vertical asymptote on the boundary of the domain, and the second because the domain is infinite. What is really being computed in each case is $\lim_{r\to 0}\int_r^1$ and $\lim_{r\to\infty}\int_0^r$ (with appropriate integrands etc). – MPW Jan 25 '17 at 15:53
  • @MPW the op specifically notes a closed interval and says "singularity in the domain of integration." The question is not--to my reading--about general improper integrals, it's about ones with singularities. Even if you soften to an open interval, one improper integral is because of infinity, the other is because of a singularity on the boundary. – Adam Hughes Jan 25 '17 at 16:58
6

The key is how fast the function is diverging.

Regarding your two examples, $-\log$ is going really fast close to the $y$-axis so it is integrable, but not $x\mapsto \frac {1}{x}$.

  • You have $$\int_a^1 -\log(x)\mathrm dx=a(1-\log(a))+1\xrightarrow[a\to 0^+]{} 1<\infty.$$ So this function is integrable.

  • You have $$\int_a^1 \frac 1x\mathrm dx=-1+\frac 1{a^2}\xrightarrow[a\to 0^+]{} +\infty.$$ So this function is not integrable.

Regarding your second question, $\log$ functions are absolutely not the only one. To convince yourself, take for instance $x\mapsto \frac 1{\sqrt{x}}$ on $(0,1)$.

E. Joseph
  • 15,066
1

Simple: $\quad -\log x=_0 o\Bigl(\dfrac1{\sqrt x}\Bigr)$ and the integral of $\dfrac 1{\sqrt x}$ on [0,1] is convergent.

Bernard
  • 179,256
  • Where is the $o(1/\sqrt{x})$ coming from and how does convergences of that function on the interval imply convergence of $-\log{x}$? – ManUtdBloke Jan 06 '17 at 22:05
  • 1
    The $o(1/\sqrt{x})$ corresponds to the high school limit $\sqrt x\log x\xrightarrow[x\to 0^+]{}0$. The convergence of the integral is a well-known theorem in asymptotic analysis. – Bernard Jan 06 '17 at 22:31
  • But the problem I have is that why are some functions with singularities integrable and orthers aren't. So you are saying $\frac{1}{\sqrt{x}}$ is integrable, but we know $\frac{1}{x}$ is not. Why is this? It seems the exponent on $x$ in the denominator determines whether a function will be integrable. So if that is the case 'where' precisely (for what value of exponent) does a function fail to be integrable? – ManUtdBloke Jan 10 '17 at 08:17
  • 1
    Since the anti derivative of$\dfrac1{x^r}$ is $-\dfrac1{(r-1)x^{r-1}}$ the answer is $r-1<0$ (there must be no denominator), i.e. $r<1$. – Bernard Jan 10 '17 at 09:43
  • @Bernard Could you please name or point me to said theorem in asymtotic analysis? It's not that obvious to find e.g. in the books of Temme or Wong if youre just starting out. – Christoph90 Feb 21 '19 at 14:27
  • @Christoph90: It can be proved using high-school results: you can easily show (by comparing the derivatives and using the Mean value theorem ) that $\log x<2\sqrt x$ for all $x>1$. The assertion follows. – Bernard Feb 21 '19 at 14:33
  • You misread, my question concerned the other part of your above comment: Why does the convergence of the integral over a function B generally imply convergence of the integral of function A if A = o(B) (or how do you show that)? – Christoph90 Feb 21 '19 at 14:36
  • 2
    Actually, the ‘theorem’ (a great word for a simple observation) is valid for non-negative functions $f$ and $g$ such that $f=O(g)$: if the integral of $g$ converges, the integral of $f$ does too (obvious since the integral preserves inequalities). As $f=o(g)\implies f=O(g)$, the result follows. – Bernard Feb 21 '19 at 14:58