4

Previously I posted this question asking for a review of the proof , but I realize the proof it's wrong at all because I can't have the continuity of $f$. So here is another attempt for the proof.

If $f\in R$ on $[a,b]$ and $g$ is a monotonous function on $[a,b],$ then there exist $\epsilon \in [a,b]$ such that $$\int_a^bfg=g(a)\int_a^{\epsilon}f+g(b)\int_{\epsilon}^bf.................. (*)$$

Proof: (attempt)

Let $F(x)=g(a)\int_a^{x}f+g(b)\int_{x}^bf.$ Thus $F$ is continuous.

Also, $F(a)=g(b)\int_a^bf$ and $F(b)=g(a)\int_a^bf$.

Now if $\int_a^bfg$ is between $F(a)$ and $F(b)$ and If I apply the Intermediate value theorem then I'll get something like $(*)$.

But how can I find a 'bound' for $\int_a^bfg$ such that $\int_a^bfg$ is between $F(a)$ and $F(b)$?.

Is this idea for the proof correct?

Note: I can't use mesure theory for the proof because I haven't seen nothing (haven't taken a course) about mesure theory.

user441848
  • 1,738
  • 1
  • 19
  • 55
  • *measure theory – user441848 Jun 22 '17 at 00:49
  • When you say apply the Intermediate Value Theorem, you need to be more specific. What are you applying it on? And what exactly is that something a function of? – Paul Jun 22 '17 at 00:52
  • You don't assume $f\geq 0$? – Smurf Jun 22 '17 at 01:26
  • No, should I assume it? @Smurf – user441848 Jun 22 '17 at 01:29
  • @Paul maybe it's not the IVT idk I though I had seen the proof using it, but now I'm confuse – user441848 Jun 22 '17 at 01:31
  • 1
    Assuming that would ease things, but that doesn't mean anything. There is a problem with your approach, the function $F$ doesn't need to be monotonous, therefore even if it reaches that value, it doesn't need to be in between the two extremes. Picture an odd $f$ (which makes $\int f=0$) then $F(a)=F(b)=0$, then pick any $g$ that makes $\int fg\neq 0$. – Smurf Jun 22 '17 at 01:45

1 Answers1

6

You are trying to prove the second mean value theorem for integrals under very weak assumptions. If you have stronger conditions like continuity or differentiability, there are easier proofs. In fact, if you know that $f$ is non-negative then you can conclude immediately since $g(x)$ is between $g(a)$ and $g(b)$ and $\int_a^b f \geqslant 0$ which implies that $\int_a^b fg $ is between $g(a)\int_a^b f$ and $g(b)\int_a^b f$.

I can provide a very general proof given your hypotheses. Suppose that $g$ is non-decreasing (a similar argument applies if $g$ is non-increasing). Then $h(x) = g(x) - g(a)$ is non-decreasing and non-negative.

We have the following lemma:

Suppose $f$ is Riemann integrable and $h$ is non-decreasing and non-negative. Let $F(x) = \int_x^b f$. If $A \leqslant F(x) \leqslant B$ for all $x \in [a,b],$ then $h(b)A \leqslant \int_a^b f h \leqslant h(b)B$.

Since $F$ is continuous, finite bounds $A = \inf_{x \in [a,b]} F(x)$ and $B = \sup_{x \in [a,b]} F(x)$ exist and by the IVT there exists $\xi \in (a,b)$ such that

$$\int_a^bf h = h(b) \int_\xi^bf$$.

Thus,

$$\int_a^b fg - g(a)\int_a^b f= \int_a^bfh = h(b) \int_\xi^bf = g(b)\int_\xi^bf - g(a) \int_\xi^bf$$.

Adding $g(a)\int_a^bf$ to both sides we get

$$\int_a^b fg = g(b)\int_\xi^bf - g(a)\int_\xi^b + \, g(a)\int_a^bf = g(b)\int_\xi^bf + g(a)\int_a^\xi. $$

It remains to prove the lemma. This can be done using an argument based on Riemann sums.

Taking any partition $P = (x_0,x_1, \ldots, x_n)$ consider the Riemann sums

$$S_P = \sum_{k=1}^n f(x_k) h(x_k)(x_k -x_{k-1}), \\ S_{P,j} = \sum_{k = j}^n f(x_k)(x_k - x_{k-1}),$$

which converge to $\int_a^b fh $ and $\int_{x_{j-1}}^b f$.

Since $f(x_k)(x_k - x_{k-1}) = S_{P,k} - S_{P,k+1}$ we have

$$S_P = \sum_{k=1}^n h(x_k)(S_{P,k} - S_{P,k+1}) \\ = h(x_1)S_{P,1} + (h(x_2) - h(x_1))S_{P,2} + \ldots (h(x_n) - h(x_{n-1}))S_{P,n} $$

Let $\hat{A}$ and $\hat{B}$ be the upper and lower bounds for the finite set $\{S_{P,k}\}$. Since $h$ is non-decreasing $h(x_k) - h(x_{k-1}) \geqslant 0$ and

$$\tag{1}\hat{A} h(b) = \hat{A} h(x_n) \leqslant S_P \leqslant \hat{B} h(x_n) = \hat{B} h(b).$$

As the partition is refined, the sum $S_P$ converges to $\int_a^b fh $ and it can be shown that $\hat{A} \to A$ and $\hat{B} \to B.$

For any $\epsilon > 0$, we can find a sufficiently fine partition $P$ such that

$$\tag{2}S_P - \epsilon < \int_a^b fh < S_P + \epsilon,$$

and for all $j$,

$$\tag{3}\int_{x_{j-1}}^b f - \epsilon < S_{P,j} < \int_{x_{j-1}}^b f + \epsilon .$$

Now (1) and (2) imply

$$\tag{4} \hat{A}h(b) - \epsilon < \int_a^b fh < \hat{B}h(b) + \epsilon.$$

Since,

$$A = \inf_{x \in [a,b]}\int_x^b f \leqslant \hat{A} \leqslant \int_{x_{j-1}}^b f \leqslant \hat{B} \leqslant \sup_{x \in [a,b]}\int_x^b f = B,$$

we have $A - \epsilon \leqslant \hat{A} - \epsilon$ and $\hat{B} + \epsilon \leqslant B + \epsilon,$ which along with (4) implies that

$$(A - \epsilon) h(b) - \epsilon < \int_a^bfh < (B + \epsilon) h(b) + \epsilon,$$

and

$$\tag{5}Ah(b) - [1 + h(b)]\epsilon < \int_a^b fh < B h(b) + [1 + h(b)] \epsilon.$$

Therefore, since $\epsilon > 0$ can be arbitrarily small,

$$h(b)A \leqslant \int_a^b f h \leqslant h(b)B.$$

RRL
  • 92,835
  • 7
  • 70
  • 142
  • $h(x)$ is non-decreasing because $x>a$ right? – user441848 Jun 22 '17 at 03:17
  • You're welcome. Yes its non-decreasing because $g(x) > g(a)$ for $x > a$ under my assumption that $g$ is non-decreasing. But as I said your proof is good when $f(x) \geqslant 0$. More general case requires more work as I showed. – RRL Jun 22 '17 at 03:20
  • Oh ok. And why did you said that $F$ is continuous? and therefore $A,B$ exist? – user441848 Jun 22 '17 at 03:25
  • If $f$ is Riemann integrable it is bounded: $|f(x) \leqslant M$. Thus $F(x) = \int_x^b f(t) , dt$ is continuous since $|F(y) - F(x)| \leqslant \int_x^y|f(t)| , dt \leqslant M|y-x|$. – RRL Jun 22 '17 at 03:38
  • I see the continuity of $F(x)$ now but I don't see the existence of $A,B$, the finite bounds. Can you explain me? – user441848 Jun 22 '17 at 03:44
  • A continuous function on a compact set like $[a,b]$ is bounded. – RRL Jun 22 '17 at 03:46
  • I see, and returning to the $h$ function, why did you assume that it's non-negative? is it assumed w.l.o.g ? – user441848 Jun 22 '17 at 03:52
  • 1
    I forced it to be non-negative by defining it as $h(x) = g(x)- g(a)$ where $g(x) \geqslant g(a)$. That way I could use the lemma. – RRL Jun 22 '17 at 04:01
  • I can add a rough proof of the lemma. – RRL Jun 22 '17 at 04:05
  • yes please, all kind of proofs are welcome here :) – user441848 Jun 22 '17 at 04:05
  • also where did you find the IVT? which version are you using? – user441848 Jun 22 '17 at 04:08
  • The basic IVT. We have that $F$ is continuous, $h(b)A = \inf h(b)F(x) $ and $h(b)B = \sup h(b)F(x) $ . Also $h(b)A < \int_a^b fh < h(b) B$. Hence there is a point $\xi$ such that $\int_a^bfh = h(b)F(\xi) = h(b)\int_\xi^b f$. – RRL Jun 22 '17 at 04:31
  • I see, thanks – user441848 Jun 22 '17 at 04:42
  • Very nice proof. I wasn't aware that the theorem was valid under such weak conditions. +1 – Paramanand Singh Jun 22 '17 at 04:47
  • @ParamanandSingh: Thanks. The standard proofs that revert to Riemann-Stieltjes integrals require that $g$ be continuous I believe. – RRL Jun 22 '17 at 05:27
  • @RRL why $\hat {A}\to A$ and why $\hat {B}\to B?$ – user441848 Jul 01 '17 at 03:17
  • 1
    @AnneliseToft: The argument for that is straightforward using convergence of Riemann sums given that $\hat{A}h(b) \leqslant S_P \leqslant \hat{B} h(b)$ and $\inf_{x \in [a,b]} \int_x^b f = A, , \sup_{x \in [a,b]} \int_x^b f = B$. I'll add it when I get a chance. – RRL Jul 01 '17 at 05:08
  • Thanks @RRL I'll be waiting for it but pls don't take too long – user441848 Jul 01 '17 at 16:21
  • great! @RRL :') – user441848 Jul 01 '17 at 20:23
  • @RRL I think there is a little mistake here $f(x_k)(x_k - x_{k-1}) = S_{P,k} - S_{P,k-1}$ because the equality it's not true, $S_{P,k} - S_{P,k-1}=f(x_{k-1})(x_{k-1}-x_{k-2})$. or did I made a mistake in my calculations? – user441848 Jul 06 '17 at 20:30
  • 1
    You are correct. It should be $f(x_k)(x_k - x_{k-1}) = S_{P,k} - S_{P,k+1}$. I edited to correct this. Everything else carries through. – RRL Jul 06 '17 at 20:43
  • Why can you put this $h(x_n)$ as 'bounds' to $S_P$? I know that as $f$ is Riemann, then it can be bounded by $\hat{A}$ and $\hat{B}$, so we just need bounds for $h$, which is non-decreasing.But why did you choose $h(x_n)?$.This is the expression: $ \hat{A} h(x_n) \leqslant S_P \leqslant \hat{B} h(x_n) $ – user441848 Jul 06 '17 at 21:02
  • So $\hat{A} \leqslant S_{P,j} \leqslant \hat{B}$ for $j = 1, \ldots, n$. The $\hat{A}$ and $\hat{B}$ are just the maximum and minimum values of these sums. Since $S_P = h(x_1)S_{P,1} + (h(x_2) - h(x_1))S_{P,2} + \ldots (h(x_n) - h(x_{n-1}))S_{P,n}$ we have $ \hat{A} h(x_n) = h(x_1)\hat{A} + (h(x_2) - h(x_1))\hat{A} + \ldots (h(x_n) - h(x_{n-1})\hat{A} \leqslant S_P$. Similarly we show $S_P \leqslant \hat{B}h(x_n)$. – RRL Jul 06 '17 at 21:39
  • Oh I didn't know you were taking $j$ as $j=1,...,n$, I though it was $j=j+1,j+2,...,n$. – user441848 Jul 06 '17 at 22:00
  • why did you do the 'trick' of adding and subtracting $h(b)\epsilon $?, in this part $Ah(b) - [1 + h(b)]\epsilon < \int_a^b fh < B h(b) + [1 + h(b)] \epsilon $. At some point we have this $h(b)A-\epsilon < \int_a^b f h < h(b)B+\epsilon$. What if we just say 'As epsilon can be arbitrary small then the result holds' – user441848 Jul 07 '17 at 00:59
  • Also, the final result gives this '<' sign instead of '$\le$'. However you used $\le$ . Why? – user441848 Jul 07 '17 at 01:03
  • I added some steps to clarify how (5) is derived. For your final question if $x - \epsilon < y$ for all $\epsilon > 0$ then $x \leqslant y$. This detail is not important, however, since we just want to show that $\int_a^b f h$ is between $Ah(b)$ and $Bh(b)$. – RRL Jul 07 '17 at 02:52
  • I finally understand the complete proof , thanks again @RRL – user441848 Jul 07 '17 at 03:08
  • @AnneliseToft: You're welcome. – RRL Jul 07 '17 at 03:11