4

I am studying Measure Theory using the book by Elstrodt and there are a few statements in the text in the section on contents on the semi-ring $\mathfrak{J} := \{\ ]a,b]\ :\ a \leq b,\ a,b \in \mathbb{R}\ \}$ for which I'm having trouble finding a proof.

The the book states:

Every increasing, continuous function defines a finite pre-measure $\mu_F: \mathfrak{I} \to \mathbb{R}$. Such a pre-measure can be seen as a continuous mass distribution on $\mathbb{R}$. However, there are also completely different pre-measures:

Let $A \subset \mathbb{R}$ be a countable set and $p: A \to \mathbb{R}$ a strictly positive function such that

$$\sum \limits_{y \in A \cap [-n,n]} p(y)$$

converges for all $n \in \mathbb{N}$.

Then

$$\mu(]a,b]) := \sum \limits_{y \in A \cap ]a,b]} p(y) \quad (a \leq b)$$

is a finite pre-measure on $\mathfrak{I}$, and so

$$G(x) := \begin{cases} \sum \limits_{y \in A \cap ]0,x]} p(y), & \text{for } x \geq 0\\ -\sum \limits_{y \in A \cap ]x,0]} p(y), & \text{for } x < 0 \end{cases}$$

is right-continuous. We call such a function a jump function.

The function $G$ is discontinuous exactly at $x \in A$ (see my previous question).

Now here comes the part I am struggling with:

Now consider any increasing, right-continuous function $F: \to \mathbb{R}$ and let $A$ be the set of discontinuities of $F$. Then $A$ is countable since $A = \bigcup \limits_{n=1}^{\infty} A_n$ where $A_n = \{x \in [-n,n]: \lim \limits_{h \downarrow 0} (F(x+h) - F(x-h)) \geq \frac{1}{n}\}$ are finite due to the monotonicity of $F$. For $y \in A$, let $p(y) = \lim \limits_{h \downarrow 0} (F(x+h) - F(x-h))$. Now if $y_1,...,y_n \in A \cap ]-n,n[$ are distinct, then $\sum \limits_{j=1}^{k} p(y_j) \leq F(n) - F(-n)$. Hence, $\sum \limits_{y \in A \cap [-n,n]} p(y)$ converges for all $n \in \mathbb{N}$. Now let $G$ be the corresponding jump function, then $H = F - G$ is continuous on $\mathbb{R}$ and increasing. This means $F$ can be expressed as $F = G + H$, i.e. it is the sum of an increasing jump function and a continuous, increasing function. The functions $G$ and $H$ are unique up to additive constants.

Now most of the last paragraph is clear to me (see here for a proof that $A_n$ is finite and for the convergence of the series of $p(y)$). But how can I prove that

  1. $H$ is continuous,

  2. $H$ is increasing and

  3. that the decomposition of $F$ into $G$ and $H$ is unqiue up to additive constants?

Proof attempt:

1) Continuity of $H$:

Obviously, $H$ is continuous at $x \notin A$ as the difference of two continuous functions and inuitively it should also be continuous at $x \in A$ since $G$ is constructed to remove the discontinuities of $F$ ($p(y) = \lim \limits_{h \downarrow 0} (F(x) - F(x-h)$ since $F$ is right-continuous).

To show that $H = F - G$ is left-continuous at $y \in A$ I have to show that

$$\lim \limits_{n \to \infty} (F(y) - F(x_n) - (G(y) - G(x_n))) = 0$$

for all sequences $x_n \to y$. If $x_n = y$ for any $n \in \mathbb{N}$, then this clearly holds. So it remains to show that $\lim \limits_{n \to \infty} G(y) - G(x_n) = p(y)$ for all sequences $x_n \to y$ with $x_n < y$ for all $n \in \mathbb{N}$.

From my previous question I know that $\lim \limits_{n \to \infty} G(y) - G(x_n) \geq p(y)$ for all sequences $x_n \to y$ with $x_n < y$ for all $n \in \mathbb{N}$, but how can I prove that this must be an equality?

2) $H$ is increasing:

To prove that $H$ is increasing I need to show that

$$F(y) - F(x) - (G(y) - G(x)) \geq 0$$

if $y > x$ (the case $y=x$ is trivial). In other words I need to show that $F$ grows faster than $G$.

Sorry for the long post and thanks a lot!

dan_fulea
  • 37,952
  • @TheAlertGerbil Well, I understand that it's a bit long. But usually people ask for more context if I omit too much. Yes, it's in the post that the set of discontinuities of $F$ is countable. – DerivativesGuy Nov 16 '24 at 11:23
  • What part in my comment do you have difficulty with? – TheAlertGerbil Nov 16 '24 at 11:30
  • @TheAlertGerbil Well, this is not a proof. As I said in my post, intuitively it is clear that the function should be continuous. Same goes for the monotonicity. – DerivativesGuy Nov 16 '24 at 12:13

1 Answers1

1

1. Continuity

Proving continuity is much easier than you think. From your previous questions, I can see that you already know $G$ is right continuous, but you seem confused on why we must have $\lim_{h\downarrow 0}G(x)-G(x-h)=p(x)$. These two facts can actually be proven in the same exact way. Firstly, for $x<y$, I assume you'll accept the following equality. $$G(y)-G(x) = \sum_{t\in A\cap]x,y]}p(t)$$

This form is hard to use, since the domain of summation depends on $x$ and $y$. To fix this, we'll use an indicator function. Let $I(x,t,y)=1$ whenever $t\in ]x,y]$, and $I(x,t,y)=0$ otherwise, then we have the following equalities. $$\begin{align} G(y)-G(x) &= \sum_{t\in A\cap]x,y]} p(t)\cdot I(x,t,y) \\ &= \sum_{t\in A} p(t)\cdot I(x,t,y) \end{align}$$

The first equality holds since the summand is unchanged on that domain, and the second equality holds since extending the domain now just adds a bunch of zeros. This form is extremely convenient however, since it lets us resolve directional limits. For convenience, we extend the domain of $p$ so that $p(t)=0$ whenever $t\notin A$, then the following holds. $$\begin{align} \lim_{x\uparrow y}G(y)-G(x) &= \lim_{x\uparrow y} \sum_{t\in A} p(t)\cdot I(x,t,y) \\ &= \sum_{t\in A} \lim_{x\uparrow y} p(t)\cdot I(x,t,y) \\ &= p(y) \end{align}$$

Swapping the limit with the summation follows directly from the Dominated Convergence theorem. The summand is bounded by the absolutely summable function $p\cdot I(y-1,t,y)$, and moreover the summand converges pointwise. Namely, the summand converges to $0$ when $t\neq y$, and converges to $p(y)$ when $t=y$, hence the sum of limits is just $p(y)$. Incidentally, this discrete version of Dominated Convergence is very easy to prove. The above technique can also be modified to give a simpler proof that $G$ is right-continuous. Since $G$ is right-continuous by proof, and $F$ is right-continuous by premise, then $H=F-G$ is also right-continuous. To prove $H$ is left-continuous, we just resolve the following limit.

$$\begin{align} \lim_{h\downarrow 0} H(x+h)-H(x-h) &= \left(\lim_{h\downarrow 0}F(x+h)-F(x-h)\right) - \left(\lim_{h\downarrow 0}G(x+h)-G(x-h)\right) \\ &= p(x)-p(x) = 0 \\ \lim_{h\downarrow 0} H(x-h) &= \lim_{h\downarrow 0}H(x+h) = H(x) \end{align}$$

So $H$ is also left-continuous, therefore $H$ is continuous.


2. Monotonicity

Proving $H$ is monotonic actually follows from the same argument which was used to show $p$ is summable. Given values $x<a_1<a_2<\cdots<a_N\leq y$, we easily prove $\sum_k p(k) \leq F(y)-F(x)$. Since the difference $G(y)-G(x)$ is the supremum of all those finite sums, we get $G(y)-G(x)\leq F(y)-F(x)$ leading to $H(x)\leq H(y)$.

To carry out this argument more formally, take any finite set $S\subseteq ]x,y]$, find a strictly increasing enumeration $a_1<a_2<\cdots<a_N$ of the set $S=\{a_1,\cdots,a_N\}$, and then observe the following. $$\begin{align} \sum_{t\in S} p(t) &= \sum_{n=1}^N p(a_n) \\ &= \sum_{n=1}^N \lim_{h\downarrow 0} F(a_n)-F(a_n-h) \\ &= \lim_{h\downarrow 0} \left(F(a_N)+\sum_{n=1}^{N-1} F(a_n)\right)-\left(F(a_1-h)+\sum_{n=2}^N F(a_n-h)\right) \\ &\leq \lim_{h\downarrow 0} \left(F(y)+\sum_{n=1}^{N-1} F(a_n)\right)-\left(F(x)+\sum_{n=2}^N F(a_{n-1})\right) \\ &= F(y)-F(x) \end{align}$$

The above inequality holds for all sufficiently small $h>0$, since we have $x<a_1<a_2<\cdots<a_N\leq y$ and since $F$ is monotonic. The above now leads to the following, proving that $H$ is monotonic increasing. $$G(y)-G(x) = \sup_{S\subseteq ]x,y]}^{\#S<\infty} \sum_{t\in S}p(t) \leq F(y)-F(x)$$ $$H(y)-H(x) = (F(y)-F(x))-(G(y)-G(x)) \geq 0$$


3. Uniqueness

Finally, we show that the decomposition $F=G+H$ is unique. Let $G'$ be any increasing jump function, such that the difference $H':=F-G'$ is continuous. Since $G'$ is an increasing jump function, there's a set $A'$ and function $p':A'\to\mathbb{R}$ such that the following holds for all $x<y$. $$G'(y)-G'(x)=\sum_{t\in A'\cap]x,y]} p'(t)$$

For convenience we let $U=A\cup A'$, which is countable, and extend $p'$ so that $p'(t)=0$ whenever $t\notin A'$. We now easily get the following equalities. $$\begin{align} G'(y)-G'(x) = \sum_{t\in U\cap]x,y]} p'(t) \\ G(y)-G(x) = \sum_{t\in U\cap]x,y]} p(t) \\ \end{align}$$

To prove uniqueness, we simply prove $p'=p$. This is trivial, since the continuity of $H$ and $H'$ leads immediately to the following. $$\begin{align} p(x) &= \lim_{h\downarrow 0}(G(x+h)-G(x-h)) \\ &= \lim_{h\downarrow 0}(F(x+h)-F(x-h)) \\ &= \lim_{h\downarrow 0}(G'(x+h)-G'(x-h)) \\ &= p'(x) \end{align}$$

We see that $p=p'$, and consequently $G'(y)-G'(x) = G(y)-G(x)$, from which we infer $G-G'$ is constant. It follows that $H+G=F = H'+G'$ and consequently $H'-H=G-G'$ is the exact same constant. This proves the decomposition $F=G+H$ unique, up to an additive constant.

Note: Technically, the provided definition of jump functions seems to require that $G(0)=0$, in which case we actually get $G=G'$ and $H=H'$. I suspect this is unintended, a minor error induced by the quoted text being overly vague about the exact definition of "jump function".

Jade Vanadium
  • 5,046
  • 10
  • 25
  • Thank you very much for the detailed answer. I think I understand all the points, but still need to understand the proof of the discrete version of the dominated convergence theorem you've linked. Would be great if you could somehow add that to your answer. In the answer by robjohn I don't understand the last inequality in equation (2), i.e. why the last term is less than or equal to $\epsilon$. – DerivativesGuy Nov 19 '24 at 18:49
  • 1
    @DerivativesGuy In their proof, the first sum has only finitely many terms, and the summands all limit to zero, so the first sum limits to zero; limits always permute with finite sums. Hence, the first sum is eventually always less than $\epsilon/2$. The second sum is bounded above by $\epsilon/2$ due to their selection of $n_\epsilon$, hence the limit supremum is at most $\epsilon$. – Jade Vanadium Nov 20 '24 at 15:57
  • Yes, but the terms of the last sum are $|f_k(n) -f(n)$|, but I think we can use the triangle inequality there. and I think the convergence of $\sum |f(n)|$ follows from the fact that $\sum f_k(n) \leq \sum g(n) < \frac{\epsilon}{2}$ for all $k$, i.e. the inequality remains true in the limit. With this we can write $\sum |f_k(n) - f(n)| \leq \sum |f_k(n)| + \sum |f(n)| \leq \sum 2g(n)| < \epsilon$. Does that make sense? – DerivativesGuy Nov 20 '24 at 16:46
  • 1
    @DerivativesGuy Yes, that works. The main idea is just that the bulk of the sum can be approximated with finitely many terms, and the error can be forced to be arbitrarily small due to domination by $g$. The limit permutes with the finite sum, and since the error can be arbitrarily small then we get equality. – Jade Vanadium Nov 20 '24 at 23:10
  • Thanks a lot for your help! – DerivativesGuy Nov 21 '24 at 06:27