1

I know the very famous Chebyshev's inequality in measure theory which states that for a measure space $X$ and a non negative constant $t$, $$\mu(\{x \in X : |f(x)| > t\}) \leq \frac{1}{t}\int_{X} |f|d \mu.$$ Wikipedia also mentions without citation a more general inequality saying that if g is non negative and non decreasing function with $g(0) \neq 0$ then \begin{equation*}\mu(\{x \in X : |f(x)| > t\}) \leq \frac{1}{g(t)}\int_{X} g (|f|)d \mu. \end{equation*} I tried to search internet for a reference of this inequality, but I couldn't find one. If anyone can provide me a refrence for this, it will be very helpful.

Calvin Khor
  • 36,192
  • 6
  • 47
  • 102
UserA
  • 451
  • 1
    this $\mu({x \in X : |f(x)| > t}) \leq \frac{1}{t}\int_{X} |f|d \mu$ is not Chebyshev's inequality, it is the Markov's inequality –  Jul 14 '21 at 11:30
  • 1
    I have heard both names used for this inequality depending on whether you're in probability or measure theory – taylorsVersion Jul 14 '21 at 11:36
  • 1
    I mentioned the required argument in this somewhat related answer; it in fact follows from the standard Chebyshev. Since the proof is so straightforward, I'm not sure if there is a published book source. Perhaps also consider the related "layer-cake decomposition" which is mentioned in Lieb-Loss's Analysis – Calvin Khor Jul 14 '21 at 13:21

1 Answers1

3

This might help, see the second page ("General Tail Estimate")

The proof goes in a very similar way to the "regular" Chebyshev's (Markov's) inequality.

The equality below is because $g$ is increasing and the inequality is by Chebyshev's inequality $$ \mu(\{|f| \geq t\}) = \mu(\{g(|f|) \geq g(t)\})\leq \frac{1}{g(t)} \int_X g(|f|) d\mu$$