0

let $f$,$g$ be $L^1(\mathbb R)$ functions with Lebesgue measure. Define $f_t(x)=\frac {f(x/t)}t$. Prove $f_t*g$ converges to $ag$ in $L^1$ when $t\to0^+$, where $a=\int_{\mathbb R}f(x)dx$.

my approach in brief: since $f,g\in L^1$, by Tonelli-Fubini's theorem, we can show $$\int_{\mathbb R}f_t*gdx=\left(\int_{\mathbb R}f(x)dx\right)\int_{\mathbb R}g(y)dy$$ $$\int_{\mathbb R}f_t*gdx=\left(\int_{\mathbb R}f(x)dx\right)\int_{\mathbb R}g(x)dx$$ $$\int_{\mathbb R}\left(f_t*g-\left(\int_{\mathbb R}f(x)dx\right)\int_{\mathbb R}g(x)\right)dx=0$$ Therefore, $$\int_{\mathbb R}|f_t*g(x)-ag(x)|dx=0$$ I am feeling something is wrong with my approach. Do correct me and give me some hints for solving this simple question. Thanks.

1 Answers1

1

We have

\begin{equation} q \left(t\right) := \int_{\mathbb{R}}^{}\left|{f}_{t} \ast g \left(x\right)-a g \left(x\right)\right| d x \leqslant \int_{\mathbb{R}}^{}\left|{f}_{t} \left(y\right)\right| \left(\int_{\mathbb{R}}^{}\left|g \left(x-y\right)-g \left(x\right)\right| d x\right) d y := \int_{\mathbb{R}}^{}\left|{f}_{t} \left(y\right)\right| {\varphi} \left(y\right) d y\end{equation}

Suppose that $g$ is the indicator function of an interval $\left[{\alpha} , {\beta}\right]$, then $ {\varphi} \left(y\right) = 2 \ \text{min} \left(\left|y\right| , {\beta}-{\alpha}\right)$, hence

\begin{equation}q \left(t\right) \leqslant 2 \int_{\mathbb{R}}\left|f \left(s\right)\right| \text{min} \left(t \left|s\right| , {\beta}-{\alpha}\right) d s\end{equation}

which converges to $0$ by the monotone convergence theorem when $t \rightarrow {0}^{+}$.

The general result follows by density of the space of step functions in $ {L}^{1} \left(\mathbb{R}\right)$.

Gribouillis
  • 16,826
  • the last inequality is a clever step. If I do the last step, I will split the intervals. Actually there is a post more general to my questions. https://math.stackexchange.com/questions/3593945/convergence-of-approximations-of-the-identity-in-lp-mathbb-rd – Fellow InstituteOfMathophile Nov 05 '21 at 17:30