2

I am working through the proof that the convolution of a square integrable function with a compactly supported continuously differentiable function is itself continuously differentiable:

"Let $f\in L^2(\mathbb R)$ and $g\in C_c^\infty(\mathbb R)$. Show that $f*g\in C^\infty(\mathbb R)$ and that $(f*g)^{(k)}=f*g^{(k)}$ for $k\in\mathbb N$."

To this end I have been making use of the following questions and the particularly linked answers:

Convolution of locally integrable and compactly supported infinitely differentiable function

Derivative of convolution

Differentiating under integral for convolution

I am confident that I understand the idea of the proof, however, there are some points which are common throughout each of the attempts which I am not entirely certain on and would appreciate to have better explained.

  1. How, exactly, is the Lebesgue Dominated Convergence Theorem being applied? In taking the difference quotient (in working with the derivative) we obtain something like, $$\lim_{h\to0}\int_\mathbb{R}f(z)\frac{g(x+h-z)-g(x-z)}h\text{d}z,$$ for which we want to find some integrable function $\psi$ so that for all $z\in\mathbb R$, $$\left|\frac{g(x+h-z)-g(x-z)}h\right|<\psi(z).$$ That is to say, we want to find a function, $\psi$, which dominates the above difference quotient. I see that that the hypotheses of the LDCT are satisfied (since $g\in C_c^\infty$ it is Borel measurable), but how do we rectify the fact that our sequence (the difference quotient) isn't indexed by the natural numbers? How do we apply LDCT when we have $0<h<1$, which is uncountable.

  2. In order to find the dominating $\psi$, as mentioned above, we make use of the Mean Value Theorem (Rather than just assume that such a dominating function exists, we should move to draw out the specific existence of such a function). In the first of the linked questions, this is done as follows, \begin{eqnarray*} \left|g(x + h - z) - g(x - z)\right| & = & \left| \int_0^1\frac{\rm d}{{\rm d}s} g(x - z + sh)\; {\rm d} s \right| \\ &\leq & |h|\max_{x\in \mathbb R} |g'(x)|. \end{eqnarray*} How is it that $g'(x-z+sh)$ for $s\in(0,1)$ is bounded by $g'(x)$ as a function of $x$ alone? I am thinking that one defines $g'_s(x):=g'(x - z + sh)$ for $s\in(0,1)$ and then argues that for all $s\in(0,1)$, $|g'_s(x)|<|g'(x)|$ for all $x\in\mathbb R$ so that the sequence of functions $(g_s)_{s\in(0,1)}$ is uniformly bounded. But how does one transition from dealing with $g'(x-z+sh)$ to $g'(x)$? And how does this affect our considerations of the support we are on?

2 Answers2

2

Nice questions!

For 1. There is a version of the dominated convergence theorem that works for general limits (not only sequences) but you can also derive it as follows:

Pick any sequence $h_n\rightarrow 0$. Look at the term $\psi(h):=\frac{g(x+h-z)-g(x-z)}{h}$. The sequence $\psi(h_n)$ satisfies the condition of the dominated convergence theorem and so $\lim_{n\rightarrow\infty} \int \psi(h_n) d\mu$ convergence. It is not hard to see that the limit for every $h_n$ is the same (it's always the integral of the derivative of $g$). Therefore by Heine's theorem this means that $\lim_{h\rightarrow 0} \int \psi(h_n) d\mu$ exists and equal to the same limit.

For 2. You're just confused because they use the same element to denote two different things. $g'$ is a function. Let $M=\max_{y\in\mathbb{R}} |g'(y)|$.

Then if you pick $t=x-z+sh$, this is clearly a real number and so $|g'(t)|\leq M$.

Yanko
  • 14,341
1
  1. The DCT works not only with sequences, also in cases like this one, in which $h\to0$. You can convince yourself by considering sequences $h_n\to0$.

  2. It is not using $|g'_s(x)|\le|g'(x)|$, but $|g'_s(x)|\le\max_{x\in\Bbb R}|g'(x)|$.