I am working through the proof that the convolution of a square integrable function with a compactly supported continuously differentiable function is itself continuously differentiable:
"Let $f\in L^2(\mathbb R)$ and $g\in C_c^\infty(\mathbb R)$. Show that $f*g\in C^\infty(\mathbb R)$ and that $(f*g)^{(k)}=f*g^{(k)}$ for $k\in\mathbb N$."
To this end I have been making use of the following questions and the particularly linked answers:
Convolution of locally integrable and compactly supported infinitely differentiable function
Differentiating under integral for convolution
I am confident that I understand the idea of the proof, however, there are some points which are common throughout each of the attempts which I am not entirely certain on and would appreciate to have better explained.
How, exactly, is the Lebesgue Dominated Convergence Theorem being applied? In taking the difference quotient (in working with the derivative) we obtain something like, $$\lim_{h\to0}\int_\mathbb{R}f(z)\frac{g(x+h-z)-g(x-z)}h\text{d}z,$$ for which we want to find some integrable function $\psi$ so that for all $z\in\mathbb R$, $$\left|\frac{g(x+h-z)-g(x-z)}h\right|<\psi(z).$$ That is to say, we want to find a function, $\psi$, which dominates the above difference quotient. I see that that the hypotheses of the LDCT are satisfied (since $g\in C_c^\infty$ it is Borel measurable), but how do we rectify the fact that our sequence (the difference quotient) isn't indexed by the natural numbers? How do we apply LDCT when we have $0<h<1$, which is uncountable.
In order to find the dominating $\psi$, as mentioned above, we make use of the Mean Value Theorem (Rather than just assume that such a dominating function exists, we should move to draw out the specific existence of such a function). In the first of the linked questions, this is done as follows, \begin{eqnarray*} \left|g(x + h - z) - g(x - z)\right| & = & \left| \int_0^1\frac{\rm d}{{\rm d}s} g(x - z + sh)\; {\rm d} s \right| \\ &\leq & |h|\max_{x\in \mathbb R} |g'(x)|. \end{eqnarray*} How is it that $g'(x-z+sh)$ for $s\in(0,1)$ is bounded by $g'(x)$ as a function of $x$ alone? I am thinking that one defines $g'_s(x):=g'(x - z + sh)$ for $s\in(0,1)$ and then argues that for all $s\in(0,1)$, $|g'_s(x)|<|g'(x)|$ for all $x\in\mathbb R$ so that the sequence of functions $(g_s)_{s\in(0,1)}$ is uniformly bounded. But how does one transition from dealing with $g'(x-z+sh)$ to $g'(x)$? And how does this affect our considerations of the support we are on?