Gaussian convolution with variance $v$ is defined as $$ {\cal G}_v[f](x):=\int_{-\infty}^{\infty}\frac{1}{\sqrt{2\pi v}}f(y) e^{-\frac{(y-x)^2}{2v}}dx. $$ Given a function $g$, does there exist a a bounded continuous function $f$ to satisfy $${\cal G}_v[f]=g?$$ Of course, this problem is ill posed in general. To seek a solution, I impose the condition of a rapidly decreasing function in the sense of Schwartz to the target function $g$.
I think that this function class is suitable to find a solution.
The following is the back ground of this inverse problem;
This problem is useful when independent Gaussian noises $Y_1, \ldots, Y_n$ with variance appear. Assume that we can observe only the noisy information $X_1+Y_1, \ldots, X_n+Y_n$ and we need to recover the value $\frac{1}{n} (f(X_1)+\cdots+f(X_n))$, not $X_1, \ldots, X_n$. In this case, we can use $\frac{1}{n} (g(X_1+Y_1)+\cdots + g(X_n+Y_n) )$. In this case, the error \begin{equation} \frac{1}{n} (g(X_1+Y_1)+\cdots + g(X_n+Y_n) )- \frac{1}{n} (f(X_1)+\cdots+f(X_n)) \qquad (1) \end{equation} is evaluated by using the supnorm $\|f\|_{\infty}$ because of the following reasons; When $X_1, \ldots, X_n$ are fixed, the variable $g(X_i+Y_i)-f(X_i)$ is independent of $g(X_{i'}+Y_{i'})-f(X_{i'})$. The variable $g(X_i+Y_i)-f(X_i)$ has average $0$ and its variance is bounded by $\|f\|_{\infty}^2$. Hence, the variance of (1) is bounded by $\frac{\|f\|_{\infty}^2}{n}$.
To consider this problem, I read the following paper. The following paper gives a solution when we adopt L^2 norm as the error. However, I want to use supremum norm.
S. Saitoh Approximate real inversion formulas of the gaussian convolution Applicable Analysis, 83:7, 727-733, DOI: 10.1080/00036810410001657198