Let $\operatorname{ReLU}:\mathbb{R}^1 \to \mathbb{R}^1$ be defined by $\operatorname{ReLU}(t) = \max(0,t)$.
Consider functions $f:\mathbb{R}^p \to \mathbb{R}$ of the form
$$ f(x) = b_0 + \sum_{i=1}^n \alpha_i\operatorname{ReLU}(w_i^\top x + b_i) $$
where $w_i \in \mathbb{R}^p$ and $b_i \in \mathbb{R}$.
Is it possible to find any non-zero such $f$ with compact support and nonzero integral?
If so this would be useful for a constructive version of an "unbounded width" Universal Approximation Theorem for neural networks. We could use such an $f$ to build an approximate identity for convolution.
It is possible when $p=1$ by explicit construction as shown by
$$x \mapsto \operatorname{ReLU}(x+1)+\operatorname{ReLU}(x-1) -2 \operatorname{ReLU}(x)$$
I suspect the answer is negative when $p \geq 2$ (probably for trivial reasons), but I wasn't able to find a proof.