4

Let $\operatorname{ReLU}:\mathbb{R}^1 \to \mathbb{R}^1$ be defined by $\operatorname{ReLU}(t) = \max(0,t)$.

Consider functions $f:\mathbb{R}^p \to \mathbb{R}$ of the form

$$ f(x) = b_0 + \sum_{i=1}^n \alpha_i\operatorname{ReLU}(w_i^\top x + b_i) $$

where $w_i \in \mathbb{R}^p$ and $b_i \in \mathbb{R}$.

Is it possible to find any non-zero such $f$ with compact support and nonzero integral?

If so this would be useful for a constructive version of an "unbounded width" Universal Approximation Theorem for neural networks. We could use such an $f$ to build an approximate identity for convolution.

It is possible when $p=1$ by explicit construction as shown by

$$x \mapsto \operatorname{ReLU}(x+1)+\operatorname{ReLU}(x-1) -2 \operatorname{ReLU}(x)$$

I suspect the answer is negative when $p \geq 2$ (probably for trivial reasons), but I wasn't able to find a proof.

Steven Gubkin
  • 10,018

1 Answers1

3

This is true (i.e., this is impossible in higher dimensions), but the proof is not quite straightforward.

See Theorem 6.1 and it's proof in this paper: https://arxiv.org/abs/2308.03812

PhoemueX
  • 36,211
  • 1
    Excellent! As a followup question: I imagine that it is possible given a certain number of layers dependent on $p$. Are you aware of a construction? If so, are there any constructions with a minimum number of layers? This would give a constructive universal approximation theorem with bounded width. – Steven Gubkin Nov 26 '24 at 19:14
  • Hmm, I think $\operatorname{ReLU}(1 - \displaystyle \sum_{w \in {-1,1}^p}\operatorname{ReLU}(w^\top x)) )$ does the trick, which gives us an arbitrary width universal approximation theorem for depth 2. – Steven Gubkin Nov 26 '24 at 19:50
  • I attract your attention on the fact that cardinal spline functions are typical examples of what you are looking for. See my recent answer here where $(...)_+$ is the equivalent of your ReLU function. – Jean Marie Nov 26 '24 at 22:11
  • The example you give at the end is a particular case of cardinal spline with $>0$ integral. – Jean Marie Nov 26 '24 at 22:17