I'm looking for a 'differentiable version' of $\max\{0,x\}$. I need it to be everywhere non-negative but there is a reasonable amount of leeway in terms of how good the approximation is. Essentially I need a function $f:\mathbb{R}\to \mathbb{R}^+$ that satisfies the following properties:
- $f(0)=0$,
- $f(x) \geq 0$ $\forall x \in \mathbb{R}$,
- $f(x) \sim x$ for $x > \delta_1$,
- $f(x) \sim 0$ for $x<\delta_2$,
with the $\delta$'s reasonably small.
Ideally, $f$ is just some combination of standard functions, I intend to work with this function computationally and it needs to be 'finite-differenced'.
Basically I want something a bit like:
$f(x) =\cases{0 \ \text{for} \ x<0\\ \frac{2\delta x^2}{\delta^2+x^2} \ \text {for} \ 0 \leq x \leq \delta \\ x \ \text{for} \ x>\delta }$
but not quite as contrived and still in terms of standard functions (i.e. functions which can be computed reasonably quickly in general mathematical software).
Edit: The second case has been updated.