0

I'm looking for a 'differentiable version' of $\max\{0,x\}$. I need it to be everywhere non-negative but there is a reasonable amount of leeway in terms of how good the approximation is. Essentially I need a function $f:\mathbb{R}\to \mathbb{R}^+$ that satisfies the following properties:

  1. $f(0)=0$,
  2. $f(x) \geq 0$ $\forall x \in \mathbb{R}$,
  3. $f(x) \sim x$ for $x > \delta_1$,
  4. $f(x) \sim 0$ for $x<\delta_2$,

with the $\delta$'s reasonably small.

Ideally, $f$ is just some combination of standard functions, I intend to work with this function computationally and it needs to be 'finite-differenced'.

Basically I want something a bit like:

$f(x) =\cases{0 \ \text{for} \ x<0\\ \frac{2\delta x^2}{\delta^2+x^2} \ \text {for} \ 0 \leq x \leq \delta \\ x \ \text{for} \ x>\delta }$

but not quite as contrived and still in terms of standard functions (i.e. functions which can be computed reasonably quickly in general mathematical software).

Edit: The second case has been updated.

Will
  • 154
  • I would probably change your choice to $\frac{x^{2}}{2\delta}+ \frac{\delta}{2}$ in the middle (so that it's continuous at $\delta$), but other than that I don't think you're going to find a much better approximation that can be computed quickly. You could in principle make this function smooth (by using bump functions), but this would make the computation longer. – preferred_anon Feb 18 '17 at 19:40
  • Oops, right you are. – Will Feb 18 '17 at 19:43
  • This makes it discontinuous at $0$. I'd seen this done before to approximate the absolute value function, and it only came to me while I was writing the question so I wrote it without thinking. – Will Feb 18 '17 at 20:40
  • They used it there as a limiting case with $\delta/2$ subtracted from the $x>\delta$ term. – Will Feb 18 '17 at 20:55
  • You could consider using Moreau-Yosida regularization. Equivalently, you could consider a "one-sided" version of the Huber penalty function, which is a smoothed out version of the absolute value function. – littleO Feb 19 '17 at 01:43
  • 1
    Thanks, @littleO. Looks like the Huber penalty is similar to what I'd (mis-)remembered seeing as an approximation to the absolute value. If this is a standard approach I'll maybe go with that over what I have above -- there's no need to keep the slope as 1 for my application. Also, I'll be able to say I've Huberised something which sounds cool. – Will Feb 19 '17 at 02:01

2 Answers2

1

Two options I can see:

  1. (inspired by https://en.wikipedia.org/wiki/Unit_hyperbola): $$f(x) = \frac12(x + \sqrt{\epsilon^2 + x^2})$$
  2. (inspired by https://en.wikipedia.org/wiki/Black%E2%80%93Scholes_model#Black.E2.80.93Scholes_formula): $$f(x) = (x+x_\text{min})\Phi(d_+) - x_\text{min}\Phi(d_-),$$ where $\,\,d_{\pm} := \dfrac{\log(1 + x/x_\text{min}) \pm \frac12\epsilon}{\sqrt\epsilon},\,\,$ for $\,\,x>x_\text{min}$.

Option 1 is both more straightforward and works on all of $\mathbb R$, whereas option 2 seems to converge faster as $\epsilon \downarrow 0$.

311411
  • 3,576
Eric
  • 19
  • 1
0

What about $$ f(x) = \begin{cases} 0 & x\leq 0 \\ x e^{-1/x} & x > 0 \end{cases} $$

  • Thanks, this is the sort of thing I'm after! The $1/x$ might cause some problems for my particular application, but I'll give it a go. – Will Feb 19 '17 at 01:38