0

Motivated by the answer in Under which conditions does small uniform norm imply 'similarity' of subgradients. I have the following question.

Let $\mathcal{S}$ be the set of proper convex functions functions from $X$ to $\mathbb{R}$, where $X$ is a open and convex subset of $\mathbb{R}^{n}$. I was wondering under which conditions on $\mathcal{S}$ we have \begin{gather} \forall \epsilon>0\ \exists \delta>0 \text{ such that for } f,h \in \mathcal{S} ,\ ||f-h||_{\infty} < \delta \\ \Rightarrow \underset{ x \in X}{\sup} \underset{ v \in \partial f(x), w \in \partial h(x)}{\inf}||v-w||_2 <\epsilon \end{gather}

In particular, I'm more interested in an answer to my 'additional question' below, but an answer to any of my two questions would be fine.

Motivation: Intuitively, the fact that $||f-h||_{\infty}$ is small, means that the shape of the graphs of the functions are similar and hence also their suporting hyperplanes might be similar. Of course this is just a picture that I have in mind for the 1-dimensional case.

From where the problemm comes: I have a function $F$ that maps convex functions to elements of their subgradient at any given point. I would like to show that if the mapped functions are similar, i.e. $||f-h||_{\infty}$ is sufficiently small, then we can say that $||F(h)-F(f)||_{2}$ is small.

Some references: A similar question was asked here [Hausdorff Distance between Subdifferential sets. The problem is that the theory defines more general types of convergences and analyzes the convergence of the subdifferentials in that framework. I have almost no background in functional analysis so I would really just like to know under which conditions my type of 'convergence'; holds, as defined above.

Regarding my qustion: regarding the conditions on the set $\mathcal{S} \ $: The set I'm working with contains positive concave functions (nothing more is assumed), and $X$ is the open unit simplex. Hence, It would be great if we would have the implication for negative convex functions on the open unit simplex. If the implication does not hold, can we make the conditions on the set $\mathcal{S}$ stronger to make sure that the implication hodls?

additional question: what if we consider two Sets, i.e the set $\mathcal{S}$ of uniformly bounded and continously differentiale functions such that the gradient at any point is uiformly bounded and $\mathcal{M}$ the set of piecewise affine functions? It is well known that for all $f$ in the first set there exists a function in the Second set such that the supremum norm is small (as small as we want). Would we then have the Implication for the subgradients of the two functions?

sigmatau
  • 2,722
  • In your final paragraph, do you mean "positive convex functions" instead of "positive concave functions"? – gerw Nov 22 '18 at 07:56
  • No I mean positive concave functions. Substituting $f$ with $-f$ it would be enough to show the implication for negative convex functions. This is what I meant. – sigmatau Nov 22 '18 at 07:59

2 Answers2

1

If $S$ contains uniformly lipschitz functions (As it holds for the sets $S$ and $M$ you mentioned in additional part of your question) Then:

If $h$ monotonically approaches to $h$ then $\partial h$ approaches to $\partial f$ (in terms of the following distance ) $$d(\partial h , \partial f) := \underset{ x \in X}{\sup} \underset{ v \in \partial f(x), w \in \partial h(x)}{\inf}||v-w||_2 $$

In another words, $$h_n \uparrow f \quad \quad \Rightarrow \quad \quad \partial h_n \to \partial f $$

Proof Assume $h_n \uparrow f $. Since $h_n$ are uniformly Lipschitz then there exist a $M > 0$ such that $$\| h_n(x) - h_m(y)\| \leq M \| x -y\|$$ this implies for all $x \in X$, $n \in N$, and $ v \in \partial h_n (x)$ we have $\| v \| \leq M.$

Now pick $x , y \in X $ and fix $x \in X$, assume $\epsilon > 0$ is arbitrary. Since $h_n \uparrow f $, large enough $n$ we have $$ h_n(y) \leq f(y), \quad f(x) -\epsilon \leq h_n (x) $$

Pick $v_n \in \partial h_n(x)$ therefore: $$ \langle v_n , y- x\rangle \leq h_n(y) - h_n(x) \leq f(y) - f(x) + \epsilon . $$

Since $\{ v_n \}$ is bounded sequence, without loss of generality we may assume $v_n \to v .$ So by letting $ n \to \infty$ in above we get

$$ \langle v , y- x\rangle \leq f(y) - f(x) + \epsilon $$

Now because $\epsilon > 0$ was arbitrary, let $\epsilon \to 0$ in above so we arrive $$ \langle v , y- x\rangle \leq f(y) - f(x) $$ which means $ v \in \partial f(x)$

Red shoes
  • 7,184
  • Yes! This is what I am looking for. Would be great if you could add a proof. I was not able to write down a proof for now but intuitively it would make sense. The picture I have in mind are two tangents at the continous function. If we let the distance between the tangent points go to zero then the slopes of the two intersecting tangents should get nearer and nearer to the slope of the continous function in between those tangent points.But this is just a picture I have in mind... – sigmatau Nov 22 '18 at 17:50
  • 1
    @sigmatau let me know if the proof is not clear for you. – Red shoes Nov 24 '18 at 01:41
  • yes, it is clear. – sigmatau Nov 24 '18 at 08:06
  • I noticed that in your proof the choice of $v_n \in \partial h_n(x)$ is arbitrary. In particular, this implies (for the case of comparing functions from the set $S$ and $M$) that if $f$ is continously differentiable we have for $n$ large enough $\underset{x \in X}{\sup} \underset{ v_n \in \partial h_n(x)}{\sup} || v_n - \nabla f(x)||<\epsilon$. right? Hence here we have a nicer result, where we can use the $\sup$ instead of the $\inf$. – sigmatau Nov 25 '18 at 18:16
  • 1
    @sigmatau Yes it is right . Also Monotonically convergence is not really needed. You only need to have $h_n \leq f$ or $f \leq h_n$ for all $n$. – Red shoes Nov 25 '18 at 22:40
0

Here is a small example with $X = (-1,1)$, showing that you need some assumptions on $\mathcal S$.

We take $f(x) = |x|$ and $h_\delta(x) = |x - \delta/2|$. Then, $\|f - h_\delta\| = \delta/2 < \delta$, but $$\partial f(\delta/4) = \{1\},\qquad \partial h_\delta(\delta/4) = \{-1\}.$$ Thus, for $\varepsilon < 2$ there is no $\delta > 0$ such that your conclusion holds for the set $$\mathcal S = \{f, h_\delta \mid \delta > 0\}.$$

gerw
  • 33,373
  • I think you mean $\partial h_{\delta}(\delta / 4)$ right? This is a nice example that shows where the 'weak spot' is, but it is not a complete answer. I also edited the last section of my question. Thank you. – sigmatau Nov 22 '18 at 08:24
  • Of, course, it should be $\delta/4$. – gerw Nov 22 '18 at 08:58