4

I've been shown that in any Hilbert space, for instance, $^2$, there exists an element of minimal norm for any closed convex set.

Now I've noticed there's other values of p that verify said property in $L^p$, even though those aren't Hilbert spaces. Is it true for any $p\in [1, +\infty$]? I don't think it's true for $L^\infty$, but I can't find a way to prove it or any information about it.

  • 1
    It is true only for $1<p<\infty$. In fact, in any strictly convex reflexive space there exists a unique element of minimal norm in any closed convex subset. – Evangelopoulos Foivos Oct 23 '24 at 09:16

1 Answers1

5

The only acceptable range is $1<p<\infty$. In fact:

  1. Existence of an element of minimal norm for every $C \subset X$ is true only in reflexive spaces (see Theorem A), so only the in the reflexive range $1<p<\infty$.

  2. Existence and uniqueness of the element of minimal norm is true only in reflexive and strictly convex spaces (see Theorem B) so again only in the reflexive range $1<p<\infty$.

Notice that the property "Each closed convex $C \subset X$ admits an element of minimal norm" is in fact equivalent to "Each closed convex $C \subset X$ is proximinal", where proximinal means that
$$(\forall x \in X) \, (\exists y_0 \in C) \text{ such that } d(x,C)=\|x-y_0\| \tag{1}$$ where $d(x,C) := \inf_{y \in C} \|x-y\|$. (One direction is trivial by taking $x=0$. For the other direction just consider $C-x$).

Theorem A: In a reflexive space $X$, any closed convex subset is proximinal. In fact, this characterizes reflexive spaces, in the sense that if each closed convex subset $C \subset X$ is proximinal then $X$ is reflexive.

Here is a proof. Let $ C $ be a nonempty, closed and convex subset of $X$ and let $ x \in X $. Let $ (y_n) $ be a minimizing sequence in $ C $ such that $ ||x-y_n|| \to d(x,C)$. It is easy to check that $ (y_n) $ is bounded so that, by reflexivity of $X$, we may assume that $ y_n \xrightarrow{w} y_0 $ for some $ y_0 \in X $ (at least for a subsequence). Notice that $y_0 \in C$ since $C$ is weakly closed by Mazur's theorem. By the weak lower semi-continuity of the norm one has that $$ d(x,C) \leq \|x-y_0\| \leq \liminf_{n \to \infty} \|x-y_n\| = d(x,C) .$$ This shows that $C$ is proximinal.

To see that this characterizes reflexive spaces, suppose that $X$ is not reflexive. Then by Jame's theorem there exists a functional $ f \in B_{X^*} $ that is not norm attaining, that is, there is no $ x \in B_X $ such that $ 1= \|f\| = |f(x)| $. Note that $ \ker f $ is closed and convex and so, by the hypothesis, $\ker f$ is proximinal. But this implies that $ \ker f =X $ and leads to a contradiction. Indeed, if $\ker f \neq X$ then there exists $ x \in X \setminus \ker f $. Since $ \ker f $ is proximinal, there exists $ v \in \ker f $ such that $||x-v|| = \operatorname{dist}(x,\ker f)$. But $$ \|x-v\| =d(x,\ker f) = \frac{ |f(x)|}{\|f\|}= |f(x)| .$$ This then implies that $ f \big ( \frac{x-v}{ \|x-v\|} \big )=1 $ which is impossible since $f$ is not norm attaining.


If one also wants uniqueness of the best approximation point $y_0 \in C$ then $X$ needs to be strictly convex. In fact,

Theorem B: In a reflexive and strictly convex space $X$, every closed convex subset $C\subset X$ is Chebyshev, meaning that (1) holds and the point $y_0 \in C$ is unique. In fact, this characterizes stritly convex and reflexive spaces, in the sense that if every closed convex $C\subset X$ is Chebyshev then $X$ is strictly convex and reflexive.

Here is a proof: Let $C \subset X$ be closed and convex and let $x \in X$. Existence of $y_0 \in C$ is already established by the previous theorem. Suppose, in the sake of contradiction, that there exist two distinct points $ y_1 , y_2 \in C $ such that $ d(x,C) = \|x-y_1\|=\|x-y_2\| $. Note that $ \frac 12 (y_1+y_2) \in C $ and that $$ \left \| \frac{ x-y_1}{2} \right \| = \left \| \frac{ x-y_2}{2} \right \| = \tfrac 12 d(x,C). $$ I claim that $$ \tag{2} \left \| \frac{x-y_1}{2} + \frac{x-y_2}{2} \right \| < \left \| \frac{x-y_1}{2} \right \| + \left \| \frac{x-y_2}{2} \right \|. $$ If not, then we must have equality in (2). But then by strict convexity $ \frac 12 (x-y_1) = \frac{1}{2} a (x-y_2) $ for some $a>0$. Since $ \|x-y_1\|=\|x-y_2\| $ we see that $ a=1 $ and so $ y_1=y_2 $. This is impossible. Hence, \begin{align*} d(x,K) \leq \left \|x- \frac{y_1+y_2}{2} \right \| &= \left \| \frac{x-y_1}{2} + \frac{x-y_2}{2} \right \| < \left \|\frac{x-y_1}{2} \right \| + \left \| \frac{x-y_2}{2} \right \| =d(x,K) \end{align*} which is absurd.

To see why the converse holds, suppose that $X$ is not strictly convex. Hence, there exist two distinct points $ x_1, x_2 \in S_X $ such that the line segment $ (x_1;x_2) := \{tx_1+(1-t) x_2 \colon 0 \le t\le 1 \} $ is contained in $ S_X $. But then every element of $(x_1;x_2) $ is an element of minimal norm since for every $0 \le t \le 1$
$$ ||tx_1 + (1-t) x_2 ||=1 = \inf \{ \|u\| \colon u \in (x_1;x_2)\}. $$