1

I'm looking for a solution that minimizes $$ f(x) = \|x\|_1 $$ where $\|x\|_2 = 1, x_i \in [0,1]$ ($i=1,\dots,K$). The thing is if I try Lagrange multiplier method, I get $1/\sqrt{K}$. But intuitively the solution must be one of standard basis elements $(0,.\dots,1,\dots,0)$.

How can I possibly derive that? Can I not use Lagrange multiplier because $1$-norm is not differentiable at zero?

C.F.G
  • 8,789
le4m
  • 3,052
  • 1
    This isn't a convex optimization problem. The constraint set isn't convex. You have to deal with the points of non-differentiability, and the fact that Lagrange multipliers are necessary (but not sufficient) when differentiable for a minimizer. – Batman Sep 26 '17 at 05:14
  • i've added a bit on the case where $x_i = \frac{1}{\sqrt{K}}$. – Batman Sep 26 '17 at 05:23
  • If $ {x}{i} = \frac{1}{\sqrt{k}} $ then the objective value is $ \sqrt{k} $. You can easily do better than that. Pay attention that due to the second constraint the objective function can be replaced by $ \sum{i} {x}_{i} $. – Royi Sep 26 '17 at 07:37

1 Answers1

4

You can show that the 2-norm is a lower bound on the 1-norm. To see this, note that $|x_1|^2 + |x_2|^2 + \ldots + |x_K|^2 \leq (|x_1| + |x_2|+ \ldots + |x_K|)^2$ and the left hand side is $\lVert x\rVert_2^2$ and the right hand side is $\lVert x \rVert_1^2$. So, $\lVert x \rVert_2 \leq \lVert x \rVert_1$.

Now, you know your constraint is $\lVert x \rVert_2=1$, so $f(x) \geq 1$. The standard basis achieves the above inequality with equality, so the minimum value of $f(x)$ is $1$.


As an aside, you can also prove the inequality $\lVert x\rVert_1 \leq \sqrt{K} \lvert x\rVert_2$. Let $v$ be the vector made by taking the absolute value of $x$ componentwise and $\mathbf{1}$ the allones vector. Then, $\lvert x \rvert_1 = \mathbf{1} \cdot v \leq \sqrt{K} \lvert x \rvert_2$ by the Cauchy Schwarz inequality. By the constraint $\lVert x \rVert_2=1$, you see $\lvert x \rvert_1 \leq \sqrt{K}$. Now, the vectors you've found, where $x_i = \frac{1}{\sqrt{K}}$ meet this upper bound on $\lvert x \rvert_1$ with equality.

This goes to the comment left above; stationary points of the Lagrangian are necessary (but not sufficient) for extrema for differentiable problems.


A farther aside: On a finite dimensional vector space, all norms are equivalent; that is, you can upper and lower bound norm A by constant multiples of norm B, for arbitrary norm A, B. This is a standard exercise for undergraduate math majors.

Batman
  • 19,790
  • Thank you for your answer. If you don't mind, what if we change 1-norm to 3/2-norm? In this case, it is at least differentiable, but I think Lagrange still does not work for minimization just because of the reason you said (not sufficient).

    The thing is my main objective is to find 'the solution' that minimizes it by gradient descent or something. If the above real analysis trick is the only thing that gets to the solution, I cannot really optimize the above function by computational algorithm. So that's the problem for me...

    – le4m Sep 26 '17 at 05:38
  • You still don't need to appeal to numerical methods. You should be able to show that $\lVert x \rVert_p$ is a decreasing function of $p$ (see here). So, for any $1 \leq p \leq 2$, $\lVert x \rVert_2 \leq \lVert x \rVert_p$. Thus, the minimizer is still $1$. – Batman Sep 26 '17 at 05:51