6

Projection onto the $\ell$2 norm ball is known. Let the norm ball set reads $C = \left\{x \in \mathbb{R}^n: \left\| x - c \right\|_2^2 \leq d^2 \right\}$, where $c \in \mathbb{R}^n$ and $d \in \mathbb{R}$, then the projection onto this norm ball set can be shown as \begin{align} \Pi_{C} \left ( y \right) = c + \frac{d}{\max \left\{\left\| y - c \right\|_2, d \right\}} \left( y - c \right) \ . \end{align}

How to obtain a projection onto the weighted $\ell$2 norm ball, where the set $C = \left\{x \in \mathbb{R}^n: \left\| x - c \right\|_W^2 \leq d^2 \right\}$, where $W$ is a positive definite matrix?


Note: There is an attempt for weighted l2 norm projection, but we don't have the same problem definition (or may be I can't figure out yet how to massage that answer).


My partial attempt:

Following Brian's comments, I have made an attempt. But I am stuck and not able to compute the Lagrange multiplier.

The weighted $\ell$2 projection problem can be expressed as \begin{align} \text{minimize}_{x \in \mathbb{R}^n} \quad & \left\|y-x\right\|_2^2\\ \text{subject to }\quad & \left\|x-c\right\|_W^2 \leq d^2 \end{align}

Forming the Lagrangian: \begin{align*} L\left(x, \lambda\right) &= \left\|y-x\right\|_2^2 + \lambda \left( \left[ (x-c)^T W (x-c) \right]- d^2 \right) . \end{align*}

Then according to the stationarity condition of the KKT conditions, \begin{align*} \frac{\partial L}{\partial x} = 0 \Rightarrow &= -(y - x) + \lambda \left(W (x-c) \right) = 0 \\ &\Leftrightarrow x = \left( I + \lambda W \right)^{-1} \left(\lambda W c + y \right). \end{align*}

Now I am stuck and don't know how to obtain $\lambda$ in closed-form? I thought about considering the complementary slackness of the KKT conditions, but then it becomes too much involved for me. Say $\lambda > 0$, then \begin{align} (x-c)^T W (x-c) = d^2 \ . \end{align} If I plugin $x = \left( I + \lambda W \right)^{-1} \left(\lambda W c + y \right)$ in the above equation, then I don't know how to solve for $\lambda$. If we can't find it in closed-form, then how to obtain this iteratively? How about if $W$ is a diagonal matrix, then can we obtain $\lambda$ in closed-form?


Further attempt: If $W$ is a diagonal matrix, then following the link, $i$th coordinate of $x$ can be written as \begin{align} x_i = \frac{\lambda W_i c_i + y_i}{1 + \lambda W_i} . \end{align}

Now, plugging into the complementary slackness condition such that \begin{align} (x-c)^T W (x-c) &= \|W^{1/2} (x-c)\|^2 = d^2 \\ &\Leftrightarrow \sqrt{W_i} (x_i - c_i) = \pm d \\ &\Leftrightarrow \lambda = \frac{\pm 1}{d \sqrt{W_i}} \left(y_i - c_i \right) - \frac{1}{W_i} \ . \end{align} Is this correct in case of $W$ is a diagonal matrix?

Question: Would $\lambda$ be different for different coordinates of the vector? Hmm, should it not be a constant for all coordinates?

Royi
  • 10,050
user550103
  • 2,773
  • It's relatively easy to set this up as a constrained optimization problem and apply the Lagrange multiplier method. If your $W$ matrix is diagonal, this simplifies to the case discussed in the question you linked to. Is your $W$ diagonal? If not, then the problem is still relatively easy to solve by iterative schemes. – Brian Borchers Jan 20 '19 at 03:26
  • Thank you Brian. I have made an attempt based on your suggestion (please see above). But I don't know how to obtain $\lambda$ in closed-form even if $W$ is a diagonal matrix. Can you help me how to solve for $\lambda$? Any reference? – user550103 Jan 20 '19 at 08:05
  • Have you thought what the matrix is doing to space? Specifically to the ball? – Royi Jan 24 '19 at 14:19
  • @Royi my impression is that this matrix $W$ are giving different weights to respective coordinates. So, the norm ball may look non-spherical depending on the considered weights. Please correct me if my impression is wrong. – user550103 Jan 24 '19 at 14:38
  • PSD Matrices has special shapes in space. I think it might assist you with the solution. – Royi Jan 24 '19 at 14:51
  • Ok thank you. So, do you have any solution to my question? or any exact reference that has solved such a problem? Thanks – user550103 Jan 24 '19 at 15:11
  • The Lagrangian factor $ \lambda $ is a scalar as the inequality is scalar. – Royi Feb 03 '19 at 04:49
  • @BrianBorchers, Do you have closed form solution for the Diagonal Case? It seems to me one needs iterative method to find $ \lambda $ in the diagonal case as well. If you look at my solution, If you have a Closed form Solution for the Diagonal Case, you have a closed form solution for the PSD case. – Royi Feb 03 '19 at 05:54

1 Answers1

3

The objective function is given by:

$$\begin{align*} \arg \min_{x} \quad & \frac{1}{2} {\left\| x - y \right\|}_{2}^{2} & \text{} \\ \text{subject to} \quad & {\left( x - c \right)}^{T} W \left( x - c \right) \leq d \end{align*}$$

Case I - Diagonal Matrix

In this case the matrix $ W $ is diagonal with $ {w}_{ii} \geq 0 $.

Let's assume we know how to solve this.

Remark
I could find a simple iterative method to find $ \lambda $ for this case but not a closed form solution. Though I'd guess it is doable. As we can change coordinates to make the Ellipsoid into a Ball and then go back.

Case II - Positive Definite Matrix

In this case the matrix $ W $ a Positive Semi Definite (PSD) Matrix - $ W \succeq 0 $.

Since $ W $ is a PSD matrix it can be written as (Eigen Decomposition):

$$ W = {P}^{T} D P $$

Where $ P $ is Unitary Matrix and $ D $ is Diagonal Matrix with $ {d}_{ii} \geq 0 $.

Then one could rewrite the problem as:

$$\begin{align*} \arg \min_{x} \quad & \frac{1}{2} {\left\| x - y \right\|}_{2}^{2} & \text{} \\ \text{subject to} \quad & {\left( x - c \right)}^{T} {P}^{T} D P \left( x - c \right) \leq d \end{align*}$$

Defining $ v = P x $, $ z = P y $ and $ e = P c $ one could rewrite the problem as:

$$\begin{align*} \arg \min_{x} \quad & \frac{1}{2} {\left\| x - y \right\|}_{2}^{2} & \text{} \\ \text{subject to} \quad & {\left( v - e \right)}^{T} D \left( v - e \right) \leq d \end{align*}$$

Now, since $ P $ is Unitary Matrix which means it preserves the $ {L}_{2} $ norm and is invertible then $ \frac{1}{2} {\left\| x - y \right\|}_{2}^{2} = \frac{1}{2} {\left\| P \left( x - y \right) \right\|}_{2}^{2} = \frac{1}{2} {\left\| v - z \right\|}_{2}^{2} $ and the whole problem becomes:

$$\begin{align*} \arg \min_{v} \quad & \frac{1}{2} {\left\| v - z \right\|}_{2}^{2} & \text{} \\ \text{subject to} \quad & {\left( v - e \right)}^{T} D \left( v - e \right) \leq d \end{align*}$$

Where $ D $ is Diagonal Matrix which is exactly Case I we assume we know how to solve.

Code

I created a MATLAB code to solve the problem (The general).
The code is available at my StackExchange Mathematics Q3079400 GitHub Repository.

Royi
  • 10,050
  • This looks good, Royi. I agree that we need to know if a closed form exists for the diagonal case. +1 – user550103 Feb 03 '19 at 12:53
  • As I wrote, it is easy to solve with few iterations of Binary Search. The question is if that what you want? I hope @BrianBorchers have a closed form solution for that. – Royi Feb 03 '19 at 16:04
  • 1
    Related - https://math.stackexchange.com/questions/3581340. – Royi Apr 26 '20 at 06:21
  • @Royi Would you explain how the iterative technique that you wrote gets the right value for lambda? I'm thinking that you iterative over lambda until the penalty term is satisfied closely. Am I right? Would you write the details? – NicNic8 Feb 17 '25 at 19:12
  • @NicNic8, You mean solving the diagonal case? Open a question, I'd be happy to answer with code. – Royi Feb 18 '25 at 05:50