0

I am trying to minimize a convex function using the steepest descent method. The function is defined over the domain $D = \{(x, y) \in R^2 : 2x^2+y^2 < 10\}$.

The gradient descent iterations:

$$\mathbf{x}_{n+1}=\mathbf{x}_n-\gamma \nabla F(\mathbf{x}_n),\ n \ge 0$$ where $\gamma$ is my constant stepsize (I tried diminishing stepsize and I get the same problem)

I easily get a solution $\mathbf{x}_{n+1} \notin D$ and then diverge. How should I deal with the problem of going outside the domain?

amaatouq
  • 535

2 Answers2

1

what about considering using gradient projection method, after each step, project your current result onto your constraint set.

Xia
  • 121
0

You are minizing the sum of a smooth function $F$, and a non-smooth function $i_D$ (the indicator function of the domain $D$) for you can compute the proximal operator, which in fact equals the projection operator onto $D$ ($D$ is an ellipsoid, so this is an easy business).

Projected gradient methods like ISTA, FISTA, etc. are your friends here.

You may want to checkout my detailed answer to a similar question.

dohmatob
  • 9,753
  • I was thinking this too, but the constraints have strict inequality, which I think will be a problem for projected gradient methods. The objective function isn't finite on the boundary of the ellipsoid. – littleO Nov 26 '15 at 10:42
  • Now that the OP has offered the specific function, it is clear that no projection is necessary. – Michael Grant Nov 27 '15 at 16:12