7

I am interested in a simple property of the gradient flow $$x'(t) = - \nabla f(x)$$ Under what conditions on $f$ does the gradient flow converge to a stationary point?

In particular, I am interested in the simple case of $f : \mathbb{R}^n \to \mathbb{R}$, where we know that $f$ obtains a global minimum somewhere. No assumption of convexity, we allow for arbitrarily many local minima and/or multiple global minima, saddle points, etc. It is fine if part of the conditions on the function require an assumption on the nature of these critical points (e.g. that $f$ has no degenerate saddle points, or something like that). I know that gradient flows are quite general, and PDE's can in many cases be cast as gradient flows on functionals, but I'm not interested in this level of abstraction at the moment — I'm thinking more about nonlinear optimization.

I have done some looking around on the internet, but most discussions get into Morse theory, with which I have no experience, and so I've had some difficulty understanding the jargon. In addition to an answer or simple categorization of the relevant functions (if one exists), I would much appreciate a reference where I could read more about this (and in particular cite at some point) - it's okay if the reference is a bit more technical, or is on Morse theory. It's very possible that the tools of Morse theory are necessary to answer this type of question.

0 Answers0