I would like to "grok" Slater's condition and other constraint qualification conditions in optimization.
Slater's condition is only one of many different constraint qualifications in the optimization literature. Which one is the most fundamental? Which one tells me "what's really going on"? What is the basic idea at the heart of this?
Also, constraint qualifications appear in both convex and non-convex optimization. Is there a unifying viewpoint that shows it is the same simple, basic idea in all cases?
I'd be interested in any insights or viewpoints that lead to a deeper understanding of constraint qualifications in optimization.
Edit: Here is one possible viewpoint. Buried on p. 223 (chapter 23) of Rockafellar's Convex Analysis, we find the following fundamental and vital fact.
Let $f_1,\ldots,f_m$ be proper convex functions on $\mathbb R^n$, and let $f = f_1 + \cdots + f_m$. If the convex sets $\text{ri}(\text{dom } f_i), i = 1,\ldots m$, have a point in common, then $$ \partial f(x) = \partial f_1(x) + \cdots + \partial f_m(x). $$ This condition for equality can be weakened slightly if certain of the functions, say $f_1,\ldots, f_k$, are polyhedral: then it is enough if the sets $\text{dom } f_i, i = 1,\ldots,k$ and $\text{ri}(\text{dom } f_i), i = k+1,\ldots,m$ have a point in common.
This subdifferential sum rule can be used to derive optimality conditions for various convex optimization problems, including the KKT conditions for convex problems. For example, the optimization problem
\begin{align} \text{minimize} & \quad f(x) \\ \text{subject to } & \quad x \in C \end{align} where $f$ is a closed convex functin and $C$ is a closed convex set, is equivalent to the problem $$ \text{minimize} \quad f(x) + I_C(x) $$ where $I_C$ is the indicator function for $C$. The optimality condition for this problem is $$ 0 \in \partial (f + I_C)(x) = \partial f(x) + \partial I_C(x), $$ but for the equality to be valid the "overlapping relative interior" condition must be satisfied. So, we need the relative interior of $C$ to have a point in common with the relative interior of $\text{dom } f$. This is a "constraint qualification" for the problem of minimizing $f$ subject to the constraint that $x \in C$.
So is this "overlapping relative interiors" condition appearing in the subdifferential sum rule the ultimate, most fundamental constraint qualification?
Can Slater's condition be viewed as a special case of this "overlapping relative interior" condition?
The "overlapping relative interior" condition apparently has nothing to do with non-convex optimization problems. Is there a unifying viewpoint that applies to both convex and non-convex problems?