1

Let $f:[0,1]^N\rightarrow \mathbb{R}$ be a convex function.

I am trying to understand whether is it possible to find an optimal solution to the problem.

What is known about the problem:

  • Evaluating $f$ at any point can be done in polynomial time.
  • Let $x^*$ be the optimal solution, then $x^*_i\in \{0,1\}$ for $i = 1,...N$
  • It may be assumed that $1\leq \vec{1}\cdot x^*\leq N-1$

With this in mind, can the problem be solved in polynomial time? If so, why exactly? Is there a way to solve this in polynomial time without convex optimization techniques?

  • Why do you think this could be solved in polynomial time? Or if not, why are you asking? – Joseph Camacho Jan 10 '22 at 22:32
  • @JosephCamacho I came across this problem without knowing anything about convex optimization, so I am wondering if this can be done. – RandomAsker889 Jan 10 '22 at 22:34
  • I assume the problem is to minimize the function $f$? If so, then yes, the problem can be solved in polynomial time, as can any convex minimization problem with convex constraints, subject to some regularity conditions. (The guarantee that the solution occurs on the boundary isn't necessary.) See https://en.wikipedia.org/wiki/Convex_optimization. An effective algorithm would probably be projected gradient descent. – Max Jan 11 '22 at 01:52

1 Answers1

0

No, this can't always be solved in polynomial time. Consider a function that is $0$ everywhere except for at the corners of the given hypercube that is in your allowed set of optimal solutions. At these corners, the function is some random number. Then the optimal solution will be at one of the corners, but you need to evaluate every one of them to find which one is maximal.

  • This answer is incorrect, as the question states that $f$ is convex, whereas the function you have described isn't even continuous (let alone convex). Subject to regularity conditions, convex minimization over a convex set is polynomial time. – Max Jan 11 '22 at 01:53
  • I'm not sure that a convex function has to be continuous. Also, what regularity conditions are you talking about? The original asker didn't mention any. – Joseph Camacho Jan 11 '22 at 03:12
  • Regularity conditions include linear independence of constraint function gradients. Convex functions must be continuous: https://math.stackexchange.com/questions/258511/proof-of-every-convex-function-is-continuous – Max Jan 11 '22 at 10:19
  • The convex function in that answer is defined over an open set. It doesn't have edges or corners like $[0, 1]^N$ has. Note that my function is continuous on the interior of $[0, 1]^N$, just not at the corners. – Joseph Camacho Jan 11 '22 at 14:02
  • A convex function must be continuous everywhere on its domain, whether or not the domain is open or closed. This is easy to see: apply Jensen's inequality to the point of discontinuity and some arbitrary point nearby to produce a contradiction. Minimizing a function that is discontinuous on the boundary of the $N$-hypercube is NP-hard: just assign it arbitrary values at each of the corners; now you have to inspect all $2^N$ of the corners to find the optimum, which is exponential. – Max Jan 12 '22 at 03:13