29

How does the computational complexity of the Subset Sum problem depend on the parameter $\alpha(S)$ of the input $(S, t)$, defined as follows?

Considering $S$ under the divisibility partial order, i.e. $s_1 \prec s_2 \iff s_1 \mid s_2$, define

$\qquad\qquad \displaystyle \alpha(S) = \max \{|V| \mid V\subseteq S, V$ an antichain$\}$

to be the maximum size of any subset of $S$ whose elements are pairwise relatively prime.

Subset Sum instances with $\alpha(S)=1$ are solvable in polynomial time. For example, if $S$ contains only powers of two, then $\alpha(S) = 1$, and an instance with set $S$ and any target $t$ is solvable in polynomial time.

In fact, when $\alpha(S)=1$, even the (harder) Knapsack problem is known to be solvable in polynomial time.$\dagger$

Are all Subset Sum instances with, say, $\alpha(S) = O(1)$ solvable in polynomial time?


$\dagger$ Solving sequential knapsack problems by M. Hartmann and T. Olmstead (1993)


Neal Young
  • 993
  • 6
  • 13
Chao Xu
  • 3,103
  • 19
  • 34

1 Answers1

4

This problem can be solved in polynomial time using linear programming, and this is actually true for any partial order $(S,\le)$. By the way, we can prove by induction that for any finite partial order set $(S,\le)$, there exists a finite set $S'\subseteq\mathbb{N}$ and a bijection $f:S\rightarrow S'$, such that for all $s_1,s_2\in S, s_1\le s_2 \Leftrightarrow f(s_1) | f(s_2)$.

Let $\mathcal{C}$ be the set formed by the chains in $S$. Remind that $C$ is a chain iff for all $v,v'$ in $C$, $v\le v'$ or $v'\le v$

Now create a boolean variable $x_v$ for each $v\in S$, and a boolean variable $y_C$ for each chain $C$. We can write the following linear program $(P)$ for our problem : $$ \begin{split} \text{Max} \displaystyle\sum\limits_{v\in S} x_v \\ \text{subject to} \displaystyle\sum\limits_{v\in C} &x_v \le 1, \forall C\in\mathcal{C}\\ &x_v \in \{0,1\}, v\in S \end{split} $$

and its dual $(D)$ :

$$ \begin{split} \text{Min} \displaystyle\sum\limits_{C\in \mathcal{C}} y_C\\ \text{subject to} \displaystyle\sum\limits_{C:v\in C} &y_C \ge 1, \forall v\in S\\ &y_C \in \{0,1\}, C\in \mathcal{C} \end{split} $$

Then the problem of finding the minimum cover of an ordered set by chains is the dual of our problem. Dilworth's theorem states that

There exists an antichain A, and a partition of the order into a family P of chains, such that the number of chains in the partition equals the cardinality of A

which means that the optimal solution of these two problems match : $Opt(P)=Opt(D)$

Let $(P^*)$ (resp. $(D^*)$) be the relaxation of $(P)$ (resp. $(D)$) i.e. the same linear program where all constraints $x_v\in\{0,1\}$ (resp. $y_C\in\{0,1\}$) are replaced by $x_v\in [0,1]$ (resp. $y_C\in [0,1]$). Let $Opt(P^*)$ and $Opt(D^*)$ be their optimal solutions. Since $\{0,1\}\subseteq [0,1]$ we have : $$ Opt(P)\le Opt(P^*) \text{ and } Opt(D^*)\le Opt(D) $$ and weak duality theorem establishes that $Opt(P^*)\le Opt(D^*)$ then by putting everything together we have : $$ Opt(P)= Opt(P^*)=Opt(D^*)=Opt(D) $$

Then, using Ellipsoid method, we can compute $Opt(P^*)$ ( $=Opt(P)$) in polynomial time. There are an exponential number of constraints but there exists a polynomial time separation oracle. Indeed given a solution $X$, we can enumerate all couples $s_1,s_2\in X$ and check if $s_1\le s_2$ or $s_2\le s_1$, and therefore decide in polynomial time whether $X$ is feasible or otherwise the constraint associated to the chain $\{v_1,v_2\}$ is violated.

Mathieu Mari
  • 366
  • 1
  • 8