0

There are of course $n \choose k$ monotone disjunctions which bounds the VC dimension at $\log_2 {n \choose k}$. I'm wondering if this is bound at $k \log_2 n$? (Possibly follows from combinatorial identities).

More generally I'm looking at the claim in An algorithmic theory of learning: Robust concepts and random projection: "Further, it is NP-hard to learn a disjunction of $k$ variables as a disjunction of few than $k \log n$ variable." The author states this without proof; I'm assuming it's well-known or obvious but am trying to figure out why or where it is proved.

djechlin
  • 497
  • 4
  • 15

1 Answers1

1

Yes. This bound follows by a little bit of straightforward manipulation:

$${n \choose k} = {n \times (n-1) \times \dots \times (n-k+1) \over k!} \le {n \times n \times \dots \times n \over 1} = n^k.$$

Now taking the logarithm of both sides, we see that

$$ \lg {n \choose k} \le \lg(n^k) = k \lg n.$$

The bound can also be derived from bounds listed in Wikipedia: it follows from the fact that ${n \choose k} \le n^k/k!$. Check out Wikipedia in the future -- it's often helpful!

D.W.
  • 167,959
  • 22
  • 232
  • 500