5

Support Vector Machines turn machine learning linear classification tasks into a linear optimization problems.

$$ \text{minimize } J(\theta,\theta_0) = \frac1n \sum_1^n \text{HingeLoss}(\theta,\theta_0) + \frac{\lambda}{2} ||\theta||^2 $$

My question is, what linear programming runs on the background for the minimization of the objective function $J$. Is it Simplex?

HelloWorld
  • 167
  • 5

1 Answers1

6

The SVM problem (and other related problems) can be described as a minimization \ maximization of a quadratic function.

This can be easily solved with the gradient descent algorithm, however I recommend using the SMO algorithm since it is a direct solution (to the dual of the SVM problem), and can be also used for kernelized SVMs

nir shahar
  • 11,753
  • 3
  • 17
  • 35