Personal tools
You are here: Home Publications On the optimal parameter choice for $\nu$-support vector machines
Document Actions

Ingo Steinwart (2002)

On the optimal parameter choice for $\nu$-support vector machines

University Jena.

We determine the asymptotically optimal choice of the parameter $\nu$ for classifiers of $\nu$-support vector machine ($\nu$-SVM) type which has been introduced by Sch\"olkopf et al.. It turns out that $\nu$ should be a close upper estimate of twice the optimal Bayes risk provided that the classifier uses a so-called universal kernel such as the Gaussian RBF kernel. Moreover, several experiments show that this result can be used to implement modified cross validation procedures which both train significantly faster and learn significantly better than standard cross validation techniques.

by admin last modified 2007-01-31 11:07

Powered by Plone CMS, the Open Source Content Management System