Personal tools
You are here: Home Publications Statistical Properties and Adaptive Tuning of Support Vector Machines
Document Actions

Y. Lin, G. Wahba, H. Zhang, and Y. Lee (2000)

Statistical Properties and Adaptive Tuning of Support Vector Machines

University of Wisconsin-Madison, Department of statistics technical report(1022).

In this review paper we consider the statistical aspects of support vector machines (SVMs) in the classification context, and describe an approach to adaptively tuning the smoothing parameter(s) in the SVMs. The relation between the Bayes rule of classification and the SVMs is discussed, shedding light on why the SVMs work well. This relation also reveals that the misclassification rate of the SVMs is closely related to the generalized comparative Kullback-Leibler distance ($GCKL$) proposed in Wahba (1999). The adaptive tuning is based on the generalized approximate cross validation ($GACV$), which is an easily computable proxy of the $GCKL$. The results are generalized to the unbalanced case where the fraction of members of the classes in the training set is different than those in the general population, and the costs of misclassification for the two kinds of errors are different. The main results in this paper have been obtained in several places elsewhere. Here we take the opportunity to organize them in one place and note how they fit together and reinforce one another.

Submitted
by admin last modified 2007-01-31 11:08

Powered by Plone CMS, the Open Source Content Management System