G. Wahba, Y. Lin, and H. Zhang (2000)
GACV for Support Vector Machines
In: Advances in Large Margin Classifiers, ed. by A.J. Smola and P.L. Bartlett and B. Schölkopf and D. Schuurmans, pp. 297-311, Cambridge, MA, MIT Press.
This paper views SVMs as a regularization technique in a reproducing kernel Hilbert space. The generalized comparative Kullback-Leibler distance (GCKL) is reviewed and the authors show that the GCKL for the SVM is an upper bound on its expected misclassification rate. The authors derive the GACV as an estimate of the GCKL, as a function of certain tunable parameters. Preliminary simulations suggest that the GACV has application to model selection problems, since the minimizer of the GACV is a reasonable estimate of the minimizer of the GCKL.