Personal tools
You are here: Home Publications Leave one out error, stability, and generalization of voting combinations of classifiers
Document Actions

T. Evgeniou, M. Pontil, and A. Elisseeff (2001)

Leave one out error, stability, and generalization of voting combinations of classifiers

INSEAD Working Paper.

We study the generalization error of voting combinations of learning machines. A special case considered is bagging. We analyze in detail combinations of kernel machines, such as support vector machines, and present theoretical bounds on their generalization error using leave one out error estimates. We also derive novel bounds on the stability of combinations of any classifiers. These bounds can be used to formally show that, for example, bagging increases the stability of unstable learning machines. As a special case we study the stability and generalization of bagging kernel machines and report experiments validating the theoretical findings.

by admin last modified 2007-01-31 11:09

Powered by Plone CMS, the Open Source Content Management System