Empirical Inference
Empirical Inference
We define notions of stability for learning algorithms and show how to use these notions to derive generalization error bounds based on the empirical error and the leave-one-out error. The methods we use can be applied in the regression framework as well as in the classification one when the classifier is obtained by thresholding a real-valued function. We study the stability properties of large classes of learning algorithms such as regularization based algorithms. In particular we focus on Hilbert space regularization and Kullback-Leibler regularization. We demonstrate how to apply the results to SVM for regression and classification.
Author(s): | Bousquet, O. and Elisseeff, A. |
Journal: | Journal of Machine Learning Research |
Volume: | 2 |
Pages: | 499-526 |
Year: | 2002 |
Day: | 0 |
Bibtex Type: | Article (article) |
Digital: | 0 |
Electronic Archiving: | grant_archive |
Organization: | Max-Planck-Gesellschaft |
School: | Biologische Kybernetik |
Links: |
BibTex
@article{1439, title = {Stability and Generalization}, journal = {Journal of Machine Learning Research}, abstract = { We define notions of stability for learning algorithms and show how to use these notions to derive generalization error bounds based on the empirical error and the leave-one-out error. The methods we use can be applied in the regression framework as well as in the classification one when the classifier is obtained by thresholding a real-valued function. We study the stability properties of large classes of learning algorithms such as regularization based algorithms. In particular we focus on Hilbert space regularization and Kullback-Leibler regularization. We demonstrate how to apply the results to SVM for regression and classification.}, volume = {2}, pages = {499-526}, organization = {Max-Planck-Gesellschaft}, school = {Biologische Kybernetik}, year = {2002}, slug = {1439}, author = {Bousquet, O. and Elisseeff, A.} }