Empirical Inference
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a subset of functions with small empirical error. We present some applications to classification and prediction with convex function classes, and with kernel classes in particular.
| Author(s): | Bartlett, P. and Bousquet, O. and Mendelson, S. |
| Links: | |
| Journal: | The Annals of Statistics |
| Volume: | 33 |
| Number (issue): | 4 |
| Pages: | 1497-1537 |
| Year: | 2005 |
| Month: | August |
| Day: | 0 |
| BibTeX Type: | Article (article) |
| Digital: | 0 |
| Electronic Archiving: | grant_archive |
| Language: | en |
| Organization: | Max-Planck-Gesellschaft |
| School: | Biologische Kybernetik |
BibTeX
@article{2000,
title = {Local Rademacher Complexities},
journal = {The Annals of Statistics},
abstract = {We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of complexity. The estimates we establish give optimal rates and are based on a local and empirical version of Rademacher averages, in the sense that the Rademacher averages are computed from the data, on a subset of functions with small empirical error. We present some applications to classification and prediction with convex function classes, and with kernel classes in particular.},
volume = {33},
number = {4},
pages = {1497-1537},
organization = {Max-Planck-Gesellschaft},
school = {Biologische Kybernetik},
month = aug,
year = {2005},
author = {Bartlett, P. and Bousquet, O. and Mendelson, S.},
month_numeric = {8}
}
