In this paper we build upon the Multiple Kernel Learning (MKL) framework and in particular on [1] which generalized it to infinitely many kernels. We rewrite the problem in the standard MKL formulation which leads to a Semi-Infinite Program. We devise a new algorithm to solve it (Infinite Kernel Learning, IKL). The IKL algorithm is applicable to both the finite and infinite case and we find it to be faster and more stable than SimpleMKL [2]. Furthermore we present the first large scale comparison of SVMs to MKL on a variety of benchmark datasets, also comparing IKL. The results show two things: a) for many datasets there is no benefit in using MKL/IKL instead of the SVM classifier, thus the flexibility of using more than one kernel seems to be of no use, b) on some datasets IKL yields massive increases in accuracy over SVM/MKL due to the possibility of using a largely increased kernel set. For those cases parameter selection through Cross-Validation or MKL is not applicable.
Author(s): | Gehler, PV. and Nowozin, S. |
Journal: | Proceedings of the NIPS 2008 Workshop on "Kernel Learning: Automatic Selection of Optimal Kernels" |
Pages: | 1-4 |
Year: | 2008 |
Month: | December |
Day: | 0 |
Bibtex Type: | Conference Paper (inproceedings) |
Event Name: | NIPS 2008 Workshop on "Kernel Learning: Automatic Selection of Optimal Kernels" (LK ASOK´08) |
Event Place: | Whistler, BC, Canada |
Digital: | 0 |
Electronic Archiving: | grant_archive |
Institution: | Max-Planck Institute for Biological Cybernetics, Tübingen, Germany |
Language: | en |
Organization: | Max-Planck-Gesellschaft |
School: | Biologische Kybernetik |
Links: |
BibTex
@inproceedings{5657, title = {Infinite Kernel Learning}, journal = {Proceedings of the NIPS 2008 Workshop on "Kernel Learning: Automatic Selection of Optimal Kernels"}, abstract = {In this paper we build upon the Multiple Kernel Learning (MKL) framework and in particular on [1] which generalized it to infinitely many kernels. We rewrite the problem in the standard MKL formulation which leads to a Semi-Infinite Program. We devise a new algorithm to solve it (Infinite Kernel Learning, IKL). The IKL algorithm is applicable to both the finite and infinite case and we find it to be faster and more stable than SimpleMKL [2]. Furthermore we present the first large scale comparison of SVMs to MKL on a variety of benchmark datasets, also comparing IKL. The results show two things: a) for many datasets there is no benefit in using MKL/IKL instead of the SVM classifier, thus the flexibility of using more than one kernel seems to be of no use, b) on some datasets IKL yields massive increases in accuracy over SVM/MKL due to the possibility of using a largely increased kernel set. For those cases parameter selection through Cross-Validation or MKL is not applicable.}, pages = {1-4}, organization = {Max-Planck-Gesellschaft}, institution = {Max-Planck Institute for Biological Cybernetics, Tübingen, Germany}, school = {Biologische Kybernetik}, month = dec, year = {2008}, slug = {5657}, author = {Gehler, PV. and Nowozin, S.}, month_numeric = {12} }