Header logo is

Maximal Margin Classification for Metric Spaces


Conference Paper


In this article we construct a maximal margin classification algorithm for arbitrary metric spaces. At first we show that the Support Vector Machine (SVM) is a maximal margin algorithm for the class of metric spaces where the negative squared distance is conditionally positive definite (CPD). This means that the metric space can be isometrically embedded into a Hilbert space, where one performs linear maximal margin separation. We will show that the solution only depends on the metric, but not on the kernel. Following the framework we develop for the SVM, we construct an algorithm for maximal margin classification in arbitrary metric spaces. The main difference compared with SVM is that we no longer embed isometrically into a Hilbert space, but a Banach space. We further give an estimate of the capacity of the function class involved in this algorithm via Rademacher averages. We recover an algorithm of Graepel et al. [6].

Author(s): Hein, M. and Bousquet, O.
Journal: Learning Theory and Kernel Machines
Pages: 72-86
Year: 2004
Day: 0
Editors: Sch{\"o}lkopf, B. and Warmuth, M. K.
Publisher: Springer

Department(s): Empirical Inference
Bibtex Type: Conference Paper (inproceedings)

DOI: 10.1007/b12006
Event Name: 16. Annual Conference on Computational Learning Theory / COLT Kernel 2003
Event Place: Washinton, DC, USA

Address: Heidelberg, Germany
Institution: MPI for Biological Cybernetics
Language: en
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik

Links: PDF


  title = {Maximal Margin Classification for Metric Spaces},
  author = {Hein, M. and Bousquet, O.},
  journal = {Learning Theory and Kernel Machines},
  pages = {72-86},
  editors = {Sch{\"o}lkopf, B. and Warmuth, M. K.},
  publisher = {Springer},
  organization = {Max-Planck-Gesellschaft},
  institution = {MPI for Biological Cybernetics},
  school = {Biologische Kybernetik},
  address = {Heidelberg, Germany},
  year = {2004}