Empirical Inference Conference Paper 2008

A Hilbert-Schmidt Dependence Maximization Approach to Unsupervised Structure Discovery

PDF Web
no image
Empirical Inference

In recent work by (Song et al., 2007), it has been proposed to perform clustering by maximizing a Hilbert-Schmidt independence criterion with respect to a predefined cluster structure Y , by solving for the partition matrix, II. We extend this approach here to the case where the cluster structure Y is not fixed, but is a quantity to be optimized; and we use an independence criterion which has been shown to be more sensitive at small sample sizes (the Hilbert-Schmidt Normalized Information Criterion, or HSNIC, Fukumizu et al., 2008). We demonstrate the use of this framework in two scenarios. In the first, we adopt a cluster structure selection approach in which the HSNIC is used to select a structure from several candidates. In the second, we consider the case where we discover structure by directly optimizing Y.

Author(s): Blaschko, MB. and Gretton, A.
Links:
Book Title: MLG 2008
Journal: Proceedings of the 6th International Workshop on Mining and Learning with Graphs (MLG 2008)
Pages: 1-3
Year: 2008
Month: July
Day: 0
Bibtex Type: Conference Paper (inproceedings)
Event Name: 6th International Workshop on Mining and Learning with Graphs
Event Place: Helsinki, Finland
Digital: 0
Electronic Archiving: grant_archive
Language: en
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik

BibTex

@inproceedings{5179,
  title = {A Hilbert-Schmidt Dependence Maximization Approach to Unsupervised Structure Discovery},
  journal = {Proceedings of the 6th International Workshop on Mining and Learning with Graphs (MLG 2008)},
  booktitle = {MLG 2008},
  abstract = {In recent work by (Song et al., 2007), it has been proposed
  to perform clustering by maximizing a Hilbert-Schmidt
  independence criterion with respect to a predefined
  cluster structure Y , by solving for the partition
  matrix, II. We extend this approach here to the
  case where the cluster structure Y is not fixed, but is
  a quantity to be optimized; and we use an independence
  criterion which has been shown to be more sensitive
  at small sample sizes (the Hilbert-Schmidt Normalized
  Information Criterion, or HSNIC, Fukumizu
  et al., 2008). We demonstrate the use of this framework
  in two scenarios. In the first, we adopt a cluster
  structure selection approach in which the HSNIC is
  used to select a structure from several candidates. In
  the second, we consider the case where we discover
  structure by directly optimizing Y.},
  pages = {1-3},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  month = jul,
  year = {2008},
  slug = {5179},
  author = {Blaschko, MB. and Gretton, A.},
  month_numeric = {7}
}