Header logo is

The Kernel Mutual Information

2003

Conference Paper

ei


We introduce a new contrast function, the kernel mutual information (KMI), to measure the degree of independence of continuous random variables. This contrast function provides an approximate upper bound on the mutual information, as measured near independence, and is based on a kernel density estimate of the mutual information between a discretised approximation of the continuous random variables. We show that Bach and Jordan‘s kernel generalised variance (KGV) is also an upper bound on the same kernel density estimate, but is looser. Finally, we suggest that the addition of a regularising term in the KGV causes it to approach the KMI, which motivates the introduction of this regularisation.

Author(s): Gretton, A. and Herbrich, R. and Smola, A.
Journal: IEEE ICASSP Vol. 4
Pages: 880-883
Year: 2003
Month: April
Day: 0

Department(s): Empirical Inference
Bibtex Type: Conference Paper (inproceedings)

Event Name: IEEE ICASSP
Event Place: Hong Kong

Digital: 0
Institution: MPI for Biological Cybernetics
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik

Links: PostScript

BibTex

@inproceedings{2133,
  title = {The Kernel Mutual Information},
  author = {Gretton, A. and Herbrich, R. and Smola, A.},
  journal = {IEEE ICASSP Vol. 4},
  pages = {880-883},
  organization = {Max-Planck-Gesellschaft},
  institution = {MPI for Biological Cybernetics},
  school = {Biologische Kybernetik},
  month = apr,
  year = {2003},
  doi = {},
  month_numeric = {4}
}