Header logo is

Justifying Additive Noise Model-Based Causal Discovery via Algorithmic Information Theory

2010

Article

ei


A recent method for causal discovery is in many cases able to infer whether X causes Y or Y causes X for just two observed variables X and Y. It is based on the observation that there exist (non-Gaussian) joint distributions P(X,Y) for which Y may be written as a function of X up to an additive noise term that is independent of X and no such model exists from Y to X. Whenever this is the case, one prefers the causal model X → Y. Here we justify this method by showing that the causal hypothesis Y → X is unlikely because it requires a specific tuning between P(Y) and P(X|Y) to generate a distribution that admits an additive noise model from X to Y. To quantify the amount of tuning, needed we derive lower bounds on the algorithmic information shared by P(Y) and P(X|Y). This way, our justification is consistent with recent approaches for using algorithmic information theory for causal reasoning. We extend this principle to the case where P(X,Y) almost admits an additive noise model. Our results suggest that the above conclusion is more reliable if the complexity of P(Y) is high.

Author(s): Janzing, D. and Steudel, B.
Journal: Open Systems and Information Dynamics
Volume: 17
Number (issue): 2
Pages: 189-212
Year: 2010
Month: June
Day: 0

Department(s): Empirical Inference
Bibtex Type: Article (article)

Digital: 0
DOI: 10.1142/S1230161210000126
Language: en
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik

Links: PDF
Web

BibTex

@article{6536,
  title = {Justifying Additive Noise Model-Based Causal Discovery via Algorithmic Information Theory},
  author = {Janzing, D. and Steudel, B.},
  journal = {Open Systems and Information Dynamics},
  volume = {17},
  number = {2},
  pages = {189-212},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  month = jun,
  year = {2010},
  doi = {10.1142/S1230161210000126},
  month_numeric = {6}
}