(2019) Telling Cause from Effect by Local and Global Regression.
|
Text
slope-marx,vreeken-kais-own.pdf Download (1MB) | Preview |
Abstract
We consider the problem of inferring the causal direction between two univariate numeric random variables X and Y from observational data. This case is especially challenging as the graph X causes Y is Markov equivalent to the graph Y causes X, and hence it is impossible to determine the correct direction using conditional independence tests. To tackle this problem, we follow an information theoretic approach based on the algorithmic Markov condition. This postulate states that in terms of Kolmogorov complexity the factorization given by the true causal model is the most succinct description of the joint distribution. This means that we can infer that X is a likely cause of Y when we need fewer bits to first transmit the data over X, and then the data of Y as a function of X, than for the inverse direction. That is, in this paper we perform causal inference by compression. To put this notion to practice, we employ the Minimum Description Length principle, and propose a score to determine how many bits we need to transmit the data using a class of regression functions that can model both local and global functional relations. To determine whether an inference, i.e. the difference in compressed sizes, is significant, we propose two analytical significance tests based on the no-hypercompression inequality. Last, but not least, we introduce the linear-time SLOPE and SLOPER algorithms that through thorough empirical evaluation we show outperform the state of the art by a wide margin.
Item Type: | Article |
---|---|
Divisions: | Jilles Vreeken (Exploratory Data Analysis) |
Depositing User: | Jilles Vreeken |
Date Deposited: | 07 Jun 2019 06:58 |
Last Modified: | 10 May 2021 11:13 |
Primary Research Area: | NRA5: Empirical & Behavioral Security |
URI: | https://publications.cispa.saarland/id/eprint/2916 |
Actions
Actions (login required)
View Item |