(2021) Estimating Conditional Mutual Information for Discrete-Continuous Mixtures using Multi-Dimensional Adaptive Histograms.
|
Text
cmixed-marx,yang,vanleeuwen.pdf Download (605kB) | Preview |
Abstract
Estimating conditional mutual information (CMI) is an essential yet challenging step in many machine learning and data mining tasks. Estimating CMI from data that contains both discrete and continuous variables, or even discrete-continuous mixture variables, is a particularly hard problem. In this paper, we show that CMI for such mixture variables, defined based on the Radon-Nikodym derivate, can be written as a sum of entropies, just like CMI for purely discrete or continuous data. Further, we show that CMI can be consistently estimated for discrete-continuous mixture variables by learning an adaptive histogram model. In practice, we estimate such a model by iteratively discretizing the continuous data points in the mixture variables. To evaluate the performance of our estimator, we benchmark it against state-of-the-art CMI estimators as well as evaluate it in a causal discovery setting.
Item Type: | Conference or Workshop Item (A Paper) (Paper) |
---|---|
Divisions: | Jilles Vreeken (Exploratory Data Analysis) |
Conference: | SDM SIAM International Conference on Data Mining |
Depositing User: | Jilles Vreeken |
Date Deposited: | 28 Mar 2022 11:57 |
Last Modified: | 28 Mar 2022 11:57 |
Primary Research Area: | NRA5: Empirical & Behavioral Security |
URI: | https://publications.cispa.saarland/id/eprint/3594 |
Actions
Actions (login required)
View Item |