(2022) Efficiently Factorizing Boolean Matrices using Proximal Gradient Descent.
Text
efficiently_factorizing_boolea.pdf - Accepted Version Download (2MB) |
Abstract
Addressing the interpretability problem of NMF on Boolean data, Boolean Matrix Factorization (BMF) uses Boolean algebra to decompose the input into low-rank Boolean factor matrices. These matrices are highly interpretable and very useful in practice, but they come at the high computational cost of solving an NP-hard combinatorial optimization problem. To reduce the computational burden, we propose to relax BMF continuously using a novel elastic-binary regularizer, from which we derive a proximal gradient algorithm. Through an extensive set of experiments, we demonstrate that our method works well in practice: On synthetic data, we show that it converges quickly, recovers the ground truth precisely, and estimates the simulated rank exactly. On real-world data, we improve upon the state of the art in recall, loss, and runtime, and a case study from the medical domain confirms that our results are easily interpretable and semantically meaningful.
Item Type: | Conference or Workshop Item (A Paper) (Paper) |
---|---|
Conference: | NeurIPS Conference on Neural Information Processing Systems |
Depositing User: | Sebastian Dalleiger |
Date Deposited: | 25 Nov 2022 14:06 |
Last Modified: | 25 Nov 2022 14:06 |
Primary Research Area: | NRA1: Trustworthy Information Processing |
URI: | https://publications.cispa.saarland/id/eprint/3885 |
Actions
Actions (login required)
View Item |