Teaching Temporal Logics to Neural Networks

Hahn, Christopher and Schmitt, Frederik and Kreber, Jens U. and Rabe, Markus Norman and Finkbeiner, Bernd
(2021) Teaching Temporal Logics to Neural Networks.
In: The Ninth International Conference on Learning Representations.
Conference: ICLR International Conference on Learning Representations

Teaching Temporal Logics to Neural Networks.pdf

Download (1MB) | Preview
Official URL: https://openreview.net/forum?id=dOcQK-f4byz


We study two fundamental questions in neuro-symbolic computing: can deep learning tackle challenging problems in logics end-to-end, and can neural networks learn the semantics of logics. In this work we focus on linear-time temporal logic (LTL), as it is widely used in verification. We train a Transformer on the problem to directly predict a solution, i.e. a trace, to a given LTL formula. The training data is generated with classical solvers, which, however, only provide one of many possible solutions to each formula. We demonstrate that it is sufficient to train on those particular solutions to formulas, and that Transformers can predict solutions even to formulas from benchmarks from the literature on which the classical solver timed out. Transformers also generalize to the semantics of the logics: while they often deviate from the solutions found by the classical solvers, they still predict correct solutions to most formulas.


Actions (login required)

View Item View Item