NATURALPROOFS: Mathematical Theorem Proving in Natural Language

Sean Welleck, Jiacheng Liu, Ronan Le Bras, Hannaneh Hajishirzi, Yejin Choi, Kyunghyun Cho

Research output: Contribution to journalConference articlepeer-review

Abstract

Understanding and creating mathematics using natural mathematical language – the mixture of symbolic and natural language used by humans – is a challenging and important problem for driving progress in machine learning. As a step in this direction, we develop NATURALPROOFS, a multi-domain corpus of mathematical statements and their proofs, written in natural mathematical language. NATURALPROOFS unifies broad coverage, deep coverage, and low-resource mathematical sources, allowing for evaluating both in-distribution and zero-shot generalization. Using NATURALPROOFS, we benchmark strong neural methods on mathematical reference retrieval and generation tasks which test a system’s ability to determine key results that appear in a proof. Large-scale sequence models show promise compared to classical information retrieval methods, yet their performance and out-of-domain generalization leave substantial room for improvement. NATURALPROOFS opens many avenues for research on challenging mathematical tasks.

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'NATURALPROOFS: Mathematical Theorem Proving in Natural Language'. Together they form a unique fingerprint.

Cite this