PhysNLU: A Language Resource for Evaluating Natural Language Understanding and Explanation Coherence in Physics

Jordan Meadows, Zili Zhou, André Freitas


Abstract
In order for language models to aid physics research, they must first encode representations of mathematical and natural language discourse which lead to coherent explanations, with correct ordering and relevance of statements. We present a collection of datasets developed to evaluate the performance of language models in this regard, which measure capabilities with respect to sentence ordering, position, section prediction, and discourse coherence. Analysis of the data reveals the classes of arguments and sub-disciplines which are most common in physics discourse, as well as the sentence-level frequency of equations and expressions. We present baselines that demonstrate how contemporary language models are challenged by coherence related tasks in physics, even when trained on mathematical natural language objectives.
Anthology ID:
2022.lrec-1.492
Volume:
Proceedings of the Thirteenth Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
4611–4619
Language:
URL:
https://aclanthology.org/2022.lrec-1.492
DOI:
Bibkey:
Cite (ACL):
Jordan Meadows, Zili Zhou, and André Freitas. 2022. PhysNLU: A Language Resource for Evaluating Natural Language Understanding and Explanation Coherence in Physics. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 4611–4619, Marseille, France. European Language Resources Association.
Cite (Informal):
PhysNLU: A Language Resource for Evaluating Natural Language Understanding and Explanation Coherence in Physics (Meadows et al., LREC 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2022.lrec-1.492.pdf
Code
 jmeadows17/physnlu
Data
PhysNLUWikiText-103WikiText-2