SETI: Systematicity Evaluation of Textual Inference

Xiyan Fu, Anette Frank


Abstract
We propose SETI (Systematicity Evaluation of Textual Inference), a novel and comprehensive benchmark designed for evaluating pre-trained language models (PLMs) for their systematicity capabilities in the domain of textual inference. Specifically, SETI offers three different NLI tasks and corresponding datasets to evaluate various types of systematicity in reasoning processes. In order to solve these tasks, models are required to perform compositional inference based on known primitive constituents. We conduct experiments of SETI on six widely used PLMs. Results show that various PLMs are able to solve unseen compositional inferences when having encountered the knowledge of how to combine primitives, with good performance. However, they are considerably limited when this knowledge is unknown to the model (40-100 % points decrease). Furthermore, we find that PLMs are able to improve dramatically once exposed to crucial compositional knowledge in minimalistic shots. These findings position SETI as the first benchmark for measuring the future progress of PLMs in achieving systematicity generalization in the textual inference.
Anthology ID:
2023.findings-acl.252
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4101–4114
Language:
URL:
https://aclanthology.org/2023.findings-acl.252
DOI:
10.18653/v1/2023.findings-acl.252
Bibkey:
Cite (ACL):
Xiyan Fu and Anette Frank. 2023. SETI: Systematicity Evaluation of Textual Inference. In Findings of the Association for Computational Linguistics: ACL 2023, pages 4101–4114, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
SETI: Systematicity Evaluation of Textual Inference (Fu & Frank, Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2023.findings-acl.252.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-2/2023.findings-acl.252.mp4