When Truth Matters - Addressing Pragmatic Categories in Natural Language Inference (NLI) by Large Language Models (LLMs)

Reto Gubelmann, Aikaterini-lida Kalouli, Christina Niklaus, Siegfried Handschuh


Abstract
In this paper, we focus on the ability of large language models (LLMs) to accommodate different pragmatic sentence types, such as questions, commands, as well as sentence fragments for natural language inference (NLI). On the commonly used notion of logical inference, nothing can be inferred from a question, an order, or an incomprehensible sentence fragment. We find MNLI, arguably the most important NLI dataset, and hence models fine-tuned on this dataset, insensitive to this fact. Using a symbolic semantic parser, we develop and make publicly available, fine-tuning datasets designed specifically to address this issue, with promising results. We also make a first exploration of ChatGPT’s concept of entailment.
Anthology ID:
2023.starsem-1.4
Volume:
Proceedings of the 12th Joint Conference on Lexical and Computational Semantics (*SEM 2023)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Alexis Palmer, Jose Camacho-collados
Venue:
*SEM
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
24–39
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.starsem-1.4/
DOI:
10.18653/v1/2023.starsem-1.4
Bibkey:
Cite (ACL):
Reto Gubelmann, Aikaterini-lida Kalouli, Christina Niklaus, and Siegfried Handschuh. 2023. When Truth Matters - Addressing Pragmatic Categories in Natural Language Inference (NLI) by Large Language Models (LLMs). In Proceedings of the 12th Joint Conference on Lexical and Computational Semantics (*SEM 2023), pages 24–39, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
When Truth Matters - Addressing Pragmatic Categories in Natural Language Inference (NLI) by Large Language Models (LLMs) (Gubelmann et al., *SEM 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.starsem-1.4.pdf
Dataset:
 2023.starsem-1.4.dataset.zip