Exploring Factual Entailment with NLI: A News Media Study

Guy Mor-Lan, Effi Levi


Abstract
We explore the relationship between factuality and Natural Language Inference (NLI) by introducing FactRel – a novel annotation scheme that models factual rather than textual entailment, and use it to annotate a dataset of naturally occurring sentences from news articles. Our analysis shows that 84% of factually supporting pairs and 63% of factually undermining pairs do not amount to NLI entailment or contradiction, respectively, suggesting that factual relationships are more apt for analyzing media discourse. We experiment with models for pairwise classification on the new dataset, and find that in some cases, generating synthetic data with GPT-4 on the basis of the annotated dataset can improve performance. Surprisingly, few-shot learning with GPT-4 yields strong results on par with medium LMs (DeBERTa) trained on the labelled dataset. We hypothesize that these results indicate the fundamental dependence of this task on both world knowledge and advanced reasoning abilities.
Anthology ID:
2024.starsem-1.15
Volume:
Proceedings of the 13th Joint Conference on Lexical and Computational Semantics (*SEM 2024)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Danushka Bollegala, Vered Shwartz
Venue:
*SEM
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
190–199
Language:
URL:
https://aclanthology.org/2024.starsem-1.15
DOI:
Bibkey:
Cite (ACL):
Guy Mor-Lan and Effi Levi. 2024. Exploring Factual Entailment with NLI: A News Media Study. In Proceedings of the 13th Joint Conference on Lexical and Computational Semantics (*SEM 2024), pages 190–199, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Exploring Factual Entailment with NLI: A News Media Study (Mor-Lan & Levi, *SEM 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/jeptaln-2024-ingestion/2024.starsem-1.15.pdf