Exploring Transitivity in Neural NLI Models through Veridicality

Hitomi Yanaka, Koji Mineshima, Kentaro Inui


Abstract
Despite the recent success of deep neural networks in natural language processing, the extent to which they can demonstrate human-like generalization capacities for natural language understanding remains unclear. We explore this issue in the domain of natural language inference (NLI), focusing on the transitivity of inference relations, a fundamental property for systematically drawing inferences. A model capturing transitivity can compose basic inference patterns and draw new inferences. We introduce an analysis method using synthetic and naturalistic NLI datasets involving clause-embedding verbs to evaluate whether models can perform transitivity inferences composed of veridical inferences and arbitrary inference types. We find that current NLI models do not perform consistently well on transitivity inference tasks, suggesting that they lack the generalization capacity for drawing composite inferences from provided training examples. The data and code for our analysis are publicly available at https://github.com/verypluming/transitivity.
Anthology ID:
2021.eacl-main.78
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Editors:
Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
920–934
Language:
URL:
https://aclanthology.org/2021.eacl-main.78
DOI:
10.18653/v1/2021.eacl-main.78
Bibkey:
Cite (ACL):
Hitomi Yanaka, Koji Mineshima, and Kentaro Inui. 2021. Exploring Transitivity in Neural NLI Models through Veridicality. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 920–934, Online. Association for Computational Linguistics.
Cite (Informal):
Exploring Transitivity in Neural NLI Models through Veridicality (Yanaka et al., EACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2021.eacl-main.78.pdf
Code
 verypluming/transitivity
Data
MultiNLISICK