Abstract
Lexical inference in context (LIiC) is the task of recognizing textual entailment between two very similar sentences, i.e., sentences that only differ in one expression. It can therefore be seen as a variant of the natural language inference task that is focused on lexical semantics. We formulate and evaluate the first approaches based on pretrained language models (LMs) for this task: (i) a few-shot NLI classifier, (ii) a relation induction approach based on handcrafted patterns expressing the semantics of lexical inference, and (iii) a variant of (ii) with patterns that were automatically extracted from a corpus. All our approaches outperform the previous state of the art, showing the potential of pretrained LMs for LIiC. In an extensive analysis, we investigate factors of success and failure of our three approaches.- Anthology ID:
- 2021.eacl-main.108
- Volume:
- Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
- Month:
- April
- Year:
- 2021
- Address:
- Online
- Editors:
- Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
- Venue:
- EACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1267–1280
- Language:
- URL:
- https://preview.aclanthology.org/add_missing_videos/2021.eacl-main.108/
- DOI:
- 10.18653/v1/2021.eacl-main.108
- Cite (ACL):
- Martin Schmitt and Hinrich Schütze. 2021. Language Models for Lexical Inference in Context. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 1267–1280, Online. Association for Computational Linguistics.
- Cite (Informal):
- Language Models for Lexical Inference in Context (Schmitt & Schütze, EACL 2021)
- PDF:
- https://preview.aclanthology.org/add_missing_videos/2021.eacl-main.108.pdf
- Code
- mnschmit/lm-lexical-inference
- Data
- SherLIiC