Reading Comprehension as Natural Language Inference:A Semantic Analysis
Anshuman Mishra, Dhruvesh Patel, Aparna Vijayakumar, Xiang Li, Pavan Kapanipathi, Kartik Talamadupula
Abstract
In the recent past, Natural language Inference (NLI) has gained significant attention, particularly given its promise for downstream NLP tasks. However, its true impact is limited and has not been well studied. Therefore, in this paper, we explore the utility of NLI for one of the most prominent downstream tasks, viz. Question Answering (QA). We transform one of the largest available MRC dataset (RACE) to an NLI form, and compare the performances of a state-of-the-art model (RoBERTa) on both these forms. We propose new characterizations of questions, and evaluate the performance of QA and NLI models on these categories. We highlight clear categories for which the model is able to perform better when the data is presented in a coherent entailment form, and a structured question-answer concatenation form, respectively.- Anthology ID:
- 2020.starsem-1.2
- Volume:
- Proceedings of the Ninth Joint Conference on Lexical and Computational Semantics
- Month:
- December
- Year:
- 2020
- Address:
- Barcelona, Spain (Online)
- Editors:
- Iryna Gurevych, Marianna Apidianaki, Manaal Faruqui
- Venue:
- *SEM
- SIG:
- SIGLEX
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 12–19
- Language:
- URL:
- https://aclanthology.org/2020.starsem-1.2
- DOI:
- Cite (ACL):
- Anshuman Mishra, Dhruvesh Patel, Aparna Vijayakumar, Xiang Li, Pavan Kapanipathi, and Kartik Talamadupula. 2020. Reading Comprehension as Natural Language Inference:A Semantic Analysis. In Proceedings of the Ninth Joint Conference on Lexical and Computational Semantics, pages 12–19, Barcelona, Spain (Online). Association for Computational Linguistics.
- Cite (Informal):
- Reading Comprehension as Natural Language Inference:A Semantic Analysis (Mishra et al., *SEM 2020)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/2020.starsem-1.2.pdf
- Data
- MultiNLI, SNLI