Multi-choice Relational Reasoning for Machine Reading Comprehension

Wuya Chen, Xiaojun Quan, Chunyu Kit, Zhengcheng Min, Jiahai Wang


Abstract
This paper presents our study of cloze-style reading comprehension by imitating human reading comprehension, which normally involves tactical comparing and reasoning over candidates while choosing the best answer. We propose a multi-choice relational reasoning (McR2) model with an aim to enable relational reasoning on candidates based on fusion representations of document, query and candidates. For the fusion representations, we develop an efficient encoding architecture by integrating the schemes of bidirectional attention flow, self-attention and document-gated query reading. Then, comparing and inferring over candidates are executed by a novel relational reasoning network. We conduct extensive experiments on four datasets derived from two public corpora, Children’s Book Test and Who DiD What, to verify the validity and advantages of our model. The results show that it outperforms all baseline models significantly on the four benchmark datasets. The effectiveness of its key components is also validated by an ablation study.
Anthology ID:
2020.coling-main.567
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
6448–6458
Language:
URL:
https://aclanthology.org/2020.coling-main.567
DOI:
10.18653/v1/2020.coling-main.567
Bibkey:
Cite (ACL):
Wuya Chen, Xiaojun Quan, Chunyu Kit, Zhengcheng Min, and Jiahai Wang. 2020. Multi-choice Relational Reasoning for Machine Reading Comprehension. In Proceedings of the 28th International Conference on Computational Linguistics, pages 6448–6458, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Multi-choice Relational Reasoning for Machine Reading Comprehension (Chen et al., COLING 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2020.coling-main.567.pdf
Data
CBT