Recurrent Chunking Mechanisms for Long-Text Machine Reading Comprehension

Hongyu Gong, Yelong Shen, Dian Yu, Jianshu Chen, Dong Yu


Abstract
In this paper, we study machine reading comprehension (MRC) on long texts: where a model takes as inputs a lengthy document and a query, extracts a text span from the document as an answer. State-of-the-art models (e.g., BERT) tend to use a stack of transformer layers that are pre-trained from a large number of unlabeled language corpora to encode the joint contextual information of query and document. However, these transformer models can only take as input a fixed-length (e.g., 512) text. To deal with even longer text inputs, previous approaches usually chunk them into equally-spaced segments and predict answers based on each segment independently without considering the information from other segments. As a result, they may form segments that fail to cover complete answers or retain insufficient contexts around the correct answer required for question answering. Moreover, they are less capable of answering questions that need cross-segment information. We propose to let a model learn to chunk in a more flexible way via reinforcement learning: a model can decide the next segment that it wants to process in either direction. We also apply recurrent mechanisms to enable information to flow across segments. Experiments on three MRC tasks – CoQA, QuAC, and TriviaQA – demonstrate the effectiveness of our proposed recurrent chunking mechanisms: we can obtain segments that are more likely to contain complete answers and at the same time provide sufficient contexts around the ground truth answers for better predictions.
Anthology ID:
2020.acl-main.603
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6751–6761
Language:
URL:
https://aclanthology.org/2020.acl-main.603
DOI:
10.18653/v1/2020.acl-main.603
Bibkey:
Cite (ACL):
Hongyu Gong, Yelong Shen, Dian Yu, Jianshu Chen, and Dong Yu. 2020. Recurrent Chunking Mechanisms for Long-Text Machine Reading Comprehension. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 6751–6761, Online. Association for Computational Linguistics.
Cite (Informal):
Recurrent Chunking Mechanisms for Long-Text Machine Reading Comprehension (Gong et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-dup-bibkey/2020.acl-main.603.pdf
Video:
 http://slideslive.com/38928937
Code
 HongyuGong/RCM-Question-Answering
Data
CoQAQuACTriviaQA