A Multi-Stage Memory Augmented Neural Network for Machine Reading Comprehension

Seunghak Yu, Sathish Reddy Indurthi, Seohyun Back, Haejun Lee

[How to correct problems with metadata yourself]


Abstract
Reading Comprehension (RC) of text is one of the fundamental tasks in natural language processing. In recent years, several end-to-end neural network models have been proposed to solve RC tasks. However, most of these models suffer in reasoning over long documents. In this work, we propose a novel Memory Augmented Machine Comprehension Network (MAMCN) to address long-range dependencies present in machine reading comprehension. We perform extensive experiments to evaluate proposed method with the renowned benchmark datasets such as SQuAD, QUASAR-T, and TriviaQA. We achieve the state of the art performance on both the document-level (QUASAR-T, TriviaQA) and paragraph-level (SQuAD) datasets compared to all the previously published approaches.
Anthology ID:
W18-2603
Volume:
Proceedings of the Workshop on Machine Reading for Question Answering
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Eunsol Choi, Minjoon Seo, Danqi Chen, Robin Jia, Jonathan Berant
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
21–30
Language:
URL:
https://aclanthology.org/W18-2603
DOI:
10.18653/v1/W18-2603
Bibkey:
Cite (ACL):
Seunghak Yu, Sathish Reddy Indurthi, Seohyun Back, and Haejun Lee. 2018. A Multi-Stage Memory Augmented Neural Network for Machine Reading Comprehension. In Proceedings of the Workshop on Machine Reading for Question Answering, pages 21–30, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
A Multi-Stage Memory Augmented Neural Network for Machine Reading Comprehension (Yu et al., ACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/teach-a-man-to-fish/W18-2603.pdf
Data
QUASAR-TSQuAD