Entity Tracking Improves Cloze-style Reading Comprehension

Luong Hoang, Sam Wiseman, Alexander Rush


Abstract
Recent work has improved on modeling for reading comprehension tasks with simple approaches such as the Attention Sum-Reader; however, automatic systems still significantly trail human performance. Analysis suggests that many of the remaining hard instances are related to the inability to track entity-references throughout documents. This work focuses on these hard entity tracking cases with two extensions: (1) additional entity features, and (2) training with a multi-task tracking objective. We show that these simple modifications improve performance both independently and in combination, and we outperform the previous state of the art on the LAMBADA dataset by 8 pts, particularly on difficult entity examples. We also effectively match the performance of more complicated models on the named entity portion of the CBT dataset.
Anthology ID:
D18-1130
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1049–1055
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/D18-1130/
DOI:
10.18653/v1/D18-1130
Bibkey:
Cite (ACL):
Luong Hoang, Sam Wiseman, and Alexander Rush. 2018. Entity Tracking Improves Cloze-style Reading Comprehension. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 1049–1055, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Entity Tracking Improves Cloze-style Reading Comprehension (Hoang et al., EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/D18-1130.pdf
Attachment:
 D18-1130.Attachment.pdf
Code
 harvardnlp/readcomp
Data
CBTLAMBADA