Multi-Granularity Hierarchical Attention Fusion Networks for Reading Comprehension and Question Answering

Wei Wang, Ming Yan, Chen Wu


Abstract
This paper describes a novel hierarchical attention network for reading comprehension style question answering, which aims to answer questions for a given narrative paragraph. In the proposed method, attention and fusion are conducted horizontally and vertically across layers at different levels of granularity between question and paragraph. Specifically, it first encode the question and paragraph with fine-grained language embeddings, to better capture the respective representations at semantic level. Then it proposes a multi-granularity fusion approach to fully fuse information from both global and attended representations. Finally, it introduces a hierarchical attention network to focuses on the answer span progressively with multi-level soft-alignment. Extensive experiments on the large-scale SQuAD, TriviaQA dataset validate the effectiveness of the proposed method. At the time of writing the paper, our model achieves state-of-the-art on the both SQuAD and TriviaQA Wiki leaderboard as well as two adversarial SQuAD datasets.
Anthology ID:
P18-1158
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1705–1714
Language:
URL:
https://aclanthology.org/P18-1158
DOI:
10.18653/v1/P18-1158
Bibkey:
Cite (ACL):
Wei Wang, Ming Yan, and Chen Wu. 2018. Multi-Granularity Hierarchical Attention Fusion Networks for Reading Comprehension and Question Answering. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1705–1714, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Multi-Granularity Hierarchical Attention Fusion Networks for Reading Comprehension and Question Answering (Wang et al., ACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/P18-1158.pdf
Poster:
 P18-1158.Poster.pdf
Code
 alibaba/AliceMind
Data
CBTSQuADTriviaQA