Attention-over-Attention Neural Networks for Reading Comprehension
Yiming Cui, Zhipeng Chen, Si Wei, Shijin Wang, Ting Liu, Guoping Hu
Abstract
Cloze-style reading comprehension is a representative problem in mining relationship between document and query. In this paper, we present a simple but novel model called attention-over-attention reader for better solving cloze-style reading comprehension task. The proposed model aims to place another attention mechanism over the document-level attention and induces “attended attention” for final answer predictions. One advantage of our model is that it is simpler than related works while giving excellent performance. In addition to the primary model, we also propose an N-best re-ranking strategy to double check the validity of the candidates and further improve the performance. Experimental results show that the proposed methods significantly outperform various state-of-the-art systems by a large margin in public datasets, such as CNN and Children’s Book Test.- Anthology ID:
- P17-1055
- Volume:
- Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
- Month:
- July
- Year:
- 2017
- Address:
- Vancouver, Canada
- Editors:
- Regina Barzilay, Min-Yen Kan
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 593–602
- Language:
- URL:
- https://aclanthology.org/P17-1055
- DOI:
- 10.18653/v1/P17-1055
- Cite (ACL):
- Yiming Cui, Zhipeng Chen, Si Wei, Shijin Wang, Ting Liu, and Guoping Hu. 2017. Attention-over-Attention Neural Networks for Reading Comprehension. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 593–602, Vancouver, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Attention-over-Attention Neural Networks for Reading Comprehension (Cui et al., ACL 2017)
- PDF:
- https://preview.aclanthology.org/improve-issue-templates/P17-1055.pdf
- Code
- additional community code
- Data
- CBT, CNN/Daily Mail, Children's Book Test