Hanxiao Liu
2017
RACE: Large-scale ReAding Comprehension Dataset From Examinations
Guokun Lai
|
Qizhe Xie
|
Hanxiao Liu
|
Yiming Yang
|
Eduard Hovy
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
We present RACE, a new dataset for benchmark evaluation of methods in the reading comprehension task. Collected from the English exams for middle and high school Chinese students in the age range between 12 to 18, RACE consists of near 28,000 passages and near 100,000 questions generated by human experts (English instructors), and covers a variety of topics which are carefully designed for evaluating the students’ ability in understanding and reasoning. In particular, the proportion of questions that requires reasoning is much larger in RACE than that in other benchmark datasets for reading comprehension, and there is a significant gap between the performance of the state-of-the-art models (43%) and the ceiling human performance (95%). We hope this new dataset can serve as a valuable resource for research and evaluation in machine comprehension. The dataset is freely available at http://www.cs.cmu.edu/~glai1/data/race/and the code is available at https://github.com/qizhex/RACE_AR_baselines.
Gated-Attention Readers for Text Comprehension
Bhuwan Dhingra
|
Hanxiao Liu
|
Zhilin Yang
|
William Cohen
|
Ruslan Salakhutdinov
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
In this paper we study the problem of answering cloze-style questions over documents. Our model, the Gated-Attention (GA) Reader, integrates a multi-hop architecture with a novel attention mechanism, which is based on multiplicative interactions between the query embedding and the intermediate states of a recurrent neural network document reader. This enables the reader to build query-specific representations of tokens in the document for accurate answer selection. The GA Reader obtains state-of-the-art results on three benchmarks for this task–the CNN & Daily Mail news stories and the Who Did What dataset. The effectiveness of multiplicative interaction is demonstrated by an ablation study, and by comparing to alternative compositional operators for implementing the gated-attention.
Search
Co-authors
- Guokun Lai 1
- Qizhe Xie 1
- Yiming Yang 1
- Eduard Hovy 1
- Bhuwan Dhingra 1
- show all...