Abstract
A shared task is a typical question answering task that aims to test how accurately the participants can answer the questions in exams. Typically, for each question, there are four candidate answers, and only one of the answers is correct. The existing methods for such a task usually implement a recurrent neural network (RNN) or long short-term memory (LSTM). However, both RNN and LSTM are biased models in which the words in the tail of a sentence are more dominant than the words in the header. In this paper, we propose the use of an attention-based LSTM (AT-LSTM) model for these tasks. By adding an attention mechanism to the standard LSTM, this model can more easily capture long contextual information.- Anthology ID:
- I17-4035
- Volume:
- Proceedings of the IJCNLP 2017, Shared Tasks
- Month:
- December
- Year:
- 2017
- Address:
- Taipei, Taiwan
- Editors:
- Chao-Hong Liu, Preslav Nakov, Nianwen Xue
- Venue:
- IJCNLP
- SIG:
- Publisher:
- Asian Federation of Natural Language Processing
- Note:
- Pages:
- 208–212
- Language:
- URL:
- https://preview.aclanthology.org/build-pipeline-with-new-library/I17-4035/
- DOI:
- Cite (ACL):
- Hang Yuan, You Zhang, Jin Wang, and Xuejie Zhang. 2017. YNU-HPCC at IJCNLP-2017 Task 5: Multi-choice Question Answering in Exams Using an Attention-based LSTM Model. In Proceedings of the IJCNLP 2017, Shared Tasks, pages 208–212, Taipei, Taiwan. Asian Federation of Natural Language Processing.
- Cite (Informal):
- YNU-HPCC at IJCNLP-2017 Task 5: Multi-choice Question Answering in Exams Using an Attention-based LSTM Model (Yuan et al., IJCNLP 2017)
- PDF:
- https://preview.aclanthology.org/build-pipeline-with-new-library/I17-4035.pdf