2021
pdf
bib
Learning to Generate Questions by Learning to Recover Answer-containing Sentences
Seohyun Back
|
Akhil Kedia
|
Sai Chetan Chinthakindi
|
Haejun Lee
|
Jaegul Choo
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021
2018
pdf
bib
abs
Cut to the Chase: A Context Zoom-in Network for Reading Comprehension
Sathish Reddy Indurthi
|
Seunghak Yu
|
Seohyun Back
|
Heriberto Cuayáhuitl
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
In recent years many deep neural networks have been proposed to solve Reading Comprehension (RC) tasks. Most of these models suffer from reasoning over long documents and do not trivially generalize to cases where the answer is not present as a span in a given document. We present a novel neural-based architecture that is capable of extracting relevant regions based on a given question-document pair and generating a well-formed answer. To show the effectiveness of our architecture, we conducted several experiments on the recently proposed and challenging RC dataset ‘NarrativeQA’. The proposed architecture outperforms state-of-the-art results by 12.62% (ROUGE-L) relative improvement.
pdf
bib
abs
MemoReader: Large-Scale Reading Comprehension through Neural Memory Controller
Seohyun Back
|
Seunghak Yu
|
Sathish Reddy Indurthi
|
Jihie Kim
|
Jaegul Choo
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Machine reading comprehension helps machines learn to utilize most of the human knowledge written in the form of text. Existing approaches made a significant progress comparable to human-level performance, but they are still limited in understanding, up to a few paragraphs, failing to properly comprehend lengthy document. In this paper, we propose a novel deep neural network architecture to handle a long-range dependency in RC tasks. In detail, our method has two novel aspects: (1) an advanced memory-augmented architecture and (2) an expanded gated recurrent unit with dense connections that mitigate potential information distortion occurring in the memory. Our proposed architecture is widely applicable to other models. We have performed extensive experiments with well-known benchmark datasets such as TriviaQA, QUASAR-T, and SQuAD. The experimental results demonstrate that the proposed method outperforms existing methods, especially for lengthy documents.
pdf
bib
abs
A Multi-Stage Memory Augmented Neural Network for Machine Reading Comprehension
Seunghak Yu
|
Sathish Reddy Indurthi
|
Seohyun Back
|
Haejun Lee
Proceedings of the Workshop on Machine Reading for Question Answering
Reading Comprehension (RC) of text is one of the fundamental tasks in natural language processing. In recent years, several end-to-end neural network models have been proposed to solve RC tasks. However, most of these models suffer in reasoning over long documents. In this work, we propose a novel Memory Augmented Machine Comprehension Network (MAMCN) to address long-range dependencies present in machine reading comprehension. We perform extensive experiments to evaluate proposed method with the renowned benchmark datasets such as SQuAD, QUASAR-T, and TriviaQA. We achieve the state of the art performance on both the document-level (QUASAR-T, TriviaQA) and paragraph-level (SQuAD) datasets compared to all the previously published approaches.