Speed Reading: Learning to Read ForBackward via Shuttle

Tsu-Jui Fu, Wei-Yun Ma


Abstract
We present LSTM-Shuttle, which applies human speed reading techniques to natural language processing tasks for accurate and efficient comprehension. In contrast to previous work, LSTM-Shuttle not only reads shuttling forward but also goes back. Shuttling forward enables high efficiency, and going backward gives the model a chance to recover lost information, ensuring better prediction. We evaluate LSTM-Shuttle on sentiment analysis, news classification, and cloze on IMDB, Rotten Tomatoes, AG, and Children’s Book Test datasets. We show that LSTM-Shuttle predicts both better and more quickly. To demonstrate how LSTM-Shuttle actually behaves, we also analyze the shuttling operation and present a case study.
Anthology ID:
D18-1474
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4439–4448
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/D18-1474/
DOI:
10.18653/v1/D18-1474
Bibkey:
Cite (ACL):
Tsu-Jui Fu and Wei-Yun Ma. 2018. Speed Reading: Learning to Read ForBackward via Shuttle. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4439–4448, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Speed Reading: Learning to Read ForBackward via Shuttle (Fu & Ma, EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/D18-1474.pdf
Code
 tsujuifu/pytorch_lstm-shuttle
Data
CBTChildren's Book TestIMDb Movie Reviews