Continual Learning Long Short Term Memory

Xin Guo, Yu Tian, Qinghan Xue, Panos Lampropoulos, Steven Eliuk, Kenneth Barner, Xiaolong Wang


Abstract
Catastrophic forgetting in neural networks indicates the performance decreasing of deep learning models on previous tasks while learning new tasks. To address this problem, we propose a novel Continual Learning Long Short Term Memory (CL-LSTM) cell in Recurrent Neural Network (RNN) in this paper. CL-LSTM considers not only the state of each individual task’s output gates but also the correlation of the states between tasks, so that the deep learning models can incrementally learn new tasks without catastrophically forgetting previously tasks. Experimental results demonstrate significant improvements of CL-LSTM over state-of-the-art approaches on spoken language understanding (SLU) tasks.
Anthology ID:
2020.findings-emnlp.164
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1817–1822
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.164
DOI:
10.18653/v1/2020.findings-emnlp.164
Bibkey:
Cite (ACL):
Xin Guo, Yu Tian, Qinghan Xue, Panos Lampropoulos, Steven Eliuk, Kenneth Barner, and Xiaolong Wang. 2020. Continual Learning Long Short Term Memory. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 1817–1822, Online. Association for Computational Linguistics.
Cite (Informal):
Continual Learning Long Short Term Memory (Guo et al., Findings 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2020.findings-emnlp.164.pdf
Optional supplementary material:
 2020.findings-emnlp.164.OptionalSupplementaryMaterial.pdf
Video:
 https://slideslive.com/38940170
Data
ATIS