Class-Incremental Learning based on Label Generation

Yijia Shao, Yiduo Guo, Dongyan Zhao, Bing Liu


Abstract
Despite the great success of pre-trained language models, it is still a challenge to use these models for continual learning, especially for the class-incremental learning (CIL) setting due to catastrophic forgetting (CF). This paper reports our finding that if we formulate CIL as a continual label generation problem, CF is drastically reduced and the generalizable representations of pre-trained models can be better retained. We thus propose a new CIL method (VAG) that also leverages the sparsity of vocabulary to focus the generation and creates pseudo-replay samples by using label semantics. Experimental results show that VAG outperforms baselines by a large margin.
Anthology ID:
2023.acl-short.109
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1263–1276
Language:
URL:
https://aclanthology.org/2023.acl-short.109
DOI:
10.18653/v1/2023.acl-short.109
Bibkey:
Cite (ACL):
Yijia Shao, Yiduo Guo, Dongyan Zhao, and Bing Liu. 2023. Class-Incremental Learning based on Label Generation. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 1263–1276, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Class-Incremental Learning based on Label Generation (Shao et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.acl-short.109.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2023.acl-short.109.mp4