Improving Continual Relation Extraction by Distinguishing Analogous Semantics

Wenzheng Zhao, Yuanning Cui, Wei Hu


Abstract
Continual relation extraction (RE) aims to learn constantly emerging relations while avoiding forgetting the learned relations. Existing works store a small number of typical samples to re-train the model for alleviating forgetting. However, repeatedly replaying these samples may cause the overfitting problem. We conduct an empirical study on existing works and observe that their performance is severely affected by analogous relations. To address this issue, we propose a novel continual extraction model for analogous relations. Specifically, we design memory-insensitive relation prototypes and memory augmentation to overcome the overfitting problem. We also introduce integrated training and focal knowledge distillation to enhance the performance on analogous relations. Experimental results show the superiority of our model and demonstrate its effectiveness in distinguishing analogous relations and overcoming overfitting.
Anthology ID:
2023.acl-long.65
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1162–1175
Language:
URL:
https://aclanthology.org/2023.acl-long.65
DOI:
10.18653/v1/2023.acl-long.65
Bibkey:
Cite (ACL):
Wenzheng Zhao, Yuanning Cui, and Wei Hu. 2023. Improving Continual Relation Extraction by Distinguishing Analogous Semantics. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1162–1175, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Improving Continual Relation Extraction by Distinguishing Analogous Semantics (Zhao et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.acl-long.65.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-5/2023.acl-long.65.mp4