Xue Bai
2024
Improving Continual Few-shot Relation Extraction through Relational Knowledge Distillation and Prototype Augmentation
Zhiheng Zhang
|
Daojian Zeng
|
Xue Bai
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
In this paper, we focus on the challenging yet practical problem of Continual Few-shot Relation Extraction (CFRE), which involves extracting relations in the continuous and iterative arrival of new data with only a few labeled examples. The main challenges in CFRE are overfitting due to few-shot learning and catastrophic forgetting caused by continual learning. To address these problems, we propose a novel framework called RK2DA, which seamlessly integrates prototype-based data augmentation and relational knowledge distillation. Specifically, RK2DA generates pseudo data by introducing Gaussian noise to the prototype embeddings and utilizes a novel two-phase multi-teacher relational knowledge distillation method to transfer various knowledge from different embedding spaces. Experimental results on the FewRel and TACRED datasets demonstrate that our method outperforms the state-of-the-art baselines.
2009
Normalized Accessor Variety Combined with Conditional Random Fields in Chinese Word Segmentation
Saike He
|
Taozheng Zhang
|
Xue Bai
|
Xiaojie Wang
|
Yuan Dong
Proceedings of the Student Research Workshop
Multi-Task Learning in Conditional Random Fields for Chunking in Shallow Semantic Parsing
Saike He
|
Xiaojie Wang
|
Yuan Dong
|
Taozheng Zhang
|
Xue Bai
Proceedings of the 23rd Pacific Asia Conference on Language, Information and Computation, Volume 1
Search
Co-authors
- Saike He 2
- Taozheng Zhang 2
- Xiaojie Wang 2
- Yuan Dong 2
- Zhiheng Zhang 1
- show all...