Structure and Label Constrained Data Augmentation for Cross-domain Few-shot NER

Jingyi Zhang, Ying Zhang, Yufeng Chen, Jinan Xu


Abstract
Cross-domain few-shot named entity recognition (NER) is a challenging task that aims to recognize entities in target domains with limited labeled data by leveraging relevant knowledge from source domains. However, domain gaps limit the effect of knowledge transfer and harm the performance of NER models. In this paper, we analyze those domain gaps from two new perspectives, i.e., entity annotations and entity structures and leverage word-to-tag and word-to-word relations to model them, respectively. Moreover, we propose a novel method called Structure and Label Constrained Data Augmentation (SLC-DA) for Cross-domain Few-shot NER, which novelly design a label constrained pre-train task and a structure constrained optimization objectives in the data augmentation process to generate domain-specific augmented data to help NER models smoothly transition from source to target domains. We evaluate our approach on several standard datasets and achieve state-of-the-art or competitive results, demonstrating the effectiveness of our method in cross-domain few-shot NER.
Anthology ID:
2023.findings-emnlp.37
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
518–530
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.37
DOI:
10.18653/v1/2023.findings-emnlp.37
Bibkey:
Cite (ACL):
Jingyi Zhang, Ying Zhang, Yufeng Chen, and Jinan Xu. 2023. Structure and Label Constrained Data Augmentation for Cross-domain Few-shot NER. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 518–530, Singapore. Association for Computational Linguistics.
Cite (Informal):
Structure and Label Constrained Data Augmentation for Cross-domain Few-shot NER (Zhang et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.findings-emnlp.37.pdf