Yingjie Wu
2022
PCBERT: Parent and Child BERT for Chinese Few-shot NER
Peichao Lai
|
Feiyang Ye
|
Lin Zhang
|
Zhiwei Chen
|
Yanggeng Fu
|
Yingjie Wu
|
Yilei Wang
Proceedings of the 29th International Conference on Computational Linguistics
Achieving good performance on few-shot or zero-shot datasets has been a long-term challenge for NER. The conventional semantic transfer approaches on NER will decrease model performance when the semantic distribution is quite different, especially in Chinese few-shot NER. Recently, prompt-tuning has been thoroughly considered for low-resource tasks. But there is no effective prompt-tuning approach for Chinese few-shot NER. In this work, we propose a prompt-based Parent and Child BERT (PCBERT) for Chinese few-shot NER. To train an annotating model on high-resource datasets and then discover more implicit labels on low-resource datasets. We further design a label extension strategy to achieve label transferring from high-resource datasets. We evaluated our model on Weibo and the other three sampling Chinese NER datasets, and the experimental result demonstrates our approach’s effectiveness in few-shot learning.
Search
Co-authors
- Peichao Lai 1
- Feiyang Ye 1
- Lin Zhang 1
- Zhiwei Chen 1
- Yanggeng Fu 1
- show all...