Changliang Xu
2022
LightNER: A Lightweight Tuning Paradigm for Low-resource NER via Pluggable Prompting
Xiang Chen
|
Lei Li
|
Shumin Deng
|
Chuanqi Tan
|
Changliang Xu
|
Fei Huang
|
Luo Si
|
Huajun Chen
|
Ningyu Zhang
Proceedings of the 29th International Conference on Computational Linguistics
Most NER methods rely on extensive labeled data for model training, which struggles in the low-resource scenarios with limited training data. Existing dominant approaches usually suffer from the challenge that the target domain has different label sets compared with a resource-rich source domain, which can be concluded as class transfer and domain transfer. In this paper, we propose a lightweight tuning paradigm for low-resource NER via pluggable prompting (LightNER). Specifically, we construct the unified learnable verbalizer of entity categories to generate the entity span sequence and entity categories without any label-specific classifiers, thus addressing the class transfer issue. We further propose a pluggable guidance module by incorporating learnable parameters into the self-attention layer as guidance, which can re-modulate the attention and adapt pre-trained weights. Note that we only tune those inserted module with the whole parameter of the pre-trained language model fixed, thus, making our approach lightweight and flexible for low-resource scenarios and can better transfer knowledge across domains. Experimental results show that LightNER can obtain comparable performance in the standard supervised setting and outperform strong baselines in low-resource settings.
Search
Co-authors
- Xiang Chen 1
- Lei Li 1
- Shumin Deng 1
- Chuanqi Tan 1
- Fei Huang 1
- show all...