Learning from Language Description: Low-shot Named Entity Recognition via Decomposed Framework

Yaqing Wang, Haoda Chu, Chao Zhang, Jing Gao


Abstract
In this work, we study the problem of named entity recognition (NER) in a low resource scenario, focusing on few-shot and zero-shot settings. Built upon large-scale pre-trained language models, we propose a novel NER framework, namely SpanNER, which learns from natural language supervision and enables the identification of never-seen entity classes without using in-domain labeled data. We perform extensive experiments on 5 benchmark datasets and evaluate the proposed method in the few-shot learning, domain transfer and zero-shot learning settings. The experimental results show that the proposed method can bring 10%, 23% and 26% improvements in average over the best baselines in few-shot learning, domain transfer and zero-shot learning settings respectively.
Anthology ID:
2021.findings-emnlp.139
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1618–1630
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.139
DOI:
10.18653/v1/2021.findings-emnlp.139
Bibkey:
Cite (ACL):
Yaqing Wang, Haoda Chu, Chao Zhang, and Jing Gao. 2021. Learning from Language Description: Low-shot Named Entity Recognition via Decomposed Framework. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 1618–1630, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Learning from Language Description: Low-shot Named Entity Recognition via Decomposed Framework (Wang et al., Findings 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2021.findings-emnlp.139.pdf
Video:
 https://preview.aclanthology.org/naacl24-info/2021.findings-emnlp.139.mp4
Data
CoNLL 2003