2INER: Instructive and In-Context Learning on Few-Shot Named Entity Recognition
Jiasheng Zhang, Xikai Liu, Xinyi Lai, Yan Gao, Shusen Wang, Yao Hu, Yiqing Lin
Abstract
Prompt-based learning has emerged as a powerful technique in natural language processing (NLP) due to its ability to leverage pre-training knowledge for downstream few-shot tasks. In this paper, we propose 2INER, a novel text-to-text framework for Few-Shot Named Entity Recognition (NER) tasks. Our approach employs instruction finetuning based on InstructionNER to enable the model to effectively comprehend and process task-specific instructions, including both main and auxiliary tasks. We also introduce a new auxiliary task, called Type Extracting, to enhance the model’s understanding of entity types in the overall semantic context of a sentence. To facilitate in-context learning, we concatenate examples to the input, enabling the model to learn from additional contextual information. Experimental results on four datasets demonstrate that our approach outperforms existing Few-Shot NER methods and remains competitive with state-of-the-art standard NER algorithms.- Anthology ID:
- 2023.findings-emnlp.259
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 3940–3951
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.259
- DOI:
- 10.18653/v1/2023.findings-emnlp.259
- Cite (ACL):
- Jiasheng Zhang, Xikai Liu, Xinyi Lai, Yan Gao, Shusen Wang, Yao Hu, and Yiqing Lin. 2023. 2INER: Instructive and In-Context Learning on Few-Shot Named Entity Recognition. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 3940–3951, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- 2INER: Instructive and In-Context Learning on Few-Shot Named Entity Recognition (Zhang et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2023.findings-emnlp.259.pdf