NAG-NER: a Unified Non-Autoregressive Generation Framework for Various NER Tasks

Xinpeng Zhang, Ming Tan, Jingfan Zhang, Wei Zhu


Abstract
Recently, the recognition of flat, nested, and discontinuous entities by a unified generative model framework has received increasing attention both in the research field and industry. However, the current generative NER methods force the entities to be generated in a predefined order, suffering from error propagation and inefficient decoding. In this work, we propose a unified non-autoregressive generation (NAG) framework for general NER tasks, referred to as NAG-NER. First, we propose to generate entities as a set instead of a sequence, avoiding error propagation. Second, we propose incorporating NAG in NER tasks for efficient decoding by treating each entity as a target sequence. Third, to enhance the generation performances of the NAG decoder, we employ the NAG encoder to detect potential entity mentions. Extensive experiments show that our NAG-NER model outperforms the state-of-the-art generative NER models on three benchmark NER datasets of different types and two of our proprietary NER tasks.\footnote{Code will be publicly available to the research community upon acceptance.}
Anthology ID:
2023.acl-industry.65
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 5: Industry Track)
Month:
July
Year:
2023
Address:
Toronto, Canada
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
676–686
Language:
URL:
https://aclanthology.org/2023.acl-industry.65
DOI:
Bibkey:
Cite (ACL):
Xinpeng Zhang, Ming Tan, Jingfan Zhang, and Wei Zhu. 2023. NAG-NER: a Unified Non-Autoregressive Generation Framework for Various NER Tasks. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 5: Industry Track), pages 676–686, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
NAG-NER: a Unified Non-Autoregressive Generation Framework for Various NER Tasks (Zhang et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/starsem-semeval-split/2023.acl-industry.65.pdf