Adaptive Threshold Selective Self-Attention for Chinese NER

Biao Hu, Zhen Huang, Minghao Hu, Ziwen Zhang, Yong Dou


Abstract
Recently, Transformer has achieved great success in Chinese named entity recognition (NER) owing to its good parallelism and ability to model long-range dependencies, which utilizes self-attention to encode context. However, the fully connected way of self-attention may scatter the attention distribution and allow some irrelevant character information to be integrated, leading to entity boundaries being misidentified. In this paper, we propose a data-driven Adaptive Threshold Selective Self-Attention (ATSSA) mechanism that aims to dynamically select the most relevant characters to enhance the Transformer architecture for Chinese NER. In ATSSA, the attention score threshold of each query is automatically generated, and characters with attention score higher than the threshold are selected by the query while others are discarded, so as to address irrelevant attention integration. Experiments on four benchmark Chinese NER datasets show that the proposed ATSSA brings 1.68 average F1 score improvements to the baseline model and achieves state-of-the-art performance.
Anthology ID:
2022.coling-1.157
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
1823–1833
Language:
URL:
https://aclanthology.org/2022.coling-1.157
DOI:
Bibkey:
Cite (ACL):
Biao Hu, Zhen Huang, Minghao Hu, Ziwen Zhang, and Yong Dou. 2022. Adaptive Threshold Selective Self-Attention for Chinese NER. In Proceedings of the 29th International Conference on Computational Linguistics, pages 1823–1833, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Adaptive Threshold Selective Self-Attention for Chinese NER (Hu et al., COLING 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2022.coling-1.157.pdf
Data
Resume NERWeibo NER