DyLex: Incorporating Dynamic Lexicons into BERT for Sequence Labeling

Baojun Wang, Zhao Zhang, Kun Xu, Guang-Yuan Hao, Yuyang Zhang, Lifeng Shang, Linlin Li, Xiao Chen, Xin Jiang, Qun Liu


Abstract
Incorporating lexical knowledge into deep learning models has been proved to be very effective for sequence labeling tasks. However, previous works commonly have difficulty dealing with large-scale dynamic lexicons which often cause excessive matching noise and problems of frequent updates. In this paper, we propose DyLex, a plug-in lexicon incorporation approach for BERT based sequence labeling tasks. Instead of leveraging embeddings of words in the lexicon as in conventional methods, we adopt word-agnostic tag embeddings to avoid re-training the representation while updating the lexicon. Moreover, we employ an effective supervised lexical knowledge denoising method to smooth out matching noise. Finally, we introduce a col-wise attention based knowledge fusion mechanism to guarantee the pluggability of the proposed framework. Experiments on ten datasets of three tasks show that the proposed framework achieves new SOTA, even with very large scale lexicons.
Anthology ID:
2021.emnlp-main.211
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2679–2693
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2021.emnlp-main.211/
DOI:
10.18653/v1/2021.emnlp-main.211
Bibkey:
Cite (ACL):
Baojun Wang, Zhao Zhang, Kun Xu, Guang-Yuan Hao, Yuyang Zhang, Lifeng Shang, Linlin Li, Xiao Chen, Xin Jiang, and Qun Liu. 2021. DyLex: Incorporating Dynamic Lexicons into BERT for Sequence Labeling. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 2679–2693, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
DyLex: Incorporating Dynamic Lexicons into BERT for Sequence Labeling (Wang et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2021.emnlp-main.211.pdf
Code
 huawei-noah/noah-research
Data
CoNLL 2003