Jiangxia Cao
2022
Enhancing Chinese Pre-trained Language Model via Heterogeneous Linguistics Graph
Yanzeng Li
|
Jiangxia Cao
|
Xin Cong
|
Zhenyu Zhang
|
Bowen Yu
|
Hongsong Zhu
|
Tingwen Liu
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Chinese pre-trained language models usually exploit contextual character information to learn representations, while ignoring the linguistics knowledge, e.g., word and sentence information. Hence, we propose a task-free enhancement module termed as Heterogeneous Linguistics Graph (HLG) to enhance Chinese pre-trained language models by integrating linguistics knowledge. Specifically, we construct a hierarchical heterogeneous graph to model the characteristics linguistics structure of Chinese language, and conduct a graph-based method to summarize and concretize information on different granularities of Chinese linguistics hierarchies. Experimental results demonstrate our model has the ability to improve the performance of vanilla BERT, BERTwwm and ERNIE 1.0 on 6 natural language processing tasks with 10 benchmark datasets. Further, the detailed experimental analyses have proven that this kind of modelization achieves more improvements compared with previous strong baseline MWA. Meanwhile, our model introduces far fewer parameters (about half of MWA) and the training/inference speed is about 7x faster than MWA.
Search
Co-authors
- Yanzeng Li 1
- Xin Cong 1
- Zhenyu Zhang 1
- Bowen Yu 1
- Hongsong Zhu 1
- show all...
Venues
- acl1