Enhancing Pre-trained Chinese Character Representation with Word-aligned Attention

Yanzeng Li, Bowen Yu, Xue Mengge, Tingwen Liu


Abstract
Most Chinese pre-trained models take character as the basic unit and learn representation according to character’s external contexts, ignoring the semantics expressed in the word, which is the smallest meaningful utterance in Chinese. Hence, we propose a novel word-aligned attention to exploit explicit word information, which is complementary to various character-based Chinese pre-trained language models. Specifically, we devise a pooling mechanism to align the character-level attention to the word level and propose to alleviate the potential issue of segmentation error propagation by multi-source information fusion. As a result, word and character information are explicitly integrated at the fine-tuning procedure. Experimental results on five Chinese NLP benchmark tasks demonstrate that our method achieves significant improvements against BERT, ERNIE and BERT-wwm.
Anthology ID:
2020.acl-main.315
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3442–3448
Language:
URL:
https://aclanthology.org/2020.acl-main.315
DOI:
10.18653/v1/2020.acl-main.315
Bibkey:
Cite (ACL):
Yanzeng Li, Bowen Yu, Xue Mengge, and Tingwen Liu. 2020. Enhancing Pre-trained Chinese Character Representation with Word-aligned Attention. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 3442–3448, Online. Association for Computational Linguistics.
Cite (Informal):
Enhancing Pre-trained Chinese Character Representation with Word-aligned Attention (Li et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2020.acl-main.315.pdf
Video:
 http://slideslive.com/38928721
Code
 lsvih/MWA
Data
DRCD