Lattice-BERT: Leveraging Multi-Granularity Representations in Chinese Pre-trained Language Models

Yuxuan Lai, Yijia Liu, Yansong Feng, Songfang Huang, Dongyan Zhao


Abstract
Chinese pre-trained language models usually process text as a sequence of characters, while ignoring more coarse granularity, e.g., words. In this work, we propose a novel pre-training paradigm for Chinese — Lattice-BERT, which explicitly incorporates word representations along with characters, thus can model a sentence in a multi-granularity manner. Specifically, we construct a lattice graph from the characters and words in a sentence and feed all these text units into transformers. We design a lattice position attention mechanism to exploit the lattice structures in self-attention layers. We further propose a masked segment prediction task to push the model to learn from rich but redundant information inherent in lattices, while avoiding learning unexpected tricks. Experiments on 11 Chinese natural language understanding tasks show that our model can bring an average increase of 1.5% under the 12-layer setting, which achieves new state-of-the-art among base-size models on the CLUE benchmarks. Further analysis shows that Lattice-BERT can harness the lattice structures, and the improvement comes from the exploration of redundant information and multi-granularity representations. Our code will be available at https://github.com/alibaba/pretrained-language-models/LatticeBERT.
Anthology ID:
2021.naacl-main.137
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1716–1731
Language:
URL:
https://aclanthology.org/2021.naacl-main.137
DOI:
10.18653/v1/2021.naacl-main.137
Bibkey:
Cite (ACL):
Yuxuan Lai, Yijia Liu, Yansong Feng, Songfang Huang, and Dongyan Zhao. 2021. Lattice-BERT: Leveraging Multi-Granularity Representations in Chinese Pre-trained Language Models. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 1716–1731, Online. Association for Computational Linguistics.
Cite (Informal):
Lattice-BERT: Leveraging Multi-Granularity Representations in Chinese Pre-trained Language Models (Lai et al., NAACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2021.naacl-main.137.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2021.naacl-main.137.mp4
Code
 alibaba/AliceMind +  additional community code
Data
CLUECMNLI