TaCL: Improving BERT Pre-training with Token-aware Contrastive Learning

Yixuan Su, Fangyu Liu, Zaiqiao Meng, Tian Lan, Lei Shu, Ehsan Shareghi, Nigel Collier


Abstract
Masked language models (MLMs) such as BERT have revolutionized the field of Natural Language Understanding in the past few years. However, existing pre-trained MLMs often output an anisotropic distribution of token representations that occupies a narrow subset of the entire representation space. Such token representations are not ideal, especially for tasks that demand discriminative semantic meanings of distinct tokens. In this work, we propose TaCL (Token-aware Contrastive Learning), a novel continual pre-training approach that encourages BERT to learn an isotropic and discriminative distribution of token representations. TaCL is fully unsupervised and requires no additional data. We extensively test our approach on a wide range of English and Chinese benchmarks. The results show that TaCL brings consistent and notable improvements over the original BERT model. Furthermore, we conduct detailed analysis to reveal the merits and inner-workings of our approach.
Anthology ID:
2022.findings-naacl.191
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2497–2507
Language:
URL:
https://aclanthology.org/2022.findings-naacl.191
DOI:
10.18653/v1/2022.findings-naacl.191
Bibkey:
Cite (ACL):
Yixuan Su, Fangyu Liu, Zaiqiao Meng, Tian Lan, Lei Shu, Ehsan Shareghi, and Nigel Collier. 2022. TaCL: Improving BERT Pre-training with Token-aware Contrastive Learning. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 2497–2507, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
TaCL: Improving BERT Pre-training with Token-aware Contrastive Learning (Su et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2022.findings-naacl.191.pdf
Video:
 https://preview.aclanthology.org/add_acl24_videos/2022.findings-naacl.191.mp4
Code
 yxuansu/tacl +  additional community code
Data
GLUEQNLI