Self-Guided Contrastive Learning for BERT Sentence Representations

Taeuk Kim, Kang Min Yoo, Sang-goo Lee


Abstract
Although BERT and its variants have reshaped the NLP landscape, it still remains unclear how best to derive sentence embeddings from such pre-trained Transformers. In this work, we propose a contrastive learning method that utilizes self-guidance for improving the quality of BERT sentence representations. Our method fine-tunes BERT in a self-supervised fashion, does not rely on data augmentation, and enables the usual [CLS] token embeddings to function as sentence vectors. Moreover, we redesign the contrastive learning objective (NT-Xent) and apply it to sentence representation learning. We demonstrate with extensive experiments that our approach is more effective than competitive baselines on diverse sentence-related tasks. We also show it is efficient at inference and robust to domain shifts.
Anthology ID:
2021.acl-long.197
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2528–2540
Language:
URL:
https://aclanthology.org/2021.acl-long.197
DOI:
10.18653/v1/2021.acl-long.197
Bibkey:
Cite (ACL):
Taeuk Kim, Kang Min Yoo, and Sang-goo Lee. 2021. Self-Guided Contrastive Learning for BERT Sentence Representations. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 2528–2540, Online. Association for Computational Linguistics.
Cite (Informal):
Self-Guided Contrastive Learning for BERT Sentence Representations (Kim et al., ACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/2021.acl-long.197.pdf
Data
GLUESentEval