Clustering-Aware Negative Sampling for Unsupervised Sentence Representation

Jinghao Deng, Fanqi Wan, Tao Yang, Xiaojun Quan, Rui Wang


Abstract
Contrastive learning has been widely studied in sentence representation learning. However, earlier works mainly focus on the construction of positive examples, while in-batch samples are often simply treated as negative examples. This approach overlooks the importance of selecting appropriate negative examples, potentially leading to a scarcity of hard negatives and the inclusion of false negatives. To address these issues, we propose ClusterNS (Clustering-aware Negative Sampling), a novel method that incorporates cluster information into contrastive learning for unsupervised sentence representation learning. We apply a modified K-means clustering algorithm to supply hard negatives and recognize in-batch false negatives during training, aiming to solve the two issues in one unified framework. Experiments on semantic textual similarity (STS) tasks demonstrate that our proposed ClusterNS compares favorably with baselines in unsupervised sentence representation learning. Our code has been made publicly available at github.com/djz233/ClusterNS.
Anthology ID:
2023.findings-acl.555
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8713–8729
Language:
URL:
https://aclanthology.org/2023.findings-acl.555
DOI:
10.18653/v1/2023.findings-acl.555
Bibkey:
Cite (ACL):
Jinghao Deng, Fanqi Wan, Tao Yang, Xiaojun Quan, and Rui Wang. 2023. Clustering-Aware Negative Sampling for Unsupervised Sentence Representation. In Findings of the Association for Computational Linguistics: ACL 2023, pages 8713–8729, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Clustering-Aware Negative Sampling for Unsupervised Sentence Representation (Deng et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2023.findings-acl.555.pdf