Virtual Augmentation Supported Contrastive Learning of Sentence Representations

Dejiao Zhang, Wei Xiao, Henghui Zhu, Xiaofei Ma, Andrew Arnold


Abstract
Despite profound successes, contrastive representation learning relies on carefully designed data augmentations using domain-specific knowledge. This challenge is magnified in natural language processing, where no general rules exist for data augmentation due to the discrete nature of natural language. We tackle this challenge by presenting a Virtual augmentation Supported Contrastive Learning of sentence representations (VaSCL). Originating from the interpretation that data augmentation essentially constructs the neighborhoods of each training instance, we, in turn, utilize the neighborhood to generate effective data augmentations. Leveraging the large training batch size of contrastive learning, we approximate the neighborhood of an instance via its K-nearest in-batch neighbors in the representation space. We then define an instance discrimination task regarding the neighborhood and generate the virtual augmentation in an adversarial training manner. We access the performance of VaSCL on a wide range of downstream tasks and set a new state-of-the-art for unsupervised sentence representation learning.
Anthology ID:
2022.findings-acl.70
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
864–876
Language:
URL:
https://aclanthology.org/2022.findings-acl.70
DOI:
10.18653/v1/2022.findings-acl.70
Bibkey:
Cite (ACL):
Dejiao Zhang, Wei Xiao, Henghui Zhu, Xiaofei Ma, and Andrew Arnold. 2022. Virtual Augmentation Supported Contrastive Learning of Sentence Representations. In Findings of the Association for Computational Linguistics: ACL 2022, pages 864–876, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Virtual Augmentation Supported Contrastive Learning of Sentence Representations (Zhang et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.findings-acl.70.pdf
Code
 amazon-research/sentence-representations
Data
GLUE