SCD: Self-Contrastive Decorrelation of Sentence Embeddings

Tassilo Klein, Moin Nabi


Abstract
In this paper, we propose Self-Contrastive Decorrelation (SCD), a self-supervised approach. Given an input sentence, it optimizes a joint self-contrastive and decorrelation objective. Learning a representation is facilitated by leveraging the contrast arising from the instantiation of standard dropout at different rates. The proposed method is conceptually simple yet empirically powerful. It achieves comparable results with state-of-the-art methods on multiple benchmarks without using contrastive pairs. This study opens up avenues for efficient self-supervised learning methods that are more robust than current contrastive methods.
Anthology ID:
2022.acl-short.44
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
394–400
Language:
URL:
https://aclanthology.org/2022.acl-short.44
DOI:
10.18653/v1/2022.acl-short.44
Bibkey:
Cite (ACL):
Tassilo Klein and Moin Nabi. 2022. SCD: Self-Contrastive Decorrelation of Sentence Embeddings. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 394–400, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
SCD: Self-Contrastive Decorrelation of Sentence Embeddings (Klein & Nabi, ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2022.acl-short.44.pdf
Code
 SAP-samples/acl2022-self-contrastive-decorrelation
Data
MPQA Opinion CorpusMRPCSICKSSTSentEval