One Sentence, Two Embeddings: Contrastive Learning of Explicit and Implicit Semantic Representations
Kohei Oda, Po-Min Chuang, Kiyoaki Shirai, Natthawut Kertkeidkachorn
Abstract
Sentence embedding methods have made remarkable progress, yet they still struggle to capture the implicit semantics within sentences. This can be attributed to the inherent limitations of conventional sentence embedding methods that assign only a single vector per sentence. To overcome this limitation, we propose DualCSE, a sentence embedding method that assigns two embeddings to each sentence: one representing the explicit semantics and the other representing the implicit semantics. These embeddings coexist in the shared space, enabling the selection of the desired semantics for specific purposes such as information retrieval and text classification. Experimental results demonstrate that DualCSE can effectively encode both explicit and implicit meanings and improve the performance of the downstream task.- Anthology ID:
- 2026.findings-eacl.74
- Volume:
- Findings of the Association for Computational Linguistics: EACL 2026
- Month:
- March
- Year:
- 2026
- Address:
- Rabat, Morocco
- Editors:
- Vera Demberg, Kentaro Inui, Lluís Marquez
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1444–1452
- Language:
- URL:
- https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.74/
- DOI:
- Cite (ACL):
- Kohei Oda, Po-Min Chuang, Kiyoaki Shirai, and Natthawut Kertkeidkachorn. 2026. One Sentence, Two Embeddings: Contrastive Learning of Explicit and Implicit Semantic Representations. In Findings of the Association for Computational Linguistics: EACL 2026, pages 1444–1452, Rabat, Morocco. Association for Computational Linguistics.
- Cite (Informal):
- One Sentence, Two Embeddings: Contrastive Learning of Explicit and Implicit Semantic Representations (Oda et al., Findings 2026)
- PDF:
- https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.74.pdf