Comparison and Combination of Sentence Embeddings Derived from Different Supervision Signals

Hayato Tsukagoshi, Ryohei Sasano, Koichi Takeda


Abstract
There have been many successful applications of sentence embedding methods.However, it has not been well understood what properties are captured in the resulting sentence embeddings depending on the supervision signals.In this paper, we focus on two types of sentence embedding methods with similar architectures and tasks: one fine-tunes pre-trained language models on the natural language inference task, and the other fine-tunes pre-trained language models on word prediction task from its definition sentence, and investigate their properties.Specifically, we compare their performances on semantic textual similarity (STS) tasks using STS datasets partitioned from two perspectives: 1) sentence source and 2) superficial similarity of the sentence pairs, and compare their performances on the downstream and probing tasks.Furthermore, we attempt to combine the two methods and demonstrate that combining the two methods yields substantially better performance than the respective methods on unsupervised STS tasks and downstream tasks.
Anthology ID:
2022.starsem-1.12
Volume:
Proceedings of the 11th Joint Conference on Lexical and Computational Semantics
Month:
July
Year:
2022
Address:
Seattle, Washington
Venue:
*SEM
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
139–150
Language:
URL:
https://aclanthology.org/2022.starsem-1.12
DOI:
10.18653/v1/2022.starsem-1.12
Bibkey:
Cite (ACL):
Hayato Tsukagoshi, Ryohei Sasano, and Koichi Takeda. 2022. Comparison and Combination of Sentence Embeddings Derived from Different Supervision Signals. In Proceedings of the 11th Joint Conference on Lexical and Computational Semantics, pages 139–150, Seattle, Washington. Association for Computational Linguistics.
Cite (Informal):
Comparison and Combination of Sentence Embeddings Derived from Different Supervision Signals (Tsukagoshi et al., *SEM 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/remove-xml-comments/2022.starsem-1.12.pdf
Data
SentEval