Ditto: A Simple and Efficient Approach to Improve Sentence Embeddings

Qian Chen, Wen Wang, Qinglin Zhang, Siqi Zheng, Chong Deng, Hai Yu, Jiaqing Liu, Yukun Ma, Chong Zhang


Abstract
Prior studies diagnose the anisotropy problem in sentence representations from pre-trained language models, e.g., BERT, without fine-tuning. Our analysis reveals that the sentence embeddings from BERT suffer from a bias towards uninformative words, limiting the performance in semantic textual similarity (STS) tasks. To address this bias, we propose a simple and efficient unsupervised approach, Diagonal Attention Pooling (Ditto), which weights words with model-based importance estimations and computes the weighted average of word representations from pre-trained models as sentence embeddings. Ditto can be easily applied to any pre-trained language model as a postprocessing operation. Compared to prior sentence embedding approaches, Ditto does not add parameters nor requires any learning. Empirical evaluations demonstrate that our proposed Ditto can alleviate the anisotropy problem and improve various pre-trained models on the STS benchmarks.
Anthology ID:
2023.emnlp-main.359
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5868–5875
Language:
URL:
https://aclanthology.org/2023.emnlp-main.359
DOI:
10.18653/v1/2023.emnlp-main.359
Bibkey:
Cite (ACL):
Qian Chen, Wen Wang, Qinglin Zhang, Siqi Zheng, Chong Deng, Hai Yu, Jiaqing Liu, Yukun Ma, and Chong Zhang. 2023. Ditto: A Simple and Efficient Approach to Improve Sentence Embeddings. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 5868–5875, Singapore. Association for Computational Linguistics.
Cite (Informal):
Ditto: A Simple and Efficient Approach to Improve Sentence Embeddings (Chen et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2023.emnlp-main.359.pdf
Video:
 https://preview.aclanthology.org/landing_page/2023.emnlp-main.359.mp4