BeLLM: Backward Dependency Enhanced Large Language Model for Sentence Embeddings

Xianming Li, Jing Li


Abstract
Sentence embeddings are crucial in measuring semantic similarity. Most recent studies employed large language models (LLMs) to learn sentence embeddings. Existing LLMs mainly adopted autoregressive architecture without explicit backward dependency modeling. Therefore, we examined the effects of backward dependencies in LLMs for semantic similarity measurements. Concretely, we propose a novel model: backward dependency enhanced large language model (BeLLM). It learns sentence embeddings via transforming specific attention layers from uni- to bi-directional. We extensively experiment across various semantic textual similarity (STS) tasks and downstream applications. BeLLM achieves state-of-the-art performance in varying scenarios. It shows that autoregressive LLMs benefit from backward dependencies for sentence embeddings.
Anthology ID:
2024.naacl-long.45
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
792–804
Language:
URL:
https://aclanthology.org/2024.naacl-long.45
DOI:
Bibkey:
Cite (ACL):
Xianming Li and Jing Li. 2024. BeLLM: Backward Dependency Enhanced Large Language Model for Sentence Embeddings. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 792–804, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
BeLLM: Backward Dependency Enhanced Large Language Model for Sentence Embeddings (Li & Li, NAACL 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2024.naacl-long.45.pdf
Copyright:
 2024.naacl-long.45.copyright.pdf