GigaEmbeddings — Efficient Russian Language Embedding Model

Egor Kolodin, Anastasia Ianina


Abstract
We introduce GigaEmbeddings, a novel framework for training high-performance Russian-focused text embeddings through hierarchical instruction tuning of the decoder-only LLM designed specifically for Russian language (GigaChat-3B). Our three-stage pipeline, comprising large-scale contrastive pre-training in web-scale corpora, fine-tuning with hard negatives, and multitask generalization across retrieval, classification, and clustering tasks, addresses key limitations of existing methods by unifying diverse objectives and leveraging synthetic data generation. Architectural innovations include bidirectional attention for contextual modeling, latent attention pooling for robust sequence aggregation, and strategic pruning of 25% of transformer layers to enhance efficiency without compromising performance. Evaluated on the ruMTEB benchmark spanning 23 multilingual tasks, GigaEmbeddings achieves state-of-the-art results (69.1 avg. score), outperforming strong baselines with a larger number of parameters.
Anthology ID:
2025.bsnlp-1.3
Volume:
Proceedings of the 10th Workshop on Slavic Natural Language Processing (Slavic NLP 2025)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Jakub Piskorski, Pavel Přibáň, Preslav Nakov, Roman Yangarber, Michal Marcinczuk
Venues:
BSNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
17–24
Language:
URL:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.bsnlp-1.3/
DOI:
Bibkey:
Cite (ACL):
Egor Kolodin and Anastasia Ianina. 2025. GigaEmbeddings — Efficient Russian Language Embedding Model. In Proceedings of the 10th Workshop on Slavic Natural Language Processing (Slavic NLP 2025), pages 17–24, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
GigaEmbeddings — Efficient Russian Language Embedding Model (Kolodin & Ianina, BSNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.bsnlp-1.3.pdf