Learning Multilingual Sentence Representations with Cross-lingual Consistency Regularization

Pengzhi Gao, Liwen Zhang, Zhongjun He, Hua Wu, Haifeng Wang


Abstract
Multilingual sentence representations are the foundation for similarity-based bitext mining, which is crucial for scaling multilingual neural machine translation (NMT) system to more languages. In this paper, we introduce MuSR: a one-for-all Multilingual Sentence Representation model that supports 223 languages. Leveraging billions of English-centric parallel corpora, we train a multilingual Transformer encoder, coupled with an auxiliary Transformer decoder, by adopting a multilingual NMT framework with CrossConST, a cross-lingual consistency regularization technique proposed in Gao et al. (2023). Experimental results on multilingual similarity search and bitext mining tasks show the effectiveness of our approach. Specifically, MuSR achieves superior performance over LASER3 (Heffernan et al., 2022) which consists of 148 independent multilingual sentence encoders.
Anthology ID:
2023.emnlp-industry.25
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
December
Year:
2023
Address:
Singapore
Editors:
Mingxuan Wang, Imed Zitouni
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
243–262
Language:
URL:
https://aclanthology.org/2023.emnlp-industry.25
DOI:
10.18653/v1/2023.emnlp-industry.25
Bibkey:
Cite (ACL):
Pengzhi Gao, Liwen Zhang, Zhongjun He, Hua Wu, and Haifeng Wang. 2023. Learning Multilingual Sentence Representations with Cross-lingual Consistency Regularization. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 243–262, Singapore. Association for Computational Linguistics.
Cite (Informal):
Learning Multilingual Sentence Representations with Cross-lingual Consistency Regularization (Gao et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.emnlp-industry.25.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-5/2023.emnlp-industry.25.mp4