OpenMSD: Towards Multilingual Scientific Documents Similarity Measurement

Yang Gao, Ji Ma, Ivan Korotkov, Keith Hall, Dana Alon, Donald Metzler


Abstract
We develop and evaluate multilingual scientific documents similarity measurement models in this work. Such models can be used to find related papers in different languages, which can help multilingual researchers find and explore papers more efficiently. We propose the first multilingual scientific documents dataset, Open-access Multilingual Scientific Documents (OpenMSD), which has 74M papers in 103 languages and 778M citation pairs. With OpenMSD, we develop multilingual SDSM models by adjusting and extending the state-of-the-art methods designed for English SDSM tasks. We find that: (i)Some highly successful methods in English SDSM yield significantly worse performance in multilingual SDSM. (ii)Our best model, which enriches the non-English papers with English summaries, outperforms strong baselines by 7% (in mean average precision) on multilingual SDSM tasks, without compromising the performance on English SDSM tasks.
Anthology ID:
2024.lrec-main.1092
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
12467–12480
Language:
URL:
https://aclanthology.org/2024.lrec-main.1092
DOI:
Bibkey:
Cite (ACL):
Yang Gao, Ji Ma, Ivan Korotkov, Keith Hall, Dana Alon, and Donald Metzler. 2024. OpenMSD: Towards Multilingual Scientific Documents Similarity Measurement. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 12467–12480, Torino, Italia. ELRA and ICCL.
Cite (Informal):
OpenMSD: Towards Multilingual Scientific Documents Similarity Measurement (Gao et al., LREC-COLING 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2024.lrec-main.1092.pdf