Wikivecs: A Fully Reproducible Vectorization of Multilingual Wikipedia

Brandon Duderstadt


Abstract
Dense vector representations have become foundational to modern natural language processing (NLP), powering diverse workflows from semantic search and retrieval augmented generation to content comparison across languages. Although Wikipedia is one of the most comprehensive and widely used datasets in modern NLP research, it lacks a fully reproducible and permissively licensed dense vectorization.In this paper, we present Wikivecs, a fully reproducible, permissively licensed dataset containing dense vector embeddings for every article in Multilingual Wikipedia. Our pipeline leverages a fully reproducible and permissively licensed multilingual text encoder to embed Wikipedia articles into a unified vector space, making it easy to compare and analyze content across languages.Alongside these vectors, we release a two-dimensional data map derived from the vectors, enabling visualization and exploration of Multilingual Wikipedia’s content landscape.We demonstrate the utility of our dataset by identifying several content gaps between English and Russian Wikipedia.
Anthology ID:
2025.wikinlp-1.1
Volume:
Proceedings of the 2nd Workshop on Advancing Natural Language Processing for Wikipedia (WikiNLP 2025)
Month:
August
Year:
2025
Address:
Vienna, Austria
Editors:
Akhil Arora, Isaac Johnson, Lucie-Aimée Kaffee, Tzu-Sheng Kuo, Tiziano Piccardi, Indira Sen
Venues:
WikiNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–9
Language:
URL:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.wikinlp-1.1/
DOI:
Bibkey:
Cite (ACL):
Brandon Duderstadt. 2025. Wikivecs: A Fully Reproducible Vectorization of Multilingual Wikipedia. In Proceedings of the 2nd Workshop on Advancing Natural Language Processing for Wikipedia (WikiNLP 2025), pages 1–9, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Wikivecs: A Fully Reproducible Vectorization of Multilingual Wikipedia (Duderstadt, WikiNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.wikinlp-1.1.pdf