Advancing Arabic Diacritization: Improved Datasets, Benchmarking, and State-of-the-Art Models

Abubakr Mohamed, Hamdy Mubarak


Abstract
Arabic diacritics, similar to short vowels in English, provide phonetic and grammatical information but are typically omitted in written Arabic, leading to ambiguity. Diacritization (aka diacritic restoration or vowelization) is essential for natural language processing. This paper advances Arabic diacritization through the following contributions: first, we propose a methodology to analyze and refine a large diacritized corpus to improve training quality. Second, we introduce WikiNews-2024, a multi-reference evaluation methodology with an updated version of the standard benchmark “WikiNews-2014”. In addition, we explore various model architectures and propose a BiLSTM-based model that achieves state-of-the-art results with 3.12% and 2.70% WER on WikiNews-2014 and WikiNews-2024, respectively. Moreover, we develop a model that preserves user-specified diacritics while maintaining accuracy. Lastly, we demonstrate that augmenting training data enhances performance in low-resource settings.
Anthology ID:
2025.emnlp-main.846
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16718–16730
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.846/
DOI:
Bibkey:
Cite (ACL):
Abubakr Mohamed and Hamdy Mubarak. 2025. Advancing Arabic Diacritization: Improved Datasets, Benchmarking, and State-of-the-Art Models. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 16718–16730, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Advancing Arabic Diacritization: Improved Datasets, Benchmarking, and State-of-the-Art Models (Mohamed & Mubarak, EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.846.pdf
Checklist:
 2025.emnlp-main.846.checklist.pdf