Neural Arabic Text Diacritization: State of the Art Results and a Novel Approach for Machine Translation

Ali Fadel, Ibraheem Tuffaha, Bara’ Al-Jawarneh, Mahmoud Al-Ayyoub


Abstract
In this work, we present several deep learning models for the automatic diacritization of Arabic text. Our models are built using two main approaches, viz. Feed-Forward Neural Network (FFNN) and Recurrent Neural Network (RNN), with several enhancements such as 100-hot encoding, embeddings, Conditional Random Field (CRF) and Block-Normalized Gradient (BNG). The models are tested on the only freely available benchmark dataset and the results show that our models are either better or on par with other models, which require language-dependent post-processing steps, unlike ours. Moreover, we show that diacritics in Arabic can be used to enhance the models of NLP tasks such as Machine Translation (MT) by proposing the Translation over Diacritization (ToD) approach.
Anthology ID:
D19-5229
Volume:
Proceedings of the 6th Workshop on Asian Translation
Month:
November
Year:
2019
Address:
Hong Kong, China
Venues:
EMNLP | WAT | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
215–225
Language:
URL:
https://aclanthology.org/D19-5229
DOI:
10.18653/v1/D19-5229
Bibkey:
Cite (ACL):
Ali Fadel, Ibraheem Tuffaha, Bara’ Al-Jawarneh, and Mahmoud Al-Ayyoub. 2019. Neural Arabic Text Diacritization: State of the Art Results and a Novel Approach for Machine Translation. In Proceedings of the 6th Workshop on Asian Translation, pages 215–225, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Neural Arabic Text Diacritization: State of the Art Results and a Novel Approach for Machine Translation (Fadel et al., EMNLP 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/D19-5229.pdf
Attachment:
 D19-5229.Attachment.zip
Code
 AliOsm/shakkelha +  additional community code
Data
Arabic Text Diacritization