Abstract
We trained a model to automatically transliterate Judeo-Arabic texts into Arabic script, enabling Arabic readers to access those writings. We employ a recurrent neural network (RNN), combined with the connectionist temporal classification (CTC) loss to deal with unequal input/output lengths. This obligates adjustments in the training data to avoid input sequences that are shorter than their corresponding outputs. We also utilize a pretraining stage with a different loss function to improve network converge. Since only a single source of parallel text was available for training, we take advantage of the possibility of generating data synthetically. We train a model that has the capability to memorize words in the output language, and that also utilizes context for distinguishing ambiguities in the transliteration. We obtain an improvement over the baseline 9.5% character error, achieving 2% error with our best configuration. To measure the contribution of context to learning, we also tested word-shuffled data, for which the error rises to 2.5%.- Anthology ID:
- 2020.wanlp-1.8
- Volume:
- Proceedings of the Fifth Arabic Natural Language Processing Workshop
- Month:
- December
- Year:
- 2020
- Address:
- Barcelona, Spain (Online)
- Editors:
- Imed Zitouni, Muhammad Abdul-Mageed, Houda Bouamor, Fethi Bougares, Mahmoud El-Haj, Nadi Tomeh, Wajdi Zaghouani
- Venue:
- WANLP
- SIG:
- SIGARAB
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 85–96
- Language:
- URL:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2020.wanlp-1.8/
- DOI:
- Cite (ACL):
- Ori Terner, Kfir Bar, and Nachum Dershowitz. 2020. Transliteration of Judeo-Arabic Texts into Arabic Script Using Recurrent Neural Networks. In Proceedings of the Fifth Arabic Natural Language Processing Workshop, pages 85–96, Barcelona, Spain (Online). Association for Computational Linguistics.
- Cite (Informal):
- Transliteration of Judeo-Arabic Texts into Arabic Script Using Recurrent Neural Networks (Terner et al., WANLP 2020)
- PDF:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2020.wanlp-1.8.pdf