Transfer or Translate? Argument Mining in Arabic with No Native Annotations

Sara Nabhani, Khalid Al Khatib


Abstract
Argument mining for Arabic remains underexplored, largely due to the scarcity of annotated corpora. To address this gap, we examine the effectiveness of cross-lingual transfer from English. Using the English Persuasive Essays (PE) corpus, annotated with argumentative components (Major Claim, Claim, and Premise), we explore several transfer strategies: training encoder-based multilingual and monolingual models on English data, machine-translated Arabic data, and their combination. We further assess the impact of annotation noise introduced during translation by manually correcting portions of the projected training data. In addition, we investigate the potential of prompting large language models (LLMs) for the task. Experiments on a manually corrected Arabic test set show that monolingual models trained on translated data achieve the strongest performance, with further improvements from small-scale manual correction of training examples.
Anthology ID:
2025.arabicnlp-main.33
Volume:
Proceedings of The Third Arabic Natural Language Processing Conference
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Kareem Darwish, Ahmed Ali, Ibrahim Abu Farha, Samia Touileb, Imed Zitouni, Ahmed Abdelali, Sharefah Al-Ghamdi, Sakhar Alkhereyf, Wajdi Zaghouani, Salam Khalifa, Badr AlKhamissi, Rawan Almatham, Injy Hamed, Zaid Alyafeai, Areeb Alowisheq, Go Inoue, Khalil Mrini, Waad Alshammari
Venue:
ArabicNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
407–416
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.arabicnlp-main.33/
DOI:
Bibkey:
Cite (ACL):
Sara Nabhani and Khalid Al Khatib. 2025. Transfer or Translate? Argument Mining in Arabic with No Native Annotations. In Proceedings of The Third Arabic Natural Language Processing Conference, pages 407–416, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Transfer or Translate? Argument Mining in Arabic with No Native Annotations (Nabhani & Khatib, ArabicNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.arabicnlp-main.33.pdf