@inproceedings{qachfar-verma-2023-redaspersuasion-araieval,
    title = "{R}e{DASP}ersuasion at {A}r{AIE}val Shared Task: Multilingual and Monolingual Models For {A}rabic Persuasion Detection",
    author = "Qachfar, Fatima Zahra  and
      Verma, Rakesh",
    editor = "Sawaf, Hassan  and
      El-Beltagy, Samhaa  and
      Zaghouani, Wajdi  and
      Magdy, Walid  and
      Abdelali, Ahmed  and
      Tomeh, Nadi  and
      Abu Farha, Ibrahim  and
      Habash, Nizar  and
      Khalifa, Salam  and
      Keleg, Amr  and
      Haddad, Hatem  and
      Zitouni, Imed  and
      Mrini, Khalil  and
      Almatham, Rawan",
    booktitle = "Proceedings of ArabicNLP 2023",
    month = dec,
    year = "2023",
    address = "Singapore (Hybrid)",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2023.arabicnlp-1.54/",
    doi = "10.18653/v1/2023.arabicnlp-1.54",
    pages = "549--557",
    abstract = "To enhance persuasion detection, we investigate the use of multilingual systems on Arabic data by conducting a total of 22 experiments using baselines, multilingual, and monolingual language transformers. Our aim is to provide a comprehensive evaluation of the various systems employed throughout this task, with the ultimate goal of comparing their performance and identifying the most effective approach. Our empirical analysis shows that *ReDASPersuasion* system performs best when combined with multilingual ``XLM-RoBERTa'' and monolingual pre-trained transformers on Arabic dialects like ``CAMeLBERT-DA SA'' depending on the NLP classification task."
}Markdown (Informal)
[ReDASPersuasion at ArAIEval Shared Task: Multilingual and Monolingual Models For Arabic Persuasion Detection](https://preview.aclanthology.org/ingest-emnlp/2023.arabicnlp-1.54/) (Qachfar & Verma, ArabicNLP 2023)
ACL