@inproceedings{attieh-hassan-2022-arabic,
    title = "{A}rabic Dialect Identification and Sentiment Classification using Transformer-based Models",
    author = "Attieh, Joseph  and
      Hassan, Fadi",
    editor = "Bouamor, Houda  and
      Al-Khalifa, Hend  and
      Darwish, Kareem  and
      Rambow, Owen  and
      Bougares, Fethi  and
      Abdelali, Ahmed  and
      Tomeh, Nadi  and
      Khalifa, Salam  and
      Zaghouani, Wajdi",
    booktitle = "Proceedings of the Seventh Arabic Natural Language Processing Workshop (WANLP)",
    month = dec,
    year = "2022",
    address = "Abu Dhabi, United Arab Emirates (Hybrid)",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2022.wanlp-1.54/",
    doi = "10.18653/v1/2022.wanlp-1.54",
    pages = "485--490",
    abstract = "In this paper, we present two deep learning approaches that are based on AraBERT, submitted to the Nuanced Arabic Dialect Identification (NADI) shared task of the Seventh Workshop for Arabic Natural Language Processing (WANLP 2022). NADI consists of two main sub-tasks, mainly country-level dialect and sentiment identification for dialectical Arabic. We present one system per sub-task. The first system is a multi-task learning model that consists of a shared AraBERT encoder with three task-specific classification layers. This model is trained to jointly learn the country-level dialect of the tweet as well as the region-level and area-level dialects. The second system is a distilled model of an ensemble of models trained using K-fold cross-validation. Each model in the ensemble consists of an AraBERT model and a classifier, fine-tuned on (K-1) folds of the training set. Our team Pythoneers achieved rank 6 on the first test set of the first sub-task, rank 9 on the second test set of the first sub-task, and rank 4 on the test set of the second sub-task."
}Markdown (Informal)
[Arabic Dialect Identification and Sentiment Classification using Transformer-based Models](https://preview.aclanthology.org/ingest-emnlp/2022.wanlp-1.54/) (Attieh & Hassan, WANLP 2022)
ACL