CLTL at ArAIEval Shared Task: Multimodal Propagandistic Memes Classification Using Transformer Models

Yeshan Wang, Ilia Markov


Abstract
We present the CLTL system designed for the ArAIEval Shared Task 2024 on multimodal propagandistic memes classification in Arabic. The challenge was divided into three subtasks: identifying propagandistic content from textual modality of memes (subtask 2A), from visual modality of memes (subtask 2B), and in a multimodal scenario when both modalities are combined (subtask 2C). We explored various unimodal transformer models for Arabic language processing (subtask 2A), visual models for image processing (subtask 2B), and concatenated text and image embeddings using the Multilayer Perceptron fusion module for multimodal propagandistic memes classification (subtask 2C). Our system achieved 77.96% for subtask 2A, 71.04% for subtask 2B, and 79.80% for subtask 2C, ranking 2nd, 1st, and 3rd on the leaderboard.
Anthology ID:
2024.arabicnlp-1.51
Volume:
Proceedings of The Second Arabic Natural Language Processing Conference
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Nizar Habash, Houda Bouamor, Ramy Eskander, Nadi Tomeh, Ibrahim Abu Farha, Ahmed Abdelali, Samia Touileb, Injy Hamed, Yaser Onaizan, Bashar Alhafni, Wissam Antoun, Salam Khalifa, Hatem Haddad, Imed Zitouni, Badr AlKhamissi, Rawan Almatham, Khalil Mrini
Venues:
ArabicNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
501–506
Language:
URL:
https://aclanthology.org/2024.arabicnlp-1.51
DOI:
10.18653/v1/2024.arabicnlp-1.51
Bibkey:
Cite (ACL):
Yeshan Wang and Ilia Markov. 2024. CLTL at ArAIEval Shared Task: Multimodal Propagandistic Memes Classification Using Transformer Models. In Proceedings of The Second Arabic Natural Language Processing Conference, pages 501–506, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
CLTL at ArAIEval Shared Task: Multimodal Propagandistic Memes Classification Using Transformer Models (Wang & Markov, ArabicNLP-WS 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2024.arabicnlp-1.51.pdf