CLaC at SemEval-2024 Task 4: Decoding Persuasion in Memes – An Ensemble of Language Models with Paraphrase Augmentation

Kota Shamanth Ramanath Nayak, Leila Kosseim


Abstract
This paper describes our approach to SemEval-2024 Task 4 subtask 1, focusing on hierarchical multi-label detection of persuasion techniques in meme texts. Our approach was based on fine-tuning individual language models (BERT, XLM-RoBERTa, and mBERT) and leveraging a mean-based ensemble model. Additional strategies included dataset augmentation through the TC dataset and paraphrase generation as well as the fine-tuning of individual classification thresholds for each class. During testing, our system outperformed the baseline in all languages except for Arabic, where no significant improvement was reached. Analysis of the results seem to indicate that our dataset augmentation strategy and per-class threshold fine-tuning may have introduced noise and exacerbated the dataset imbalance.
Anthology ID:
2024.semeval-1.27
Volume:
Proceedings of the 18th International Workshop on Semantic Evaluation (SemEval-2024)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Atul Kr. Ojha, A. Seza Doğruöz, Harish Tayyar Madabushi, Giovanni Da San Martino, Sara Rosenthal, Aiala Rosá
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
175–180
Language:
URL:
https://aclanthology.org/2024.semeval-1.27
DOI:
Bibkey:
Cite (ACL):
Kota Shamanth Ramanath Nayak and Leila Kosseim. 2024. CLaC at SemEval-2024 Task 4: Decoding Persuasion in Memes – An Ensemble of Language Models with Paraphrase Augmentation. In Proceedings of the 18th International Workshop on Semantic Evaluation (SemEval-2024), pages 175–180, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
CLaC at SemEval-2024 Task 4: Decoding Persuasion in Memes – An Ensemble of Language Models with Paraphrase Augmentation (Nayak & Kosseim, SemEval 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-checklist/2024.semeval-1.27.pdf
Supplementary material:
 2024.semeval-1.27.SupplementaryMaterial.zip
Supplementary material:
 2024.semeval-1.27.SupplementaryMaterial.txt