SubmissionNumber#=%=#281 FinalPaperTitle#=%=#NLPNCHU at SemEval-2024 Task 4: A Comparison of MDHC Strategy and In-domain Pre-training for Multilingual Detection of Persuasion Techniques in Memes ShortPaperTitle#=%=# NumberOfPages#=%=#8 CopyrightSigned#=%=#Shih-Wei Guo JobTitle#==# Organization#==#Department of Computer Science and Engineering, National Chung Hsing University, 145 Xingda Rd., South Dist., Taichung City 402202, Taiwan (R.O.C.) Abstract#==#This study presents a systematic method for identifying 22 persuasive techniques used in multilingual memes. We explored various fine-tuning techniques and classification strategies, such as data augmentation, problem transformation, and hierarchical multi-label classification strategies. Identifying persuasive techniques in memes involves a multimodal task. We fine-tuned the XLM-RoBERTA-large-twitter language model, focusing on domain-specific language modeling, and integrated it with the CLIP visual model's embedding to consider image and text features simultaneously. In our experiments, we evaluated the effectiveness of our approach by using official validation data in English. Our system in the competition, achieving competitive rankings in Subtask1 and Subtask2b across four languages: English, Bulgarian, North Macedonian, and Arabic. Significantly, we achieved 2nd place ranking for Arabic language in Subtask 1. Author{1}{Firstname}#=%=#Shih-Wei Author{1}{Lastname}#=%=#Guo Author{1}{Username}#=%=#swguo Author{1}{Email}#=%=#cometlcc@gmail.com Author{1}{Affiliation}#=%=#Department of Computer Science and Engineering, NCHU Author{2}{Firstname}#=%=#Yao-Chung Author{2}{Lastname}#=%=#Fan Author{2}{Username}#=%=#yfan Author{2}{Email}#=%=#yfan@nchu.edu.tw Author{2}{Affiliation}#=%=#National Chung Hsing University ========== èéáğö