DS at SemEval-2023 Task 10: Explaining Online Sexism using Transformer based Approach

Madisetty Padmavathi


Abstract
In this paper, I describe the approach used in the SemEval 2023 - Task 10 Explainable Detection of Online Sexism (EDOS) competition (Kirk et al., 2023). I use different transformermodels, including BERT and RoBERTa which were fine-tuned on the EDOS dataset to classify text into different categories of sexism. I participated in three subtasks: subtask A is to classify given text as either sexist or not, while subtask B is to identify the specific category of sexism, such as (1) threats, (2) derogation, (3) animosity, (4) prejudiced discussions. Finally, subtask C involves predicting a finegrained vector representation of sexism, which included information about the severity, target and type of sexism present in the text. The use of transformer models allows the system to learn from the input data and make predictions on unseen text. By fine-tuning the models on the EDOS dataset, the system can improve its performance on the specific task of detecting online sexism. I got the following macro F1 scores: subtask A:77.16, subtask B: 46.11, and subtask C: 30.2.
Anthology ID:
2023.semeval-1.152
Volume:
Proceedings of the 17th International Workshop on Semantic Evaluation (SemEval-2023)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Atul Kr. Ojha, A. Seza Doğruöz, Giovanni Da San Martino, Harish Tayyar Madabushi, Ritesh Kumar, Elisa Sartori
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
1102–1106
Language:
URL:
https://aclanthology.org/2023.semeval-1.152
DOI:
10.18653/v1/2023.semeval-1.152
Bibkey:
Cite (ACL):
Madisetty Padmavathi. 2023. DS at SemEval-2023 Task 10: Explaining Online Sexism using Transformer based Approach. In Proceedings of the 17th International Workshop on Semantic Evaluation (SemEval-2023), pages 1102–1106, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
DS at SemEval-2023 Task 10: Explaining Online Sexism using Transformer based Approach (Padmavathi, SemEval 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.semeval-1.152.pdf