SKAM at SemEval-2023 Task 10: Linguistic Feature Integration and Continuous Pretraining for Online Sexism Detection and Classification

Murali Manohar Kondragunta, Amber Chen, Karlo Slot, Sanne Weering, Tommaso Caselli


Abstract
Sexism has been prevalent online. In this paper, we explored the effect of explicit linguistic features and continuous pretraining on the performance of pretrained language models in sexism detection. While adding linguistic features did not improve the performance of the model, continuous pretraining did slightly boost the performance of the model in Task B from a mean macro-F1 score of 0.6156 to 0.6246. The best mean macro-F1 score in Task A was achieved by a finetuned HateBERT model using regular pretraining (0.8331). We observed that the linguistic features did not improve the model’s performance. At the same time, continuous pretraining proved beneficial only for nuanced downstream tasks like Task-B.
Anthology ID:
2023.semeval-1.250
Volume:
Proceedings of the 17th International Workshop on Semantic Evaluation (SemEval-2023)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Atul Kr. Ojha, A. Seza Doğruöz, Giovanni Da San Martino, Harish Tayyar Madabushi, Ritesh Kumar, Elisa Sartori
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
1805–1817
Language:
URL:
https://aclanthology.org/2023.semeval-1.250
DOI:
10.18653/v1/2023.semeval-1.250
Bibkey:
Cite (ACL):
Murali Manohar Kondragunta, Amber Chen, Karlo Slot, Sanne Weering, and Tommaso Caselli. 2023. SKAM at SemEval-2023 Task 10: Linguistic Feature Integration and Continuous Pretraining for Online Sexism Detection and Classification. In Proceedings of the 17th International Workshop on Semantic Evaluation (SemEval-2023), pages 1805–1817, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
SKAM at SemEval-2023 Task 10: Linguistic Feature Integration and Continuous Pretraining for Online Sexism Detection and Classification (Kondragunta et al., SemEval 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/2023.semeval-1.250.pdf