NLP_goats at SemEval-2025 Task 11: Multi-Label Emotion Classification Using Fine-Tuned Roberta-Large Tranformer

Vijay Karthick Vaidyanathan, Srihari V K, Mugilkrishna D U, Saritha Madhavan


Abstract
This paper serves as a solution for multi-label emotion classification and intensity for text, developed for SemEval-2025 Task 11. The method uses a fine-tuned RoBERTa-Large transformer model. The system represents a multi-label classification approach to identifying multiple emotions, and uses regression models to estimate emotion strength. The model performed with ranks of 31st and 17th place in the corresponding tracks. The findings show impressive performance and it remains possible to improve the performance of ambiguous or low-frequency emotion recognition using the state-of-the-art contextual embeddings and threshold optimization techniques.
Anthology ID:
2025.semeval-1.135
Volume:
Proceedings of the 19th International Workshop on Semantic Evaluation (SemEval-2025)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Sara Rosenthal, Aiala Rosá, Debanjan Ghosh, Marcos Zampieri
Venues:
SemEval | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1023–1027
Language:
URL:
https://preview.aclanthology.org/corrections-2025-08/2025.semeval-1.135/
DOI:
Bibkey:
Cite (ACL):
Vijay Karthick Vaidyanathan, Srihari V K, Mugilkrishna D U, and Saritha Madhavan. 2025. NLP_goats at SemEval-2025 Task 11: Multi-Label Emotion Classification Using Fine-Tuned Roberta-Large Tranformer. In Proceedings of the 19th International Workshop on Semantic Evaluation (SemEval-2025), pages 1023–1027, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
NLP_goats at SemEval-2025 Task 11: Multi-Label Emotion Classification Using Fine-Tuned Roberta-Large Tranformer (Vaidyanathan et al., SemEval 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/corrections-2025-08/2025.semeval-1.135.pdf