Dianchi at SemEval-2025 Task 11: Multilabel Emotion Recognition via Orthogonal Knowledge Distillation

Zhenlan Wang, Jiaxuan Liu


Abstract
This paper presents KDBERT-MLDistill, a novel framework for multi-label emotion recognition developed for SemEval-2025 Task 11. Addressing challenges of fine-grained emotion misdetection and small-data overfitting, the method synergizes BERT-based text encoding with orthogonal knowledge distillation. Key innovations include: (1) Orthogonal regularization on classifier weights to minimize redundant feature correlations, coupled with dynamic pseudo-labeling for periodic data augmentation; (2) A hierarchical distillation mechanism where dual teacher-student models iteratively exchange parameters to balance knowledge retention and exploration.
Anthology ID:
2025.semeval-1.146
Volume:
Proceedings of the 19th International Workshop on Semantic Evaluation (SemEval-2025)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Sara Rosenthal, Aiala Rosá, Debanjan Ghosh, Marcos Zampieri
Venues:
SemEval | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1108–1112
Language:
URL:
https://preview.aclanthology.org/corrections-2025-08/2025.semeval-1.146/
DOI:
Bibkey:
Cite (ACL):
Zhenlan Wang and Jiaxuan Liu. 2025. Dianchi at SemEval-2025 Task 11: Multilabel Emotion Recognition via Orthogonal Knowledge Distillation. In Proceedings of the 19th International Workshop on Semantic Evaluation (SemEval-2025), pages 1108–1112, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Dianchi at SemEval-2025 Task 11: Multilabel Emotion Recognition via Orthogonal Knowledge Distillation (Wang & Liu, SemEval 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/corrections-2025-08/2025.semeval-1.146.pdf