INFOTEC-NLP at SemEval-2025 Task 11: A Case Study on Transformer-Based Models and Bag of Words

Emmanuel Santos - Rodriguez, Mario Graff


Abstract
Leveraging transformer-based models as feature extractors, we introduce a hybrid architecture that integrates a bidirectional LSTM network with a multi-head attention mechanism to address the challenges of multilingual emotion detection in text. While pre-trained transformers provide robust contextual embeddings, they often struggle with capturing long-range dependencies and handling class imbalances, particularly in low-resource languages. To mitigate these issues, our approach combines sequential modeling and attention mechanisms, allowing the model to refine representations by emphasizing key emotional cues in text.
Anthology ID:
2025.semeval-1.50
Volume:
Proceedings of the 19th International Workshop on Semantic Evaluation (SemEval-2025)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Sara Rosenthal, Aiala Rosá, Debanjan Ghosh, Marcos Zampieri
Venues:
SemEval | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
350–356
Language:
URL:
https://preview.aclanthology.org/transition-to-people-yaml/2025.semeval-1.50/
DOI:
Bibkey:
Cite (ACL):
Emmanuel Santos - Rodriguez and Mario Graff. 2025. INFOTEC-NLP at SemEval-2025 Task 11: A Case Study on Transformer-Based Models and Bag of Words. In Proceedings of the 19th International Workshop on Semantic Evaluation (SemEval-2025), pages 350–356, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
INFOTEC-NLP at SemEval-2025 Task 11: A Case Study on Transformer-Based Models and Bag of Words (Santos - Rodriguez & Graff, SemEval 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/transition-to-people-yaml/2025.semeval-1.50.pdf