Abstract
Emotion is the essential attribute of human beings. Perceiving and understanding emotions in a human-like manner is the most central part of developing emotional intelligence. This paper describes the contribution of the LingJing team’s method to the Workshop on Computational Approaches to Subjectivity, Sentiment & Social Media Analysis (WASSA) 2022 shared task on Emotion Classification. The participants are required to predict seven emotions from empathic responses to news or stories that caused harm to individuals, groups, or others. This paper describes the continual pre-training method for the masked language model (MLM) to enhance the DeBERTa pre-trained language model. Several training strategies are designed to further improve the final downstream performance including the data augmentation with the supervised transfer, child-tuning training, and the late fusion method. Extensive experiments on the emotional classification dataset show that the proposed method outperforms other state-of-the-art methods, demonstrating our method’s effectiveness. Moreover, our submission ranked Top-1 with all metrics in the evaluation phase for the Emotion Classification task.- Anthology ID:
- 2022.wassa-1.22
- Volume:
- Proceedings of the 12th Workshop on Computational Approaches to Subjectivity, Sentiment & Social Media Analysis
- Month:
- May
- Year:
- 2022
- Address:
- Dublin, Ireland
- Editors:
- Jeremy Barnes, Orphée De Clercq, Valentin Barriere, Shabnam Tafreshi, Sawsan Alqahtani, João Sedoc, Roman Klinger, Alexandra Balahur
- Venue:
- WASSA
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 233–238
- Language:
- URL:
- https://aclanthology.org/2022.wassa-1.22
- DOI:
- 10.18653/v1/2022.wassa-1.22
- Cite (ACL):
- Bin Li, Yixuan Weng, Qiya Song, Bin Sun, and Shutao Li. 2022. Continuing Pre-trained Model with Multiple Training Strategies for Emotional Classification. In Proceedings of the 12th Workshop on Computational Approaches to Subjectivity, Sentiment & Social Media Analysis, pages 233–238, Dublin, Ireland. Association for Computational Linguistics.
- Cite (Informal):
- Continuing Pre-trained Model with Multiple Training Strategies for Emotional Classification (Li et al., WASSA 2022)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/2022.wassa-1.22.pdf