Abstract
This paper describes our method that competed at WASSA2018 Implicit Emotion Shared Task. The goal of this task is to classify the emotions of excluded words in tweets into six different classes: sad, joy, disgust, surprise, anger and fear. For this, we examine a BiLSTM architecture with attention mechanism (BiLSTM-Attention) and a LSTM architecture with attention mechanism (LSTM-Attention), and try different dropout rates based on these two models. We then exploit an ensemble of these methods to give the final prediction which improves the model performance significantly compared with the baseline model. The proposed method achieves 7th position out of 30 teams and outperforms the baseline method by 12.5% in terms of macro F1.- Anthology ID:
- W18-6226
- Volume:
- Proceedings of the 9th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis
- Month:
- October
- Year:
- 2018
- Address:
- Brussels, Belgium
- Editors:
- Alexandra Balahur, Saif M. Mohammad, Veronique Hoste, Roman Klinger
- Venue:
- WASSA
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 189–194
- Language:
- URL:
- https://preview.aclanthology.org/build-pipeline-with-new-library/W18-6226/
- DOI:
- 10.18653/v1/W18-6226
- Cite (ACL):
- Qimin Zhou and Hao Wu. 2018. NLP at IEST 2018: BiLSTM-Attention and LSTM-Attention via Soft Voting in Emotion Classification. In Proceedings of the 9th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis, pages 189–194, Brussels, Belgium. Association for Computational Linguistics.
- Cite (Informal):
- NLP at IEST 2018: BiLSTM-Attention and LSTM-Attention via Soft Voting in Emotion Classification (Zhou & Wu, WASSA 2018)
- PDF:
- https://preview.aclanthology.org/build-pipeline-with-new-library/W18-6226.pdf