Abstract
Most word representation learning methods are based on the distributional hypothesis in linguistics, according to which words that are used and occur in the same contexts tend to possess similar meanings. As a consequence, emotionally dissimilar words, such as “happy” and “sad” occurring in similar contexts would purport more similar meaning than emotionally similar words, such as “happy” and “joy”. This complication leads to rather undesirable outcome in predictive tasks that relate to affect (emotional state), such as emotion classification and emotion similarity. In order to address this limitation, we propose a novel method of obtaining emotion-enriched word representations, which projects emotionally similar words into neighboring spaces and emotionally dissimilar ones far apart. The proposed approach leverages distant supervision to automatically obtain a large training dataset of text documents and two recurrent neural network architectures for learning the emotion-enriched representations. Through extensive evaluation on two tasks, including emotion classification and emotion similarity, we demonstrate that the proposed representations outperform several competitive general-purpose and affective word representations.- Anthology ID:
- C18-1081
- Volume:
- Proceedings of the 27th International Conference on Computational Linguistics
- Month:
- August
- Year:
- 2018
- Address:
- Santa Fe, New Mexico, USA
- Venue:
- COLING
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 950–961
- Language:
- URL:
- https://aclanthology.org/C18-1081
- DOI:
- Cite (ACL):
- Ameeta Agrawal, Aijun An, and Manos Papagelis. 2018. Learning Emotion-enriched Word Representations. In Proceedings of the 27th International Conference on Computational Linguistics, pages 950–961, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
- Cite (Informal):
- Learning Emotion-enriched Word Representations (Agrawal et al., COLING 2018)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/C18-1081.pdf