Abstract
The word representations are based on distributional hypothesis according to which words that occur in the similar contexts, tend to have a similar meaning and appear closer in vector space. For example, the emotionally dissimilar words ”joy” and ”sadness” have higher cosine similarity. The existing pre-trained embedding models lack in emotional words interpretations. For creating our VAD-Emotion embeddings, we modify the pre-trained word embeddings with emotion information. This is a lexicons based approach that uses the Valence, Arousal and Dominance (VAD) values, and the Plutchik’s emotions to incorporate the emotion information in pre-trained word embeddings using post-training processing. This brings emotionally similar words nearer and emotionally dissimilar words away from each other in the proposed vector space. We demonstrate the performance of proposed embedding through NLP downstream task - Emotion Recognition.- Anthology ID:
- 2021.icon-main.64
- Volume:
- Proceedings of the 18th International Conference on Natural Language Processing (ICON)
- Month:
- December
- Year:
- 2021
- Address:
- National Institute of Technology Silchar, Silchar, India
- Editors:
- Sivaji Bandyopadhyay, Sobha Lalitha Devi, Pushpak Bhattacharyya
- Venue:
- ICON
- SIG:
- Publisher:
- NLP Association of India (NLPAI)
- Note:
- Pages:
- 529–536
- Language:
- URL:
- https://aclanthology.org/2021.icon-main.64
- DOI:
- Cite (ACL):
- Manasi Kulkarni and Pushpak Bhattacharyya. 2021. Retrofitting of Pre-trained Emotion Words with VAD-dimensions and the Plutchik Emotions. In Proceedings of the 18th International Conference on Natural Language Processing (ICON), pages 529–536, National Institute of Technology Silchar, Silchar, India. NLP Association of India (NLPAI).
- Cite (Informal):
- Retrofitting of Pre-trained Emotion Words with VAD-dimensions and the Plutchik Emotions (Kulkarni & Bhattacharyya, ICON 2021)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2021.icon-main.64.pdf