Abstract
Attention mechanisms have been leveraged for sentiment classification tasks because not all words have the same importance. However, most existing attention models did not take full advantage of sentiment lexicons, which provide rich sentiment information and play a critical role in sentiment analysis. To achieve the above target, in this work, we propose a novel lexicon-based supervised attention model (LBSA), which allows a recurrent neural network to focus on the sentiment content, thus generating sentiment-informative representations. Compared with general attention models, our model has better interpretability and less noise. Experimental results on three large-scale sentiment classification datasets showed that the proposed method outperforms previous methods.- Anthology ID:
- C18-1074
- Volume:
- Proceedings of the 27th International Conference on Computational Linguistics
- Month:
- August
- Year:
- 2018
- Address:
- Santa Fe, New Mexico, USA
- Editors:
- Emily M. Bender, Leon Derczynski, Pierre Isabelle
- Venue:
- COLING
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 868–877
- Language:
- URL:
- https://aclanthology.org/C18-1074
- DOI:
- Cite (ACL):
- Yicheng Zou, Tao Gui, Qi Zhang, and Xuanjing Huang. 2018. A Lexicon-Based Supervised Attention Model for Neural Sentiment Analysis. In Proceedings of the 27th International Conference on Computational Linguistics, pages 868–877, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
- Cite (Informal):
- A Lexicon-Based Supervised Attention Model for Neural Sentiment Analysis (Zou et al., COLING 2018)
- PDF:
- https://preview.aclanthology.org/ml4al-ingestion/C18-1074.pdf
- Data
- SST