Attentive Gated Lexicon Reader with Contrastive Contextual Co-Attention for Sentiment Classification

Yi Tay, Anh Tuan Luu, Siu Cheung Hui, Jian Su


Abstract
This paper proposes a new neural architecture that exploits readily available sentiment lexicon resources. The key idea is that that incorporating a word-level prior can aid in the representation learning process, eventually improving model performance. To this end, our model employs two distinctly unique components, i.e., (1) we introduce a lexicon-driven contextual attention mechanism to imbue lexicon words with long-range contextual information and (2), we introduce a contrastive co-attention mechanism that models contrasting polarities between all positive and negative words in a sentence. Via extensive experiments, we show that our approach outperforms many other neural baselines on sentiment classification tasks on multiple benchmark datasets.
Anthology ID:
D18-1381
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3443–3453
Language:
URL:
https://aclanthology.org/D18-1381
DOI:
10.18653/v1/D18-1381
Bibkey:
Cite (ACL):
Yi Tay, Anh Tuan Luu, Siu Cheung Hui, and Jian Su. 2018. Attentive Gated Lexicon Reader with Contrastive Contextual Co-Attention for Sentiment Classification. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 3443–3453, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Attentive Gated Lexicon Reader with Contrastive Contextual Co-Attention for Sentiment Classification (Tay et al., EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/D18-1381.pdf