A Hybrid CNN-RNN Alignment Model for Phrase-Aware Sentence Classification

Shiou Tian Hsu, Changsung Moon, Paul Jones, Nagiza Samatova


Abstract
The success of sentence classification often depends on understanding both the syntactic and semantic properties of word-phrases. Recent progress on this task has been based on exploiting the grammatical structure of sentences but often this structure is difficult to parse and noisy. In this paper, we propose a structure-independent ‘Gated Representation Alignment’ (GRA) model that blends a phrase-focused Convolutional Neural Network (CNN) approach with sequence-oriented Recurrent Neural Network (RNN). Our novel alignment mechanism allows the RNN to selectively include phrase information in a word-by-word sentence representation, and to do this without awareness of the syntactic structure. An empirical evaluation of GRA shows higher prediction accuracy (up to 4.6%) of fine-grained sentiment ratings, when compared to other structure-independent baselines. We also show comparable results to several structure-dependent methods. Finally, we analyzed the effect of our alignment mechanism and found that this is critical to the effectiveness of the CNN-RNN hybrid.
Anthology ID:
E17-2071
Volume:
Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers
Month:
April
Year:
2017
Address:
Valencia, Spain
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
443–449
Language:
URL:
https://aclanthology.org/E17-2071
DOI:
Bibkey:
Cite (ACL):
Shiou Tian Hsu, Changsung Moon, Paul Jones, and Nagiza Samatova. 2017. A Hybrid CNN-RNN Alignment Model for Phrase-Aware Sentence Classification. In Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers, pages 443–449, Valencia, Spain. Association for Computational Linguistics.
Cite (Informal):
A Hybrid CNN-RNN Alignment Model for Phrase-Aware Sentence Classification (Hsu et al., EACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/E17-2071.pdf
Data
SST