Convolutional Neural Networks with Recurrent Neural Filters

Yi Yang


Abstract
We introduce a class of convolutional neural networks (CNNs) that utilize recurrent neural networks (RNNs) as convolution filters. A convolution filter is typically implemented as a linear affine transformation followed by a non-linear function, which fails to account for language compositionality. As a result, it limits the use of high-order filters that are often warranted for natural language processing tasks. In this work, we model convolution filters with RNNs that naturally capture compositionality and long-term dependencies in language. We show that simple CNN architectures equipped with recurrent neural filters (RNFs) achieve results that are on par with the best published ones on the Stanford Sentiment Treebank and two answer sentence selection datasets.
Anthology ID:
D18-1109
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
912–917
Language:
URL:
https://aclanthology.org/D18-1109
DOI:
10.18653/v1/D18-1109
Bibkey:
Cite (ACL):
Yi Yang. 2018. Convolutional Neural Networks with Recurrent Neural Filters. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 912–917, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Convolutional Neural Networks with Recurrent Neural Filters (Yang, EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/D18-1109.pdf
Video:
 https://preview.aclanthology.org/naacl24-info/D18-1109.mp4
Code
 bloomberg/cnn-rnf +  additional community code
Data
SSTSST-2SST-5WikiQA