Abstract
In this paper, we present an adaptive convolution for text classification to give flexibility to convolutional neural networks (CNNs). Unlike traditional convolutions which utilize the same set of filters regardless of different inputs, the adaptive convolution employs adaptively generated convolutional filters conditioned on inputs. We achieve this by attaching filter-generating networks, which are carefully designed to generate input-specific filters, to convolution blocks in existing CNNs. We show the efficacy of our approach in existing CNNs based on the performance evaluation. Our evaluation indicates that all of our baselines achieve performance improvements with adaptive convolutions as much as up to 2.6 percentage point in seven benchmark text classification datasets.- Anthology ID:
- N19-1256
- Volume:
- Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
- Month:
- June
- Year:
- 2019
- Address:
- Minneapolis, Minnesota
- Editors:
- Jill Burstein, Christy Doran, Thamar Solorio
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2475–2485
- Language:
- URL:
- https://aclanthology.org/N19-1256
- DOI:
- 10.18653/v1/N19-1256
- Cite (ACL):
- Byung-Ju Choi, Jun-Hyung Park, and SangKeun Lee. 2019. Adaptive Convolution for Text Classification. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 2475–2485, Minneapolis, Minnesota. Association for Computational Linguistics.
- Cite (Informal):
- Adaptive Convolution for Text Classification (Choi et al., NAACL 2019)
- PDF:
- https://preview.aclanthology.org/dois-2013-emnlp/N19-1256.pdf