Abstract
In NLP, convolutional neural networks (CNNs) have benefited less than recurrent neural networks (RNNs) from attention mechanisms. We hypothesize that this is because the attention in CNNs has been mainly implemented as attentive pooling (i.e., it is applied to pooling) rather than as attentive convolution (i.e., it is integrated into convolution). Convolution is the differentiator of CNNs in that it can powerfully model the higher-level representation of a word by taking into account its local fixed-size context in the input text tx. In this work, we propose an attentive convolution network, ATTCONV. It extends the context scope of the convolution operation, deriving higher-level features for a word not only from local context, but also from information extracted from nonlocal context by the attention mechanism commonly used in RNNs. This nonlocal context can come (i) from parts of the input text tx that are distant or (ii) from extra (i.e., external) contexts ty. Experiments on sentence modeling with zero-context (sentiment analysis), single-context (textual entailment) and multiple-context (claim verification) demonstrate the effectiveness of ATTCONV in sentence representation learning with the incorporation of context. In particular, attentive convolution outperforms attentive pooling and is a strong competitor to popular attentive RNNs.1- Anthology ID:
- Q18-1047
- Volume:
- Transactions of the Association for Computational Linguistics, Volume 6
- Month:
- Year:
- 2018
- Address:
- Cambridge, MA
- Editors:
- Lillian Lee, Mark Johnson, Kristina Toutanova, Brian Roark
- Venue:
- TACL
- SIG:
- Publisher:
- MIT Press
- Note:
- Pages:
- 687–702
- Language:
- URL:
- https://aclanthology.org/Q18-1047
- DOI:
- 10.1162/tacl_a_00249
- Cite (ACL):
- Wenpeng Yin and Hinrich Schütze. 2018. Attentive Convolution: Equipping CNNs with RNN-style Attention Mechanisms. Transactions of the Association for Computational Linguistics, 6:687–702.
- Cite (Informal):
- Attentive Convolution: Equipping CNNs with RNN-style Attention Mechanisms (Yin & Schütze, TACL 2018)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/Q18-1047.pdf
- Data
- FEVER, SNLI