Abstract
Recurrent and convolutional neural networks comprise two distinct families of models that have proven to be useful for encoding natural language utterances. In this paper we present SoPa, a new model that aims to bridge these two approaches. SoPa combines neural representation learning with weighted finite-state automata (WFSAs) to learn a soft version of traditional surface patterns. We show that SoPa is an extension of a one-layer CNN, and that such CNNs are equivalent to a restricted version of SoPa, and accordingly, to a restricted form of WFSA. Empirically, on three text classification tasks, SoPa is comparable or better than both a BiLSTM (RNN) baseline and a CNN baseline, and is particularly useful in small data settings.- Anthology ID:
 - P18-1028
 - Volume:
 - Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
 - Month:
 - July
 - Year:
 - 2018
 - Address:
 - Melbourne, Australia
 - Venue:
 - ACL
 - SIG:
 - Publisher:
 - Association for Computational Linguistics
 - Note:
 - Pages:
 - 295–305
 - Language:
 - URL:
 - https://aclanthology.org/P18-1028
 - DOI:
 - 10.18653/v1/P18-1028
 - Cite (ACL):
 - Roy Schwartz, Sam Thomson, and Noah A. Smith. 2018. Bridging CNNs, RNNs, and Weighted Finite-State Machines. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 295–305, Melbourne, Australia. Association for Computational Linguistics.
 - Cite (Informal):
 - Bridging CNNs, RNNs, and Weighted Finite-State Machines (Schwartz et al., ACL 2018)
 - PDF:
 - https://preview.aclanthology.org/ingestion-script-update/P18-1028.pdf
 - Data
 - SST