Abstract
Tree-structured neural network architectures for sentence encoding draw inspiration from the approach to semantic composition generally seen in formal linguistics, and have shown empirical improvements over comparable sequence models by doing so. Moreover, adding multiplicative interaction terms to the composition functions in these models can yield significant further improvements. However, existing compositional approaches that adopt such a powerful composition function scale poorly, with parameter counts exploding as model dimension or vocabulary size grows. We introduce the Lifted Matrix-Space model, which uses a global transformation to map vector word embeddings to matrices, which can then be composed via an operation based on matrix-matrix multiplication. Its composition function effectively transmits a larger number of activations across layers with relatively few model parameters. We evaluate our model on the Stanford NLI corpus, the Multi-Genre NLI corpus, and the Stanford Sentiment Treebank and find that it consistently outperforms TreeLSTM (Tai et al., 2015), the previous best known composition function for tree-structured models.- Anthology ID:
- K18-1049
- Volume:
- Proceedings of the 22nd Conference on Computational Natural Language Learning
- Month:
- October
- Year:
- 2018
- Address:
- Brussels, Belgium
- Editors:
- Anna Korhonen, Ivan Titov
- Venue:
- CoNLL
- SIG:
- SIGNLL
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 508–518
- Language:
- URL:
- https://aclanthology.org/K18-1049
- DOI:
- 10.18653/v1/K18-1049
- Cite (ACL):
- WooJin Chung, Sheng-Fu Wang, and Samuel Bowman. 2018. The Lifted Matrix-Space Model for Semantic Composition. In Proceedings of the 22nd Conference on Computational Natural Language Learning, pages 508–518, Brussels, Belgium. Association for Computational Linguistics.
- Cite (Informal):
- The Lifted Matrix-Space Model for Semantic Composition (Chung et al., CoNLL 2018)
- PDF:
- https://preview.aclanthology.org/landing_page/K18-1049.pdf
- Code
- NYU-MLL/spinn + additional community code
- Data
- MultiNLI, SNLI, SST