Abstract
Natural language inference has been shown to be an effective supervised task for learning generic sentence embeddings. In order to better understand the components that lead to effective representations, we propose a lightweight version of InferSent, called InferLite, that does not use any recurrent layers and operates on a collection of pre-trained word embeddings. We show that a simple instance of our model that makes no use of context, word ordering or position can still obtain competitive performance on the majority of downstream prediction tasks, with most performance gaps being filled by adding local contextual information through temporal convolutions. Our models can be trained in under 1 hour on a single GPU and allows for fast inference of new representations. Finally we describe a semantic hashing layer that allows our model to learn generic binary codes for sentences.- Anthology ID:
- D18-1524
- Volume:
- Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
- Month:
- October-November
- Year:
- 2018
- Address:
- Brussels, Belgium
- Editors:
- Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
- Venue:
- EMNLP
- SIG:
- SIGDAT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 4868–4874
- Language:
- URL:
- https://aclanthology.org/D18-1524
- DOI:
- 10.18653/v1/D18-1524
- Cite (ACL):
- Jamie Kiros and William Chan. 2018. InferLite: Simple Universal Sentence Representations from Natural Language Inference Data. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4868–4874, Brussels, Belgium. Association for Computational Linguistics.
- Cite (Informal):
- InferLite: Simple Universal Sentence Representations from Natural Language Inference Data (Kiros & Chan, EMNLP 2018)
- PDF:
- https://preview.aclanthology.org/teach-a-man-to-fish/D18-1524.pdf
- Data
- GLUE, MultiNLI, SNLI