Abstract
We present a simple sequential sentence encoder for multi-domain natural language inference. Our encoder is based on stacked bidirectional LSTM-RNNs with shortcut connections and fine-tuning of word embeddings. The overall supervised model uses the above encoder to encode two input sentences into two vectors, and then uses a classifier over the vector combination to label the relationship between these two sentences as that of entailment, contradiction, or neural. Our Shortcut-Stacked sentence encoders achieve strong improvements over existing encoders on matched and mismatched multi-domain natural language inference (top single-model result in the EMNLP RepEval 2017 Shared Task (Nangia et al., 2017)). Moreover, they achieve the new state-of-the-art encoding result on the original SNLI dataset (Bowman et al., 2015).- Anthology ID:
- W17-5308
- Volume:
- Proceedings of the 2nd Workshop on Evaluating Vector Space Representations for NLP
- Month:
- September
- Year:
- 2017
- Address:
- Copenhagen, Denmark
- Editors:
- Samuel Bowman, Yoav Goldberg, Felix Hill, Angeliki Lazaridou, Omer Levy, Roi Reichart, Anders Søgaard
- Venue:
- RepEval
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 41–45
- Language:
- URL:
- https://aclanthology.org/W17-5308
- DOI:
- 10.18653/v1/W17-5308
- Cite (ACL):
- Yixin Nie and Mohit Bansal. 2017. Shortcut-Stacked Sentence Encoders for Multi-Domain Inference. In Proceedings of the 2nd Workshop on Evaluating Vector Space Representations for NLP, pages 41–45, Copenhagen, Denmark. Association for Computational Linguistics.
- Cite (Informal):
- Shortcut-Stacked Sentence Encoders for Multi-Domain Inference (Nie & Bansal, RepEval 2017)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-3/W17-5308.pdf
- Code
- easonnie/multiNLI_encoder + additional community code
- Data
- MultiNLI, SNLI