Compare, Compress and Propagate: Enhancing Neural Architectures with Alignment Factorization for Natural Language Inference

Yi Tay, Anh Tuan Luu, Siu Cheung Hui


Abstract
This paper presents a new deep learning architecture for Natural Language Inference (NLI). Firstly, we introduce a new architecture where alignment pairs are compared, compressed and then propagated to upper layers for enhanced representation learning. Secondly, we adopt factorization layers for efficient and expressive compression of alignment vectors into scalar features, which are then used to augment the base word representations. The design of our approach is aimed to be conceptually simple, compact and yet powerful. We conduct experiments on three popular benchmarks, SNLI, MultiNLI and SciTail, achieving competitive performance on all. A lightweight parameterization of our model also enjoys a 3 times reduction in parameter size compared to the existing state-of-the-art models, e.g., ESIM and DIIN, while maintaining competitive performance. Additionally, visual analysis shows that our propagated features are highly interpretable.
Anthology ID:
D18-1185
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1565–1575
Language:
URL:
https://aclanthology.org/D18-1185
DOI:
10.18653/v1/D18-1185
Bibkey:
Cite (ACL):
Yi Tay, Anh Tuan Luu, and Siu Cheung Hui. 2018. Compare, Compress and Propagate: Enhancing Neural Architectures with Alignment Factorization for Natural Language Inference. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 1565–1575, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Compare, Compress and Propagate: Enhancing Neural Architectures with Alignment Factorization for Natural Language Inference (Tay et al., EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/D18-1185.pdf
Data
MultiNLISNLISciTail