Asynchronous Deep Interaction Network for Natural Language Inference

Di Liang, Fubao Zhang, Qi Zhang, Xuanjing Huang


Abstract
Natural language inference aims to predict whether a premise sentence can infer another hypothesis sentence. Existing methods typically have framed the reasoning problem as a semantic matching task. The both sentences are encoded and interacted symmetrically and in parallel. However, in the process of reasoning, the role of the two sentences is obviously different, and the sentence pairs for NLI are asymmetrical corpora. In this paper, we propose an asynchronous deep interaction network (ADIN) to complete the task. ADIN is a neural network structure stacked with multiple inference sub-layers, and each sub-layer consists of two local inference modules in an asymmetrical manner. Different from previous methods, this model deconstructs the reasoning process and implements the asynchronous and multi-step reasoning. Experiment results show that ADIN achieves competitive performance and outperforms strong baselines on three popular benchmarks: SNLI, MultiNLI, and SciTail.
Anthology ID:
D19-1271
Volume:
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Month:
November
Year:
2019
Address:
Hong Kong, China
Venues:
EMNLP | IJCNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2692–2700
Language:
URL:
https://aclanthology.org/D19-1271
DOI:
10.18653/v1/D19-1271
Bibkey:
Cite (ACL):
Di Liang, Fubao Zhang, Qi Zhang, and Xuanjing Huang. 2019. Asynchronous Deep Interaction Network for Natural Language Inference. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 2692–2700, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Asynchronous Deep Interaction Network for Natural Language Inference (Liang et al., EMNLP 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/D19-1271.pdf