Modelling Sentence Pairs with Tree-structured Attentive Encoder

Yao Zhou, Cong Liu, Yan Pan


Abstract
We describe an attentive encoder that combines tree-structured recursive neural networks and sequential recurrent neural networks for modelling sentence pairs. Since existing attentive models exert attention on the sequential structure, we propose a way to incorporate attention into the tree topology. Specially, given a pair of sentences, our attentive encoder uses the representation of one sentence, which generated via an RNN, to guide the structural encoding of the other sentence on the dependency parse tree. We evaluate the proposed attentive encoder on three tasks: semantic similarity, paraphrase identification and true-false question selection. Experimental results show that our encoder outperforms all baselines and achieves state-of-the-art results on two tasks.
Anthology ID:
C16-1274
Volume:
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers
Month:
December
Year:
2016
Address:
Osaka, Japan
Editors:
Yuji Matsumoto, Rashmi Prasad
Venue:
COLING
SIG:
Publisher:
The COLING 2016 Organizing Committee
Note:
Pages:
2912–2922
Language:
URL:
https://aclanthology.org/C16-1274
DOI:
Bibkey:
Cite (ACL):
Yao Zhou, Cong Liu, and Yan Pan. 2016. Modelling Sentence Pairs with Tree-structured Attentive Encoder. In Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pages 2912–2922, Osaka, Japan. The COLING 2016 Organizing Committee.
Cite (Informal):
Modelling Sentence Pairs with Tree-structured Attentive Encoder (Zhou et al., COLING 2016)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/C16-1274.pdf
Code
 yoosan/sentpair