Multiplicative Tree-Structured Long Short-Term Memory Networks for Semantic Representations

Nam Khanh Tran, Weiwei Cheng


Abstract
Tree-structured LSTMs have shown advantages in learning semantic representations by exploiting syntactic information. Most existing methods model tree structures by bottom-up combinations of constituent nodes using the same shared compositional function and often making use of input word information only. The inability to capture the richness of compositionality makes these models lack expressive power. In this paper, we propose multiplicative tree-structured LSTMs to tackle this problem. Our model makes use of not only word information but also relation information between words. It is more expressive, as different combination functions can be used for each child node. In addition to syntactic trees, we also investigate the use of Abstract Meaning Representation in tree-structured models, in order to incorporate both syntactic and semantic information from the sentence. Experimental results on common NLP tasks show the proposed models lead to better sentence representation and AMR brings benefits in complex tasks.
Anthology ID:
S18-2032
Volume:
Proceedings of the Seventh Joint Conference on Lexical and Computational Semantics
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Malvina Nissim, Jonathan Berant, Alessandro Lenci
Venue:
*SEM
SIGs:
SIGSEM | SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
276–286
Language:
URL:
https://aclanthology.org/S18-2032
DOI:
10.18653/v1/S18-2032
Bibkey:
Cite (ACL):
Nam Khanh Tran and Weiwei Cheng. 2018. Multiplicative Tree-Structured Long Short-Term Memory Networks for Semantic Representations. In Proceedings of the Seventh Joint Conference on Lexical and Computational Semantics, pages 276–286, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Multiplicative Tree-Structured Long Short-Term Memory Networks for Semantic Representations (Tran & Cheng, *SEM 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/S18-2032.pdf
Data
SNLI