Head-Lexicalized Bidirectional Tree LSTMs

Zhiyang Teng, Yue Zhang


Abstract
Sequential LSTMs have been extended to model tree structures, giving competitive results for a number of tasks. Existing methods model constituent trees by bottom-up combinations of constituent nodes, making direct use of input word information only for leaf nodes. This is different from sequential LSTMs, which contain references to input words for each node. In this paper, we propose a method for automatic head-lexicalization for tree-structure LSTMs, propagating head words from leaf nodes to every constituent node. In addition, enabled by head lexicalization, we build a tree LSTM in the top-down direction, which corresponds to bidirectional sequential LSTMs in structure. Experiments show that both extensions give better representations of tree structures. Our final model gives the best results on the Stanford Sentiment Treebank and highly competitive results on the TREC question type classification task.
Anthology ID:
Q17-1012
Volume:
Transactions of the Association for Computational Linguistics, Volume 5
Month:
Year:
2017
Address:
Cambridge, MA
Editors:
Lillian Lee, Mark Johnson, Kristina Toutanova
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
163–177
Language:
URL:
https://aclanthology.org/Q17-1012
DOI:
10.1162/tacl_a_00053
Bibkey:
Cite (ACL):
Zhiyang Teng and Yue Zhang. 2017. Head-Lexicalized Bidirectional Tree LSTMs. Transactions of the Association for Computational Linguistics, 5:163–177.
Cite (Informal):
Head-Lexicalized Bidirectional Tree LSTMs (Teng & Zhang, TACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-bitext-workshop/Q17-1012.pdf
Video:
 https://preview.aclanthology.org/ingest-bitext-workshop/Q17-1012.mp4