Latent Tree Learning with Differentiable Parsers: Shift-Reduce Parsing and Chart Parsing

Jean Maillard, Stephen Clark


Abstract
Latent tree learning models represent sentences by composing their words according to an induced parse tree, all based on a downstream task. These models often outperform baselines which use (externally provided) syntax trees to drive the composition order. This work contributes (a) a new latent tree learning model based on shift-reduce parsing, with competitive downstream performance and non-trivial induced trees, and (b) an analysis of the trees learned by our shift-reduce model and by a chart-based model.
Anthology ID:
W18-2903
Volume:
Proceedings of the Workshop on the Relevance of Linguistic Structure in Neural Architectures for NLP
Month:
July
Year:
2018
Address:
Melbourne, Australia
Venues:
ACL | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13–18
Language:
URL:
https://aclanthology.org/W18-2903
DOI:
10.18653/v1/W18-2903
Bibkey:
Cite (ACL):
Jean Maillard and Stephen Clark. 2018. Latent Tree Learning with Differentiable Parsers: Shift-Reduce Parsing and Chart Parsing. In Proceedings of the Workshop on the Relevance of Linguistic Structure in Neural Architectures for NLP, pages 13–18, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Latent Tree Learning with Differentiable Parsers: Shift-Reduce Parsing and Chart Parsing (Maillard & Clark, 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/W18-2903.pdf
Data
MultiNLISNLI