Abstract
We present neural syntactic generative models with exact marginalization that support both dependency parsing and language modeling. Exact marginalization is made tractable through dynamic programming over shift-reduce parsing and minimal RNN-based feature sets. Our algorithms complement previous approaches by supporting batched training and enabling online computation of next word probabilities. For supervised dependency parsing, our model achieves a state-of-the-art result among generative approaches. We also report empirical results on unsupervised syntactic models and their role in language modeling. We find that our model formulation of latent dependencies with exact marginalization do not lead to better intrinsic language modeling performance than vanilla RNNs, and that parsing accuracy is not correlated with language modeling perplexity in stack-based models.- Anthology ID:
- N18-1086
- Volume:
- Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
- Month:
- June
- Year:
- 2018
- Address:
- New Orleans, Louisiana
- Editors:
- Marilyn Walker, Heng Ji, Amanda Stent
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 942–952
- Language:
- URL:
- https://aclanthology.org/N18-1086
- DOI:
- 10.18653/v1/N18-1086
- Cite (ACL):
- Jan Buys and Phil Blunsom. 2018. Neural Syntactic Generative Models with Exact Marginalization. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), pages 942–952, New Orleans, Louisiana. Association for Computational Linguistics.
- Cite (Informal):
- Neural Syntactic Generative Models with Exact Marginalization (Buys & Blunsom, NAACL 2018)
- PDF:
- https://preview.aclanthology.org/fix-dup-bibkey/N18-1086.pdf