Abstract
A number of differences have emerged between modern and classic approaches to constituency parsing in recent years, with structural components like grammars and feature-rich lexicons becoming less central while recurrent neural network representations rise in popularity. The goal of this work is to analyze the extent to which information provided directly by the model structure in classical systems is still being captured by neural methods. To this end, we propose a high-performance neural model (92.08 F1 on PTB) that is representative of recent work and perform a series of investigative experiments. We find that our model implicitly learns to encode much of the same information that was explicitly provided by grammars and lexicons in the past, indicating that this scaffolding can largely be subsumed by powerful general-purpose neural machinery.- Anthology ID:
- N18-1091
- Volume:
- Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
- Month:
- June
- Year:
- 2018
- Address:
- New Orleans, Louisiana
- Editors:
- Marilyn Walker, Heng Ji, Amanda Stent
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 999–1010
- Language:
- URL:
- https://aclanthology.org/N18-1091
- DOI:
- 10.18653/v1/N18-1091
- Cite (ACL):
- David Gaddy, Mitchell Stern, and Dan Klein. 2018. What’s Going On in Neural Constituency Parsers? An Analysis. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), pages 999–1010, New Orleans, Louisiana. Association for Computational Linguistics.
- Cite (Informal):
- What’s Going On in Neural Constituency Parsers? An Analysis (Gaddy et al., NAACL 2018)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/N18-1091.pdf
- Code
- dgaddy/parser-analysis
- Data
- Penn Treebank