Efficient Structured Inference for Transition-Based Parsing with Neural Networks and Error States

Ashish Vaswani, Kenji Sagae


Abstract
Transition-based approaches based on local classification are attractive for dependency parsing due to their simplicity and speed, despite producing results slightly below the state-of-the-art. In this paper, we propose a new approach for approximate structured inference for transition-based parsing that produces scores suitable for global scoring using local models. This is accomplished with the introduction of error states in local training, which add information about incorrect derivation paths typically left out completely in locally-trained models. Using neural networks for our local classifiers, our approach achieves 93.61% accuracy for transition-based dependency parsing in English.
Anthology ID:
Q16-1014
Volume:
Transactions of the Association for Computational Linguistics, Volume 4
Month:
Year:
2016
Address:
Cambridge, MA
Editors:
Lillian Lee, Mark Johnson, Kristina Toutanova
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
183–196
Language:
URL:
https://aclanthology.org/Q16-1014
DOI:
10.1162/tacl_a_00092
Bibkey:
Cite (ACL):
Ashish Vaswani and Kenji Sagae. 2016. Efficient Structured Inference for Transition-Based Parsing with Neural Networks and Error States. Transactions of the Association for Computational Linguistics, 4:183–196.
Cite (Informal):
Efficient Structured Inference for Transition-Based Parsing with Neural Networks and Error States (Vaswani & Sagae, TACL 2016)
Copy Citation:
PDF:
https://preview.aclanthology.org/ml4al-ingestion/Q16-1014.pdf
Code
 sagae/nndep
Data
Penn Treebank