Fast(er) Exact Decoding and Global Training for Transition-Based Dependency Parsing via a Minimal Feature Set

Tianze Shi, Liang Huang, Lillian Lee


Abstract
We first present a minimal feature set for transition-based dependency parsing, continuing a recent trend started by Kiperwasser and Goldberg (2016a) and Cross and Huang (2016a) of using bi-directional LSTM features. We plug our minimal feature set into the dynamic-programming framework of Huang and Sagae (2010) and Kuhlmann et al. (2011) to produce the first implementation of worst-case O(n3) exact decoders for arc-hybrid and arc-eager transition systems. With our minimal features, we also present O(n3) global training methods. Finally, using ensembles including our new parsers, we achieve the best unlabeled attachment score reported (to our knowledge) on the Chinese Treebank and the “second-best-in-class” result on the English Penn Treebank.
Anthology ID:
D17-1002
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
12–23
Language:
URL:
https://aclanthology.org/D17-1002
DOI:
10.18653/v1/D17-1002
Bibkey:
Cite (ACL):
Tianze Shi, Liang Huang, and Lillian Lee. 2017. Fast(er) Exact Decoding and Global Training for Transition-Based Dependency Parsing via a Minimal Feature Set. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 12–23, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Fast(er) Exact Decoding and Global Training for Transition-Based Dependency Parsing via a Minimal Feature Set (Shi et al., EMNLP 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/D17-1002.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/D17-1002.mp4
Code
 tzshi/dp-parser-emnlp17
Data
Penn Treebank