Dependency Parsing as Head Selection

Xingxing Zhang, Jianpeng Cheng, Mirella Lapata


Abstract
Conventional graph-based dependency parsers guarantee a tree structure both during training and inference. Instead, we formalize dependency parsing as the problem of independently selecting the head of each word in a sentence. Our model which we call DENSE (as shorthand for Dependency Neural Selection) produces a distribution over possible heads for each word using features obtained from a bidirectional recurrent neural network. Without enforcing structural constraints during training, DeNSe generates (at inference time) trees for the overwhelming majority of sentences, while non-tree outputs can be adjusted with a maximum spanning tree algorithm. We evaluate DeNSe on four languages (English, Chinese, Czech, and German) with varying degrees of non-projectivity. Despite the simplicity of the approach, our parsers are on par with the state of the art.
Anthology ID:
E17-1063
Volume:
Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers
Month:
April
Year:
2017
Address:
Valencia, Spain
Editors:
Mirella Lapata, Phil Blunsom, Alexander Koller
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
665–676
Language:
URL:
https://aclanthology.org/E17-1063
DOI:
Bibkey:
Cite (ACL):
Xingxing Zhang, Jianpeng Cheng, and Mirella Lapata. 2017. Dependency Parsing as Head Selection. In Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers, pages 665–676, Valencia, Spain. Association for Computational Linguistics.
Cite (Informal):
Dependency Parsing as Head Selection (Zhang et al., EACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-dup-bibkey/E17-1063.pdf
Code
 XingxingZhang/dense_parser