Combining Generative and Discriminative Approaches to Unsupervised Dependency Parsing via Dual Decomposition

Yong Jiang, Wenjuan Han, Kewei Tu


Abstract
Unsupervised dependency parsing aims to learn a dependency parser from unannotated sentences. Existing work focuses on either learning generative models using the expectation-maximization algorithm and its variants, or learning discriminative models using the discriminative clustering algorithm. In this paper, we propose a new learning strategy that learns a generative model and a discriminative model jointly based on the dual decomposition method. Our method is simple and general, yet effective to capture the advantages of both models and improve their learning results. We tested our method on the UD treebank and achieved a state-of-the-art performance on thirty languages.
Anthology ID:
D17-1177
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1689–1694
Language:
URL:
https://aclanthology.org/D17-1177
DOI:
10.18653/v1/D17-1177
Bibkey:
Cite (ACL):
Yong Jiang, Wenjuan Han, and Kewei Tu. 2017. Combining Generative and Discriminative Approaches to Unsupervised Dependency Parsing via Dual Decomposition. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 1689–1694, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Combining Generative and Discriminative Approaches to Unsupervised Dependency Parsing via Dual Decomposition (Jiang et al., EMNLP 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/D17-1177.pdf