Online Adaptor Grammars with Hybrid Inference

Ke Zhai, Jordan Boyd-Graber, Shay B. Cohen


Abstract
Adaptor grammars are a flexible, powerful formalism for defining nonparametric, unsupervised models of grammar productions. This flexibility comes at the cost of expensive inference. We address the difficulty of inference through an online algorithm which uses a hybrid of Markov chain Monte Carlo and variational inference. We show that this inference strategy improves scalability without sacrificing performance on unsupervised word segmentation and topic modeling tasks.
Anthology ID:
Q14-1036
Volume:
Transactions of the Association for Computational Linguistics, Volume 2
Month:
Year:
2014
Address:
Cambridge, MA
Editors:
Dekang Lin, Michael Collins, Lillian Lee
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
465–476
Language:
URL:
https://aclanthology.org/Q14-1036
DOI:
10.1162/tacl_a_00196
Bibkey:
Cite (ACL):
Ke Zhai, Jordan Boyd-Graber, and Shay B. Cohen. 2014. Online Adaptor Grammars with Hybrid Inference. Transactions of the Association for Computational Linguistics, 2:465–476.
Cite (Informal):
Online Adaptor Grammars with Hybrid Inference (Zhai et al., TACL 2014)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/Q14-1036.pdf