cfg model 0.0031056
tsg model 0.0029724850000000004
grammar rules 0.00284198
generative model 0.002840837
refinement model 0.0027515580000000003
backoff model 0.0027442540000000002
hierarchical model 0.0027267230000000003
probabilistic model 0.002699046
bayesian model 0.002697061
product model 0.002601747
pyp model 0.0025893310000000003
model sig 0.002567077
erative model 0.002563558
unified model 0.002562144
bination model 0.002560189
tsg tree 0.0025099049999999998
large tree 0.002446331
parse tree 0.002377373
model 0.00229493
tree con 0.002294205
backoff tree 0.002281674
elementary tree 0.002280541
tree substitution 0.0022541559999999998
derivation tree 0.002251777
probabilistic tree 0.002236466
tree fragments 0.002180739
tree frag 0.002120341
cfg rules 0.0021177599999999998
dependency grammar 0.0021051919999999997
tary tree 0.002103615
tree substitu 0.00209796
grammar induction 0.002061261
cfg rule 0.00201764
tsg rules 0.001984645
substitution grammar 0.001956696
large rules 0.0019210709999999999
tsg rule 0.001884525
adaptor grammar 0.001812984
gle grammar 0.00181267
vised grammar 0.0018024859999999998
pcfg rules 0.0017934379999999998
syntactic parsing 0.001651296
training data 0.001574812
training set 0.001535948
grammar 0.00153489
tsg parsing 0.00150242
rule sizes 0.001482693
other models 0.001469008
data set 0.001465094
large training 0.001436814
tsg parser 0.001428445
nonterminal symbol 0.001424044
tsg models 0.001399558
dependency parsing 0.0013951670000000001
same set 0.001391235
training corpus 0.001361079
parse trees 0.001358101
large set 0.0013270959999999998
syntax parsing 0.001322845
syntax trees 0.001311058
rules 0.00130709
probability distribution 0.001301945
generative parser 0.0012967970000000001
symbol refinement 0.001264545
elementary trees 0.001261269
training method 0.0012601420000000001
same number 0.001253818
additional training 0.001252134
pcfg parser 0.001237238
small training 0.0012331450000000002
treebank data 0.001232472
chine translation 0.001232311
parsing task 0.001230428
ing data 0.001227523
full training 0.001227075
same parse 0.0012231429999999999
other parsers 0.001212205
tsg induction 0.0012039260000000001
root symbol 0.001184819
wsj parsing 0.001172698
backoff models 0.001171327
equivalent cfg 0.001167942
rate parsing 0.001166257
natural language 0.001165772
deterministic symbol 0.001163344
training sets 0.001158591
language modeling 0.0011555670000000001
syntactic pars 0.001155389
efficient training 0.001144522
baseline parser 0.001142275
backoff distribution 0.001138802
standard data 0.0011377689999999998
coarse symbol 0.001128733
latent symbol 0.001128479
lexical features 0.001123612
nonterminal nodes 0.001117448
symbol sub 0.00110676
symbol subcategory 0.0011056360000000001
nonterminal symbols 0.0011047549999999998
reranking parser 0.0011030670000000001
