dependency model 0.00523806
language model 0.0040641
model probability 0.003789647
model training 0.003519856
trigram model 0.0035022820000000002
parsing model 0.0034363560000000002
model size 0.003433231
dependency language 0.00342998
baseline model 0.003419449
bigram model 0.003300391
initial model 0.003280682
markov model 0.003265692
model parameters 0.003260022
gram model 0.003241145
guage model 0.0032355260000000003
model the 0.003223239
dependency structure 0.0032209730000000002
model weight 0.0032197280000000003
combined model 0.00321445
word probability 0.003214087
model estima 0.0032070560000000002
elementary model 0.0032014770000000003
extended model 0.0032014770000000003
model actions 0.0032014770000000003
dependency probability 0.003155527
headword dependency 0.002982899
word dependencies 0.002954098
model 0.00293609
word trigram 0.002926722
linguistic dependency 0.002912112
word sequence 0.002900069
syntactic dependency 0.002893928
function word 0.002853205
word level 0.002832393
tion word 0.002829626
dependency relations 0.002808457
dependency parsing 0.002802236
new word 0.0027966140000000002
dependency relation 0.002795424
distance word 0.002790286
chinese word 0.002779754
new dependency 0.002738054
next word 0.00273444
word bigram 0.002724831
same dependency 0.002720297
dependency set 0.002701294
word category 0.002698708
word segmentation 0.002688872
word string 0.0026872
related word 0.002667496
initial dependency 0.002646562
relevant word 0.002638711
tent word 0.002636942
conventional word 0.002635121
word prediction 0.00263271
word ban 0.00263077
statistical dependency 0.002625559
dependency parser 0.002623077
meaningful dependency 0.002617395
dependency dij 0.002614121
probable dependency 0.002604167
incremental dependency 0.002594182
dependency relationship 0.002577628
dependency struc 0.002576309
able dependency 0.002576294
pervised dependency 0.002573826
dependency cycle 0.002568439
dependency bigrams 0.00256726
dependency 0.00230197
language models 0.00221132
function words 0.001854245
language modeling 0.001808055
previous words 0.0017957630000000001
multiple words 0.0017856600000000001
trigram language 0.0016942020000000001
related words 0.001668536
preceding words 0.001662636
baseline language 0.001611369
training corpus 0.0015962860000000001
language input 0.001590918
headword probability 0.001534486
linguistic structure 0.001529145
syntactic structure 0.001510961
japanese sentence 0.001465983
asian language 0.001408252
newspaper corpus 0.001407367
text corpus 0.001405442
conventional language 0.001402601
english sentence 0.001391793
training data 0.001391018
test data 0.001383163
same sentence 0.001361962
words 0.00136157
parsing probability 0.001353823
lexical dependencies 0.001289292
particular sentence 0.00126004
headword trigram 0.001247121
conditional probability 0.001245947
bigram probability 0.001217858
lish sentence 0.001210648
