dependency translation 0.00413472
dependency language 0.00358872
language dependency 0.00358872
model model 0.00355052
dependency parsing 0.003533294
dependency tree 0.003439217
dependency structure 0.0034310440000000003
dependency phrase 0.003348093
phrase dependency 0.003348093
dependency features 0.003339816
dependency rules 0.003276544
target dependency 0.003242661
dependency structures 0.003235258
english dependency 0.003147397
phrasal dependency 0.003120195
work dependency 0.003100867
dependency models 0.003096125
additional dependency 0.003016178
dependency category 0.0029877050000000002
dependency decoder 0.002987339
dependency trees 0.0029832220000000002
dependency parse 0.002950353
complete dependency 0.002912901
dependency approaches 0.00289369
future dependency 0.002874665
dependency fea 0.002844183
language model 0.00282345
dependency lan 0.002821536
dependency arcs 0.002788204
divide dependency 0.002788204
reordering model 0.0027848200000000004
parsing model 0.002768024
dependency 0.00254053
translation system 0.002447289
translation rule 0.00238504
translation rules 0.002330204
translation figure 0.002262755
linear model 0.002213414
english translation 0.002201057
model probabilities 0.002135239
entropy model 0.002085293
uniform model 0.002064433
word alignment 0.0020349640000000002
root word 0.001970248
translation quality 0.0019655090000000003
translation approaches 0.00194735
word penalty 0.0019426920000000002
source words 0.0019313440000000002
machine translation 0.0019312090000000001
local word 0.001896245
translation datasets 0.001853461
target words 0.001850651
translation scenario 0.0018423930000000001
phrase reordering 0.001817123
word selection 0.0017858140000000002
model 0.00177526
english words 0.0017553870000000002
other words 0.0017449920000000001
language models 0.001603785
syntactic phrase 0.001598953
translation 0.00159419
source phrase 0.001590387
same feature 0.001559487
parsing algorithm 0.0015521889999999998
training data 0.0015133479999999999
target phrase 0.001509694
phrase target 0.001509694
chinese words 0.0014917290000000002
same training 0.0014764840000000001
training algorithm 0.001453443
system bleu 0.001432843
boundary words 0.001409709
consecutive words 0.001397807
dency language 0.00138268
local parsing 0.001355719
hierarchical structure 0.001338658
decoding algorithm 0.001323449
language mod 0.001302323
structure penalty 0.001299916
type feature 0.001294258
pendency parsing 0.0012907259999999999
mst parsing 0.001285486
same target 0.0012845970000000002
chart parsing 0.0012795979999999998
feature weights 0.001278839
parsing ambiguity 0.0012704169999999998
lexicalized reordering 0.001264802
erarchical reordering 0.001261645
following features 0.001259681
decoding time 0.0012520819999999998
feature templates 0.0012489789999999999
floating structure 0.001242078
parsing literature 0.0012416299999999999
parsing aggra 0.001240799
fixed structure 0.001235198
phrase pair 0.001233727
training example 0.001233635
dency tree 0.001233177
source phrases 0.001231823
feature conjunction 0.001226696
