translation model 0.00416549
target word 0.00394737
source word 0.003546895
decoding translation 0.0033940719999999997
translation decoding 0.0033940719999999997
language model 0.00328417
word order 0.0032206500000000002
translation decoder 0.003143243
english translation 0.003024199
phrase translation 0.002998195
translation evaluation 0.0029905509999999997
svo word 0.002915931
target language 0.00291332
word type 0.0028962700000000003
machine translation 0.0028875539999999996
word sequence 0.002884912
sov word 0.002854384
translation performance 0.0028528399999999997
word sequences 0.002840216
translation systems 0.002825804
candidate word 0.002821091
nal word 0.002803349
rent word 0.0028005670000000003
prevalent word 0.002800037
translation hypotheses 0.002771221
translation techniques 0.0027466649999999997
japanese translation 0.002741769
output translation 0.002726775
automatic translation 0.002677703
translation paradigm 0.0026498769999999997
translation engines 0.002647629
chine translation 0.002647628
source language 0.0025128449999999997
translation 0.00239933
model probability 0.0023920219999999997
target words 0.002365083
language models 0.00232408
target sentence 0.002276431
ibm model 0.002265124
model context 0.002259415
model weights 0.0022379649999999997
forward model 0.002211004
distortion model 0.002194434
same language 0.002157122
linear model 0.002131853
reverse model 0.002116825
guage model 0.002097936
length model 0.002095355
same target 0.002034422
target phrase 0.001994175
target languages 0.001976494
first target 0.0019400090000000001
language pair 0.001933339
target phrases 0.0019057370000000002
human language 0.001900546
language pairs 0.001893936
source sentence 0.0018759559999999998
language production 0.0017830139999999999
unigram language 0.001771113
ent language 0.001770363
abbreviation language 0.001768314
model 0.00176616
target sequence 0.001728162
partial target 0.0017215750000000001
target lan 0.0017144030000000002
different decoding 0.001635049
decoding table 0.001594214
source languages 0.001576019
language 0.00151801
source phrases 0.0015052619999999998
same data 0.001490929
forward decoding 0.001439586
decoding systems 0.001421216
smt decoding 0.001416064
decoding direction 0.0014069870000000002
target 0.00139531
decoding process 0.001391602
additional data 0.0013859900000000001
reverse decoding 0.001345407
few words 0.001332453
decoding experiments 0.001331939
decoding strategies 0.001317943
decoding strategy 0.0013136110000000001
same order 0.001307702
source side 0.001301093
bidirectional decoding 0.001286456
tional decoding 0.0012810410000000001
other decoder 0.001276842
decoding constraints 0.00127439
size bleu 0.001265316
component words 0.001260389
size figure 0.00125977
verse decoding 0.001254024
subsequent decoding 0.0012497930000000001
order languages 0.001249774
bleu score 0.0012491310000000001
decoding processes 0.001249058
decoding instances 0.001247655
decoding strat 0.001245912
directional decoding 0.0012427290000000001
