translation model 0.00568627
language model 0.00403387
translation probability 0.003756675
model representation 0.003445767
model errors 0.003432871
translation results 0.0034129
statistical model 0.003331375
model wfst 0.003319929
wfst model 0.003319929
machine translation 0.003307617
null model 0.003297392
model null 0.003297392
tion model 0.0032746370000000004
ibm model 0.0032740440000000003
model size 0.003264723
translation wfst 0.003263879
translation accuracy 0.003247255
fertility model 0.003217501
composition model 0.0032053370000000004
correct translation 0.003177538
translation process 0.003174094
translation experiment 0.003162565
translation experiments 0.00315007
expansion model 0.0031434410000000003
cascade model 0.0031366400000000004
tical model 0.003131131
model qrg 0.003125936
translation prob 0.003086515
translation sys 0.003085659
chine translation 0.003075677
rect translation 0.003074235
model 0.00287116
translation 0.00281511
english word 0.00280318
word alignment 0.0027664680000000002
possible word 0.002708599
word null 0.002552122
word alignments 0.002524738
japanese word 0.0025083320000000003
special word 0.002460309
glish word 0.00240506
word trigram 0.0024037390000000002
source language 0.0019726889999999997
other words 0.001958724
language sentence 0.001918313
target language 0.001897455
continuous words 0.001638496
statistical language 0.0016229249999999999
null words 0.0016191819999999998
language modeling 0.001505993
trigram language 0.001440559
english sentence 0.001432893
bilingual corpus 0.001394878
english sentences 0.001312155
fertility probability 0.0012879060000000001
conditional probability 0.001245924
search method 0.0012419649999999998
wfst models 0.001234528
distortion probability 0.001225064
input sentence 0.0011949320000000001
words 0.00119295
ibm models 0.001188643
language 0.00116271
bilingual corpora 0.001155527
japanese sentence 0.0011380449999999999
sentence pairs 0.001120976
decoding time 0.0011035300000000001
search methods 0.001102933
training set 0.0011021680000000002
merging states 0.001075664
wfst states 0.001067373
expansion models 0.0010580400000000001
forward search 0.0010529189999999998
decoding algorithm 0.0010495980000000001
standard decoding 0.001043227
tistical models 0.001043197
search process 0.0010354449999999999
other submodels 0.00102896
same hypothesis 0.0010288559999999999
same decoder 0.001027408
merging method 0.0010225640000000001
output weights 0.00101186
beam search 9.99008E-4
output symbols 9.865590000000001E-4
output transitions 9.8445E-4
optimization method 9.84351E-4
possible alignments 9.81557E-4
english vocabulary 9.76153E-4
search efficiency 9.761259999999999E-4
decoding methods 9.57047E-4
statistical machine 9.52722E-4
search algorithms 9.51543E-4
possible paths 9.453860000000001E-4
probability 9.41565E-4
output sym 9.41311E-4
ward search 9.352879999999999E-4
mon search 9.338489999999999E-4
english attempts 9.33764E-4
redundant states 9.22979E-4
bleu score 9.2015E-4
