translation model 0.00452518
translation table 0.003762065
new translation 0.003587191
translation domain 0.0035034519999999998
machine translation 0.003495182
translation lexicon 0.0034750989999999997
word translations 0.0034240110000000002
translation pairs 0.003423975
translation probabilities 0.00339277
several translation 0.003367671
full translation 0.0033524979999999998
large word 0.003337936
translation quality 0.003294002
translation systems 0.00328083
english word 0.003275516
specific translation 0.003243356
source word 0.003231727
estimate translation 0.003216962
candidate translation 0.003178528
inverse translation 0.003170482
translation sys 0.00316788
translation probabil 0.003165042
translation option 0.003155919
french word 0.002980832
significant word 0.002960659
cipher word 0.002934857
word substitution 0.0029325
translation 0.0028896
word sequences 0.00287971
simple word 0.002867527
word type 0.0028551080000000003
plaintext word 0.002854922
word types 0.002850533
unique word 0.002828854
word penalty 0.002820617
quency word 0.002789666
word substitu 0.002784533
word sub 0.002784533
word substi 0.002784533
sands word 0.002784533
language model 0.00272258
training data 0.0026078990000000003
model training 0.002599369
parallel data 0.002383355
data test 0.0023704060000000002
test data 0.0023704060000000002
decipherment model 0.002335487
data set 0.002301573
model probability 0.002262221
monolingual data 0.002153793
original data 0.002009665
model parameters 0.002002393
reordering model 0.001998781
specific data 0.001997866
train data 0.001946135
testing data 0.001937715
channel model 0.001929082
tune data 0.001921828
data tune 0.001921828
model probabil 0.001911022
training corpus 0.001863772
language models 0.001862188
source language 0.001800627
target language 0.001795681
sampling algorithm 0.001782589
parallel sampling 0.0017506750000000001
new sampling 0.001709021
decipherment training 0.001663696
phrase table 0.001660337
parallel corpus 0.001639228
model 0.00163558
language pairs 0.001621375
training set 0.0016212520000000001
previous sampling 0.001576544
decipherment table 0.001572372
new table 0.001570056
oov words 0.0015661310000000001
different news 0.001538794
sampling process 0.001530179
different set 0.001523799
tokens table 0.001488979
new phrase 0.001485463
different domain 0.001480188
decipherment algorithm 0.001471066
bigram language 0.001453606
gibbs sampling 0.001432753
unseen words 0.0014228
baseline system 0.001417968
glish words 0.001417882
new approach 0.001414032
new decipherment 0.001397498
unknown words 0.001389809
trigram language 0.001387825
test set 0.001383759
new features 0.0013817080000000002
english tokens 0.00137393
other features 0.001373302
natural language 0.001369819
language processing 0.001365633
ngram language 0.001354891
