word translation 0.00452765
translation hypothesis 0.003702365
decoding model 0.00367389
translation consensus 0.003579716
translation results 0.003522901
translation decoders 0.003516728
machine translation 0.003508558
translation hypotheses 0.003465367
model training 0.00344663
translation systems 0.0034463460000000003
partial translation 0.003369591
translation task 0.003335232
additional translation 0.003326752
translation lists 0.0033102730000000003
translation length 0.0032968150000000002
translation outputs 0.003295047
translation quality 0.003292259
translation accuracy 0.003279759
long translation 0.0032656250000000003
translation tasks 0.003265586
translation con 0.0032484650000000003
translation lattice 0.0032449930000000003
chine translation 0.003237176
translation proba 0.003233063
translation consen 0.0032321800000000003
combination model 0.003160417
smt model 0.003157487
language model 0.003148715
baseline model 0.003105798
reordering model 0.003026107
model score 0.002989692
translation 0.00294763
model scores 0.002843325
member model 0.002832307
linear model 0.002830616
general model 0.002742905
model parameters 0.002734888
ranking model 0.002700073
model formulation 0.002667367
model irrele 0.002653808
training data 0.0026009199999999996
different feature 0.00238214
model 0.00236921
test data 0.0023503779999999998
decoding models 0.0023431299999999997
word corpus 0.002261615
data set 0.002180426
word alignment 0.002108627
smt decoding 0.002092957
bilingual data 0.00207365
baseline decoding 0.0020412679999999997
system combination 0.001983147
word level 0.0019652899999999997
development data 0.0019304259999999998
parallel data 0.0019094749999999999
data figure 0.001888173
word lattice 0.0018773829999999998
word align 0.001868518
symmetric word 0.001866066
source sentence 0.0018629089999999998
data sets 0.001858176
source words 0.001840778
combination models 0.0018296570000000002
decoding approach 0.001822694
language models 0.001817955
data sparsity 0.0018083959999999999
models baseline 0.001775038
baseline models 0.001775038
decoding procedure 0.00177251
new feature 0.001765028
feature function 0.0017589850000000002
different decoders 0.0017536779999999998
decoding process 0.001683125
collaborative decoding 0.0016754909999999999
different member 0.001647677
iterative decoding 0.0016286439999999998
bilingual training 0.00162757
decoding scheme 0.001624148
feature weight 0.001619881
tive decoding 0.0016176609999999998
mbr decoding 0.0016108189999999999
decoding incorpo 0.001591496
borative decoding 0.001590937
complete decoding 0.0015899619999999999
standalone decoding 0.0015899619999999999
clude decoding 0.0015899619999999999
corresponding feature 0.0015848070000000001
feature functions 0.001571315
length feature 0.001546745
training procedure 0.00154525
feature values 0.0015170910000000001
different settings 0.001510139
ment features 0.001508053
feature index 0.001507751
member models 0.001501547
models member 0.001501547
system com 0.0014971750000000001
system combi 0.0014954550000000001
system combina 0.00148693
different con 0.001485415
