phrase translation 0.00630846
pruning phrase 0.00474727
translation model 0.00462323
phrase pair 0.00457799
target phrase 0.004401838
phrase pairs 0.004336948
other phrase 0.004181538
phrase table 0.004169537
source phrase 0.004153039
such phrase 0.004110845
phrase translations 0.004082149
many phrase 0.0040619689999999995
phrase extraction 0.0040080409999999995
phrase transla 0.003889409
long phrase 0.0038712259999999997
redundant phrase 0.0038339909999999997
spurious phrase 0.003831561
composite phrase 0.003827682
phrase tables 0.003821369
atomic phrase 0.003820043
next phrase 0.0038142209999999996
phrase accord 0.0038111839999999996
phrase reorder 0.0038111839999999996
phrase prediction 0.0038111839999999996
different translation 0.00365745
phrase 0.00355883
translation models 0.003503876
translation features 0.003452297
same translation 0.003377128
translation table 0.0033603369999999997
translation probabilities 0.00321998
multiple translation 0.003214743
lexical translation 0.0031771539999999997
machine translation 0.003169145
translation task 0.003157938
translation order 0.0031525249999999998
translation events 0.003139073
translation decoder 0.0031378359999999998
full translation 0.003111934
translation quality 0.003099009
translation hypothesis 0.003068668
pruning model 0.0030620400000000002
overall translation 0.003048954
translation event 0.003048182
iwslt translation 0.003037994
translation systems 0.003036657
translation accuracy 0.003030709
accurate translation 0.003028781
dialog translation 0.003011003
translation entry 0.003007649
translation hypoth 0.0030065279999999996
chine translation 0.003005461
translation hypotheses 0.003004943
elementary translation 0.003002455
monotonous translation 0.003002455
model probability 0.002757942
translation 0.00274963
language model 0.002710401
entropy model 0.00238784
pruning pruning 0.00237688
model weights 0.0023680949999999997
europarl model 0.0023488429999999998
model weight 0.002287058
ibm model 0.002274602
reordering model 0.002262236
model size 0.002246131
target phrases 0.002237878
hmm model 0.002212736
ing model 0.002210391
source word 0.002205239
multinomial model 0.002195569
uniform model 0.002189341
word alignments 0.002186069
phrases pairs 0.002172988
iwslt model 0.002161964
unpruned model 0.002156851
other phrases 0.002017578
model 0.0018736
glish word 0.001865438
language pair 0.0018559610000000002
table pruning 0.001799147
pruning threshold 0.001766617
pruning algorithm 0.001756531
sentence pair 0.0017439640000000002
entropy pruning 0.00170268
training sentences 0.001699982
target language 0.001679809
composite phrases 0.001663722
atomic phrases 0.001656083
multiple pruning 0.001653553
language pairs 0.0016149189999999998
language models 0.001591047
target sentence 0.001567812
pruning step 0.00154787
our pruning 0.001535984
training corpora 0.00151677
significance pruning 0.0015141
same probability 0.00151184
sentence pairs 0.0015029219999999999
undesirable pruning 0.0014818980000000002
