phrase translation 0.00706552
phrase table 0.0052273829999999995
phrase pairs 0.005135181999999999
different phrase 0.005078821
other phrase 0.00500881
phrase pair 0.004992877
phrase length 0.004916929
large phrase 0.00484248
target phrase 0.004839298
source phrase 0.004829259
phrase models 0.004821849
same phrase 0.004812807
phrase extraction 0.004764209
common phrase 0.0047095769999999995
french phrase 0.004689955
single phrase 0.004666813
simple phrase 0.004646794
good phrase 0.004641823
phrase tables 0.004636807
original phrase 0.004615464
relative phrase 0.004598454
full phrase 0.004589813999999999
phrase compositionality 0.004579286
wmt phrase 0.004576792
phrase penalty 0.004565056
maximum phrase 0.0045588059999999994
whole phrase 0.004552703999999999
sound phrase 0.004551494
phrase lengths 0.004539828
conclusions phrase 0.00453283
phrase extrac 0.004530831999999999
normal phrase 0.0045284999999999995
entire phrase 0.004528056
solute phrase 0.0045260579999999995
required phrase 0.0045260579999999995
translation model 0.00428294
phrase 0.00422761
translation system 0.003655578
translation models 0.003432149
same translation 0.003423107
translation results 0.00341409
translation quality 0.003333202
machine translation 0.003328516
translation systems 0.00329311
translation task 0.003261559
original translation 0.003225764
translation cost 0.003201721
translation directions 0.003187584
yield translation 0.003162064
shared translation 0.003151971
language phrases 0.00314246
translation direc 0.003136986
model pruning 0.0030440299999999997
other phrases 0.0028411400000000002
translation 0.00283791
target phrases 0.002671628
source phrases 0.0026615890000000002
many phrases 0.0026159240000000004
table pruning 0.002598773
pruning method 0.002542942
language model 0.00252755
example phrases 0.002515933
long phrases 0.0024526780000000002
different pruning 0.002450211
system pruning 0.002416668
alignment model 0.002409677
threshold pruning 0.0024059299999999997
pruning threshold 0.0024059299999999997
redundant phrases 0.002387644
short phrases 0.002386393
other pruning 0.0023802
language word 0.00237787
lion phrases 0.002360211
unreliable phrases 0.002360211
singleton phrases 0.002360211
word alignment 0.002259997
entropy pruning 0.002205764
model distribution 0.002193872
significance pruning 0.002167467
model size 0.002135605
several pruning 0.002095695
pruning experiments 0.002067722
phrases 0.00205994
pruning methods 0.0020366019999999998
basic pruning 0.002031615
language pairs 0.001990092
training data 0.0019840450000000003
novel pruning 0.001983716
various pruning 0.001974211
pruning criterion 0.001970798
relative pruning 0.001969844
reordering model 0.0019640029999999998
aggressive pruning 0.0019468859999999999
pruning criteria 0.001945414
model quality 0.001940322
absolute pruning 0.0019366939999999999
pruning thresholds 0.001933478
pruning meth 0.001931391
model score 0.001924669
pruning techniques 0.00191674
