translation model 0.0040409899999999995
target word 0.00331001
target pos 0.00330013
model probability 0.0030214839999999996
pos tag 0.002999466
source word 0.002993562
pos source 0.002983682
pos system 0.00289303
pos information 0.002863489
first word 0.002808045
pos function 0.00280561
language model 0.0027610539999999998
word alignment 0.0027278299999999997
pos tags 0.002682107
additional model 0.0026697799999999996
original word 0.0026681219999999998
bilm pos 0.002646392
simple model 0.002622934
bilm model 0.002607142
reordering model 0.002591474
model reordering 0.002591474
word alwzyr 0.0025770669999999997
ter pos 0.0025688580000000003
pos label 0.002568327
pos labeling 0.002543822
empty word 0.0025243749999999997
pos labels 0.00251178
word orders 0.002509734
translation system 0.00250897
own pos 0.002494194
general model 0.002491605
model sparsity 0.002490713
accurate model 0.0024845749999999997
model variations 0.0024632509999999996
english translation 0.0023099419999999997
translation experiments 0.002263104
arabic translation 0.002243974
machine translation 0.00223729
model 0.0021929
translation units 0.0021761339999999997
partial translation 0.002171731
translation events 0.002165872
minimal translation 0.002165415
translation hypothesis 0.002153933
translation event 0.002149602
elementary translation 0.002137942
translation quality 0.002116348
translation perfor 0.002114281
chine translation 0.00211245
translation correspondence 0.002111422
target words 0.0020624420000000003
different target 0.002029046
translation 0.00184809
source words 0.0017459939999999998
dependency tree 0.001745489
target information 0.001699319
training data 0.0016484450000000001
target function 0.00164144
target language 0.001636134
training training 0.001599022
source dependency 0.001592648
other words 0.001512294
simple target 0.001498014
target sentences 0.0014753190000000001
dependency information 0.0014724550000000001
information dependency 0.0014724550000000001
target string 0.001439919
bilingual models 0.0014280780000000002
source syntactic 0.0014261209999999998
target sen 0.001415508
target functions 0.0014086440000000001
arabic words 0.001390346
language models 0.00138532
corresponding target 0.001384985
source sentence 0.001377847
target side 0.0013540840000000002
source function 0.001324992
parse tree 0.001314121
syntactic information 0.001305928
bleu results 0.001303636
source contextual 0.0012946379999999999
dependency parser 0.001289513
ing source 0.0012642489999999998
dependency trees 0.001251364
dependency parse 0.001250864
different combinations 0.001250091
different samples 0.001239169
parallel data 0.001236334
pbsmt features 0.0012353730000000001
bilm models 0.001231408
feature function 0.001230634
different variants 0.0012270620000000001
different represen 0.001226293
different aspects 0.001226293
different segmentations 0.001226293
bilingual language 0.0011790659999999999
dependency grammar 0.0011767330000000001
contextual information 0.001174445
probability parameters 0.001170867
lexical information 0.0011645599999999998
