language model 0.0037361
translation model 0.00368409
sentence model 0.003457046
context model 0.0034215240000000004
model feature 0.00312064
generation model 0.0030960910000000005
ibm model 0.0029891880000000003
trigram model 0.002879355
model surface 0.0028751500000000004
guage model 0.002846017
smt model 0.002841076
model 0.00260031
language models 0.0021455199999999997
translation models 0.00209351
word association 0.002016307
word frequency 0.001994257
translation work 0.00159919
network language 0.001547372
translation features 0.001524056
machine translation 0.0014972759999999999
phrase translation 0.001484408
target language 0.00147929
language modeling 0.0014746989999999999
related language 0.001424788
different models 0.0014214689999999999
language mod 0.001393708
continuous translation 0.0013843229999999998
models content 0.001370196
translation mod 0.001341698
models perplexity 0.001337743
ing training 0.001336024
context vector 0.001333696
evaluation evaluation 0.001321782
same context 0.001309815
generator models 0.0012782
models fluency 0.0012598709999999999
recurrent context 0.001258146
chinese poems 0.001257182
context lines 0.001251854
context vectors 0.0012469220000000001
probability distribution 0.001223773
other generation 0.001207695
chinese poem 0.001194316
line poems 0.0011641199999999998
bleu evaluation 0.001139635
language 0.00113579
poem corpus 0.001126278
single context 0.001121782
sentence rep 0.001103045
poem line 0.0011012539999999999
preceding sentence 0.001096937
convolutional sentence 0.001096039
interactive context 0.001093245
translation 0.00108378
first layer 0.0010773900000000001
rent context 0.001074038
generation system 0.001070536
poem generation 0.001069549
poems lines 0.001067274
design data 0.001063296
mert training 0.001060777
training procedure 0.0010569260000000001
eling context 0.001055403
first line 0.001054908
data sets 0.001044769
human evaluation 0.00103059
line generation 0.001023267
previous layer 0.001017283
qtest data 0.001015415
data sparsity 0.001015415
input layer 0.001012761
models 0.00100973
other systems 0.001008781
output layer 0.001005367
poem lines 0.001004408
previous line 9.94801E-4
perplexity evaluation 9.88904E-4
previous work 9.82725E-4
ing vectors 9.8054E-4
evaluation methods 9.6237E-4
example poems 9.623289999999999E-4
such vectors 9.61077E-4
poems section 9.59977E-4
lines line 9.581259999999999E-4
other half 9.55621E-4
manual evaluation 9.52901E-4
other sys 9.52747E-4
other layers 9.52617E-4
chinese poetry 9.496649999999999E-4
several vectors 9.4965E-4
similar results 9.45096E-4
style evaluation 9.425550000000001E-4
recurrent generation 9.32713E-4
different character 9.326429999999999E-4
evaluation study 9.32345E-4
generation task 9.30928E-4
candidate poems 9.23901E-4
character embeddings 9.232229999999999E-4
chinese quatrains 9.15152E-4
meaningful poems 9.099449999999999E-4
