word error 0.00295736
word accuracy 0.00278996
syllabification word 0.002764347
word syllabification 0.002764347
orthographic word 0.00273149
word forms 0.0026189860000000002
average word 0.002533061
syllable word 0.0025185200000000002
input word 0.002464019
word entries 0.0024400100000000003
word accuracies 0.002434061
word stealth 0.002420451
unfamiliar word 0.002420451
word accu 0.002420451
word editors 0.002420451
test words 0.002184488
tag set 0.002154394
morphological features 0.002073531
many words 0.002041166
feature set 0.0020359339999999997
tag sequence 0.002019206
training data 0.002013764
certain words 0.00195698
dutch words 0.00195181
training set 0.001933854
syllabification model 0.001922277
correct tag 0.001912206
ponent words 0.0018330900000000001
compound words 0.001827726
possible tag 0.001827609
syllabified words 0.00182256
unsyllabified words 0.00181795
ing tag 0.00181303
emission features 0.00178629
training accuracy 0.00173179
tag pairs 0.00173178
structural tag 0.0017067100000000002
markov model 0.001699617
tag sets 0.0016983150000000002
complete tag 0.001694008
single tag 0.001690625
onc tag 0.0016748590000000001
model output 0.001671597
large training 0.0016694650000000002
feature vector 0.001661166
tion features 0.001659473
unigram features 0.001649524
tag sequences 0.00164894
tag scheme 0.001640771
model parameters 0.001633518
emission feature 0.0016087599999999999
tag schemes 0.00159526
model effects 0.001587338
words 0.00157722
feature representation 0.001528289
test data 0.0014990519999999998
training method 0.001493745
feature rep 0.001492934
feature represen 0.0014849619999999998
training sets 0.0014777750000000002
syllabification data 0.001475981
svm training 0.001474222
arbitrary feature 0.001469207
next training 0.00144679
different dictionary 0.001446056
training example 0.0014416720000000002
large data 0.001439269
same data 0.001437942
labeled training 0.001437864
celex training 0.001433218
structural tags 0.00142252
different languages 0.001420059
test set 0.001419142
training times 0.001415323
tags sets 0.001414125
nettalk training 0.001412187
discriminative training 0.001412178
svm tags 0.001410572
features 0.00140159
training examples 0.001399971
syllabification system 0.001392717
training time 0.001390944
onc tags 0.001390669
training instances 0.0013886880000000001
other languages 0.001376282
data results 0.001373261
training points 0.001368333
syllabified training 0.00136732
same set 0.001358032
incorrect tags 0.0013451539999999999
model 0.00133808
positional tags 0.0013141869999999999
potential tags 0.0013130519999999999
different approaches 0.001264065
baseline system 0.001254572
letter domain 0.0012544540000000001
correct sequence 0.001246372
feature 0.00122406
error analysis 0.001214367
tagging problem 0.0012091110000000001
