semantic word 0.00284757
product feature 0.0024479849999999997
product features 0.002339835
learning word 0.0021097069999999997
feature mining 0.00207298
feature seeds 0.002072262
word representation 0.0020452079999999997
feature list 0.00203
feature candidates 0.00201676
feature seed 0.002013708
global word 0.002000175
feature candidate 0.001986187
word embedding 0.001985064
movie feature 0.001979322
feature extraction 0.001972364
train word 0.001971942
word screen 0.001954682
stanford word 0.0019361539999999998
uct feature 0.0019284319999999999
feature candi 0.001921014
word segmenter 0.001915511
word embeddings 0.001915511
surrounding word 0.001915511
word flaws 0.001915511
word segmenta 0.001915511
infrequent features 0.001859952
frequent features 0.0018352759999999998
uct features 0.0018202819999999999
discrete features 0.001815244
lexical semantic 0.0017430509999999998
feature 0.00167969
neural model 0.0016330469999999999
contextual semantic 0.001615531
representation model 0.001573138
features 0.00157154
semantic representation 0.001545498
semantic distance 0.0015319239999999998
semantic clue 0.001477757
semantic transition 0.0014769289999999999
semantic similarity 0.001475141
semantic clues 0.001458661
surfer model 0.0014475789999999998
training set 0.001418141
semantic similarities 0.001415003
semantic relations 0.001415003
training words 0.001358674
training vector 0.0013278909999999999
target product 0.001273819
important product 0.00122166
model 0.00120157
ing product 0.001199086
bootstrapping algorithm 0.001185204
syntactic pattern 0.0011790400000000001
label set 0.0011786029999999999
propagation algorithm 0.001176371
semantic 0.00117393
training window 0.001168648
mining product 0.0011615850000000001
product fea 0.001160581
local information 0.001158666
training step 0.001140791
hits algorithm 0.001139902
data sparsity 0.001138632
training let 0.00113664
syntactic rules 0.001135011
tion algorithm 0.001128764
additional training 0.00112693
training criterion 0.001123941
backpropagation algorithm 0.001117195
adsorption algorithm 0.001117195
pos tag 0.00111059
parameter matrixes 0.0011068060000000001
parameter study 0.001098255
training objective 0.001095373
mining method 0.0010951110000000002
annotated product 0.001092812
output product 0.001090143
log probability 0.001079293
novel product 0.001074672
new product 0.001071209
level information 0.001068025
syntactic patterns 0.00106732
syntactic pat 0.001062737
infrequent product 0.001056707
popular product 0.001035695
reliable product 0.00103424
product name 0.001032905
frequent product 0.001032031
target label 0.001029056
syntactic structures 0.0010174749999999999
bootstrapping method 0.0010128020000000001
syntactic constituents 0.001012409
mines product 0.001011495
phrasal product 0.001009318
syntactic templates 0.001006935
seed set 9.89089E-4
mutual information 9.861450000000001E-4
key words 9.79121E-4
convolutional method 9.76787E-4
example set 9.68723E-4
