From Random to Supervised: A Novel Dropout Mechanism Integrated with Global Information

Hengru Xu, Shen Li, Renfen Hu, Si Li, Sheng Gao


Abstract
Dropout is used to avoid overfitting by randomly dropping units from the neural networks during training. Inspired by dropout, this paper presents GI-Dropout, a novel dropout method integrating with global information to improve neural networks for text classification. Unlike the traditional dropout method in which the units are dropped randomly according to the same probability, we aim to use explicit instructions based on global information of the dataset to guide the training process. With GI-Dropout, the model is supposed to pay more attention to inapparent features or patterns. Experiments demonstrate the effectiveness of the dropout with global information on seven text classification tasks, including sentiment analysis and topic classification.
Anthology ID:
K18-1055
Volume:
Proceedings of the 22nd Conference on Computational Natural Language Learning
Month:
October
Year:
2018
Address:
Brussels, Belgium
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
573–582
Language:
URL:
https://aclanthology.org/K18-1055
DOI:
10.18653/v1/K18-1055
Bibkey:
Cite (ACL):
Hengru Xu, Shen Li, Renfen Hu, Si Li, and Sheng Gao. 2018. From Random to Supervised: A Novel Dropout Mechanism Integrated with Global Information. In Proceedings of the 22nd Conference on Computational Natural Language Learning, pages 573–582, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
From Random to Supervised: A Novel Dropout Mechanism Integrated with Global Information (Xu et al., CoNLL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/remove-xml-comments/K18-1055.pdf
Code
 xusong19960424/global_cnn
Data
MPQA Opinion CorpusSST