Yangyang Lu
2016
Improved relation classification by deep recurrent neural networks with data augmentation
Yan Xu
|
Ran Jia
|
Lili Mou
|
Ge Li
|
Yunchuan Chen
|
Yangyang Lu
|
Zhi Jin
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers
Nowadays, neural networks play an important role in the task of relation classification. By designing different neural architectures, researchers have improved the performance to a large extent in comparison with traditional methods. However, existing neural networks for relation classification are usually of shallow architectures (e.g., one-layer convolutional neural networks or recurrent networks). They may fail to explore the potential representation space in different abstraction levels. In this paper, we propose deep recurrent neural networks (DRNNs) for relation classification to tackle this challenge. Further, we propose a data augmentation method by leveraging the directionality of relations. We evaluated our DRNNs on the SemEval-2010 Task 8, and achieve an F1-score of 86.1%, outperforming previous state-of-the-art recorded results.
2015
A Comparative Study on Regularization Strategies for Embedding-based Neural Networks
Hao Peng
|
Lili Mou
|
Ge Li
|
Yunchuan Chen
|
Yangyang Lu
|
Zhi Jin
Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing