Yanyao Shen
2018
Dense Information Flow for Neural Machine Translation
Yanyao Shen
|
Xu Tan
|
Di He
|
Tao Qin
|
Tie-Yan Liu
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
Recently, neural machine translation has achieved remarkable progress by introducing well-designed deep neural networks into its encoder-decoder framework. From the optimization perspective, residual connections are adopted to improve learning performance for both encoder and decoder in most of these deep architectures, and advanced attention connections are applied as well. Inspired by the success of the DenseNet model in computer vision problems, in this paper, we propose a densely connected NMT architecture (DenseNMT) that is able to train more efficiently for NMT. The proposed DenseNMT not only allows dense connection in creating new features for both encoder and decoder, but also uses the dense attention structure to improve attention quality. Our experiments on multiple datasets show that DenseNMT structure is more competitive and efficient.
2017
Deep Active Learning for Named Entity Recognition
Yanyao Shen
|
Hyokun Yun
|
Zachary Lipton
|
Yakov Kronrod
|
Animashree Anandkumar
Proceedings of the 2nd Workshop on Representation Learning for NLP
Deep neural networks have advanced the state of the art in named entity recognition. However, under typical training procedures, advantages over classical methods emerge only with large datasets. As a result, deep learning is employed only when large public datasets or a large budget for manually labeling data is available. In this work, we show otherwise: by combining deep learning with active learning, we can outperform classical methods even with a significantly smaller amount of training data.
Search
Co-authors
- Xu Tan 1
- Di He 1
- Tao Qin 1
- Tie-Yan Liu 1
- Hyokun Yun 1
- show all...