Abstract
Residual has been widely applied to build deep neural networks with enhanced feature propagation and improved accuracy. In the literature, multiple variants of residual structure are proposed. However, most of them are manually designed for particular tasks and datasets and the combination of existing residual structures has not been well studied. In this work, we propose the Self-Adaptive Scaling (SAS) approach that automatically learns the design of residual structure from data. The proposed approach makes the best of various residual structures, resulting in a general architecture covering several existing ones. In this manner, we construct a learnable residual structure which can be easily integrated into a wide range of residual-based models. We evaluate our approach on various tasks concerning different modalities, including machine translation (IWSLT-2015 EN-VI and WMT-2014 EN-DE, EN-FR), image classification (CIFAR-10 and CIFAR-100), and image captioning (MSCOCO). Empirical results show that the proposed approach consistently improves the residual-based models and exhibits desirable generalization ability. In particular, by incorporating the proposed approach to the Transformer model, we establish new state-of-the-arts on the IWSLT-2015 EN-VI low-resource machine translation dataset.- Anthology ID:
- K19-1080
- Volume:
- Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)
- Month:
- November
- Year:
- 2019
- Address:
- Hong Kong, China
- Editors:
- Mohit Bansal, Aline Villavicencio
- Venue:
- CoNLL
- SIG:
- SIGNLL
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 862–870
- Language:
- URL:
- https://preview.aclanthology.org/build-pipeline-with-new-library/K19-1080/
- DOI:
- 10.18653/v1/K19-1080
- Cite (ACL):
- Fenglin Liu, Meng Gao, Yuanxin Liu, and Kai Lei. 2019. Self-Adaptive Scaling for Learnable Residual Structure. In Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pages 862–870, Hong Kong, China. Association for Computational Linguistics.
- Cite (Informal):
- Self-Adaptive Scaling for Learnable Residual Structure (Liu et al., CoNLL 2019)
- PDF:
- https://preview.aclanthology.org/build-pipeline-with-new-library/K19-1080.pdf