Andreas Nawroth


2020

pdf
An Evaluation of Progressive Neural Networksfor Transfer Learning in Natural Language Processing
Abdul Moeed | Gerhard Hagerer | Sumit Dugar | Sarthak Gupta | Mainak Ghosh | Hannah Danner | Oliver Mitevski | Andreas Nawroth | Georg Groh
Proceedings of the Twelfth Language Resources and Evaluation Conference

A major challenge in modern neural networks is the utilization of previous knowledge for new tasks in an effective manner, otherwise known as transfer learning. Fine-tuning, the most widely used method for achieving this, suffers from catastrophic forgetting. The problem is often exacerbated in natural language processing (NLP). In this work, we assess progressive neural networks (PNNs) as an alternative to fine-tuning. The evaluation is based on common NLP tasks such as sequence labeling and text classification. By gauging PNNs across a range of architectures, datasets, and tasks, we observe improvements over the baselines throughout all experiments.