Abstract
Effectively scaling large Transformer models is a main driver of recent advances in natural language processing. Dynamic neural networks, as an emerging research direction, are capable of scaling up neural networks with sub-linear increases in computation and time by dynamically adjusting their computational path based on the input. Dynamic neural networks could be a promising solution to the growing parameter numbers of pretrained language models, allowing both model pretraining with trillions of parameters and faster inference on mobile devices. In this survey, we summarize the progress of three types of dynamic neural networks in NLP: skimming, mixture of experts, and early exit. We also highlight current challenges in dynamic neural networks and directions for future research.- Anthology ID:
- 2023.findings-eacl.180
- Volume:
- Findings of the Association for Computational Linguistics: EACL 2023
- Month:
- May
- Year:
- 2023
- Address:
- Dubrovnik, Croatia
- Editors:
- Andreas Vlachos, Isabelle Augenstein
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2370–2381
- Language:
- URL:
- https://aclanthology.org/2023.findings-eacl.180
- DOI:
- 10.18653/v1/2023.findings-eacl.180
- Cite (ACL):
- Canwen Xu and Julian McAuley. 2023. A Survey on Dynamic Neural Networks for Natural Language Processing. In Findings of the Association for Computational Linguistics: EACL 2023, pages 2370–2381, Dubrovnik, Croatia. Association for Computational Linguistics.
- Cite (Informal):
- A Survey on Dynamic Neural Networks for Natural Language Processing (Xu & McAuley, Findings 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-1/2023.findings-eacl.180.pdf