Learning Architectures from an Extended Search Space for Language Modeling

Yinqiao Li, Chi Hu, Yuhao Zhang, Nuo Xu, Yufan Jiang, Tong Xiao, Jingbo Zhu, Tongran Liu, Changliang Li


Abstract
Neural architecture search (NAS) has advanced significantly in recent years but most NAS systems restrict search to learning architectures of a recurrent or convolutional cell. In this paper, we extend the search space of NAS. In particular, we present a general approach to learn both intra-cell and inter-cell architectures (call it ESS). For a better search result, we design a joint learning method to perform intra-cell and inter-cell NAS simultaneously. We implement our model in a differentiable architecture search system. For recurrent neural language modeling, it outperforms a strong baseline significantly on the PTB and WikiText data, with a new state-of-the-art on PTB. Moreover, the learned architectures show good transferability to other systems. E.g., they improve state-of-the-art systems on the CoNLL and WNUT named entity recognition (NER) tasks and CoNLL chunking task, indicating a promising line of research on large-scale pre-learned architectures.
Anthology ID:
2020.acl-main.592
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6629–6639
Language:
URL:
https://aclanthology.org/2020.acl-main.592
DOI:
10.18653/v1/2020.acl-main.592
Bibkey:
Cite (ACL):
Yinqiao Li, Chi Hu, Yuhao Zhang, Nuo Xu, Yufan Jiang, Tong Xiao, Jingbo Zhu, Tongran Liu, and Changliang Li. 2020. Learning Architectures from an Extended Search Space for Language Modeling. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 6629–6639, Online. Association for Computational Linguistics.
Cite (Informal):
Learning Architectures from an Extended Search Space for Language Modeling (Li et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2020.acl-main.592.pdf
Video:
 http://slideslive.com/38928944