Sungrae Park
2020
Scale down Transformer by Grouping Features for a Lightweight Character-level Language Model
Sungrae Park
|
Geewook Kim
|
Junyeop Lee
|
Junbum Cha
|
Ji-Hoon Kim
|
Hwalsuk Lee
Proceedings of the 28th International Conference on Computational Linguistics
This paper introduces a method that efficiently reduces the computational cost and parameter size of Transformer. The proposed model, refer to as Group-Transformer, splits feature space into multiple groups, factorizes the calculation paths, and reduces computations for the group interaction. Extensive experiments on two benchmark tasks, enwik8 and text8, prove our model’s effectiveness and efficiency in small-scale Transformers. To the best of our knowledge, Group-Transformer is the first attempt to design Transformer with the group strategy, widely used for efficient CNN architectures.
Search