On the Computational Power of Transformers and Its Implications in Sequence Modeling

Satwik Bhattamishra, Arkil Patel, Navin Goyal


Abstract
Transformers are being used extensively across several sequence modeling tasks. Significant research effort has been devoted to experimentally probe the inner workings of Transformers. However, our conceptual and theoretical understanding of their power and inherent limitations is still nascent. In particular, the roles of various components in Transformers such as positional encodings, attention heads, residual connections, and feedforward networks, are not clear. In this paper, we take a step towards answering these questions. We analyze the computational power as captured by Turing-completeness. We first provide an alternate and simpler proof to show that vanilla Transformers are Turing-complete and then we prove that Transformers with only positional masking and without any positional encoding are also Turing-complete. We further analyze the necessity of each component for the Turing-completeness of the network; interestingly, we find that a particular type of residual connection is necessary. We demonstrate the practical implications of our results via experiments on machine translation and synthetic tasks.
Anthology ID:
2020.conll-1.37
Volume:
Proceedings of the 24th Conference on Computational Natural Language Learning
Month:
November
Year:
2020
Address:
Online
Editors:
Raquel Fernández, Tal Linzen
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
455–475
Language:
URL:
https://aclanthology.org/2020.conll-1.37
DOI:
10.18653/v1/2020.conll-1.37
Bibkey:
Cite (ACL):
Satwik Bhattamishra, Arkil Patel, and Navin Goyal. 2020. On the Computational Power of Transformers and Its Implications in Sequence Modeling. In Proceedings of the 24th Conference on Computational Natural Language Learning, pages 455–475, Online. Association for Computational Linguistics.
Cite (Informal):
On the Computational Power of Transformers and Its Implications in Sequence Modeling (Bhattamishra et al., CoNLL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2020.conll-1.37.pdf
Optional supplementary material:
 2020.conll-1.37.OptionalSupplementaryMaterial.zip
Code
 satwik77/Transformer-Computation-Analysis