Transformers as Transducers
Lena Strobl, Dana Angluin, David Chiang, Jonathan Rawski, Ashish Sabharwal
Abstract
We study the sequence-to-sequence mapping capacity of transformers by relating them to finite transducers, and find that they can express surprisingly large classes of (total functional) transductions. We do so using variants of RASP, a programming language designed to help people “think like transformers,” as an intermediate representation. We extend the existing Boolean variant B-RASP to sequence-to-sequence transductions and show that it computes exactly the first-order rational transductions (such as string rotation). Then, we introduce two new extensions. B-RASP[pos] enables calculations on positions (such as copying the first half of a string) and contains all first-order regular transductions. S-RASP adds prefix sum, which enables additional arithmetic operations (such as squaring a string) and contains all first-order polyregular transductions. Finally, we show that masked average-hard attention transformers can simulate S-RASP.- Anthology ID:
- 2025.tacl-1.9
- Volume:
- Transactions of the Association for Computational Linguistics, Volume 13
- Month:
- Year:
- 2025
- Address:
- Cambridge, MA
- Venue:
- TACL
- SIG:
- Publisher:
- MIT Press
- Note:
- Pages:
- 200–219
- Language:
- URL:
- https://preview.aclanthology.org/fix-sig-urls/2025.tacl-1.9/
- DOI:
- 10.1162/tacl_a_00736
- Cite (ACL):
- Lena Strobl, Dana Angluin, David Chiang, Jonathan Rawski, and Ashish Sabharwal. 2025. Transformers as Transducers. Transactions of the Association for Computational Linguistics, 13:200–219.
- Cite (Informal):
- Transformers as Transducers (Strobl et al., TACL 2025)
- PDF:
- https://preview.aclanthology.org/fix-sig-urls/2025.tacl-1.9.pdf