Computational Expressivity of Neural Language Models

Alexandra Butoi, Ryan Cotterell, Anej Svete


Abstract
Language models (LMs) are currently at the forefront of NLP research due to their remarkable versatility across diverse tasks. However, a large gap exists between their observed capabilities and the explanations proposed by established formal machinery. To motivate a better theoretical characterization of LMs’ abilities and limitations, this tutorial aims to provide a comprehensive introduction to a specific framework for formal analysis of modern LMs using tools from formal language theory (FLT). We present how tools from FLT can be useful in understanding the inner workings and predicting the capabilities of modern neural LM architectures. We will cover recent results using FLT to make precise and practically relevant statements about LMs based on recurrent neural networks and transformers by relating them to formal devices such as finite-state automata, Turing machines, and analog circuits. Altogether, the results covered in this tutorial will allow us to make precise statements and explanations about the observed as well as predicted behaviors of LMs, as well as provide theoretically motivated suggestions on the aspects of the architectures that could be improved.
Anthology ID:
2024.acl-tutorials.3
Original:
2024.acl-tutorials.3v1
Version 2:
2024.acl-tutorials.3v2
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 5: Tutorial Abstracts)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Luis Chiruzzo, Hung-yi Lee, Leonardo Ribeiro
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5–5
Language:
URL:
https://aclanthology.org/2024.acl-tutorials.3
DOI:
10.18653/v1/2024.acl-tutorials.3
Bibkey:
Cite (ACL):
Alexandra Butoi, Ryan Cotterell, and Anej Svete. 2024. Computational Expressivity of Neural Language Models. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 5: Tutorial Abstracts), pages 5–5, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Computational Expressivity of Neural Language Models (Butoi et al., ACL 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/autopr/2024.acl-tutorials.3.pdf