AnimatedLLM: Explaining LLMs with Interactive Visualizations

Zdeněk Kasner, Ondrej Dusek


Abstract
Large language models (LLMs) are becoming central to natural language processing education, yet materials showing their mechanics are sparse. We present AnimatedLLM, an interactive web application that provides step-by-step visualizations of a Transformer language model. AnimatedLLM runs entirely in the browser, using pre-computed traces of open LLMs applied on manually curated inputs. The application is available at https://animatedllm.github.io, both as a teaching aid and for self-educational purposes.
Anthology ID:
2026.teachingnlp-1.1
Volume:
Proceedings of the Seventh Workshop on Teaching Natural Language Processing (TeachNLP 2026)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Matthias Aßenmacher, Laura Biester, Claudia Borg, György Kovács, Margot Mieskes, Sofia Serrano
Venues:
TeachingNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–6
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.teachingnlp-1.1/
DOI:
Bibkey:
Cite (ACL):
Zdeněk Kasner and Ondrej Dusek. 2026. AnimatedLLM: Explaining LLMs with Interactive Visualizations. In Proceedings of the Seventh Workshop on Teaching Natural Language Processing (TeachNLP 2026), pages 1–6, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
AnimatedLLM: Explaining LLMs with Interactive Visualizations (Kasner & Dusek, TeachingNLP 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.teachingnlp-1.1.pdf