From Standard Transformers to Modern LLMs: Bringing Dialogue Models, RAG, and Agents to the Classroom

Maria Tikhonova, Viktoriia A. Chekalina, Artem Chervyakov, Alexey Zaytsev, Alexander Panchenko


Abstract
Modern LLM education is increasingly centered on system building: grounding generation with retrieval, enabling tool use, and deploying models under latency and cost constraints.We present an updated release of our open course on Transformer-based LLMs and multimodal models (Nikishina et al, 2024).The update introduces topics which became importance since the first edition, namely session on Retrieval Augmented Generation (RAG), a hands-on session on tool-using agents, an API-based track for applied work with LLM, and practical local inference with vLLM.We also add a dedicated session on multimodal dialog models with a focus on dialog grounding. We enriched the course with a discussion on long-context transformers, focusing on KV-cache efficiency along with the related models and benchmarks.All materials are released online.
Anthology ID:
2026.teachingnlp-1.8
Volume:
Proceedings of the Seventh Workshop on Teaching Natural Language Processing (TeachNLP 2026)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Matthias Aßenmacher, Laura Biester, Claudia Borg, György Kovács, Margot Mieskes, Sofia Serrano
Venues:
TeachingNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
41–44
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.teachingnlp-1.8/
DOI:
Bibkey:
Cite (ACL):
Maria Tikhonova, Viktoriia A. Chekalina, Artem Chervyakov, Alexey Zaytsev, and Alexander Panchenko. 2026. From Standard Transformers to Modern LLMs: Bringing Dialogue Models, RAG, and Agents to the Classroom. In Proceedings of the Seventh Workshop on Teaching Natural Language Processing (TeachNLP 2026), pages 41–44, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
From Standard Transformers to Modern LLMs: Bringing Dialogue Models, RAG, and Agents to the Classroom (Tikhonova et al., TeachingNLP 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.teachingnlp-1.8.pdf