Talgat Bektleuov


2025

pdf bib
ATGen: A Framework for Active Text Generation
Akim Tsvigun | Daniil Vasilev | Ivan Tsvigun | Ivan Lysenko | Talgat Bektleuov | Aleksandr Medvedev | Uliana Vinogradova | Nikita Severin | Mikhail Mozikov | Andrey Savchenko | Ilya Makarov | Grigorev Rostislav | Ramil Kuleev | Fedor Zhdanov | Artem Shelmanov
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 3: System Demonstrations)

Active learning (AL) has demonstrated remarkable potential in reducing the annotation effort required for training machine learning models. However, despite the surging popularity of natural language generation (NLG) tasks in recent years, the application of AL to NLG has been limited. In this paper, we introduce Active Text Generation (ATGen) - a comprehensive framework that bridges AL with text generation tasks, enabling the application of state-of-the-art AL strategies to NLG. Our framework simplifies AL-empowered annotation in NLG tasks using both human annotators and automatic annotation agents based on large language models (LLMs). The framework supports LLMs deployed as a service, such as ChatGPT and Claude, or operated on-premises. Furthermore, ATGen provides a unified platform for smooth implementation and benchmarking of novel AL strategies tailored to NLG tasks. Finally, we present experimental results across multiple text generation tasks where we compare the performance of state-of-the-art AL strategies in various settings. We demonstrate that ATGen can reduce both the effort of human annotators and costs for API calls to automatic annotation agents based on LLMs.