Teaching Old Tokenizers New Words: Efficient Tokenizer Adaptation for Pretrained Models
Taido Purason, Pavel Chizhov, Ivan P. Yamshchikov, Mark Fishel
Abstract
Tokenizer adaptation plays an important role in adapting pre-trained language models to new domains or languages. In this work, we address two complementary aspects of this process: vocabulary extension and pruning. The common approach to extension trains a new tokenizer on domain-specific text and appends the tokens that do not overlap with the existing vocabulary, which often results in many tokens that are unreachable or never used. We propose continued BPE training that extends a pre-trained tokenizer by continuing the BPE merge learning process on new data. Experiments across multiple languages and model families show that this approach improves tokenization efficiency and leads to better utilization of added vocabulary. We also introduce leaf-based vocabulary pruning, which removes redundant tokens while preserving model quality. Together, these methods provide practical tools for controlled vocabulary modification, which we release as an open-source toolkit.- Anthology ID:
- 2026.findings-eacl.341
- Volume:
- Findings of the Association for Computational Linguistics: EACL 2026
- Month:
- March
- Year:
- 2026
- Address:
- Rabat, Morocco
- Editors:
- Vera Demberg, Kentaro Inui, Lluís Marquez
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 6492–6516
- Language:
- URL:
- https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.341/
- DOI:
- Cite (ACL):
- Taido Purason, Pavel Chizhov, Ivan P. Yamshchikov, and Mark Fishel. 2026. Teaching Old Tokenizers New Words: Efficient Tokenizer Adaptation for Pretrained Models. In Findings of the Association for Computational Linguistics: EACL 2026, pages 6492–6516, Rabat, Morocco. Association for Computational Linguistics.
- Cite (Informal):
- Teaching Old Tokenizers New Words: Efficient Tokenizer Adaptation for Pretrained Models (Purason et al., Findings 2026)
- PDF:
- https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.341.pdf