xLM: A Python Package for Non-Autoregressive Language Models
Dhruvesh Patel, Durga Prasad Maram, Sai Sreenivas Chintha, Benjamin Rozonoyer, Andrew McCallum
Abstract
In recent years, there has been a resurgence of interest in non-autoregressive text generation in the context of general language modeling. Unlike the well-established autoregressive language modeling paradigm, which has a plethora of standard training and inference libraries, implementations of non-autoregressive language modeling have largely been bespoke making it difficult to perform systematic comparisons of different methods. Moreover, each non-autoregressive language model typically requires it own data collation, loss, and prediction logic, making it challenging to reuse common components. In this work, we present the XLM python package, which is designed to make implementing small non-autoregressive language models faster with a secondary goal of providing a suite of small pre-trained models (through a companion package) that can be used by the research community.- Anthology ID:
- 2026.eacl-demo.31
- Volume:
- Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 3: System Demonstrations)
- Month:
- March
- Year:
- 2026
- Address:
- Rabat, Marocco
- Editors:
- Danilo Croce, Jochen Leidner, Nafise Sadat Moosavi
- Venue:
- EACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 445–456
- Language:
- URL:
- https://preview.aclanthology.org/ingest-eacl/2026.eacl-demo.31/
- DOI:
- Cite (ACL):
- Dhruvesh Patel, Durga Prasad Maram, Sai Sreenivas Chintha, Benjamin Rozonoyer, and Andrew McCallum. 2026. xLM: A Python Package for Non-Autoregressive Language Models. In Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 3: System Demonstrations), pages 445–456, Rabat, Marocco. Association for Computational Linguistics.
- Cite (Informal):
- xLM: A Python Package for Non-Autoregressive Language Models (Patel et al., EACL 2026)
- PDF:
- https://preview.aclanthology.org/ingest-eacl/2026.eacl-demo.31.pdf