Enabling Autoregressive Models to Fill In Masked Tokens

Daniel Mingyi Israel, Aditya Grover, Guy Van den Broeck


Abstract
Historically, LLMs have been trained using either autoregressive (AR) or masked language modeling (MLM) objectives, with AR models gaining dominance in recent years. However, AR models are inherently incapable of masked infilling, which is the ability to predict masked tokens between past and future context. In contrast, MLM models suffer from intrinsic computational inefficiencies during both training and inference that hinder their scalability. This work introduces MARIA (Masked and Autoregressive Infilling Architecture), a novel approach that leverages the strengths of both paradigms to achieve state-of-the-art masked infilling performance. MARIA combines a pre-trained MLM and AR model by training a linear decoder that takes their concatenated hidden states as input. This minimal modification enables the AR model to perform infilling while retaining its inherent advantages in terms of faster inference with KV caching. Our results demonstrate that MARIA significantly outperforms existing methods, namely discrete diffusion models, on masked infilling tasks.
Anthology ID:
2026.findings-eacl.260
Volume:
Findings of the Association for Computational Linguistics: EACL 2026
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4954–4965
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.260/
DOI:
Bibkey:
Cite (ACL):
Daniel Mingyi Israel, Aditya Grover, and Guy Van den Broeck. 2026. Enabling Autoregressive Models to Fill In Masked Tokens. In Findings of the Association for Computational Linguistics: EACL 2026, pages 4954–4965, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Enabling Autoregressive Models to Fill In Masked Tokens (Israel et al., Findings 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.260.pdf
Checklist:
 2026.findings-eacl.260.checklist.pdf