MapCoder-Lite: Distilling Multi-Agent Coding into a Single Small LLM

Woongkyu Lee, Junhee Cho, Jungwook Choi


Abstract
Large language models (LLMs) have advanced code generation from single-function tasks to competitive-programming problems, but existing multi-agent solutions either rely on costly large-scale (> 30 B) models or collapse when downsized to small open-source models. We present MapCoder-Lite, a framework for distilling the complex reasoning of large, multi-agent coding systems into a single 7B model. Our contribution is a novel, three-pillar methodology that synergistically generates, refines, and encodes multi-agent knowledge: (i) pass-based trajectory distillation from strong LLMs fixes format fragility in retrieval and reduces failures in debugging, (ii) supervisor-guided correction with global feedback strengthens planning and coding agents, and (iii) agent-wise LoRA fine-tuning delivers memory-efficient specialisation.Comprehensive evaluation on xCodeEval, APPS, and CodeContests shows that MapCoder-Lite more than doubles xCodeEval accuracy (13.2% → 28.3%), eliminates all format failures, while reducing GPU memory and token-generation time by compared to a 32B model. It also achieves over 10% gains on simpler coding benchmarks, demonstrating broad improvements beyond competitive programming. These results demonstrate that careful agent-wise fine-tuning unleashes high-quality multi-agent coding on a small language model. Our code is publicly available at https://github.com/aiha-lab/MapCoder-Lite.
Anthology ID:
2026.findings-eacl.346
Volume:
Findings of the Association for Computational Linguistics: EACL 2026
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6569–6596
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.346/
DOI:
Bibkey:
Cite (ACL):
Woongkyu Lee, Junhee Cho, and Jungwook Choi. 2026. MapCoder-Lite: Distilling Multi-Agent Coding into a Single Small LLM. In Findings of the Association for Computational Linguistics: EACL 2026, pages 6569–6596, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
MapCoder-Lite: Distilling Multi-Agent Coding into a Single Small LLM (Lee et al., Findings 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.346.pdf
Checklist:
 2026.findings-eacl.346.checklist.pdf