Code-Optimise: Self-Generated Preference Data for Correctness and Efficiency

Leonidas Gee, Milan Gritta, Gerasimos Lampouras, Ignacio Iacobacci


Abstract
Code Language Models have been trained togenerate accurate solutions, typically with noregard for runtime. On the other hand, previousworks that explored execution optimisationhave observed corresponding drops infunctional correctness. To that end, we introduceCode-Optimise, a framework that incorporatesboth correctness (passed, failed) andruntime (quick, slow) as learning signals viaself-generated preference data. Our frameworkis both lightweight and robust as it dynamicallyselects solutions to reduce overfitting whileavoiding a reliance on larger models for learningsignals. Code-Optimise achieves significantimprovements in pass@k while decreasingthe competitive baseline runtimes by anadditional 6% for in-domain data and up to3% for out-of-domain data. As a by-product,the average length of the generated solutionsis reduced by up to 48% on MBPP and 23%on HumanEval, resulting in faster and cheaperinference. The generated data and codebaseis open-sourced at https://github.com/huawei-noah/HEBO/tree/Code_Optimise.
Anthology ID:
2025.findings-naacl.5
Volume:
Findings of the Association for Computational Linguistics: NAACL 2025
Month:
April
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Luis Chiruzzo, Alan Ritter, Lu Wang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
79–94
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.findings-naacl.5/
DOI:
Bibkey:
Cite (ACL):
Leonidas Gee, Milan Gritta, Gerasimos Lampouras, and Ignacio Iacobacci. 2025. Code-Optimise: Self-Generated Preference Data for Correctness and Efficiency. In Findings of the Association for Computational Linguistics: NAACL 2025, pages 79–94, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
Code-Optimise: Self-Generated Preference Data for Correctness and Efficiency (Gee et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.findings-naacl.5.pdf