MATCH: Task-Driven Code Evaluation through Contrastive Learning

Marah Ghoummaid, Vladimir Tchuiev, Ofek Glick, Michal Moshkovitz, Dotan Di Castro


Abstract
AI-based code generation is increasingly prevalent, with GitHub Copilot estimated to generate 46% of the code on GitHub. Accurately evaluating how well generated code aligns with developer intent remains a critical challenge. Traditional evaluation methods, such as unit tests, are often unscalable and costly. Syntactic similarity metrics (e.g., BLEU, ROUGE) fail to capture code functionality, and metrics like CodeBERTScore require reference code, which is not always available. To address the gap in reference-free evaluation, with few alternatives such as ICE-Score, this paper introduces MATCH, a novel reference-free metric. MATCH uses Contrastive Learning to generate meaningful embeddings for code and natural language task descriptions, enabling similarity scoring that reflects how well generated code implements the task. We show that MATCH achieves stronger correlations with functional correctness and human preference than existing metrics across multiple programming languages.
Anthology ID:
2025.findings-emnlp.611
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11399–11414
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.611/
DOI:
10.18653/v1/2025.findings-emnlp.611
Bibkey:
Cite (ACL):
Marah Ghoummaid, Vladimir Tchuiev, Ofek Glick, Michal Moshkovitz, and Dotan Di Castro. 2025. MATCH: Task-Driven Code Evaluation through Contrastive Learning. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 11399–11414, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
MATCH: Task-Driven Code Evaluation through Contrastive Learning (Ghoummaid et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.611.pdf
Checklist:
 2025.findings-emnlp.611.checklist.pdf