HGAdapter: Hypergraph-based Adapters in Language Models for Code Summarization and Clone Detection

Guang Yang, Yujie Zhu


Abstract
Pre-trained language models (PLMs) are increasingly being applied to code-related tasks. Although PLMs have achieved good results, they do not take into account potential high-order data correlations within the code. We propose three types of high-order correlations in code tokens, i.e. abstract syntax tree family correlation, lexical correlation, and line correlation. We design a tokens and hyperedges generator to capture these high-order data correlations. We improve the architecture of hypergraph neural networks and combine it with adapter tuning to propose a novel hypergraph-based adapter (HGAdapter) to fine-tune PLMs. HGAdapter can encode high-order data correlations and is allowed to be inserted into various PLMs to enhance performance. Experiments were conducted on several public datasets, including six languages of code summarization and code clone detection tasks. Our methods improved the performance of PLMs in datasets to varying degrees. Experimental results validate the introduction of high-order data correlations that contribute to improved effectiveness.
Anthology ID:
2025.findings-emnlp.800
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14823–14833
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.800/
DOI:
10.18653/v1/2025.findings-emnlp.800
Bibkey:
Cite (ACL):
Guang Yang and Yujie Zhu. 2025. HGAdapter: Hypergraph-based Adapters in Language Models for Code Summarization and Clone Detection. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 14823–14833, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
HGAdapter: Hypergraph-based Adapters in Language Models for Code Summarization and Clone Detection (Yang & Zhu, Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.800.pdf
Checklist:
 2025.findings-emnlp.800.checklist.pdf