Abstract
Code pre-trained models (CodePTMs) have recently become the de-facto paradigm for various tasks in the domain of code intelligence. To achieve excellent performance, the widely used strategy is to fine-tune all the parameters of CodePTMs. However, as the model size increases along with the number of downstream tasks, this strategy becomes excessively expensive. There are also some prior works that utilize Parameter-Efficient Learning (PEL) methods for model tuning in natural language processing to mitigate similar problems, but applying them directly to CodePTMs fails to capture the inherent structural characteristics of codes. To address the problem, in this paper, we propose Pass-Tuning for structure-aware Parameter-Efficient code representation learning. Specifically, a plug-and-play graph neural network module that can learn from Abstract Syntax Tree (AST) is employed as a tunable prefix. On the one hand, Pass-Tuning can further exploit the structural information of source code. On the other hand, it could serve as a replacement for full fine-tuning. We evaluate our method on multiple tasks across eight programming languages, including code understanding and generation. These results demonstrate the effectiveness, robustness, and universality of our method.- Anthology ID:
- 2023.findings-emnlp.42
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 577–591
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.42
- DOI:
- 10.18653/v1/2023.findings-emnlp.42
- Cite (ACL):
- Nuo Chen, Qiushi Sun, Jianing Wang, Xiang Li, and Ming Gao. 2023. Pass-Tuning: Towards Structure-Aware Parameter-Efficient Tuning for Code Representation Learning. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 577–591, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Pass-Tuning: Towards Structure-Aware Parameter-Efficient Tuning for Code Representation Learning (Chen et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/2023.findings-emnlp.42.pdf