Seamlessly Integrating Tree-Based Positional Embeddings into Transformer Models for Source Code Representation

Patryk Bartkowiak, Filip Graliński


Abstract
Transformer-based models have demonstrated significant success in various source code representation tasks. Nonetheless, traditional positional embeddings employed by these models inadequately capture the hierarchical structure intrinsic to source code, typically represented as Abstract Syntax Trees (ASTs). To address this, we propose a novel tree-based positional embedding approach that explicitly encodes hierarchical relationships derived from ASTs, including node depth and sibling indices. These hierarchical embeddings are integrated into the transformer architecture, specifically enhancing the CodeBERTa model. We thoroughly evaluate our proposed model through masked language modeling (MLM) pretraining and clone detection fine-tuning tasks. Experimental results indicate that our Tree-Enhanced CodeBERTa consistently surpasses the baseline model in terms of loss, accuracy, F1 score, precision, and recall, emphasizing the importance of incorporating explicit structural information into transformer-based representations of source code.
Anthology ID:
2025.xllm-1.10
Volume:
Proceedings of the 1st Joint Workshop on Large Language Models and Structure Modeling (XLLM 2025)
Month:
August
Year:
2025
Address:
Vienna, Austria
Editors:
Hao Fei, Kewei Tu, Yuhui Zhang, Xiang Hu, Wenjuan Han, Zixia Jia, Zilong Zheng, Yixin Cao, Meishan Zhang, Wei Lu, N. Siddharth, Lilja Øvrelid, Nianwen Xue, Yue Zhang
Venues:
XLLM | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
91–98
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.xllm-1.10/
DOI:
Bibkey:
Cite (ACL):
Patryk Bartkowiak and Filip Graliński. 2025. Seamlessly Integrating Tree-Based Positional Embeddings into Transformer Models for Source Code Representation. In Proceedings of the 1st Joint Workshop on Large Language Models and Structure Modeling (XLLM 2025), pages 91–98, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Seamlessly Integrating Tree-Based Positional Embeddings into Transformer Models for Source Code Representation (Bartkowiak & Graliński, XLLM 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.xllm-1.10.pdf