SPE Attention: Making Attention Equivariant to Semantic-Preserving Permutation for Code Processing

Chengyu Jiao, Shuhao Chen, Yu Zhang


Abstract
Codes serve as the fundamental language for human to communicate with machines, and various Transformer-based models are trained to process codes in recent advancements. A unique symmetry of code is its semantic-preserving permutation, which allows certain lines to be rearranged without altering the overall meaning. To capture such symmetry, we propose a novel attention mechanism that incorporates semantic-preserving permutation equivariance, called the SPE attention. By leveraging the symmetry relationships within code, we introduce a directed layered graph to represent the code structure, and this graph is then summarized into a symmetry mask. The SPE attention integrates those symmetry masks, granting semantic-preserving permutations equivariance to the model. Experiments on various code related tasks, including code summarization and error detection, demonstrate the effectiveness of the proposed SPE attention.
Anthology ID:
2025.emnlp-main.332
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6566–6579
Language:
URL:
https://preview.aclanthology.org/ingest-luhme/2025.emnlp-main.332/
DOI:
10.18653/v1/2025.emnlp-main.332
Bibkey:
Cite (ACL):
Chengyu Jiao, Shuhao Chen, and Yu Zhang. 2025. SPE Attention: Making Attention Equivariant to Semantic-Preserving Permutation for Code Processing. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 6566–6579, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
SPE Attention: Making Attention Equivariant to Semantic-Preserving Permutation for Code Processing (Jiao et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-luhme/2025.emnlp-main.332.pdf
Checklist:
 2025.emnlp-main.332.checklist.pdf