Ulrich Schwanecke


2022

pdf
Addressing Leakage in Self-Supervised Contextualized Code Retrieval
Johannes Villmow | Viola Campos | Adrian Ulges | Ulrich Schwanecke
Proceedings of the 29th International Conference on Computational Linguistics

We address contextualized code retrieval, the search for code snippets helpful to fill gaps in a partial input program. Our approach facilitates a large-scale self-supervised contrastive training by splitting source code randomly into contexts and targets. To combat leakage between the two, we suggest a novel approach based on mutual identifier masking, dedentation, and the selection of syntax-aligned targets. Our second contribution is a new dataset for direct evaluation of contextualized code retrieval, based on a dataset of manually aligned subpassages of code clones. Our experiments demonstrate that the proposed approach improves retrieval substantially, and yields new state-of-the-art results for code clone and defect detection.