@inproceedings{liang-etal-2025-towards,
    title = "Towards Infinite-Long Prefix in Transformer",
    author = "Liang, Yingyu  and
      Shi, Zhenmei  and
      Song, Zhao  and
      Yang, Chiwun",
    editor = "Christodoulopoulos, Christos  and
      Chakraborty, Tanmoy  and
      Rose, Carolyn  and
      Peng, Violet",
    booktitle = "Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing",
    month = nov,
    year = "2025",
    address = "Suzhou, China",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.563/",
    pages = "11138--11202",
    ISBN = "979-8-89176-332-6",
    abstract = "Prompting and context-based fine-tuning methods, which we call Prefix Learning, have been proposed to enhance the performance of language models on various downstream tasks. They are empirically efficient and effective, matching the performance of full parameter fine-tuning, but the theoretical understandings are limited. In this paper, we aim to address this limitation by studying their ability from the perspective of prefix length. In particular, we provide a convergence guarantee for training an ultra-long prefix in a stylized setting using the Neural Tangent Kernel (NTK) framework. Based on this strong theoretical guarantee, we design and implement an algorithm that only needs to introduce and fine-tune a few extra trainable parameters instead of an infinite-long prefix in each layer of a transformer, and can approximate the prefix attention to a guaranteed polynomial-small error.Preliminary experimental results on vision, natural language, and math data show that our method achieves superior or competitive performance compared to existing methods like full parameters fine-tuning, P-Tuning V2, and LoRA. This demonstrates our method is promising for parameter-efficient fine-tuning."
}Markdown (Informal)
[Towards Infinite-Long Prefix in Transformer](https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.563/) (Liang et al., EMNLP 2025)
ACL
- Yingyu Liang, Zhenmei Shi, Zhao Song, and Chiwun Yang. 2025. Towards Infinite-Long Prefix in Transformer. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 11138–11202, Suzhou, China. Association for Computational Linguistics.