Abstract
Lifelong prompt tuning has significantly advanced parameter-efficient lifelong learning with its efficiency and minimal storage demands on various tasks.Our empirical studies, however, highlights certain transferability constraints in the current methodologies: a universal algorithm that guarantees consistent positive transfer across all tasks is currently unattainable, especially when dealing dissimilar tasks that may engender negative transfer.Identifying the misalignment between algorithm selection and task specificity as the primary cause of negative transfer, we present the Similarity Heuristic Lifelong Prompt Tuning (SHLPT) framework. This innovative strategy partitions tasks into two distinct subsets by harnessing a learnable similarity metric, thereby facilitating fruitful transfer from tasks regardless of their similarity or dissimilarity. Additionally, SHLPT incorporates a parameter pool to combat catastrophic forgetting effectively. Our experiments shows that SHLPT outperforms state-of-the-art techniques in lifelong learning benchmarks and demonstrates robustness against negative transfer in diverse task sequences.- Anthology ID:
- 2024.findings-acl.650
- Volume:
- Findings of the Association for Computational Linguistics ACL 2024
- Month:
- August
- Year:
- 2024
- Address:
- Bangkok, Thailand and virtual meeting
- Editors:
- Lun-Wei Ku, Andre Martins, Vivek Srikumar
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 10944–10959
- Language:
- URL:
- https://aclanthology.org/2024.findings-acl.650
- DOI:
- 10.18653/v1/2024.findings-acl.650
- Cite (ACL):
- Chenyuan Wu, Gangwei Jiang, and Defu Lian. 2024. Mitigate Negative Transfer with Similarity Heuristic Lifelong Prompt Tuning. In Findings of the Association for Computational Linguistics ACL 2024, pages 10944–10959, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
- Cite (Informal):
- Mitigate Negative Transfer with Similarity Heuristic Lifelong Prompt Tuning (Wu et al., Findings 2024)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2024.findings-acl.650.pdf