Lightweight Cross-Lingual Federated Prompt Tuning for Low-Resource Languages

Ubaid Azam, Imran Razzak, Shoaib Jameel


Abstract
Multilingual NLP faces challenges of data heterogeneity, privacy, and limited computational resources, especially for low-resource languages. Centralised methods risk privacy breaches, while federated learning struggles with communication overhead and poor cross-lingual generalisation. We propose FLiP (Federated Lightweight Prompt-tuning), a privacy-preserving, resource-efficient, generalizable framework integrating prompt-based learning with federated optimisation. FLiP eliminates communication overhead, reduces trainable parameters to 16%, and cuts GPU memory use by 90%. Experiments show superior generalisation and efficiency under both IID and Non-IID settings, establishing FLiP as a scalable, privacy-aware solution for multilingual NLP, particularly in low-resource and indigenous language contexts.
Anthology ID:
2026.lrec-main.260
Volume:
Proceedings of the Fifteenth Language Resources and Evaluation Conference
Month:
May
Year:
2026
Address:
Palma de Mallorca, Spain
Editors:
Stelios Piperidis, Núria Bel, Henk van den Heuvel, Nancy Ide, Simon Krek, Antonio Toral
Venue:
LREC
SIG:
Publisher:
ELRA Language Resource Association
Note:
Pages:
3304–3316
Language:
URL:
https://preview.aclanthology.org/ingest-lrec/2026.lrec-main.260/
DOI:
Bibkey:
Cite (ACL):
Ubaid Azam, Imran Razzak, and Shoaib Jameel. 2026. Lightweight Cross-Lingual Federated Prompt Tuning for Low-Resource Languages. International Conference on Language Resources and Evaluation, main:3304–3316.
Cite (Informal):
Lightweight Cross-Lingual Federated Prompt Tuning for Low-Resource Languages (Azam et al., LREC 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-lrec/2026.lrec-main.260.pdf