@inproceedings{silva-etal-2023-fedperc,
    title = "{F}ed{P}er{C}: Federated Learning for Language Generation with Personal and Context Preference Embeddings",
    author = "Silva, Andrew  and
      Tambwekar, Pradyumna  and
      Gombolay, Matthew",
    editor = "Vlachos, Andreas  and
      Augenstein, Isabelle",
    booktitle = "Findings of the Association for Computational Linguistics: EACL 2023",
    month = may,
    year = "2023",
    address = "Dubrovnik, Croatia",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2023.findings-eacl.64/",
    doi = "10.18653/v1/2023.findings-eacl.64",
    pages = "869--882",
    abstract = "Federated learning is a training paradigm that learns from multiple distributed users without aggregating data on a centralized server, promising the ability to deploy machine-learning to a diverse population of users without first collecting large, labeled datasets. As federated learning involves averaging gradient updates across a decentralized population, there is a growing need for personalization of federated learning systems (i.e. conversational agents must personalize to individual users and the context of an interaction).In this work, we propose a new direction for personalization research within federated learning, leveraging both personal embeddings and shared context embeddings.We also present an approach to predict these ``preference'' embeddings, enabling personalization without backpropagation. Compared to state-of-the-art personalization baselines, our approach achieves a 50{\%} improvement in test-time perplexity using 0.001{\%} of the memory required by baseline approaches, and achieving greater sample- and compute-efficiency."
}Markdown (Informal)
[FedPerC: Federated Learning for Language Generation with Personal and Context Preference Embeddings](https://preview.aclanthology.org/ingest-emnlp/2023.findings-eacl.64/) (Silva et al., Findings 2023)
ACL