Matthias Schubert
2022
Federated Continual Learning for Text Classification via Selective Inter-client Transfer
Yatin Chaudhary
|
Pranav Rai
|
Matthias Schubert
|
Hinrich Schütze
|
Pankaj Gupta
Findings of the Association for Computational Linguistics: EMNLP 2022
In this work, we combine the two paradigms: Federated Learning (FL) and Continual Learning (CL) for text classification task in cloud-edge continuum. The objective of Federated Continual Learning (FCL) is to improve deep learning models over life time at each client by (relevant and efficient) knowledge transfer without sharing data. Here, we address challenges in minimizing inter-client interference while knowledge sharing due to heterogeneous tasks across clients in FCL setup. In doing so, we propose a novel framework, Federated Selective Inter-client Transfer (FedSeIT) which selectively combines model parameters of foreign clients. To further maximize knowledge transfer, we assess domain overlap and select informative tasks from the sequence of historical tasks at each foreign client while preserving privacy. Evaluating against the baselines, we show improved performance, a gain of (average) 12.4% in text classification over a sequence of tasks using five datasets from diverse domains. To the best of our knowledge, this is the first work that applies FCL to NLP.
TempCaps: A Capsule Network-based Embedding Model for Temporal Knowledge Graph Completion
Guirong Fu
|
Zhao Meng
|
Zhen Han
|
Zifeng Ding
|
Yunpu Ma
|
Matthias Schubert
|
Volker Tresp
|
Roger Wattenhofer
Proceedings of the Sixth Workshop on Structured Prediction for NLP
Temporal knowledge graphs store the dynamics of entities and relations during a time period. However, typical temporal knowledge graphs often suffer from incomplete dynamics with missing facts in real-world scenarios. Hence, modeling temporal knowledge graphs to complete the missing facts is important. In this paper, we tackle the temporal knowledge graph completion task by proposing TempCaps, which is a Capsule network-based embedding model for Temporal knowledge graph completion. TempCaps models temporal knowledge graphs by introducing a novel dynamic routing aggregator inspired by Capsule Networks. Specifically, TempCaps builds entity embeddings by dynamically routing retrieved temporal relation and neighbor information. Experimental results demonstrate that TempCaps reaches state-of-the-art performance for temporal knowledge graph completion. Additional analysis also shows that TempCaps is efficient.
Search
Co-authors
- Yatin Chaudhary 1
- Pranav Rai 1
- Hinrich Schütze 1
- Pankaj Gupta 1
- Guirong Fu 1
- show all...