FedGen: Generalizable Federated Learning for Sequential Data
Praveen Venkateswaran, Vatche Isahagian, et al.
CLOUD 2023
Federated Learning (FL) enables collaborative model training across decentralized clients without sharing raw data, yet faces significant challenges in real-world settings where client data distributions evolve dynamically over time. This paper tackles the critical problem of covariate and label shifts in streaming FL environments, where non-stationary data distributions degrade model performance and require adaptive middleware solutions. We introduce ShiftEx, a shift-aware mixture of experts framework that dynamically creates and trains specialized global models in response to detected distribution shifts using Maximum Mean Discrepancy for covariate shifts. The framework employs a latent memory mechanism for expert reuse and implements facility location-based optimization to jointly minimize covariate mismatch, expert creation costs, and label imbalance. Through theoretical analysis and comprehensive experiments on benchmark datasets, we demonstrate 5.5-12.9 percentage point accuracy improvements and 22-95% faster adaptation compared to state-of-the-art FL baselines across diverse shift scenarios. The proposed approach offers a scalable, privacy-preserving middleware solution for FL systems operating in non-stationary, real-world conditions while minimizing communication and computational overhead.
Praveen Venkateswaran, Vatche Isahagian, et al.
CLOUD 2023
Vinod Muthusamy, Yara Rizk, et al.
EMNLP 2023
Alexander Erben, Gauri Joshi, et al.
ICML 2025
Herbert Woisetschläger, Alexander Erben, et al.
Middleware 2024