FLIPS: Federated Learning using Intelligent Participant Selection
Reading group: Brian Jian Wei Ooi presented "FLIPS: Federated Learning using Intelligent Participant Selection" (Middleware'23) at 1A312 the 12/1/2024 at 10h25.
Abstract
This paper presents the design and implementation of FLIPS, a middleware system to manage data and participant heterogeneity in federated learning (FL) training workloads. In particular, we examine the benefits of label distribution clustering on participant selection in federated learning. FLIPS clusters parties involved in an FL training job based on the label distribution of their data apriori, and during FL training, ensures that each cluster is equitably represented in the participants selected. FLIPS can support the most common FL algorithms, including FedAvg, FedProx, FedDyn, FedOpt and FedYogi. To manage platform heterogeneity and dynamic resource availability, FLIPS incorporates a straggler management mechanism to handle changing capacities in distributed, smart community applications. Privacy of label distributions, clustering and participant selection is ensured through a trusted execution environment (TEE). Our comprehensive empirical evaluation compares FLIPS with random participant selection, as well as three other “smart” selection mechanisms – Oort [51], TiFL [15] and gradient clustering [27] using four real-world datasets, two different non-IID distributions and three common FL algorithms (FedYogi, FedProx and FedAvg). We demonstrate that FLIPS significantly improves convergence, achieving higher accuracy by 17-20 percentage points with 20-60% lower communication costs, and these benefits endure in the presence of straggler participants.