论文标题
三角洲:多样化的客户抽样,用于禁食联合学习
DELTA: Diverse Client Sampling for Fasting Federated Learning
论文作者
论文摘要
部分客户参与已被广泛采用联合学习(FL),以有效减轻沟通负担。但是,客户端采样方案不足可以导致选择不代表的子集,从而导致模型更新和收敛速度减慢。现有的抽样方法是有偏见的,或者可以进一步优化以更快的融合。在本文中,我们提出了Delta,这是一种旨在减轻这些问题的无偏抽样方案。 Delta表征了客户多样性和本地差异的效果,并为全球模型更新提供了有价值信息的代表客户。此外,Delta是一种经过验证的最佳无偏抽样方案,可最大程度地减少部分客户参与引起的差异,并在收敛方面优于其他无偏抽样方案。此外,为了解决全能梯度依赖性,我们根据可用客户的信息提供了三角洲的实用版本,并分析其收敛性。通过对合成和现实世界数据集的实验,我们的结果得到了验证。
Partial client participation has been widely adopted in Federated Learning (FL) to reduce the communication burden efficiently. However, an inadequate client sampling scheme can lead to the selection of unrepresentative subsets, resulting in significant variance in model updates and slowed convergence. Existing sampling methods are either biased or can be further optimized for faster convergence.In this paper, we present DELTA, an unbiased sampling scheme designed to alleviate these issues. DELTA characterizes the effects of client diversity and local variance, and samples representative clients with valuable information for global model updates. In addition, DELTA is a proven optimal unbiased sampling scheme that minimizes variance caused by partial client participation and outperforms other unbiased sampling schemes in terms of convergence. Furthermore, to address full-client gradient dependence,we provide a practical version of DELTA depending on the available clients' information, and also analyze its convergence. Our results are validated through experiments on both synthetic and real-world datasets.