论文标题

FEDSENTH:通过联合学习中的合成数据梯度压缩

FedSynth: Gradient Compression via Synthetic Data in Federated Learning

论文作者

Hu, Shengyuan, Goetz, Jack, Malik, Kshitiz, Zhan, Hongyuan, Liu, Zhe, Liu, Yue

论文摘要

模型压缩在使用大型模型的联合学习(FL)中很重要,以降低沟通成本。先前的工作一直集中在基于稀疏的压缩上,该压缩可能会影响全局模型的准确性。在这项工作中,我们提出了一种新的方案,用于上游通信,其中每个客户都会学习并传输一个轻巧的合成数据集,以便将其用作培训数据,而是在实际培训数据上表现出色。服务器将通过合成数据恢复本地模型更新,并应用标准聚合。然后,我们提供了一种新算法FEDSENTH,以在本地学习合成数据。从经验上讲,我们发现我们的方法比所有三个常见的联合学习基准数据集中的随机掩盖基准都具有比较/更好。

Model compression is important in federated learning (FL) with large models to reduce communication cost. Prior works have been focusing on sparsification based compression that could desparately affect the global model accuracy. In this work, we propose a new scheme for upstream communication where instead of transmitting the model update, each client learns and transmits a light-weight synthetic dataset such that using it as the training data, the model performs similarly well on the real training data. The server will recover the local model update via the synthetic data and apply standard aggregation. We then provide a new algorithm FedSynth to learn the synthetic data locally. Empirically, we find our method is comparable/better than random masking baselines in all three common federated learning benchmark datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源