论文标题

使用服务器学习的联合学习:提高非IID数据的性能

Federated Learning with Server Learning: Enhancing Performance for Non-IID Data

论文作者

Mai, Van Sy, La, Richard J., Zhang, Tao

论文摘要

联合学习(FL)已成为使用与协调服务器的客户端存储的本地数据分布式学习的一种手段。最近的研究表明,当客户的培训数据不是独立且分布相同时,FL的性能和收敛性较慢。在这里,我们考虑了一种新的补充方法,可以通过允许服务器从小型数据集中进行辅助学习来减轻这种性能退化。我们的分析和实验表明,即使服务器数据集较小,这种新方法也可以在模型准确性和收敛时间方面取得重大改进,并且其分布与所有客户端的汇总数据的分布不同。

Federated Learning (FL) has emerged as a means of distributed learning using local data stored at clients with a coordinating server. Recent studies showed that FL can suffer from poor performance and slower convergence when training data at clients are not independent and identically distributed. Here we consider a new complementary approach to mitigating this performance degradation by allowing the server to perform auxiliary learning from a small dataset. Our analysis and experiments show that this new approach can achieve significant improvements in both model accuracy and convergence time even when the server dataset is small and its distribution differs from that of the aggregated data from all clients.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源