论文标题
使用分层个性化模型的稀疏联合学习
Sparse Federated Learning with Hierarchical Personalized Models
论文作者
论文摘要
联合学习(FL)可以在不收集用户的私人数据的情况下实现隐私和可靠的协作培训。其出色的隐私安全潜力可促进广泛的FL应用程序(IoT),无线网络,移动设备,自动驾驶汽车和云医疗治疗。但是,FL方法遭受非I.I.D模型性能差。数据和流量过多。为此,我们使用基于Moreau信封的层次近端映射提出了个性化的FL算法,该层次是使用层次个性化模型(SFEDHP)的稀疏联合学习(SFEDHP),从而显着改善了面临多种数据的全球模型性能。连续可区分的近似L1 - 纳尔也用作稀疏约束,以降低通信成本。收敛分析表明,SFEDHP的收敛速率是最先进的,并具有线性加速,而稀疏约束只会在很小的程度上降低收敛速度,同时显着降低了通信成本。在实验上,我们证明了SFEDHP与FedAvg,Hierfavg(分层FedAvg)和基于本地定制的个性化FL方法相比,包括FedAmp,FedAmp,FedProx,Per-Fedavg,Pfedme和PfedGP。
Federated learning (FL) can achieve privacy-safe and reliable collaborative training without collecting users' private data. Its excellent privacy security potential promotes a wide range of FL applications in Internet-of-Things (IoT), wireless networks, mobile devices, autonomous vehicles, and cloud medical treatment. However, the FL method suffers from poor model performance on non-i.i.d. data and excessive traffic volume. To this end, we propose a personalized FL algorithm using a hierarchical proximal mapping based on the moreau envelop, named sparse federated learning with hierarchical personalized models (sFedHP), which significantly improves the global model performance facing diverse data. A continuously differentiable approximated L1-norm is also used as the sparse constraint to reduce the communication cost. Convergence analysis shows that sFedHP's convergence rate is state-of-the-art with linear speedup and the sparse constraint only reduces the convergence rate to a small extent while significantly reducing the communication cost. Experimentally, we demonstrate the benefits of sFedHP compared with the FedAvg, HierFAVG (hierarchical FedAvg), and personalized FL methods based on local customization, including FedAMP, FedProx, Per-FedAvg, pFedMe, and pFedGP.