论文标题

使用神经体系结构搜索的资源感知的异质联合学习

Resource-Aware Heterogeneous Federated Learning using Neural Architecture Search

论文作者

Yu, Sixing, Muñoz, J. Pablo, Jannesari, Ali

论文摘要

联合学习(FL)广泛用于培训在分布式和隐私设置中的AI/ML模型。 FL系统中的参与者边缘设备通常包含非独立和相同分布的(非IID)私人数据和不均分布的计算资源。在异质联合网络中优化AI/ML模型的同时,保留用户数据隐私需要我们解决数据和系统/资源异质性。为了应对这些挑战,我们提出了资源感知的联合学习(RAFL)。 RAFL使用神经体系结构搜索(NAS)将资源感知的专业模型分配给边缘设备,并允许通过知识提取和融合来部署异质模型体系结构。组合NAS和FL可以为资源多样性边缘设备的按需定制模型部署。此外,我们提出了一种多模型体系结构融合方案,允许汇总分布式学习结果。结果表明,与SOTA相比,RAFL的资源效率卓越。

Federated Learning (FL) is extensively used to train AI/ML models in distributed and privacy-preserving settings. Participant edge devices in FL systems typically contain non-independent and identically distributed (Non-IID) private data and unevenly distributed computational resources. Preserving user data privacy while optimizing AI/ML models in a heterogeneous federated network requires us to address data and system/resource heterogeneity. To address these challenges, we propose Resource-aware Federated Learning (RaFL). RaFL allocates resource-aware specialized models to edge devices using Neural Architecture Search (NAS) and allows heterogeneous model architecture deployment by knowledge extraction and fusion. Combining NAS and FL enables on-demand customized model deployment for resource-diverse edge devices. Furthermore, we propose a multi-model architecture fusion scheme allowing the aggregation of the distributed learning results. Results demonstrate RaFL's superior resource efficiency compared to SoTA.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源