论文标题

通过移动设备的联合学习的服务延迟最小化

Service Delay Minimization for Federated Learning over Mobile Devices

论文作者

Chen, Rui, Shi, Dian, Qin, Xiaoqi, Liu, Dongjie, Pan, Miao, Cui, Shuguang

论文摘要

移动设备上的联合学习(FL)促进了许多有趣的应用程序/服务,其中许多服务对延迟敏感。在本文中,我们建议使用移动设备的服务延迟有效的FL(SDEFL)方案。与传统的沟通有效的FL不同,将无线通信视为瓶颈,我们发现在许多情况下,鉴于高速无线传输技术的发展,本地计算延迟与FL培训过程中的通信延迟相当。因此,FL的服务延迟应为计算延迟 +训练回合的通信延迟。为了最大程度地减少FL的服务延迟,只需独立减少本地计算/通信延迟是不够的。必须考虑本地计算和无线通信之间的延迟权衡。此外,我们从经验上研究本地计算控制和压缩策略(即本地更新,权重量化和梯度量化的数量)对计算,通信和服务延迟的影响。基于那些权衡的观察和实证研究,我们制定了一种优化方案,以最大程度地减少FL在异质设备上的服务延迟。我们建立测试床并进行广泛的仿真/实验以验证我们的理论分析。结果表明,与同行设计相比,SDEFL以较小的精度下降减少了明显的服务延迟。

Federated learning (FL) over mobile devices has fostered numerous intriguing applications/services, many of which are delay-sensitive. In this paper, we propose a service delay efficient FL (SDEFL) scheme over mobile devices. Unlike traditional communication efficient FL, which regards wireless communications as the bottleneck, we find that under many situations, the local computing delay is comparable to the communication delay during the FL training process, given the development of high-speed wireless transmission techniques. Thus, the service delay in FL should be computing delay + communication delay over training rounds. To minimize the service delay of FL, simply reducing local computing/communication delay independently is not enough. The delay trade-off between local computing and wireless communications must be considered. Besides, we empirically study the impacts of local computing control and compression strategies (i.e., the number of local updates, weight quantization, and gradient quantization) on computing, communication and service delays. Based on those trade-off observation and empirical studies, we develop an optimization scheme to minimize the service delay of FL over heterogeneous devices. We establish testbeds and conduct extensive emulations/experiments to verify our theoretical analysis. The results show that SDEFL reduces notable service delay with a small accuracy drop compared to peer designs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源