论文标题
FEDHAP:使用协作HAPS的LEO星座的快速联合学习
FedHAP: Fast Federated Learning for LEO Constellations Using Collaborative HAPs
论文作者
论文摘要
在过去的几年中,低地球轨道(LEO)卫星星座在过去几年中的部署激增,它们可以提供宽带互联网访问以及收集大量的地球观测数据,这些数据可用于在全球范围内开发AI。由于传统的机器学习(ML)通过将卫星数据下载到地面站(GS)的培训模型的方法是不切实际的,因此联合学习(FL)提供了潜在的解决方案。但是,由于具有挑战性的卫星GS通信环境,现有的FL方法无法轻易应用,因为它们过度延长了训练时间。本文提出了FedHap,该FedHap将高空平台(HAP)作为分布式参数服务器(PSS)引入SATCOM(或更具体的Leo Constellations)中,以实现快速有效的模型培训。 FEDHAP由三个组成部分组成:1)分层通信结构,2)模型传播算法和3)模型聚合算法。我们的广泛模拟表明,与最先进的基线相比,FEDHAP显着加速了FL模型的收敛,从而将训练时间从几天减少到几个小时,但精度更高。
Low Earth Orbit (LEO) satellite constellations have seen a surge in deployment over the past few years by virtue of their ability to provide broadband Internet access as well as to collect vast amounts of Earth observational data that can be utilized to develop AI on a global scale. As traditional machine learning (ML) approaches that train a model by downloading satellite data to a ground station (GS) are not practical, Federated Learning (FL) offers a potential solution. However, existing FL approaches cannot be readily applied because of their excessively prolonged training time caused by the challenging satellite-GS communication environment. This paper proposes FedHAP, which introduces high-altitude platforms (HAPs) as distributed parameter servers (PSs) into FL for Satcom (or more concretely LEO constellations), to achieve fast and efficient model training. FedHAP consists of three components: 1) a hierarchical communication architecture, 2) a model dissemination algorithm, and 3) a model aggregation algorithm. Our extensive simulations demonstrate that FedHAP significantly accelerates FL model convergence as compared to state-of-the-art baselines, cutting the training time from several days down to a few hours, yet achieving higher accuracy.