论文标题

模块化联合学习

Modular Federated Learning

论文作者

Liang, Kuo-Yun, Srinivasan, Abhishek, Andresen, Juan Carlos

论文摘要

联合学习是一种在网络边缘训练机器学习模型的方法,尽可能近的位置,这是由于无法流式传输和集中存储Edge设备以及数据隐私问题所产生的大量数据的新兴问题所激发的。该学习范式需要对设备异质性和数据异质性的鲁棒算法。本文提出,MODFL作为联合学习框架,将模型分为配置模块和操作模块,从而实现了各个模块的联合学习。这种模块化方法使从一组异质设备以及用户产生的非IID数据中提取知识。这种方法可以看作是通过个性化层FEDPER框架的联合学习的扩展,该框架解决了数据异质性。我们表明,使用CNN的MODFL优于CIFAR-10和STL-10的非IID数据分区的FEDPER。我们在使用RNN的Hapt,RWHAR和WISDM数据集的时间序列数据上的结果仍然尚无定论,我们认为所选数据集并未强调MODFL的优势,但在最坏的情况下,其执行情况以及FEDPER和FEDPER。

Federated learning is an approach to train machine learning models on the edge of the networks, as close as possible where the data is produced, motivated by the emerging problem of the inability to stream and centrally store the large amount of data produced by edge devices as well as by data privacy concerns. This learning paradigm is in need of robust algorithms to device heterogeneity and data heterogeneity. This paper proposes ModFL as a federated learning framework that splits the models into a configuration module and an operation module enabling federated learning of the individual modules. This modular approach makes it possible to extract knowlege from a group of heterogeneous devices as well as from non-IID data produced from its users. This approach can be viewed as an extension of the federated learning with personalisation layers FedPer framework that addresses data heterogeneity. We show that ModFL outperforms FedPer for non-IID data partitions of CIFAR-10 and STL-10 using CNNs. Our results on time-series data with HAPT, RWHAR, and WISDM datasets using RNNs remain inconclusive, we argue that the chosen datasets do not highlight the advantages of ModFL, but in the worst case scenario it performs as well as FedPer.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源