论文标题
FEDSIAM-DA:在非IID数据下通过暹罗网络进行双聚集的联邦学习
FedSiam-DA: Dual-aggregated Federated Learning via Siamese Network under Non-IID Data
论文作者
论文摘要
联合学习是一种分布式学习,允许每个客户在本地保留原始数据,并且仅将本地模型的参数上传到服务器。尽管联邦学习可以解决数据岛,但在实际应用中使用数据异质训练仍然具有挑战性。在本文中,我们提出了一种新型的双重对比的联合学习方法Fedsiam-da,以在各种数据异质性设置下个性化本地和全球模型。首先,基于暹罗网络中对比度学习的思想,FEDSIAM-DA将本地和全球模型视为本地培训期间暹罗网络的不同分支,并通过不断更改模型的相似性来控制模型的更新方向,以使本地模型个性化。其次,Fedsiam-Da基于每个本地模型的模型相似性引入动态权重,并行使双聚集机制,以进一步改善全局模型的概括。此外,我们在基准数据集上提供了广泛的实验,结果表明,FEDSIAM-DA在异质数据集上实现了多种以前的FL方法。
Federated learning is a distributed learning that allows each client to keep the original data locally and only upload the parameters of the local model to the server. Despite federated learning can address data island, it remains challenging to train with data heterogeneous in a real application. In this paper, we propose FedSiam-DA, a novel dual-aggregated contrastive federated learning approach, to personalize both local and global models, under various settings of data heterogeneity. Firstly, based on the idea of contrastive learning in the siamese network, FedSiam-DA regards the local and global model as different branches of the siamese network during the local training and controls the update direction of the model by constantly changing model similarity to personalize the local model. Secondly, FedSiam-DA introduces dynamic weights based on model similarity for each local model and exercises the dual-aggregated mechanism to further improve the generalization of the global model. Moreover, we provide extensive experiments on benchmark datasets, the results demonstrate that FedSiam-DA achieves outperforming several previous FL approaches on heterogeneous datasets.