论文标题
交易:通过循环知识蒸馏为个性化医疗保健的联合会之间的联合学习
MetaFed: Federated Learning among Federations with Cyclic Knowledge Distillation for Personalized Healthcare
论文作者
论文摘要
联邦学习引起了人们对建筑模型的越来越多的关注,而无需访问原始用户数据,尤其是在医疗保健中。在实际应用中,由于可能的原因(例如数据异质性和中央服务器的不信任/不存在),不同的联合会很少共同努力。在本文中,我们提出了一个称为Metaf的新型框架,以促进不同联合会之间可信赖的FL。 METAFED通过提出的循环知识蒸馏为每个联邦提供一个个性化模型。具体而言,metafed将每个联盟视为元分布,并以环状方式汇总对每个联合会的知识。培训分为两个部分:常识的积累和个性化。与最先进的方法相比,三个基准测试的全面实验表明,与最先进的方法相比,无需服务器的元素获得了更好的准确性(例如,与PAMAP2的基线相比,提高了10%+的准确性)。
Federated learning has attracted increasing attention to building models without accessing the raw user data, especially in healthcare. In real applications, different federations can seldom work together due to possible reasons such as data heterogeneity and distrust/inexistence of the central server. In this paper, we propose a novel framework called MetaFed to facilitate trustworthy FL between different federations. MetaFed obtains a personalized model for each federation without a central server via the proposed Cyclic Knowledge Distillation. Specifically, MetaFed treats each federation as a meta distribution and aggregates knowledge of each federation in a cyclic manner. The training is split into two parts: common knowledge accumulation and personalization. Comprehensive experiments on three benchmarks demonstrate that MetaFed without a server achieves better accuracy compared to state-of-the-art methods (e.g., 10%+ accuracy improvement compared to the baseline for PAMAP2) with fewer communication costs.