论文标题

将它们全部团结的一种模型:个性化的多对比度MRI综合学习

One Model to Unite Them All: Personalized Federated Learning of Multi-Contrast MRI Synthesis

论文作者

Dalmaz, Onat, Mirza, Usama, Elmas, Gökberk, Özbey, Muzaffer, Dar, Salman UH, Ceyani, Emir, Avestimehr, Salman, Çukur, Tolga

论文摘要

多机构合作是学习可通用的MRI合成模型的关键,该模型将源转化为目标对比度图像。为了促进协作,联邦学习(FL)通过避免共享成像数据来采用分散培训,并减轻隐私问题。但是,训练训练的合成模型可能会因数据分布的固有异质性而损害,当跨站点规定常见或变量的翻译任务时,域移动很明显。在这里,我们介绍了MRI合成(PFLSYNTH)的第一种个性化FL方法,以提高针对域转移的可靠性。 PFLSYNTH基于一种对抗模型,该模型可产生特定于单个站点和源目标对比的潜伏期,并利用新颖的个性化障碍物适应跨发电机阶段的统计数据和特征图的加权,从而在给定的潜在阶段。为了进一步促进站点特异性,部分模型聚集在发电机的下游层上采用,而上游层则保留在本地。因此,PFLSYNTH可以培训统一的合成模型,该模型可以可靠地跨越多个站点和翻译任务。多站点数据集的全面实验清楚地证明了PFLSHNTH在多对比度MRI合成中对先前联邦方法的增强性能。

Multi-institutional collaborations are key for learning generalizable MRI synthesis models that translate source- onto target-contrast images. To facilitate collaboration, federated learning (FL) adopts decentralized training and mitigates privacy concerns by avoiding sharing of imaging data. However, FL-trained synthesis models can be impaired by the inherent heterogeneity in the data distribution, with domain shifts evident when common or variable translation tasks are prescribed across sites. Here we introduce the first personalized FL method for MRI Synthesis (pFLSynth) to improve reliability against domain shifts. pFLSynth is based on an adversarial model that produces latents specific to individual sites and source-target contrasts, and leverages novel personalization blocks to adaptively tune the statistics and weighting of feature maps across the generator stages given latents. To further promote site specificity, partial model aggregation is employed over downstream layers of the generator while upstream layers are retained locally. As such, pFLSynth enables training of a unified synthesis model that can reliably generalize across multiple sites and translation tasks. Comprehensive experiments on multi-site datasets clearly demonstrate the enhanced performance of pFLSynth against prior federated methods in multi-contrast MRI synthesis.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源