论文标题

Frea-Unet:用于模态转移的频率吸引的U-NET

FREA-Unet: Frequency-aware U-net for Modality Transfer

论文作者

Emami, Hajar, Liu, Qiong, Dong, Ming

论文摘要

虽然正电子发射断层扫描(PET)成像已被广泛用于诊断疾病数量,但它具有昂贵的收购过程,涉及对患者的辐射暴露。但是,磁共振成像(MRI)是一种更安全的成像方式,不涉及患者暴露于辐射。因此,需要从MRI数据中产生有效且自动化的PET图像。在本文中,我们提出了一种新的频率引起的注意U-NET,以生成合成PET图像。具体而言,我们将注意机制纳入负责估计图像低/高频尺度的不同U-NET层中。我们的频率吸引注意力UNET计算低/高频层中特征图的注意力评分,并使用它来帮助模型更多地关注最重要的区域,从而导致更现实的输出图像。对来自阿尔茨海默氏病神经影像倡议(ADNI)数据集的30名受试者的30名受试者的实验结果表明,在PET图像合成中,所提出的模型的表现良好,这些模型在当前的最新面前都实现了卓越的定性和定量性能。

While Positron emission tomography (PET) imaging has been widely used in diagnosis of number of diseases, it has costly acquisition process which involves radiation exposure to patients. However, magnetic resonance imaging (MRI) is a safer imaging modality that does not involve patient's exposure to radiation. Therefore, a need exists for an efficient and automated PET image generation from MRI data. In this paper, we propose a new frequency-aware attention U-net for generating synthetic PET images. Specifically, we incorporate attention mechanism into different U-net layers responsible for estimating low/high frequency scales of the image. Our frequency-aware attention Unet computes the attention scores for feature maps in low/high frequency layers and use it to help the model focus more on the most important regions, leading to more realistic output images. Experimental results on 30 subjects from Alzheimers Disease Neuroimaging Initiative (ADNI) dataset demonstrate good performance of the proposed model in PET image synthesis that achieved superior performance, both qualitative and quantitative, over current state-of-the-arts.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源