论文标题

U-NET拓扑的比较研究,用于删除组织病理学图像中的背景

A Comparative Study of U-Net Topologies for Background Removal in Histopathology Images

论文作者

Riasatian, Abtin, Rasoolijaberi, Maral, Babaei, Morteza, Tizhoosh, H. R.

论文摘要

在过去的十年中,病理学的数字化已获得相当大的动力。数字病理学提供了许多优势,包括更有效的工作流程,更轻松的协作以及一个有力的心文病理学场所。同时,在整个幻灯片图像(WSIS)上应用计算机辅助诊断(CAD),这是数字化的直接结果。任何图像分析的第一步是提取组织。因此,拆除背景是许多算法的有效和准确结果的必要先决条件。尽管对人类操作员有明显的歧视,但WSIS中组织区域的鉴定对于计算机来说可能具有挑战性,这主要是由于颜色变化和伪影的存在。此外,某些情况很难检测到诸如肺泡组织类型,脂肪组织和染色较差的组织。在本文中,我们对具有不同网络式骨架(不同拓扑)的U-NET体系结构进行实验,以删除WSIS的背景以及伪像,以提取组织区域。我们比较包括Mobilenet,VGG16,EfficityNet-B3,Resnet50,Resnext101和Densenet121在内的各种骨干网络。我们在手动标记的癌症基因组图集(TCGA)数据集的子集上培训和评估了网络。有效网络-B3和Mobilenet的灵敏度和特异性几乎达到了最佳效果。

During the last decade, the digitization of pathology has gained considerable momentum. Digital pathology offers many advantages including more efficient workflows, easier collaboration as well as a powerful venue for telepathology. At the same time, applying Computer-Aided Diagnosis (CAD) on Whole Slide Images (WSIs) has received substantial attention as a direct result of the digitization. The first step in any image analysis is to extract the tissue. Hence, background removal is an essential prerequisite for efficient and accurate results for many algorithms. In spite of the obvious discrimination for human operators, the identification of tissue regions in WSIs could be challenging for computers, mainly due to the existence of color variations and artifacts. Moreover, some cases such as alveolar tissue types, fatty tissues, and tissues with poor staining are difficult to detect. In this paper, we perform experiments on U-Net architecture with different network backbones (different topologies) to remove the background as well as artifacts from WSIs in order to extract the tissue regions. We compare a wide range of backbone networks including MobileNet, VGG16, EfficientNet-B3, ResNet50, ResNext101 and DenseNet121. We trained and evaluated the network on a manually labeled subset of The Cancer Genome Atlas (TCGA) Dataset. EfficientNet-B3 and MobileNet by almost 99% sensitivity and specificity reached the best results.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源