论文标题

深度半监督的知识蒸馏,用于重叠宫颈细胞实例分割

Deep Semi-supervised Knowledge Distillation for Overlapping Cervical Cell Instance Segmentation

论文作者

Zhou, Yanning, Chen, Hao, Lin, Huangjing, Heng, Pheng-Ann

论文摘要

深度学习方法显示了重叠宫颈细胞实例分割的有希望的结果。但是,为了训练具有良好概括能力的模型,需要大量的像素级注释,这是相当昂贵且耗时的获取。在本文中,我们建议利用标记和未标记的数据(例如通过知识蒸馏提高精度分割)。我们提出了一个新颖的面具引导的卑鄙教师框架,其中具有扰动敏感的样本挖掘(MMT-PSM),该框架在培训期间由教师和学生网络组成。鼓励两个网络在小扰动下在功能和语义水平上保持一致。老师从$ k $ - 时间增强样本中的自我汇总预测用于构建可靠的伪标记,以优化学生。我们设计了一种新型策略,以估计每个建议的敏感性,并从大规模案例中选择信息样本以促进快速有效的语义蒸馏。此外,为了消除背景区域的不可避免的噪声,我们建议使用预测的分割蒙版作为指导,以在前景区域强制执行特征蒸馏。实验表明,与仅从标记的数据中学到的监督方法相比,所提出的方法显着提高了性能,并且优于最先进的半监督方法。

Deep learning methods show promising results for overlapping cervical cell instance segmentation. However, in order to train a model with good generalization ability, voluminous pixel-level annotations are demanded which is quite expensive and time-consuming for acquisition. In this paper, we propose to leverage both labeled and unlabeled data for instance segmentation with improved accuracy by knowledge distillation. We propose a novel Mask-guided Mean Teacher framework with Perturbation-sensitive Sample Mining (MMT-PSM), which consists of a teacher and a student network during training. Two networks are encouraged to be consistent both in feature and semantic level under small perturbations. The teacher's self-ensemble predictions from $K$-time augmented samples are used to construct the reliable pseudo-labels for optimizing the student. We design a novel strategy to estimate the sensitivity to perturbations for each proposal and select informative samples from massive cases to facilitate fast and effective semantic distillation. In addition, to eliminate the unavoidable noise from the background region, we propose to use the predicted segmentation mask as guidance to enforce the feature distillation in the foreground region. Experiments show that the proposed method improves the performance significantly compared with the supervised method learned from labeled data only, and outperforms state-of-the-art semi-supervised methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源