论文标题

学习不细分的内容:几次分割的新视角

Learning What Not to Segment: A New Perspective on Few-Shot Segmentation

论文作者

Lang, Chunbo, Cheng, Gong, Tu, Binfei, Han, Junwei

论文摘要

最近,很少开发了很少的射击分割(FSS)。大多数以前的作品都在努力通过从分类任务得出的元学习框架实现概括。但是,受过训练的模型偏向于看见的类,而不是理想地是班级不可或缺的,从而阻碍了对新概念的认可。本文提出了一个新鲜而直接的见解,以减轻问题。具体而言,我们将附加分支(基础学习者)应用于常规FSS模型(META学习者),以明确识别基类的目标,即不需要细分的区域。然后,这两个学习者并联的粗略结果被自适应地集成以产生精确的分割预测。考虑到元学习者的灵敏度,我们进一步引入了一个调整因子,以估算输入图像对之间的场景差异,以促进模型集合预测。 Pascal-5i和Coco-20i上的实质性表现验证了有效性,令人惊讶的是,我们的多功能方案为即使有两个普通的学习者设定了新的最先进。此外,鉴于所提出的方法的独特性质,我们还将其扩展到更现实但具有挑战性的环境,即广义的FSS,其中基础和新颖类的像素都必须确定。源代码可在github.com/chunbolang/bam上获得。

Recently few-shot segmentation (FSS) has been extensively developed. Most previous works strive to achieve generalization through the meta-learning framework derived from classification tasks; however, the trained models are biased towards the seen classes instead of being ideally class-agnostic, thus hindering the recognition of new concepts. This paper proposes a fresh and straightforward insight to alleviate the problem. Specifically, we apply an additional branch (base learner) to the conventional FSS model (meta learner) to explicitly identify the targets of base classes, i.e., the regions that do not need to be segmented. Then, the coarse results output by these two learners in parallel are adaptively integrated to yield precise segmentation prediction. Considering the sensitivity of meta learner, we further introduce an adjustment factor to estimate the scene differences between the input image pairs for facilitating the model ensemble forecasting. The substantial performance gains on PASCAL-5i and COCO-20i verify the effectiveness, and surprisingly, our versatile scheme sets a new state-of-the-art even with two plain learners. Moreover, in light of the unique nature of the proposed approach, we also extend it to a more realistic but challenging setting, i.e., generalized FSS, where the pixels of both base and novel classes are required to be determined. The source code is available at github.com/chunbolang/BAM.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源