论文标题

使用嘈杂标签学习的模型和数据协议

Model and Data Agreement for Learning with Noisy Labels

论文作者

Zhang, Yuhang, Deng, Weihong, Cui, Xingchen, Yin, Yunfeng, Shi, Hongzhi, Wen, Dongchao

论文摘要

用嘈杂的标签学习是实用深度学习的重要话题,因为模型应该对野外嘈杂的开放世界数据集有牢固的态度。当面对较大的嘈杂标签时,最新的嘈杂标签学习方法Jocor失败了。此外,选择小损失样本也可能导致错误积累,因为一旦将嘈杂的样本错误选为小损坏样本,就更有可能再次选择它们。在本文中,我们尝试从模型和数据角度来处理嘈杂的标签学习中的错误积累。我们介绍了平均点集合,以利用更强大的损失函数和来自未选择的样本的更多信息,以从模型的角度减少误差积累。此外,由于翻转图像具有与原始图像相同的语义含义,因此我们根据翻转图像的损失值选择小损失样本,而不是原始图像,以从数据角度来减少误差积累。关于CIFAR-10,CIFAR-100和大规模服装的广泛实验表明,我们的方法优于具有不同标签噪声的最先进的嘈杂标签学习方法。我们的方法还可以与其他嘈杂的标签学习方法无缝结合,以进一步提高其性能并良好地推广到其他任务。该代码可在https://github.com/zyh-uaiaaaa/mda-noisy-label-learning中找到。

Learning with noisy labels is a vital topic for practical deep learning as models should be robust to noisy open-world datasets in the wild. The state-of-the-art noisy label learning approach JoCoR fails when faced with a large ratio of noisy labels. Moreover, selecting small-loss samples can also cause error accumulation as once the noisy samples are mistakenly selected as small-loss samples, they are more likely to be selected again. In this paper, we try to deal with error accumulation in noisy label learning from both model and data perspectives. We introduce mean point ensemble to utilize a more robust loss function and more information from unselected samples to reduce error accumulation from the model perspective. Furthermore, as the flip images have the same semantic meaning as the original images, we select small-loss samples according to the loss values of flip images instead of the original ones to reduce error accumulation from the data perspective. Extensive experiments on CIFAR-10, CIFAR-100, and large-scale Clothing1M show that our method outperforms state-of-the-art noisy label learning methods with different levels of label noise. Our method can also be seamlessly combined with other noisy label learning methods to further improve their performance and generalize well to other tasks. The code is available in https://github.com/zyh-uaiaaaa/MDA-noisy-label-learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源