论文标题

p-kdgan:用甘斯进行渐进知识蒸馏以进行单级新颖性检测

P-KDGAN: Progressive Knowledge Distillation with GANs for One-class Novelty Detection

论文作者

Zhang, Zhiwei, Chen, Shifeng, Sun, Lei

论文摘要

一级新颖性检测是确定不符合预期正常实例的异常实例。在本文中,基于编码器编码管道的生成对抗网络(GAN)用于检测并实现最新性能。但是,深度神经网络的参数过多,无法在资源有限的设备上部署。因此,提出了逐渐进行gan的渐进知识蒸馏(PKDGAN)来学习紧凑而快速的新颖性检测网络。 P-KDGAN是一种新颖的尝试,通过设计的蒸馏损失将知识从老师转移到学生,将两种标准gan连接起来。知识蒸馏的逐步学习是一种两步的方法,它不断提高学生甘的表现并取得比单步方法更好的性能。第一步,学生甘通过固定权重的审计老师甘,从老师那里学到了基本知识。在第二步中,知识渊博的老师和学生甘斯采用了联合训练,以进一步提高表现和稳定性。 CIFAR-10,MNIST和FMNIST的实验结果表明,当以24.45:1,311.11:1:1,311.11:1和700:1的比率压缩计算时,我们的方法将学生GAN的性能提高了2.44%,1.77%和1.73%。

One-class novelty detection is to identify anomalous instances that do not conform to the expected normal instances. In this paper, the Generative Adversarial Networks (GANs) based on encoder-decoder-encoder pipeline are used for detection and achieve state-of-the-art performance. However, deep neural networks are too over-parameterized to deploy on resource-limited devices. Therefore, Progressive Knowledge Distillation with GANs (PKDGAN) is proposed to learn compact and fast novelty detection networks. The P-KDGAN is a novel attempt to connect two standard GANs by the designed distillation loss for transferring knowledge from the teacher to the student. The progressive learning of knowledge distillation is a two-step approach that continuously improves the performance of the student GAN and achieves better performance than single step methods. In the first step, the student GAN learns the basic knowledge totally from the teacher via guiding of the pretrained teacher GAN with fixed weights. In the second step, joint fine-training is adopted for the knowledgeable teacher and student GANs to further improve the performance and stability. The experimental results on CIFAR-10, MNIST, and FMNIST show that our method improves the performance of the student GAN by 2.44%, 1.77%, and 1.73% when compressing the computation at ratios of 24.45:1, 311.11:1, and 700:1, respectively.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源