论文标题

mingindistiltation:基于利润的SoftMax的蒸馏

MarginDistillation: distillation for margin-based softmax

论文作者

Svitov, David, Alyamkin, Sergey

论文摘要

卷积神经网络(CNN)与基于保证金的软马克斯方法的使用证明了面部识别问题的最新性能。最近,已经引入了用边缘设备的面部识别任务培训的轻型神经网络模型。在本文中,我们为轻质神经网络体系结构提出了一种新颖的蒸馏方法,该方法在LFW,ageDB-30和Megaface数据集上胜过其他已知方法的面部识别任务。提出的方法的想法是将来自教师网络的课堂中心用于学生网络。然后,对学生网络进行了培训,以在教师网络预测的班级中心和面部嵌入之间获得相同的角度。

The usage of convolutional neural networks (CNNs) in conjunction with a margin-based softmax approach demonstrates a state-of-the-art performance for the face recognition problem. Recently, lightweight neural network models trained with the margin-based softmax have been introduced for the face identification task for edge devices. In this paper, we propose a novel distillation method for lightweight neural network architectures that outperforms other known methods for the face recognition task on LFW, AgeDB-30 and Megaface datasets. The idea of the proposed method is to use class centers from the teacher network for the student network. Then the student network is trained to get the same angles between the class centers and the face embeddings, predicted by the teacher network.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源