论文标题

使用很少的样品在预审预周座模型中忘记了级别的遗忘

Attaining Class-level Forgetting in Pretrained Model using Few Samples

论文作者

Singh, Pravendra, Mazumder, Pratik, Karim, Mohammed Asad

论文摘要

为了解决现实世界中的问题,深度学习模型是在许多课程上共同培训的。但是,将来,由于隐私/道德问题,某些班级可能会受到限制,并且必须从对其进行培训的模型中删除限制的班级知识。由于隐私/道德问题,可用的数据也可能受到限制,并且不可能重新训练该模型。我们提出了一种新的方法来解决这个问题,而不会影响模型对其余类别的预测能力。我们的方法可以使用有限的可用培训数据来识别与限制类高度相关的模型参数,并删除有关限制类别的知识。我们的方法明显更快,并且在其余类的完整数据上重新训练的模型类似。

In order to address real-world problems, deep learning models are jointly trained on many classes. However, in the future, some classes may become restricted due to privacy/ethical concerns, and the restricted class knowledge has to be removed from the models that have been trained on them. The available data may also be limited due to privacy/ethical concerns, and re-training the model will not be possible. We propose a novel approach to address this problem without affecting the model's prediction power for the remaining classes. Our approach identifies the model parameters that are highly relevant to the restricted classes and removes the knowledge regarding the restricted classes from them using the limited available training data. Our approach is significantly faster and performs similar to the model re-trained on the complete data of the remaining classes.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源