论文标题
通过闭环转录对结构化内存的增量学习
Incremental Learning of Structured Memory via Closed-Loop Transcription
论文作者
论文摘要
这项工作提出了一个最小的计算模型,用于在增量设置中学习多个对象类的结构性记忆。我们的方法基于在低维特征空间中建立类和一组相应的子空间(称为线性判别表示)之间的闭环转录。我们的方法比现有的增量学习方法要简单,并且在模型大小,存储和计算方面更有效:它仅需要一个具有特征空间的单个固定能力自动编码网络,该网络既用于判别和生成目的。通过在基于降低的目标上解决编码和解码地图之间的约束最小游戏,可以同时在没有架构操作的情况下同时优化网络参数。实验结果表明,我们的方法可以有效地减轻灾难性的遗忘,尽管需要更少的资源,但与先前对MNIST,CIFAR-10和Imagenet-50的生成重播工作相比,实现了明显更好的性能。可以在https://github.com/tsb0601/i-ctrl上找到源代码
This work proposes a minimal computational model for learning structured memories of multiple object classes in an incremental setting. Our approach is based on establishing a closed-loop transcription between the classes and a corresponding set of subspaces, known as a linear discriminative representation, in a low-dimensional feature space. Our method is simpler than existing approaches for incremental learning, and more efficient in terms of model size, storage, and computation: it requires only a single, fixed-capacity autoencoding network with a feature space that is used for both discriminative and generative purposes. Network parameters are optimized simultaneously without architectural manipulations, by solving a constrained minimax game between the encoding and decoding maps over a single rate reduction-based objective. Experimental results show that our method can effectively alleviate catastrophic forgetting, achieving significantly better performance than prior work of generative replay on MNIST, CIFAR-10, and ImageNet-50, despite requiring fewer resources. Source code can be found at https://github.com/tsb0601/i-CTRL