论文标题

二重性持续学习

Bilevel Continual Learning

论文作者

Shaker, Ammar, Alesiani, Francesco, Yu, Shujian, Yin, Wenzhe

论文摘要

持续学习(CL)一次学习一系列任务的问题,以使每个新任务的学习不会导致在利用先前学习的功能的同时,在先前看过的任务上的性能下降。本文介绍了双层持续学习(BICL),这是一个持续学习的一般框架,它融合了双重优化和深度神经网络的元学习的最新进展。 BICL能够在在线持续学习的保守环境下培训深层歧视和生成性模型。实验结果表明,BICL在当前任务的准确性方面提供了竞争性能,同时降低了灾难性遗忘的影响。这是[1]的同时工作。我们将其提交给2020年AAAI和2020年IJCAI。现在,我们将其放在Arxiv上以备记录。与[1]不同,我们还考虑了连续的生成模型。同时,作者了解了一项有关基于双重优化的核心构造的最新建议,用于持续学习[2]。 [1] Q. Pham,D。Sahoo,C。Liu和S. C. Hoi。二重性持续学习。 Arxiv预印型ARXIV:2007.15553,2020。 [2] Z. Borsos,M。Mutny和A. Krause。通过二线优化的核心进行持续学习和流式传输。 ARXIV预印型ARXIV:2006.03875,2020

Continual learning (CL) studies the problem of learning a sequence of tasks, one at a time, such that the learning of each new task does not lead to the deterioration in performance on the previously seen ones while exploiting previously learned features. This paper presents Bilevel Continual Learning (BiCL), a general framework for continual learning that fuses bilevel optimization and recent advances in meta-learning for deep neural networks. BiCL is able to train both deep discriminative and generative models under the conservative setting of the online continual learning. Experimental results show that BiCL provides competitive performance in terms of accuracy for the current task while reducing the effect of catastrophic forgetting. This is a concurrent work with [1]. We submitted it to AAAI 2020 and IJCAI 2020. Now we put it on the arxiv for record. Different from [1], we also consider continual generative model as well. At the same time, the authors are aware of a recent proposal on bilevel optimization based coreset construction for continual learning [2]. [1] Q. Pham, D. Sahoo, C. Liu, and S. C. Hoi. Bilevel continual learning. arXiv preprint arXiv:2007.15553, 2020. [2] Z. Borsos, M. Mutny, and A. Krause. Coresets via bilevel optimization for continual learning and streaming. arXiv preprint arXiv:2006.03875, 2020

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源