论文标题

选择性记忆种群在终身语言学习中有多相关?

How Relevant is Selective Memory Population in Lifelong Language Learning?

论文作者

Araujo, Vladimir, Balabin, Helena, Hurtado, Julio, Soto, Alvaro, Moens, Marie-Francine

论文摘要

终身语言学习试图让模型不断地以依次的顺序学习多个任务,而不会遭受灾难性的遗忘。最先进的方法依赖于稀疏的经验重播是防止忘记的主要方法。经验重播通常为记忆总数采用抽样方法;但是,尚未研究所选抽样策略对模型性能的影响。在本文中,我们研究了选择性记忆总体在文本分类和提问任务的终生学习过程中的相关性。我们发现,从整个数据流中随机存储均匀数量的样品的方法会导致高性能,尤其是对于低记忆大小,这与计算机视觉研究一致。

Lifelong language learning seeks to have models continuously learn multiple tasks in a sequential order without suffering from catastrophic forgetting. State-of-the-art approaches rely on sparse experience replay as the primary approach to prevent forgetting. Experience replay usually adopts sampling methods for the memory population; however, the effect of the chosen sampling strategy on model performance has not yet been studied. In this paper, we investigate how relevant the selective memory population is in the lifelong learning process of text classification and question-answering tasks. We found that methods that randomly store a uniform number of samples from the entire data stream lead to high performances, especially for low memory size, which is consistent with computer vision studies.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源