论文标题

增量提示:终生事件检测的情节内存提示

Incremental Prompting: Episodic Memory Prompt for Lifelong Event Detection

论文作者

Liu, Minqian, Chang, Shiyu, Huang, Lifu

论文摘要

终身事件检测旨在逐步更新具有新事件类型和数据的模型,同时保留先前学习的旧类型的功能。一个关键的挑战是,当不断接受新数据训练时,该模型会灾难性地忘记旧类型。在本文中,我们介绍了情节记忆提示(EMP),以明确保留特定于任务的知识。我们的方法采用每个任务的连续提示,并进行了优化以指导模型预测并学习特定于事件的表示。在以前的任务中学习的EMP与后续任务中的模型一起携带,并且可以用作存储模块,以保持旧知识并转移到新任务。实验结果证明了我们方法的有效性。此外,我们还对终生学习中的新事件类型进行了全面分析。

Lifelong event detection aims to incrementally update a model with new event types and data while retaining the capability on previously learned old types. One critical challenge is that the model would catastrophically forget old types when continually trained on new data. In this paper, we introduce Episodic Memory Prompts (EMP) to explicitly preserve the learned task-specific knowledge. Our method adopts continuous prompt for each task and they are optimized to instruct the model prediction and learn event-specific representation. The EMPs learned in previous tasks are carried along with the model in subsequent tasks, and can serve as a memory module that keeps the old knowledge and transferring to new tasks. Experiment results demonstrate the effectiveness of our method. Furthermore, we also conduct a comprehensive analysis of the new and old event types in lifelong learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源