论文标题

紫红色:预先培训相关性的上下文与事实变压器,以中心为中心的生成和分类

ClarET: Pre-training a Correlation-Aware Context-To-Event Transformer for Event-Centric Generation and Classification

论文作者

Zhou, Yucheng, Shen, Tao, Geng, Xiubo, Long, Guodong, Jiang, Daxin

论文摘要

与相关的事件产生新事件在许多以事件为中心的推理任务中起着至关重要的作用。现有作品要么将其范围限制为特定方案,要么忽略事件级别的相关性。在本文中,我们建议预先进行以事件为中心推理的一般相关感知到事实变压器(紫红色)。为了实现这一目标,我们提出了三个新颖的以事件为中心的目标,即整个事件恢复,对比事件相关编码和基于迅速的事件定位,这突出了事件级别与有效培训的相关性。提出的紫红色适用于广泛的以事件为中心的推理场景,考虑到(i)事件相关类型(例如,因果关系,时间,时间,对比),(ii)申请公式(即产生和分类)和(iii)推理类型(例如,递减,相反的理由)。经验微调结果,以及零和几乎没有学习的学习,对9个基准(5代和4个分类任务涵盖了4种具有不同事件相关性的推理类型),验证其有效性和泛化能力。

Generating new events given context with correlated ones plays a crucial role in many event-centric reasoning tasks. Existing works either limit their scope to specific scenarios or overlook event-level correlations. In this paper, we propose to pre-train a general Correlation-aware context-to-Event Transformer (ClarET) for event-centric reasoning. To achieve this, we propose three novel event-centric objectives, i.e., whole event recovering, contrastive event-correlation encoding and prompt-based event locating, which highlight event-level correlations with effective training. The proposed ClarET is applicable to a wide range of event-centric reasoning scenarios, considering its versatility of (i) event-correlation types (e.g., causal, temporal, contrast), (ii) application formulations (i.e., generation and classification), and (iii) reasoning types (e.g., abductive, counterfactual and ending reasoning). Empirical fine-tuning results, as well as zero- and few-shot learning, on 9 benchmarks (5 generation and 4 classification tasks covering 4 reasoning types with diverse event correlations), verify its effectiveness and generalization ability.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源