论文标题
通过双重提示增强跨语性提示
Enhancing Cross-lingual Prompting with Dual Prompt Augmentation
论文作者
论文摘要
提示在几个场景中显示出令人鼓舞的结果。但是,其多语言/跨语性问题的强度尚未得到充分利用。 Zhao和Schütze(2021)通过提出跨语义促使跨语性登胜的跨语义来朝这个方向进行初步探索。在本文中,我们对每个组成部分在跨语性提示中的影响并得出语言 - 静态通用提示进行了经验探索,这有助于减轻源语言训练和目标语言推论之间的差异。基于此,我们提出了DPA,这是一个双重及时的增强框架,旨在减轻几次跨语性提示中的数据稀缺问题。值得注意的是,对于XNLI而言,我们的方法每班只有16个英语培训示例,可实现46.54%的成就,明显好于34.99%。我们的代码可在https://github.com/damo-nlp-sg/dpa上找到。
Prompting shows promising results in few-shot scenarios. However, its strength for multilingual/cross-lingual problems has not been fully exploited. Zhao and Schütze (2021) made initial explorations in this direction by presenting that cross-lingual prompting outperforms cross-lingual finetuning. In this paper, we conduct an empirical exploration on the effect of each component in cross-lingual prompting and derive language-agnostic Universal Prompting, which helps alleviate the discrepancies between source-language training and target-language inference. Based on this, we propose DPA, a dual prompt augmentation framework, aiming at relieving the data scarcity issue in few-shot cross-lingual prompting. Notably, for XNLI, our method achieves 46.54% with only 16 English training examples per class, significantly better than 34.99% of finetuning. Our code is available at https://github.com/DAMO-NLP-SG/DPA.