论文标题
巨大的选择,足够的任务(Machamp):NLP中多任务学习的工具包
Massive Choice, Ample Tasks (MaChAmp): A Toolkit for Multi-task Learning in NLP
论文作者
论文摘要
转移学习,尤其是将多任务学习与预训练的上下文嵌入和微调结合在一起的方法,近年来已经大大推进了自然语言处理领域。在本文中,我们介绍了Machamp,这是一种工具包,可轻松调整多任务设置中的上下文化嵌入。 Machamp的好处是其灵活的配置选项,以及统一工具包中各种自然语言处理任务的支持,从文本分类和序列标签到依赖性解析,蒙版语言建模和文本生成。
Transfer learning, particularly approaches that combine multi-task learning with pre-trained contextualized embeddings and fine-tuning, have advanced the field of Natural Language Processing tremendously in recent years. In this paper we present MaChAmp, a toolkit for easy fine-tuning of contextualized embeddings in multi-task settings. The benefits of MaChAmp are its flexible configuration options, and the support of a variety of natural language processing tasks in a uniform toolkit, from text classification and sequence labeling to dependency parsing, masked language modeling, and text generation.