论文标题

优化中文 - 泰国低资源翻译的深层变压器

Optimizing Deep Transformers for Chinese-Thai Low-Resource Translation

论文作者

Hao, Wenjie, Xu, Hongfei, Mu, Lingling, Zan, Hongying

论文摘要

在本文中,我们研究了CCMT 2022中文 - 泰国低资源机器翻译任务的深层变压器翻译模型的使用。我们首先使用6层变压器探索实验设置(包括BPE合并操作的数量,辍学概率,嵌入尺寸等)。考虑到增加层的数量还会增加新模型参数的正则化(使用更多层时也引入了辍学模块),我们采用最高的性能设置,但将变压器的深度增加到24层以获得改进的翻译质量。我们的工作在受限的评估中获得了中文到泰语翻译的SOTA性能。

In this paper, we study the use of deep Transformer translation model for the CCMT 2022 Chinese-Thai low-resource machine translation task. We first explore the experiment settings (including the number of BPE merge operations, dropout probability, embedding size, etc.) for the low-resource scenario with the 6-layer Transformer. Considering that increasing the number of layers also increases the regularization on new model parameters (dropout modules are also introduced when using more layers), we adopt the highest performance setting but increase the depth of the Transformer to 24 layers to obtain improved translation quality. Our work obtains the SOTA performance in the Chinese-to-Thai translation in the constrained evaluation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源