论文标题

使用基于BERT的预训练模型的中国语法校正

Chinese Grammatical Correction Using BERT-based Pre-trained Model

论文作者

Wang, Hongfei, Kurosawa, Michiki, Katsumata, Satoru, Komachi, Mamoru

论文摘要

近年来,预先培训的模型已经进行了广泛的研究,并且从利用中受益了几项下游任务。在这项研究中,我们验证了两种结合了Cui等人开发的基于BERT的预训练模型的方法的有效性。 (2020)进入中国语法错误校正任务的编码器模型。我们还分析了错误类型,并得出结论,句子级错误尚未解决。

In recent years, pre-trained models have been extensively studied, and several downstream tasks have benefited from their utilization. In this study, we verify the effectiveness of two methods that incorporate a BERT-based pre-trained model developed by Cui et al. (2020) into an encoder-decoder model on Chinese grammatical error correction tasks. We also analyze the error type and conclude that sentence-level errors are yet to be addressed.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源