论文标题

支持 - 伯特:使用深双向变压器预测MSDN中问答对的质量

Support-BERT: Predicting Quality of Question-Answer Pairs in MSDN using Deep Bidirectional Transformer

论文作者

Sen, Bhaskar, Gopal, Nikhil, Xue, Xinwei

论文摘要

难以定义社区支持网站的问题和答案质量(例如Microsoft开发人员网络,Stackoverflow,Github等),并且质量问题和答案的预测模型更具挑战性。以前的工作已经使用元功能的质量模型和质量模型解决了质量模型,诸如上投票的数量,发布问题或答案的人的可信赖性,帖子的标题以及上下文天真的自然语言处理功能。但是,缺乏用于社区问题回答文献网站的集成问题质量模型。在这篇简短的论文中,我们使用双向变压器的最近开发的深度学习模型来解决社区支持网站的质量问答建模问题。我们使用来自最初使用Wikipedia的单独任务训练的Transformers(BERT)的双向编码器表示,调查了转移学习对问答质量建模的适用性。已经发现,从Microsoft开发人员网络(MSDN)提取的Q&PER中,BERT模型的进一步预培训可以提高自动化质量预测的性能至80%以上。此外,实施是为了在Azure知识库系统中使用Azureml实时方案而在实时方案中部署填充模型。

Quality of questions and answers from community support websites (e.g. Microsoft Developers Network, Stackoverflow, Github, etc.) is difficult to define and a prediction model of quality questions and answers is even more challenging to implement. Previous works have addressed the question quality models and answer quality models separately using meta-features like number of up-votes, trustworthiness of the person posting the questions or answers, titles of the post, and context naive natural language processing features. However, there is a lack of an integrated question-answer quality model for community question answering websites in the literature. In this brief paper, we tackle the quality Q&A modeling problems from the community support websites using a recently developed deep learning model using bidirectional transformers. We investigate the applicability of transfer learning on Q&A quality modeling using Bidirectional Encoder Representations from Transformers (BERT) trained on a separate tasks originally using Wikipedia. It is found that a further pre-training of BERT model along with finetuning on the Q&As extracted from Microsoft Developer Network (MSDN) can boost the performance of automated quality prediction to more than 80%. Furthermore, the implementations are carried out for deploying the finetuned model in real-time scenario using AzureML in Azure knowledge base system.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源