论文标题
SML:使用跳过元logit来增强网络平滑度以进行CTR预测
SML:Enhance the Network Smoothness with Skip Meta Logit for CTR Prediction
论文作者
论文摘要
鉴于Resnet中跳过连接带来的平滑度属性,本文提出了Skip Logit,以引入拟合任意DNN尺寸的跳过连接机制,并在重新网络中包含相似的属性。 Meta Tanh归一化(MTN)旨在学习差异信息并稳定训练过程。通过这些精致的设计,我们的跳过元重构(SML)为在两个现实世界数据集上的广泛SOTA CTR预测模型的性能带来了增量增强。同时,我们证明任意深度跳过logit网络的优化格局没有杂乱无章的本地Optima。最后,可以轻松地将SML添加到构建块中,并在应用程序广告上提供了离线准确性和在线业务指标的收益,以学习在Tiktok上对系统进行排名。
In light of the smoothness property brought by skip connections in ResNet, this paper proposed the Skip Logit to introduce the skip connection mechanism that fits arbitrary DNN dimensions and embraces similar properties to ResNet. Meta Tanh Normalization (MTN) is designed to learn variance information and stabilize the training process. With these delicate designs, our Skip Meta Logit (SML) brought incremental boosts to the performance of extensive SOTA ctr prediction models on two real-world datasets. In the meantime, we prove that the optimization landscape of arbitrarily deep skip logit networks has no spurious local optima. Finally, SML can be easily added to building blocks and has delivered offline accuracy and online business metrics gains on app ads learning to rank systems at TikTok.