论文标题
知道在哪里降低体重:迈向更快的不确定性估计
Know Where To Drop Your Weights: Towards Faster Uncertainty Estimation
论文作者
论文摘要
估计在低延迟应用中使用的模型的认知不确定性和分布式样本检测是一个挑战,这是一个挑战,这是由于不确定性估计技术的计算要求性质。使用Monte Carlo辍学(MCD)等近似技术估算模型不确定性,DropConnect(MCDC)需要大量的正向通过网络,从而使它们不适合低延迟应用程序。我们提出了Select-DC,它使用神经网络中的一部分层来模拟MCDC的认知不确定性。通过我们的实验,与Monte Carlo DropConnect相比,我们显示了建模不确定性所需的GFLOPS的大幅度减少,并且性能的边缘折衷。我们在CIFAR 10,CIFAR 100和SVHN数据集上执行了一套实验,并具有Resnet和VGG模型。我们进一步展示了如何将DropConnect应用于网络中的各个层的掉落概率不同,会影响网络性能和预测分布的熵。
Estimating epistemic uncertainty of models used in low-latency applications and Out-Of-Distribution samples detection is a challenge due to the computationally demanding nature of uncertainty estimation techniques. Estimating model uncertainty using approximation techniques like Monte Carlo Dropout (MCD), DropConnect (MCDC) requires a large number of forward passes through the network, rendering them inapt for low-latency applications. We propose Select-DC which uses a subset of layers in a neural network to model epistemic uncertainty with MCDC. Through our experiments, we show a significant reduction in the GFLOPS required to model uncertainty, compared to Monte Carlo DropConnect, with marginal trade-off in performance. We perform a suite of experiments on CIFAR 10, CIFAR 100, and SVHN datasets with ResNet and VGG models. We further show how applying DropConnect to various layers in the network with different drop probabilities affects the networks performance and the entropy of the predictive distribution.