论文标题

贝叶斯随机梯度下降,用于随机优化,并使用流入数据

Bayesian Stochastic Gradient Descent for Stochastic Optimization with Streaming Input Data

论文作者

Liu, Tianyi, Lin, Yifan, Zhou, Enlu

论文摘要

我们考虑分布不确定性下的随机优化,其中未知的分布参数是根据随着时间顺序到达的流数据估算的。此外,数据可能取决于它们生成的时间的决定。对于独立和决策依赖性的不确定性,我们提出了一种通过贝叶斯后分布共同估计分布参数的方法,并通过将随机梯度下降应用于目标函数的贝叶斯平均值来更新决策。我们的方法随着时间的流逝而渐近地收敛,并在独立的情况下达到了经典SGD的收敛速率。我们证明了我们方法在合成测试问题和经典新闻供应商问题上的经验表现。

We consider stochastic optimization under distributional uncertainty, where the unknown distributional parameter is estimated from streaming data that arrive sequentially over time. Moreover, data may depend on the decision of the time when they are generated. For both decision-independent and decision-dependent uncertainties, we propose an approach to jointly estimate the distributional parameter via Bayesian posterior distribution and update the decision by applying stochastic gradient descent on the Bayesian average of the objective function. Our approach converges asymptotically over time and achieves the convergence rates of classical SGD in the decision-independent case. We demonstrate the empirical performance of our approach on both synthetic test problems and a classical newsvendor problem.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源