论文标题

用不精确的Oracle分布式优化

Distributed Optimization with Inexact Oracle

论文作者

Zhu, Kui, Zhang, Yichen, Tang, Yutao

论文摘要

在本文中,我们使用近似的一阶信息研究了分布式优化问题。我们假设代理可以反复调用每个单独的目标函数的不精确的一阶甲骨文,并与其随时间变化的邻居交换信息。在这种情况下,我们重新访问了分布式的亚级别方法,并显示了其在Square总结下的次优小性,但不可汇总的步骤大小。我们还介绍了几个条件,以确保迭代序列朝着全局最佳解决方案的确切收敛。给出了一个数值示例来验证我们算法的效率。

In this paper, we study the distributed optimization problem using approximate first-order information. We suppose the agent can repeatedly call an inexact first-order oracle of each individual objective function and exchange information with its time-varying neighbors. We revisit the distributed subgradient method in this circumstance and show its suboptimality under square summable but not summable step sizes. We also present several conditions on the inexactness of the local oracles to ensure an exact convergence of the iterative sequences towards the global optimal solution. A numerical example is given to verify the efficiency of our algorithm.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源