论文标题

多项式优化的相关稀疏的Lagrange乘数表达式松弛

A Correlatively Sparse Lagrange Multiplier Expression Relaxation for Polynomial Optimization

论文作者

Qu, Zheng, Tang, Xindong

论文摘要

在本文中,我们考虑使用相关稀疏性的多项式优化。我们使用Karush-Kuhn-Tucker最佳条件构建了相关的稀疏Lagrange乘数表达式(CS-LMES),并提出针对多项式优化问题的CS-LME重新进行。使用相关的稀疏平方(CS-SOS)松弛来解决CS-LME重新制定。我们表明,CS-LME重新印象继承了原始的相关稀疏模式,并且与将其应用于原始问题相比,CS-SOS放松在应用于CS-LME重新印度时提供了更清晰的下限。此外,在轻度条件下,可以保证我们的方法的收敛性。在数值实验中,我们的新方法通常会发现全局最佳值(可忽略不计的误差),而对于直接解决该问题的情况下,较低的松弛顺序未能获得准确的近似值。同样,通过正确利用相关稀疏性,我们的CS-LME方法比原始LME方法要少于达到相同精度水平的计算时间。

In this paper, we consider polynomial optimization with correlative sparsity. We construct correlatively sparse Lagrange multiplier expressions (CS-LMEs) and propose CS-LME reformulations for polynomial optimization problems using the Karush-Kuhn-Tucker optimality conditions. Correlatively sparse sum-of-squares (CS-SOS) relaxations are applied to solve the CS-LME reformulation. We show that the CS-LME reformulation inherits the original correlative sparsity pattern, and the CS-SOS relaxation provides sharper lower bounds when applied to the CS-LME reformulation, compared with when it is applied to the original problem. Moreover, the convergence of our approach is guaranteed under mild conditions. In numerical experiments, our new approach usually finds the global optimal value (up to a negligible error) with a low relaxation order for cases where directly solving the problem fails to get an accurate approximation. Also, by properly exploiting the correlative sparsity, our CS-LME approach requires less computational time than the original LME approach to reach the same accuracy level.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源