论文标题

关于基于逻辑的弯曲者分解在解决0-1分钟 - 最大遗憾优化问题时的有限最佳融合,以间隔成本

On the Finite Optimal Convergence of Logic-Based Benders' Decomposition in Solving 0-1 Min-max Regret Optimization Problems with Interval Costs

论文作者

Assunção, Lucas, Santos, Andréa Cynthia, Noronha, Thiago F., Andrade, Rafael

论文摘要

本文解决了在间隔数据下的一系列问题不确定性,由Min-Max遗憾版本组成的经典0-1优化问题与间隔成本的优化问题。我们称它们为间隔0-1 min-Max后悔问题。该类别问题的最新精确算法通过在弯曲者分解方式中求解相应的混合整数线性编程公式来工作。每个可能指数的弯曲者的切口都是通过经典0-1优化问题对应物的实例分离而分开的。由于这些分离子问题可能是NP - hard,因此除非P = np,否则并非所有这些子问题都可以通过线性编程进行建模。在这些情况下,不能直接保证上述算法的收敛性。实际上,据我们所知,对于任何间隔0-1 min-Max遗憾问题,尚未明确证明它们的有限融合。在这项工作中,我们通过定义基于逻辑的Benders的分解框架正式描述了这些算法,并证明了它们在有限数量的迭代中的最佳解决方案的收敛性。由于此框架适用于任何间隔0-1 min-max遗憾问题,因此在分离子问题是NP-HARD的情况下,其有限的最佳收敛也可以。

This paper addresses a class of problems under interval data uncertainty composed of min-max regret versions of classical 0-1 optimization problems with interval costs. We refer to them as interval 0-1 min-max regret problems. The state-of-the-art exact algorithms for this class of problems work by solving a corresponding mixed integer linear programming formulation in a Benders' decomposition fashion. Each of the possibly exponentially many Benders' cuts is separated on the fly through the resolution of an instance of the classical 0-1 optimization problem counterpart. Since these separation subproblems may be NP-hard, not all of them can be modeled by means of linear programming, unless P = NP. In these cases, the convergence of the aforementioned algorithms are not guaranteed in a straightforward manner. In fact, to the best of our knowledge, their finite convergence has not been explicitly proved for any interval 0-1 min-max regret problem. In this work, we formally describe these algorithms through the definition of a logic-based Benders' decomposition framework and prove their convergence to an optimal solution in a finite number of iterations. As this framework is applicable to any interval 0-1 min-max regret problem, its finite optimal convergence also holds in the cases where the separation subproblems are NP-hard.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源