论文标题
基于正则化模型的新的子空间最小化结合梯度方法,用于不受约束的优化
New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
论文作者
论文摘要
在本文中,提出了基于$ P -$正则化模型的两个新的子空间最小化结合梯度方法,其中分析了$ p -$正则化模型中的特殊缩放标准。特殊缩放规范的不同选择导致$ P -$正规化子问题的不同解决方案。根据在二维子空间中对溶液的分析,我们得出了满足足够下降条件的新方向。通过修改的非单极线搜索,我们在轻度假设下建立了所提出方法的全局收敛性。还分析了$ r- $ $线性收敛。数值结果表明,对于更可口的库,提出的方法优于四种结合梯度方法,这是由Hager和Zhang提出的(Siam J Optiam 16(1):170-192,2005),DAI和KOU,DAI和KOU(SIAM J OPTION 23(1):296-320,LIU和LIU和LIU和LIU(J-320,LIU和LIU(J)90(3)90(30)(30)(37)等。 (Comput Appl Math 38(1):2019)。
In this paper, two new subspace minimization conjugate gradient methods based on $p - $regularization models are proposed, where a special scaled norm in $p - $regularization model is analyzed. Different choices for special scaled norm lead to different solutions to the $p - $regularized subproblem. Based on the analyses of the solutions in a two-dimensional subspace, we derive new directions satisfying the sufficient descent condition. With a modified nonmonotone line search, we establish the global convergence of the proposed methods under mild assumptions. $R - $linear convergence of the proposed methods are also analyzed. Numerical results show that, for the CUTEr library, the proposed methods are superior to four conjugate gradient methods, which were proposed by Hager and Zhang (SIAM J Optim 16(1):170-192, 2005), Dai and Kou (SIAM J Optim 23(1):296-320, 2013), Liu and Liu (J Optim Theory Appl 180(3):879-906, 2019) and Li et al. (Comput Appl Math 38(1): 2019), respectively.