论文标题
关于模型差异的超差异灵敏度分析:最佳解决方案更新
Hyper-differential sensitivity analysis with respect to model discrepancy: Optimal solution updating
论文作者
论文摘要
整个科学和工程的一个共同目标是解决受计算模型限制的优化问题。但是,在许多情况下,由于代码复杂性和计算成本,无法优化系统的高保真数值模拟,从而禁止使用侵入性和许多查询算法。相反,构建了低保真模型以实现大规模优化的侵入性算法。由于高保真模型之间的差异,使用低保真模型确定的最佳解决方案通常远非真正的最优性。在本文中,我们介绍了一种新颖的方法,该方法在模型差异方面使用了在模型差异方面使用优化解决方案的敏感性。有限的高保真数据用于校准贝叶斯框架中的模型差异,而贝叶斯框架反过来又通过低前景优化问题的当时敏感性来传播。我们的公式利用了后敏感性操作员的结构来实现计算可伸缩性。数值结果证明了如何使用高保真模型评估有限的评估,可以显着改善使用低保真模型计算的最佳解决方案。
A common goal throughout science and engineering is to solve optimization problems constrained by computational models. However, in many cases a high-fidelity numerical emulation of systems cannot be optimized due to code complexity and computational costs which prohibit the use of intrusive and many query algorithms. Rather, lower-fidelity models are constructed to enable intrusive algorithms for large-scale optimization. As a result of the discrepancy between high and low-fidelity models, optimal solutions determined using low-fidelity models are frequently far from true optimality. In this article we introduce a novel approach that uses post-optimality sensitivities with respect to model discrepancy to update the optimization solution. Limited high-fidelity data is used to calibrate the model discrepancy in a Bayesian framework which in turn is propagated through post-optimality sensitivities of the low-fidelity optimization problem. Our formulation exploits structure in the post-optimality sensitivity operator to achieve computational scalability. Numerical results demonstrate how an optimal solution computed using a low-fidelity model may be significantly improved with limited evaluations of a high-fidelity model.