Estimates of Generalized Hessians for Optimal Value Functions in Mathematical Programming

We consider the optimal value function of a parametric optimization problem. A large number of publications have been dedicated to the study of continuity and differentiability properties of the function. However, the differentiability aspect of works in the current literature has mostly been limite...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Set-valued and variational analysis 2022-09, Vol.30 (3), p.847-871
1. Verfasser: Zemkoho, Alain B.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We consider the optimal value function of a parametric optimization problem. A large number of publications have been dedicated to the study of continuity and differentiability properties of the function. However, the differentiability aspect of works in the current literature has mostly been limited to first order analysis, with focus on estimates of its directional derivatives and subdifferentials, given that the function is typically nonsmooth. With the progress made in the last two to three decades in major subfields of optimization such as robust, minmax, semi-infinite and bilevel optimization, and their connection to the optimal value function, there is a need for a second order analysis of the generalized differentiability properties of this function. This could enable the development of robust solution algorithms, such as the Newton method. The main goal of this paper is to provide estimates of the generalized Hessian for the optimal value function. Our results are based on two handy tools from parametric optimization, namely the optimal solution and Lagrange multiplier mappings, for which completely detailed estimates of their generalized derivatives are either well-known or can easily be obtained.
ISSN:1877-0533
1877-0541
DOI:10.1007/s11228-021-00623-y