An augmented Lagrangian method with constraint generation for shape-constrained convex regression problems
Shape-constrained convex regression problem deals with fitting a convex function to the observed data, where additional constraints are imposed, such as component-wise monotonicity and uniform Lipschitz continuity. This paper provides a unified framework for computing the least squares estimator of...
Gespeichert in:
Veröffentlicht in: | Mathematical programming computation 2022-06, Vol.14 (2), p.223-270 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Shape-constrained convex regression problem deals with fitting a convex function to the observed data, where additional constraints are imposed, such as component-wise monotonicity and uniform Lipschitz continuity. This paper provides a unified framework for computing the least squares estimator of a multivariate shape-constrained convex regression function in
R
d
. We prove that the least squares estimator is computable via solving an essentially constrained convex quadratic programming (QP) problem with
(
d
+
1
)
n
variables,
n
(
n
-
1
)
linear inequality constraints and
n
possibly non-polyhedral inequality constraints, where
n
is the number of data points. To efficiently solve the generally very large-scale convex QP, we design a proximal augmented Lagrangian method (proxALM) whose subproblems are solved by the semismooth Newton method. To further accelerate the computation when
n
is huge, we design a practical implementation of the constraint generation method such that each reduced problem is efficiently solved by our proposed proxALM. Comprehensive numerical experiments, including those in the pricing of basket options and estimation of production functions in economics, demonstrate that our proposed proxALM outperforms the state-of-the-art algorithms, and the proposed acceleration technique further shortens the computation time by a large margin. |
---|---|
ISSN: | 1867-2949 1867-2957 |
DOI: | 10.1007/s12532-021-00210-0 |