Efficient Non-linear Optimization via Multi-scale Gradient Filtering

We present a method for accelerating the convergence of continuous non‐linear shape optimization algorithms. We start with a general method for constructing gradient vector fields on a manifold, and we analyse this method from a signal processing viewpoint. This analysis reveals that we can construc...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computer graphics forum 2013-09, Vol.32 (6), p.89-100
Hauptverfasser: Martin, Tobias, Joshi, Pushkar, Bergou, Miklós, Carr, Nathan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We present a method for accelerating the convergence of continuous non‐linear shape optimization algorithms. We start with a general method for constructing gradient vector fields on a manifold, and we analyse this method from a signal processing viewpoint. This analysis reveals that we can construct various filters using the Laplace–Beltrami operator of the shape that can effectively separate the components of the gradient at different scales. We use this idea to adaptively change the scale of features being optimized to arrive at a solution that is optimal across multiple scales. This is in contrast to traditional descent‐based methods, for which the rate of convergence often stalls early once the high frequency components have been optimized. We demonstrate how our method can be easily integrated into existing non‐linear optimization frameworks such as gradient descent, Broyden–Fletcher–Goldfarb–Shanno (BFGS) and the non‐linear conjugate gradient method. We show significant performance improvement for shape optimization in variational shape modelling and parameterization, and we also demonstrate the use of our method for efficient physical simulation. We present a method for accelerating the convergence of continuous nonlinear shape optimization algorithms. We start with a general method for constructing gradient vector fields on a manifold, and we analyze this method from a signal processing viewpoint. This analysis reveals that we can construct various filters using the Laplace‐Beltrami operator of the shape that can effectively separate the components of the gradient at different scales. We use this idea to adaptively change the scale of features being optimized in order to arrive at a solution that is optimal across multiple scales.
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.12019