A Robust and Efficient Hybrid Algorithm for Global Optimization
The objective of realizing more effective solution during any complex system design can be achieved by the application of Multidisciplinary Design Optimization. The primary problem in developing an integrated framework, which is essential in the iterative procedure of optimization, is how to automat...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The objective of realizing more effective solution during any complex system design can be achieved by the application of Multidisciplinary Design Optimization. The primary problem in developing an integrated framework, which is essential in the iterative procedure of optimization, is how to automate the design codes that were designed to be used by experts. Automation of design codes primarily calls for a robust optimization algorithm which can reach global optimum without calling for much expertise - with reference to neither the design problem nor the optimizing algorithm's parameters - from the user. Gradient search methods' efficiency in reaching global optimum relies on the expertise in providing right initial guess. Whereas in case of Genetic Algorithm(GA), it depends on the expertise in choosing GA parameters. This paper proposes a new hybrid approach, Genetic Algorithm Guided Gradient Search (GAGGS), which overcomes these limitations. This algorithm simultaneously exploits the gradients method's capability to quickly converge to the local optimum and GA's capability to explore the entire design space. To demonstrate its robustness and efficiency, it is applied to Keane's bumpy function with two and ten design variables. |
---|---|
DOI: | 10.1109/IADCC.2009.4809059 |