Using Recurrent Neural Networks to Model Spatial Grammars for Design Creation

The authors present preliminary results on successfully training a recurrent neural network to learn a spatial grammar embodied in a data set, and then generate new designs that comply with the grammar but are not from the data set, demonstrating generalized learning. For the test case, the data wer...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of mechanical design (1990) 2020-10, Vol.142 (10), Article 104501
Hauptverfasser: Yukish, Michael A, Stump, Gary M, Miller, Simon W
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The authors present preliminary results on successfully training a recurrent neural network to learn a spatial grammar embodied in a data set, and then generate new designs that comply with the grammar but are not from the data set, demonstrating generalized learning. For the test case, the data were created by first exercising generative context-free spatial grammar representing physical layouts that included infeasible designs due to geometric interferences and then removing the designs that violated geometric constraints, resulting in a data set from a design grammar that is of a higher complexity context-sensitive grammar. A character recurrent neural network (char-RNN) was trained on the positive remaining results. Analysis shows that the char-RNN was able to effectively learn the spatial grammar with high reliability, and for the given problem with tuned hyperparameters, having up to 98% success rate compared to a 62% success rate when randomly sampling the generative grammar. For a more complex problem where random sampling results in only 18% success, a trained char-RNN generated feasible solutions with an 89% success rate. Further, the char-RNN also generated designs differing from the training set at a rate of over 99%, showing generalized learning.
ISSN:1050-0472
1528-9001
DOI:10.1115/1.4046806