Generative NeuroEvolution for Deep Learning
An important goal for the machine learning (ML) community is to create approaches that can learn solutions with human-level capability. One domain where humans have held a significant advantage is visual processing. A significant approach to addressing this gap has been machine learning approaches t...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | An important goal for the machine learning (ML) community is to create
approaches that can learn solutions with human-level capability. One domain
where humans have held a significant advantage is visual processing. A
significant approach to addressing this gap has been machine learning
approaches that are inspired from the natural systems, such as artificial
neural networks (ANNs), evolutionary computation (EC), and generative and
developmental systems (GDS). Research into deep learning has demonstrated that
such architectures can achieve performance competitive with humans on some
visual tasks; however, these systems have been primarily trained through
supervised and unsupervised learning algorithms. Alternatively, research is
showing that evolution may have a significant role in the development of visual
systems. Thus this paper investigates the role neuro-evolution (NE) can take in
deep learning. In particular, the Hypercube-based NeuroEvolution of Augmenting
Topologies is a NE approach that can effectively learn large neural structures
by training an indirect encoding that compresses the ANN weight pattern as a
function of geometry. The results show that HyperNEAT struggles with performing
image classification by itself, but can be effective in training a feature
extractor that other ML approaches can learn from. Thus NeuroEvolution combined
with other ML methods provides an intriguing area of research that can
replicate the processes in nature. |
---|---|
DOI: | 10.48550/arxiv.1312.5355 |