MASSIVELY SCALABLE PARALLEL NEURAL NETWORKS: A BIG DATA EXPERIMENT

As the cost of computer hardware goes down, the efficiency of processors and secondary storage continues to increase at a rapid pace. These conditions have led us to storing huge amounts of data, which is sometimes referred to as Big Data. The Neural Network (NN) has been shown in the past as a capa...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of information, business and management business and management, 2016-05, Vol.8 (2), p.46
Hauptverfasser: McMurtrey, Shannon D, Sexton, Randall
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:As the cost of computer hardware goes down, the efficiency of processors and secondary storage continues to increase at a rapid pace. These conditions have led us to storing huge amounts of data, which is sometimes referred to as Big Data. The Neural Network (NN) has been shown in the past as a capable tool for prediction. However, operating in serial on such huge data sets is cumbersome and time consuming. In this research, a parallelized NN that uses a modified genetic algorithm known as Neural Network Simultaneous Optimization Algorithm (NNSOA) is shown to significantly speed up the training process. Past research has shown that the NNSOA was well suited for parallelizing the NN training process on computer generated data (McMurtrey, 2013). This research extends the experiment to real-world data for validity.
ISSN:2076-9202
2218-046X