Cerebral LSTM: A Better Alternative for Single- and Multi-Stacked LSTM Cell-Based RNNs

Deep learning has rapidly transformed the natural language processing domain with its recurrent neural networks. LSTM is one such popular repeating cell unit used for building these recurrent neural network-based deep learning architectures. In this paper, we proposed a significantly improved versio...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:SN computer science 2020-03, Vol.1 (2), p.85, Article 85
1. Verfasser: Kumar, Ravin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Deep learning has rapidly transformed the natural language processing domain with its recurrent neural networks. LSTM is one such popular repeating cell unit used for building these recurrent neural network-based deep learning architectures. In this paper, we proposed a significantly improved version of LSTM named Cerebral LSTM which has much better ability to understand time-series data. Extensive experiments were conducted to get an unbiased performance comparison of our proposed version. Obtained results showed that recurrent neural network constructed using single Cerebral LSTM cell outperformed both recurrent neural network with single LSTM cell and recurrent neural network with two-stacked LSTM cells.
ISSN:2662-995X
2661-8907
DOI:10.1007/s42979-020-0101-1