HcLSH: A Novel Non-Linear Monotonic Activation Function for Deep Learning Methods

Activation functions are essential components in any neural network model; they play a crucial role in determining the network's expressive power through their introduced non-linearity. Rectified Linear Unit (ReLU) has been the famous and default choice for most deep neural network models becau...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2023-01, Vol.11, p.1-1
Hauptverfasser: Abdel-Nabi, Heba, Al-Naymat, Ghazi, Ali, Mostafa, Awajan, Arafat
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!