Zorro: A Flexible and Differentiable Parametric Family of Activation Functions That Extends ReLU and GELU
Even in recent neural network architectures such as Transformers and Extended LSTM (xLSTM), and traditional ones like Convolutional Neural Networks, Activation Functions are an integral part of nearly all neural networks. They enable more effective training and capture nonlinear data patterns. More...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Even in recent neural network architectures such as Transformers and Extended
LSTM (xLSTM), and traditional ones like Convolutional Neural Networks,
Activation Functions are an integral part of nearly all neural networks. They
enable more effective training and capture nonlinear data patterns. More than
400 functions have been proposed over the last 30 years, including fixed or
trainable parameters, but only a few are widely used. ReLU is one of the most
frequently used, with GELU and Swish variants increasingly appearing. However,
ReLU presents non-differentiable points and exploding gradient issues, while
testing different parameters of GELU and Swish variants produces varying
results, needing more parameters to adapt to datasets and architectures. This
article introduces a novel set of activation functions called Zorro, a
continuously differentiable and flexible family comprising five main functions
fusing ReLU and Sigmoid. Zorro functions are smooth and adaptable, and serve as
information gates, aligning with ReLU in the 0-1 range, offering an alternative
to ReLU without the need for normalization, neuron death, or gradient
explosions. Zorro also approximates functions like Swish, GELU, and DGELU,
providing parameters to adjust to different datasets and architectures. We
tested it on fully connected, convolutional, and transformer architectures to
demonstrate its effectiveness. |
---|---|
DOI: | 10.48550/arxiv.2409.19239 |