Fractional Tensor Recurrent Unit (fTRU): A Stable Forecasting Model With Long Memory

The tensor recurrent model is a family of nonlinear dynamical systems, of which the recurrence relation consists of a p -fold (called degree- p ) tensor product. Despite such models frequently appearing in advanced recurrent neural networks (RNNs), to this date, there are limited studies on their l...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2023-12, Vol.PP, p.1-10
Hauptverfasser: Qiu, Hejia, Li, Chao, Weng, Ying, Sun, Zhun, Zhao, Qibin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The tensor recurrent model is a family of nonlinear dynamical systems, of which the recurrence relation consists of a p -fold (called degree- p ) tensor product. Despite such models frequently appearing in advanced recurrent neural networks (RNNs), to this date, there are limited studies on their long memory properties and stability in sequence tasks. In this article, we propose a fractional tensor recurrent model, where the tensor degree p is extended from the discrete domain to the continuous domain, so it is effectively learnable from various datasets. Theoretically, we prove that a large degree p is essential to achieve the long memory effect in a tensor recurrent model, yet it could lead to unstable dynamical behaviors. Hence, our new model, named fractional tensor recurrent unit (fTRU), is expected to seek the saddle point between long memory property and model stability during the training. We experimentally show that the proposed model achieves competitive performance with a long memory and stable manners in several forecasting tasks compared to various advanced RNNs.
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2023.3338696