CARRNN: A Continuous Autoregressive Recurrent Neural Network for Deep Representation Learning From Sporadic Temporal Data

Learning temporal patterns from multivariate longitudinal data is challenging especially in cases when data is sporadic, as often seen in, e.g., healthcare applications where the data can suffer from irregularity and asynchronicity as the time between consecutive data points can vary across features...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2024-01, Vol.35 (1), p.792-802
Hauptverfasser: Ghazi, Mostafa Mehdipour, Sorensen, Lauge, Ourselin, Sebastien, Nielsen, Mads
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Learning temporal patterns from multivariate longitudinal data is challenging especially in cases when data is sporadic, as often seen in, e.g., healthcare applications where the data can suffer from irregularity and asynchronicity as the time between consecutive data points can vary across features and samples, hindering the application of existing deep learning models that are constructed for complete, evenly spaced data with fixed sequence lengths. In this article, a novel deep learning-based model is developed for modeling multiple temporal features in sporadic data using an integrated deep learning architecture based on a recurrent neural network (RNN) unit and a continuous-time autoregressive (CAR) model. The proposed model, called CARRNN, uses a generalized discrete-time autoregressive (AR) model that is trainable end-to-end using neural networks modulated by time lags to describe the changes caused by the irregularity and asynchronicity. It is applied to time-series regression and classification tasks for Alzheimer's disease progression modeling, intensive care unit (ICU) mortality rate prediction, human activity recognition, and event-based digit recognition, where the proposed model based on a gated recurrent unit (GRU) in all cases achieves significantly better predictive performance than the state-of-the-art methods using RNNs, GRUs, and long short-term memory (LSTM) networks.
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2022.3177366