Sleep Classification With Artificial Synthetic Imaging Data Using Convolutional Neural Networks

Objective: We propose a new analytic framework, "Artificial Synthetic Imaging Data (ASID) Workflow," for sleep classification from a wearable device comprising: 1) the creation of ASID from data collected by a non-invasive wearable device that permits real-time multi-modal physiological mo...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE journal of biomedical and health informatics 2023-01, Vol.27 (1), p.421-432
Hauptverfasser: Shi, Lan, Wank, Marianthie, Chen, Yan, Wang, Yibo, Liu, Yachuan, Hector, Emily C., Song, Peter X.K.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Objective: We propose a new analytic framework, "Artificial Synthetic Imaging Data (ASID) Workflow," for sleep classification from a wearable device comprising: 1) the creation of ASID from data collected by a non-invasive wearable device that permits real-time multi-modal physiological monitoring on heart rate (HR), 3-axis accelerometer, electrodermal activity, and skin temperature, denoted as "Temporal E4 Data" (TED) and 2) the use of an image classification supervised learning algorithm, convolutional neural network (CNN), to classify periods of sleep. Methods: We investigate ASID Workflow under 6 settings (3 data resolutions × 2 HR scenarios). Competing machine/deep learning classification algorithms, including logistic regression, support vector machine, random forest, k-nearest neighbors, and Long Short-Term Memory, are applied to TED as comparisons, termed "Competing Workflow." Results: The ASID Workflow achieves excellent performance with mean weighted accuracy across settings of 94.7%, and is superior to the Competing Workflow with high and low resolution data regardless of the inclusion of HR modality. This superiority is maximized for low resolution data without HR. Additionally, CNN has a relatively low subject-wise test computational cost compared with competing algorithms. Conclusion: We demonstrate the utility of creating ASID from multi-modal physiological data and applying a preexisting image classification algorithm to achieve better classification accuracy. We shed light on the influence of data resolution and HR modality on the Workflow's performance. Significance: Applying CNN to ASID allows us to capture both temporal and spatial dependency among physiological variables and modalities by using 2D images' topological structure that competing algorithms fail to utilize.
ISSN:2168-2194
2168-2208
DOI:10.1109/JBHI.2022.3210485