ADHD identification and its interpretation of functional connectivity using deep self-attention factorization
Attention deficit hyperactivity disorder (ADHD) is a common behavioural disorder in children. So far, its pathogenesis is not completely understood, and the diagnosis of ADHD still requires the manual interpretation of a sufficient amount of monitoring data by domain experts. Meanwhile, the analysis...
Gespeichert in:
Veröffentlicht in: | Knowledge-based systems 2022-08, Vol.250, p.109082, Article 109082 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Attention deficit hyperactivity disorder (ADHD) is a common behavioural disorder in children. So far, its pathogenesis is not completely understood, and the diagnosis of ADHD still requires the manual interpretation of a sufficient amount of monitoring data by domain experts. Meanwhile, the analysis of ADHD generally depends on subjective methods, such as questionnaire surveys and behavioural observations. Thus, the early diagnosis of ADHD patients has become even more challenging than later-stage diagnosis. For this purpose, this study proposes a new model, called Deep Channel Self-Attention Factorization (Deep CSAF), to learn the “important” factor matrices of interest that underlie the naturally non-linear functional Magnetic Resonance Imaging (fMRI) data. Unlike the traditional tensor decomposition methods, Deep CSAF extracts the non-linear factor matrices of interest via N-correlated self-attention convolutional neural networks, whose designs are inspired by the autoencoder. The experimental results on the publicly available dataset of the ADHD-200 Consortium illustrate that Deep CSAF can learn structural and non-linear factor matrices without sufficient a priori knowledge of the problem domain. Deep CSAF also retains maximal information to ensure the interpretation of functional connectivity. Moreover, our approach outperforms its state-of-the-art counterparts by achieving superior classification performance. |
---|---|
ISSN: | 0950-7051 1872-7409 |
DOI: | 10.1016/j.knosys.2022.109082 |