Pauses for Detection of Alzheimer’s Disease
Pauses, disfluencies and language problems in Alzheimer’s disease can be naturally modeled by fine-tuning Transformer-based pre-trained language models such as BERT and ERNIE. Using this method with pause-encoded transcripts, we achieved 89.6% accuracy on the test set of the ADReSS ( A lzheimer’s D...
Gespeichert in:
Veröffentlicht in: | Frontiers in computer science (Lausanne) 2021-01, Vol.2 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Pauses, disfluencies and language problems in Alzheimer’s disease can be naturally modeled by fine-tuning Transformer-based pre-trained language models such as BERT and ERNIE. Using this method with pause-encoded transcripts, we achieved 89.6% accuracy on the test set of the ADReSS (
A
lzheimer’s
D
ementia
Re
cognition through
S
pontaneous
S
peech) Challenge. The best accuracy was obtained with ERNIE, plus an encoding of pauses. Robustness is a challenge for large models and small training sets. Ensemble over many runs of BERT/ERNIE fine-tuning reduced variance and improved accuracy. We found that
um
was used much less frequently in Alzheimer’s speech, compared to
uh
. We discussed this interesting finding from linguistic and cognitive perspectives. |
---|---|
ISSN: | 2624-9898 2624-9898 |
DOI: | 10.3389/fcomp.2020.624488 |