Human electrocortical dynamics while stepping over obstacles

To better understand human brain dynamics during visually guided locomotion, we developed a method of removing motion artifacts from mobile electroencephalography (EEG) and studied human subjects walking and running over obstacles on a treadmill. We constructed a novel dual-layer EEG electrode syste...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Scientific reports 2019-03, Vol.9 (1), p.4693-4693, Article 4693
Hauptverfasser: Nordin, Andrew D., Hairston, W. David, Ferris, Daniel P.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:To better understand human brain dynamics during visually guided locomotion, we developed a method of removing motion artifacts from mobile electroencephalography (EEG) and studied human subjects walking and running over obstacles on a treadmill. We constructed a novel dual-layer EEG electrode system to isolate electrocortical signals, and then validated the system using an electrical head phantom and robotic motion platform. We collected data from young healthy subjects walking and running on a treadmill while they encountered unexpected obstacles to step over. Supplementary motor area and premotor cortex had spectral power increases within ~200 ms after object appearance in delta, theta, and alpha frequency bands (3–13 Hz). That activity was followed by similar posterior parietal cortex spectral power increase that decreased in lag time with increasing locomotion speed. The sequence of activation suggests that supplementary motor area and premotor cortex interrupted the gait cycle, while posterior parietal cortex tracked obstacle location for planning foot placement nearly two steps ahead of reaching the obstacle. Together, these results highlight advantages of adopting dual-layer mobile EEG, which should greatly facilitate the study of human brain dynamics in physically active real-world settings and tasks.
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-019-41131-2