Collection of multimodal data in real-world driving
Our research group has recently developed a new data collection vehicle equipped with various sensors for the synchronous recording of multimodal data including speech, video, driving behavior, and physiological signals. Driver speech is recorded with 12 microphones distributed throughout the vehicl...
Gespeichert in:
Veröffentlicht in: | The Journal of the Acoustical Society of America 2006-11, Vol.120 (5_Supplement), p.3044-3044 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Our research group has recently developed a new data collection vehicle equipped with various sensors for the synchronous recording of multimodal data including speech, video, driving behavior, and physiological signals. Driver speech is recorded with 12 microphones distributed throughout the vehicle. Face images and a view of the road ahead are captured with three CCD cameras. Driving behavior signals including gas and brake pedal pressures, steering angles, vehicle velocities, and following distances are recorded. Physiological sensors are mounted to measure the drivers’ heart rate, skin conductance, and emotion-based sweating on the palm of the hand and sole of the foot. The multimodal data are collected while driving on city roads and expressways during four different tasks: reading random four-character alphanumeric strings, reading words on billboards and signs seen while driving, interacting with a spoken dialogue system to retrieve and play music, and talking on a cell phone with a human navigator using a hands-free device. Data collection is currently underway. The multimodal database will be published in the future for various research purposes such as noise-robust speech recognition in car environments, detection of driver stress while driving, and the prediction of driving behaviors for improving intelligent transportation systems. |
---|---|
ISSN: | 0001-4966 1520-8524 |
DOI: | 10.1121/1.4787230 |