RGB Images from Wearable Cameras: Applications for Autonomous Lower-Limb Biomechatronic Device Control
This RGB image dataset was experimentally collected using a wearable camera system and postprocessed for image classification of different walking environments conducive to controlling autonomous lower-limb biomechatronic devices (e.g., robotic lower-limb prostheses and exoskeletons). Overall, 34,25...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Dataset |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This RGB image dataset was experimentally collected using a wearable camera system and postprocessed for image classification of different walking environments conducive to controlling autonomous lower-limb biomechatronic devices (e.g., robotic lower-limb prostheses and exoskeletons). Overall, 34,254 sampled images were collected and individually labelled. The authors request that prospective users reference the following publication: Laschowski B, McNally W, Wong A, and McPhee J. (2019). Preliminary Design of an Environment Recognition System for Controlling Robotic Lower-Limb Prostheses and Exoskeletons. IEEE International Conference on Rehabilitation Robotics. Toronto, Canada. |
---|---|
DOI: | 10.21227/apc2-s596 |