Mobility and low contrast trip hazard avoidance using augmented depth
Objective. We evaluated a novel visual representation for current and near-term prosthetic vision. Augmented depth emphasizes ground obstacles and floor-wall boundaries in a depth-based visual representation. This is achieved by artificially increasing contrast between obstacles and the ground surfa...
Gespeichert in:
Veröffentlicht in: | Journal of neural engineering 2015-02, Vol.12 (1), p.016003-016003 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Objective. We evaluated a novel visual representation for current and near-term prosthetic vision. Augmented depth emphasizes ground obstacles and floor-wall boundaries in a depth-based visual representation. This is achieved by artificially increasing contrast between obstacles and the ground surface via a novel ground plane extraction algorithm specifically designed to preserve low-contrast ground-surface boundaries. Approach. The effectiveness of augmented depth was examined in human mobility trials compared against standard intensity-based (Intensity), depth-based (Depth) and random (Random) visual representations. Eight participants with normal vision used simulated prosthetic vision with 20 phosphenes and eight perceivable brightness levels to traverse a course with randomly placed small and low-contrast obstacles on the ground. Main results. The number of collisions was significantly reduced using augmented depth, compared with intensity, depth and random representations (48%, 44% and 72% less collisions, respectively). Significance. These results indicate that augmented depth may enable safe mobility in the presence of low-contrast obstacles with current and near-term implants. This is the first demonstration that an augmentation of the scene ensuring key objects are visible may provide better outcomes for prosthetic vision. |
---|---|
ISSN: | 1741-2560 1741-2552 |
DOI: | 10.1088/1741-2560/12/1/016003 |