Pedestrian lane detection in unstructured scenes for assistive navigation

•We propose an algorithm for pedestrian lane detection in unstructured scenes.•Vanishing point is found via color tensor and local orientations of edge pixels.•Sample lane region is then located via color and geometrics features of lane.•A lane model adaptive to the input image is constructed for la...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computer vision and image understanding 2016-08, Vol.149, p.186-196
Hauptverfasser: Phung, Son Lam, Le, Manh Cuong, Bouzerdoum, Abdesselam
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•We propose an algorithm for pedestrian lane detection in unstructured scenes.•Vanishing point is found via color tensor and local orientations of edge pixels.•Sample lane region is then located via color and geometrics features of lane.•A lane model adaptive to the input image is constructed for lane segmentation.•We create a dataset for vanishing point estimation and pedestrian lane detection. Automatic detection of the pedestrian lane in a scene is an important task in assistive and autonomous navigation. This paper presents a vision-based algorithm for pedestrian lane detection in unstructured scenes, where lanes vary significantly in color, texture, and shape and are not indicated by any painted markers. In the proposed method, a lane appearance model is constructed adaptively from a sample image region, which is identified automatically from the image vanishing point. This paper also introduces a fast and robust vanishing point estimation method based on the color tensor and dominant orientations of color edge pixels. The proposed pedestrian lane detection method is evaluated on a new benchmark dataset that contains images from various indoor and outdoor scenes with different types of unmarked lanes. Experimental results are presented which demonstrate its efficiency and robustness in comparison with several existing methods.
ISSN:1077-3142
1090-235X
DOI:10.1016/j.cviu.2016.01.011