A projective chirp based stair representation and detection from monocular images and its application for the visually impaired

•Introducing a projective chirp model for staircasess structure representation.•Proposing an iterative algorithm for staircases detection from monocular images.•Validating performances of the proposed method on benchmark and artificial datasets.•Deployment of a staircase alarm system supporting blin...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition letters 2020-09, Vol.137, p.17-26
Hauptverfasser: Vu, Hai, Hoang, Van-Nam, Le, Thi-Lan, Tran, Thanh-Hai, Nguyen, Thi Thuy
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•Introducing a projective chirp model for staircasess structure representation.•Proposing an iterative algorithm for staircases detection from monocular images.•Validating performances of the proposed method on benchmark and artificial datasets.•Deployment of a staircase alarm system supporting blind people in indoor environment. The most prominent characteristic of a stair is that it has rigid form with periodic pattern of its steps. In this work, we exploit this periodic characteristic in view of geometrical rules. As a stair consists of equidistant nosing lines, under a perspective projection of camera, the projection of these lines on an image follows a projective chirplet transform. We propose to detect a stair by finding a group of lines that best satisfies a projective chirp model. The most advantage of the proposed techniques is that some missed noising lines and thus whole stair could be recovered. We validate the proposed method on both artificial and real datasets. The experimental results show a higher detection rate on different datasets. Finally, a real application alarming the visually impaired about stairs in indoor environments has been conducted and obtained 88.37% of accuracy. The implementations and datasets are made publicly available.
ISSN:0167-8655
1872-7344
DOI:10.1016/j.patrec.2019.03.007