High-Speed Embedded-Object Analysis Using a Dual-Line Timed-Address-Event Temporal-Contrast Vision Sensor
This paper presents a neuromorphic dual-line vision sensor and signal-processing concepts for object recognition and classification. The system performs ultrahigh speed machine vision with a compact and low-cost embedded-processing architecture. The main innovation of this paper includes efficient e...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on industrial electronics (1982) 2011-03, Vol.58 (3), p.770-783 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper presents a neuromorphic dual-line vision sensor and signal-processing concepts for object recognition and classification. The system performs ultrahigh speed machine vision with a compact and low-cost embedded-processing architecture. The main innovation of this paper includes efficient edge extraction of moving objects by the vision sensor on pixel level and a novel concept for real-time embedded vision processing based on address-event data. The proposed system exploits the very high temporal resolution and the sparse visual-information representation of the event-based vision sensor. The 2 × 256 pixel dual line temporal-contrast vision sensor asynchronously responds to relative illumination-intensity changes and consequently extracts contours of moving objects. This paper shows data-volume independence from object velocity and evaluates the data quality for object velocities of up to 40 m/s (equivalent to up to 6.25 m/s on the sensor's focal plane). Subsequently, an embedded-processing concept is presented for real-time extraction of object contours and for object recognition. Finally, the influence of object velocity on high-performance embedded computer vision is discussed. |
---|---|
ISSN: | 0278-0046 1557-9948 |
DOI: | 10.1109/TIE.2010.2095390 |