Obstacle detection in a greenhouse environment using the Kinect sensor
•Obstacle detection in a greenhouse using the Kinect 3D sensor.•Depth is processed into slope information.•Obstacle decision is based on color, texture and slope.•The system is real-time and provides good results. In many agricultural robotic applications, the robotic vehicle must detect obstacles i...
Gespeichert in:
Veröffentlicht in: | Computers and electronics in agriculture 2015-04, Vol.113, p.104-115 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •Obstacle detection in a greenhouse using the Kinect 3D sensor.•Depth is processed into slope information.•Obstacle decision is based on color, texture and slope.•The system is real-time and provides good results.
In many agricultural robotic applications, the robotic vehicle must detect obstacles in its way in order to navigate correctly. This is also true for robotic spray vehicles that autonomously explore greenhouses. In this study, we present an approach to obstacle detection in a greenhouse environment using the Kinect 3D sensor, which provides synchronized color and depth information. First, the depth data are processed by applying slope computation. This creates an obstacle map which labels the pixels that exceed a predefined slope as suspected obstacles, and others as surface. Then, the system uses both color and texture features to classify the suspected obstacle pixels. The obstacle detection decision is made using information on the pixel slope, its intensity and surrounding neighbor pixels. The obstacle detection of the proposed sensor and algorithm is demonstrated on data recorded by the Kinect in a greenhouse. We show that the system produces satisfactory results (all obstacles were detected with only few false positive detections) and is fast enough to run on a limited computer. |
---|---|
ISSN: | 0168-1699 1872-7107 |
DOI: | 10.1016/j.compag.2015.02.001 |