Under canopy light detection and ranging–based autonomous navigation

This study describes a light detection and ranging (LiDAR)–based autonomous navigation system for an ultralightweight ground robot in agricultural fields. The system is designed for reliable navigation under cluttered canopies using only a 2D Hokuyo UTM–30LX LiDAR sensor as the single source for per...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of field robotics 2018-12, Vol.36 (3)
Hauptverfasser: Higuti, Vitor A. H., Velasquez, Andres E. B., Magalhaes, Daniel Varela, Becker, Marcelo, Chowdhary, Girish
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This study describes a light detection and ranging (LiDAR)–based autonomous navigation system for an ultralightweight ground robot in agricultural fields. The system is designed for reliable navigation under cluttered canopies using only a 2D Hokuyo UTM–30LX LiDAR sensor as the single source for perception. Its purpose is to ensure that the robot can navigate through rows of crops without damaging the plants in narrow row–based and high–leaf–cover semistructured crop plantations, such as corn (Zea mays) and sorghum (Sorghum bicolor). The key contribution of our work is a LiDAR–based navigation algorithm capable of rejecting outlying measurements in the point cloud due to plants in adjacent rows, low–hanging leaf cover or weeds. The algorithm addresses this challenge using a set of heuristics that are designed to filter out outlying measurements in a computationally efficient manner, and linear least squares are applied to estimate within–row distance using the filtered data. Moreover, a crucial step is the estimate validation, which is achieved through a heuristic that grades and validates the fitted row–lines based on current and previous information. The proposed LiDAR–based perception subsystem has been extensively tested in production/breeding corn and sorghum fields. In such variety of highly cluttered real field environments, the robot logged more than 6 km of autonomous run in straight rows. These results demonstrate highly promising advances to LiDAR–based navigation in realistic field environments for small under–canopy robots.
ISSN:1556-4959
1556-4967