An Autonomous Crop Treatment Robot: Part I. A Kalman Filter Model for Localization and Crop/Weed Classification

This work is concerned with a machine vision system for an autonomous vehicle designed to treat horticultural crops. The vehicle navigates by following rows of crop (individual cauliflower plants) that are planted in a reasonably regular array typical of commercial practice. We adopt an extended Kal...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The International journal of robotics research 2002-01, Vol.21 (1), p.61-74
Hauptverfasser: Southall, B., Hague, T., Marchant, J.A., Buxton, B.F.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This work is concerned with a machine vision system for an autonomous vehicle designed to treat horticultural crops. The vehicle navigates by following rows of crop (individual cauliflower plants) that are planted in a reasonably regular array typical of commercial practice. We adopt an extended Kalman filter approach where the observation model consists of a grid which is matched to the crop planting pattern in the perspective view through the vehicle camera. Plant features are extracted by thresholding near infrared images of the scene evolving before the camera. A clustering method collects the features into groups representing single plants. An important aspect of the approach is that it provides both localization information and crop/weed discrimination within a single framework, since we can assume that features not matching the planting pattern are weeds. Off-line tests with two image sequences are carried out to compare the tracking with assessment by three different humans. These show that the extended Kalman filter is a viable method for tracking and that the model parameters derived from the filter are consistent with human assessment. We conclude that the performance will be good enough for accurate in-field navigation.
ISSN:0278-3649
1741-3176
DOI:10.1177/027836402320556485