Low-Cost Hybrid Multisensor Fusion Method and Implementation for Production Intelligent Vehicles

Faced with cost limitations and 360° fusion and tracking requirement of production intelligent vehicles, this article proposes a low-cost hybrid multisensor fusion method that provides flexibility and scalability for multilevel information input and different sensor configurations. By combining rada...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on instrumentation and measurement 2022, Vol.71, p.1-16
Hauptverfasser: Cai, Kunyang, Qu, Ting, Chen, Hong, Gao, Bingzhao, Bian, Ning
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Faced with cost limitations and 360° fusion and tracking requirement of production intelligent vehicles, this article proposes a low-cost hybrid multisensor fusion method that provides flexibility and scalability for multilevel information input and different sensor configurations. By combining radar, lane detection, and ego vehicle information, we design a centralized fusion algorithm for 360° object tracking that provides a low-computation multisensor association algorithm by a scheme of the local sensor to global track association and proves an optimal centralized estimator in the condition of multicoordinate system Doppler information. To make the sensor fusion system compatible with multilevel information input and enhance the redundancy to improve security, a distributed fusion algorithm based on the covariance intersection (CI) method is also designed as a part of the fusion framework for the secondary fusion of high-level information. We validate the feasibility of our fusion method in an industrial-level controller with only 10% computing power consumption and build a ground-truth system to quantitatively evaluate the method, which shows that our fusion method performs well on 360° object tracking, guarantees stable object IDs, and greatly improves the accuracy, especially for sensor boundary regions. Furthermore, we also test and evaluate the performance of our fusion method in a variety of complex open road scenarios including expressway, curved road, and lane fading scenarios.
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2022.3200434