A Novel Method of Multitarget Augmented Reality Assembly Result Inspection for Large Complex Scenes
Augmented reality (AR) has been widely employed in assembly guidance and maintenance as an excellent visualization tool. On this basis, AR technology combined with visual inspection has become a research topic to realize rapid and intuitive quality inspection while reducing operators' workload....
Gespeichert in:
Veröffentlicht in: | IEEE transactions on industrial informatics 2024-05, Vol.20 (5), p.7282-7291 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Augmented reality (AR) has been widely employed in assembly guidance and maintenance as an excellent visualization tool. On this basis, AR technology combined with visual inspection has become a research topic to realize rapid and intuitive quality inspection while reducing operators' workload. This article proposes a novel multitarget, AR-based, assembly result inspection method, in which detected information is matched with prior knowledge via high-precision registration. First, a multimarker-based global registration method is designed to significantly improve the average registration accuracy in large scenes based on the mutual calibration of a few markers. Second, based on the correlation between the inspection target and its AR twin, an image containing multiple mechanical components is segmented according to the locations, and the local images are matched with the prior knowledge for evaluation. Finally, the inspection method is deployed to AR-based assembly inspection system, and its validity is verified on a rocket cabin imitation platform. Experiments show that the proposed inspection method can accurately segment the multitarget image into several images containing a single target according to the prior locations and can verify the assembly results of the targets, running at 15.1 fps. |
---|---|
ISSN: | 1551-3203 1941-0050 |
DOI: | 10.1109/TII.2024.3357205 |