Visual servoing framework using Gaussian process for an aerial parallel manipulator
This paper proposes a Gaussian process based visual servoing framework for an aerial parallel manipulator. Our aerial parallel manipulator utilizes the on-board eye-in-hand vision sensor system attached on the end-effector of three-degrees-of-freedom parallel manipulator. There are three major advan...
Gespeichert in:
Veröffentlicht in: | Proceedings of the Institution of Mechanical Engineers. Part G, Journal of aerospace engineering Journal of aerospace engineering, 2019-07, Vol.233 (9), p.3408-3425 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper proposes a Gaussian process based visual servoing framework for an aerial parallel manipulator. Our aerial parallel manipulator utilizes the on-board eye-in-hand vision sensor system attached on the end-effector of three-degrees-of-freedom parallel manipulator. There are three major advantages: small, light in weight, and linearity with respect to the host vehicle rather than the serial manipulator, but it has a critical drawback that its workspace is too small to perform the mission itself during the hovering. In order to overcome the limited workspace problem and perform the mission more actively, proposed visual servoing framework is proposed to generate relative body velocity commands of the host vehicle by using the interpolated and extrapolated feature path between the initial and desired features to fed into the underactuated aerial parallel manipulator. It can generate not only numerical stable but also feasible control input. Furthermore, it can overcome the weakness of the traditional image-based visual servoing such as singularities, uncertainties, and local minimums during calculating image Jacobian under the large disparity environment between the target and the unmanned aerial vehicle. As a result of the proposed contribution, we show that our contribution is reliable to perform the picking-and-replacement autonomously, and it shows that it can be applied in the large displacement environments throughout the flight test. |
---|---|
ISSN: | 0954-4100 2041-3025 |
DOI: | 10.1177/0954410018798145 |