Hand Gesture Controlled Drones: An Open Source Library
Drones are conventionally controlled using joysticks, remote controllers, mobile applications, and embedded computers. A few significant issues with these approaches are that drone control is limited by the range of electromagnetic radiation and susceptible to interference noise. In this study we pr...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Drones are conventionally controlled using joysticks, remote controllers,
mobile applications, and embedded computers. A few significant issues with
these approaches are that drone control is limited by the range of
electromagnetic radiation and susceptible to interference noise. In this study
we propose the use of hand gestures as a method to control drones. We
investigate the use of computer vision methods to develop an intuitive way of
agent-less communication between a drone and its operator. Computer
vision-based methods rely on the ability of a drone's camera to capture
surrounding images and use pattern recognition to translate images to
meaningful and/or actionable information. The proposed framework involves a few
key parts toward an ultimate action to be taken. They are: image segregation
from the video streams of front camera, creating a robust and reliable image
recognition based on segregated images, and finally conversion of classified
gestures into actionable drone movement, such as takeoff, landing, hovering and
so forth. A set of five gestures are studied in this work. Haar feature-based
AdaBoost classifier is employed for gesture recognition. We also envisage
safety of the operator and drone's action calculating the distance based on
computer vision for this task. A series of experiments are conducted to measure
gesture recognition accuracies considering the major scene variabilities,
illumination, background, and distance. Classification accuracies show that
well-lit, clear background, and within 3 ft gestures are recognized correctly
over 90%. Limitations of current framework and feasible solutions for better
gesture recognition are discussed, too. The software library we developed, and
hand gesture data sets are open-sourced at project website. |
---|---|
DOI: | 10.48550/arxiv.1803.10344 |