SmartGrip: grip sensing system for commodity mobile devices through sound signals

Although many studies have attempted to detect the hand postures of a mobile device to utilize these postures as a user interface, they either require additional hardware or can differentiate a limited number of grips only if there is a touch event on the mobile device’s screen. In this paper, we pr...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Personal and ubiquitous computing 2020-10, Vol.24 (5), p.643-654
Hauptverfasser: Kim, Namhyun, Lee, Junseong, Whang, Joyce Jiyoung, Lee, Jinkyu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Although many studies have attempted to detect the hand postures of a mobile device to utilize these postures as a user interface, they either require additional hardware or can differentiate a limited number of grips only if there is a touch event on the mobile device’s screen. In this paper, we propose a novel grip sensing system, called SmartGrip, which allows a mobile device to detect different hand postures without any additional hardware and a screen touch event. SmartGrip emits carefully designed sound signals and differentiates the propagated signals distorted by different user grips. To achieve this, we analyze how a sound signal propagates from the speaker to the microphone of a mobile device and then address three key challenges: sound structure design, volume control, and feature extraction and classification. We implement and evaluate SmartGrip on three Android mobile devices. With six representative grips, SmartGrip exhibits 93.1% average accuracy for ten users in an office environment. We also demonstrate that SmartGrip operates with 83.5 to 98.3% accuracy in six different (noisy) locations. Further demonstrating the feasibility of SmartGrip as a user interface, we develop an Android application that exploits SmartGrip, validating its practical usage.
ISSN:1617-4909
1617-4917
DOI:10.1007/s00779-019-01337-7