Integrating Mobile Multimodal Interactions based on Programming By Demonstration

Mobile Multimodal Interaction aims at exploiting complementary aspects of human communication capacities and new mobile sensors. Recently, most mobile applications are limited to a basic interaction modality, namely touchscreen, which is subject to restricted interaction under certain situations. In...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of human-computer interaction 2021-03, Vol.37 (5), p.418-433
Hauptverfasser: Bellal, Zouhir, Elouali, Nadia, Benslimane, Sidi Mohamed, Acarturk, Cengiz
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Mobile Multimodal Interaction aims at exploiting complementary aspects of human communication capacities and new mobile sensors. Recently, most mobile applications are limited to a basic interaction modality, namely touchscreen, which is subject to restricted interaction under certain situations. In this paper, we present On-the-Fly Interaction Editor (OFIE), an application that allows mobile end-users to define and integrate sensor-based unimodal and multimodal input interactions in their already installed applications according to their contexts. OFIE is based on the Event-Condition-Action rules and Programming By Demonstration approach that allows end-users to demonstrate their expected action simply by performing it on the application's interface. We evaluated OFIE through a controlled user study. Our evaluation involves 15 participants distributed on 4 groups based on their programming experience. Each participant was invited to integrate six input interactions (three multimodal inputs). The initial results show that end-users are able to successfully integrate sensor-based input interactions using OFIE.
ISSN:1044-7318
1532-7590
1044-7318
DOI:10.1080/10447318.2020.1823688