New interface for musical instruments using lip reading
As smart audio-visual multimedia devices are developed for various applications, there has been a growing interest in effective human–computer interaction (HCI) interfaces for specific environments. There have also been great efforts to implement HCI interfaces into musical instruments, in which it...
Gespeichert in:
Veröffentlicht in: | IET image processing 2015-09, Vol.9 (9), p.770-776 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | As smart audio-visual multimedia devices are developed for various applications, there has been a growing interest in effective human–computer interaction (HCI) interfaces for specific environments. There have also been great efforts to implement HCI interfaces into musical instruments, in which it would be possible to take intuitions, comfort and expressiveness into the musical instruments. However, most of the traditional HCI interfaces are not applicable because both hands are likely to be occupied while playing a musical instrument. In this environment, a lip reading method can be used. A lip reading method is a HCI method that analyses lip motion to recognise spoken words. In this study, a lip reading method is proposed with its application. As a specific example of the interface for musical instruments, the guitar effector application is presented. The proposed lip reading method uses a constrained local model instead of the conventional active appearance model for effective facial feature tracking. The proposed method also uses a dynamic time warping-based classifier for word recognition which is effective for simple real-time implementation of lip reading. The proposed lip reading method shows 85.0% word recognition accuracy on OuluVS database and is effectively applied to the proposed guitar effector application. |
---|---|
ISSN: | 1751-9659 1751-9667 1751-9667 |
DOI: | 10.1049/iet-ipr.2014.1014 |