ArtiLock: Smartphone User Identification Based on Physiological and Behavioral Features of Monosyllable Articulation

Although voice authentication is generally secure, voiceprint-based authentication methods have the drawback of being affected by environmental noise, long passphrases, and large registered samples. Therefore, we present a breakthrough idea for smartphone user authentication by analyzing articulatio...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Sensors (Basel, Switzerland) Switzerland), 2023-02, Vol.23 (3), p.1667
Hauptverfasser: Wong, Aslan B, Huang, Ziqi, Chen, Xia, Wu, Kaishun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Although voice authentication is generally secure, voiceprint-based authentication methods have the drawback of being affected by environmental noise, long passphrases, and large registered samples. Therefore, we present a breakthrough idea for smartphone user authentication by analyzing articulation and integrating the physiology and behavior of the vocal tract, tongue position, and lip movement to expose the uniqueness of individuals while making utterances. The key idea is to leverage the smartphone speaker and microphone to simultaneously transmit and receive speech and ultrasonic signals, construct identity-related features, and determine whether a single utterance is a legitimate user or an attacker. Physiological authentication methods prevent other users from copying or reproducing passwords. Compared to other types of behavioral authentication, the system is more accurately able to recognize the user's identity and adapt accordingly to environmental variations. The proposed system requires a smaller number of samples because single utterances are utilized, resulting in a user-friendly system that resists mimicry attacks with an average accuracy of 99% and an equal error rate of 0.5% under the three different surroundings.
ISSN:1424-8220
1424-8220
DOI:10.3390/s23031667