Smart Communication System Using Sign Language Interpretation
Although sign language has become more widely used in recent years, establishing effective communication between mute/deaf people and non-signers without a translator remains a barrier. There have been multiple methods proposed in the literature to overcome these challenges with the help of Sign Lan...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Although sign language has become more widely used in recent years, establishing effective communication between mute/deaf people and non-signers without a translator remains a barrier. There have been multiple methods proposed in the literature to overcome these challenges with the help of Sign Language Recognition (SLR) using methods based on arm sensors, data glove and computer vision. However, the sensor-based methods require users to wear additional devices such as arm bands and data-glove. The sensor-free vision-based methods are computationally intensive and sometimes less accurate as compared to the wearable sensor-based methods. In this paper, we propose a vision-based light weight web-based sign-language interpretation system. It provides two-way communication for all classes of people (deaf-and-mute, hard of hearing, visually impaired, and non-signers) and can be scaled commercially. The proposed method uses Mediapipe to extract hand features from the input image/video and then uses a light weight random forest classifier to classify the signs based on the extracted features with the accuracy of 94.69 %. The proposed model is trained on alphabets from American Sign Language. We developed a web-based user interface to remove for ease of deployment. It is equipped with text-to-speech, speech-to-text and auto-correct features to support communication between deaf-and-mute, hard of hearing, visually impaired and non-signers. |
---|---|
ISSN: | 2305-7254 2343-0737 |
DOI: | 10.23919/FRUCT54823.2022.9770914 |