Sound bite hearing system embed with OCR, techniques for blind and deaf people
Speech and text are the essential one for human and for individual necessities. The person needs vision to access the data in a text and the person who have poor vision can access the data in voice. This project has been built by Arduino Uno board in which the camera and Bluetooth headset speakers a...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Speech and text are the essential one for human and for individual necessities. The person needs vision to access the data in a text and the person who have poor vision can access the data in voice. This project has been built by Arduino Uno board in which the camera and Bluetooth headset speakers are used as a peripherals which is controlled by the Arduino Uno board. The content will be written in the pure white paper, the content should be written in good English and be of good font size, which will be shown in front of the camera. The camera will be mounted in the center of the spectacles, it captures the full view of the paper. After all the criteria met, it will process and it will give announcement in the Bluetooth headset speakers. After that when we connect the vibrator and the person will bite the vibrator it will hear on it. Next, we will use voice board to record voices, after that when we bite the vibrator the recorded voice will be hear on that vibrator. In this project, we will use OCR technique to record voices and to capture images. a visually impaired individual with perusing a paper content or text in the absence of support of any human reader additionally establish short clip hearing framework for both visually impaired and hard of hearing people groups we will utilize voice board to record voice and we will hear utilizing vibrator when we nibble it. |
---|---|
ISSN: | 0094-243X 1551-7616 |
DOI: | 10.1063/5.0180531 |