Abstract:
The people who cannot hear and speak find it difficult to communicate. However, these people can use sign language such as American Sign Language (ASL). So, there is a need to express their feelings better using sign language. This project aims to address this issue by developing a system that uses a glove to translate ASL hand symbols into text and speech in time. By creating a connection between ASL and spoken language the project aims to improve the quality of life for individuals with hearing impairments promoting inclusion and understanding in society.
To achieve this the project utilizes tools and technologies including an Arduino Nano R3 microcontroller, HC 05 Bluetooth Module, flex sensors, Adafruit Analog Accelerometer ADXL335, and a 10k/1K ohm resistor. Flex sensors are placed on the fingers and thumb of the glove which is used to monitor the amount of finger bends. The accelerometer measures the hand's tilt and orientation, which helps accurately interpret gestures. The data from these sensors is processed by the Arduino Nano R3 or Arduino Uno, which interprets the gestures based on predefined finger bends and hand orientation angles. The HC 05 Bluetooth Module enables communication, with an Android app. In this app interpreted ASL symbols are displayed on a smartphone screen while also being voiced using speech synthesis and other functionalities for user experience. The total number of gestures implemented in our project is 26 for Alphabets and 11 for Strings. The response time for the gesture to display on the Android Application is 2 seconds.