Abstract:
The Sign Sense Translator is designed to bridge the communication gap between the deaf and hearing communities by translating American Sign Language (ASL) gestures into text in real time. Utilizing cutting-edge technologies such as Mediapipe’s Hand Tracking and machine learning models like Support Vector Machines (SVMs), the system accurately captures, recognizes, and translates ASL gestures from video input. This mobile application, built using the Flutter framework, provides a user-friendly interface and supports seamless interactions for both Android and iOS platforms. The system is scalable, with the potential for future integration of additional sign languages. Although some environmental factors, such as lighting, pose challenges, the system remains robust in its core functionalities. Future improvements, including dataset expansion and enhanced error handling, are anticipated to further refine its performance. This product represents a significant step towards inclusive communication, enhancing accessibility for individuals reliant on ASL.