| dc.description.abstract |
Deaf and mute people mainly use sign language like ASL (American Sign Language) to communicate, but when they interact with those who can hear, a big language barrier gets in the way. This makes it tough for them to access education, jobs, and social activities, leaving them feeling excluded. On top of that, blind people have a hard time getting information from text, making communication even more challenging. Our project steps in to solve these issues by creating a communication tool that helps deaf and mute individuals talk to everyone, including those who can hear.
Our proposed methodology is based on the application of Machine learning. Extracting features of Hand by using Mediapipe and classification from Machine Learning (ML) model K-Nearest Neighbour Classifier, and computer vision, to transform communication for deaf and mute people. We capture their sign language from live video and instantly turn it into written messages (English). This not only breaks down language barriers but also helps blind people by converting text (English) into spoken words. What's more, our software lets people who don't know sign language talk to deaf and dumb individuals by changing their voice into text (English) and that text (English) into sign language. For both purposes we used Flutter’s Packages because the app development is done on flutter. This way, we're making communication easier for everyone, no matter how they prefer to communicate. Our project takes us a step closer to a world where everyone can connect and understand each other. |
en_US |