| dc.description.abstract |
Smartphones and mobile applications have come a long way in helping make people’s lives easier. The mobile application development industry that started from simple messaging and gaming apps now offers all kinds of software ranging from foreign exchange trading to health and fitness tracking. One area where there is still a lot more room for improvement though, are the applications being built for the differently-abled people. The specific problem that this team is addressing is that of sign language translation apps. Although there are some sign language apps commercially available now, there are three big issues with the options that are currently available. Firstly, most of the apps rely on an American Sign Language (ASL) alphabets dataset to translate any given word. Which means if someone wants to input the sign for the word ’car’, the person must make a separate sign for each alphabet in the word because those are the only signs the app would be able to recognize and translate. The second big issue with these apps is that the Deep Learning models that they use for translation have been trained on a dataset of still images rather than videos. So, if there is a sign that involves movement, the app will usually fail to recognize and translate it. This happens for a lot of signs as majority of them involve some movement. The third major issue with these apps is that all relatively decent sign language translation apps either require some expensive hardware equipment or are protected by a paywall, making them inaccessible to the majority. The team has addressed all these issues by making an ASL translation app that provides word level translations, is able to translate signs that involve movement, and is free to download and use for all. For A |
en_US |