top of page

Sign Language Recognition using custom Data Glove

Bachelor of Engineering Thesis Project.

Project Guide - Dr. Shrikanta Pal, Electronics & Communication Dept. BIT Mesra

Team Members - Anirvan Dutta, Ravi Ranjan, Nancy Mishra

Our proposed solution and goal was to design a Human Computer Interface (HCI) device that can translate sign language, specifically ISL, to text and speech providing any deaf & mute individuals with the ability to effortlessly communicate with anyone. Sign language involves the use of gestures, mainly specific hand shapes and movements, instead of sound to convey words and sentences. The idea is to design a device placed on a hand with sensors capable of capturing hand gestures and then transmitting the information to a processing unit which performs the sign language translation. The novelty of our approach was to not only identify static gestures but also define a framework for dynamic gesture identification as well.

We proposed a real time multimodal Indian Sign Language recognition system using a feature level fusion scheme. A broad set of features from flex, force and inertial sensors were considered to classify both static as well as dynamic gestures.

A human computer interface glove was developed with the aim of translating Indian sign language to text & speech. The glove utilizes nine flex sensors, two force sensors, and an inertial measurement unit to accurately capture hand gestures. All components were placed on the back side of the glove providing the user with full range of motion, and not restricting the user from performing other tasks while wearing the glove.
Machine learning algorithms were explored to perform classification of hand gestures. Non-Parametric supervised learning algorithms like kNN (K – Nearest Neighbours), SVM (Support Vector Machine), Decision Tree, Random Forest etc. were tested on the training data for achieving maximum accuracy in classification. Finally, decision tree classifier was selected based on the results of k-fold cross validation using the training data. The current results include translation of twenty-four letters and ten dynamic gestures from sign language to text & speech.

We were able to classify 24 static gestures and 5 dynamic gestures with accuracy of 98% and 96% respectively. The 24 static gestures included all the letters of the Indian Sign Language and the ten basic dynamic gestures like  “Hello”, “Good Morning”, “Thank You”, “Sorry”, “Good Bye” etc. Our training accuracy with 10-fold cross validation for static gesture is 98% and 96% for dynamic gestures. After repeated testing it was found out that it could classify 46 gestures accurately out of 50. Thus, showing significant improvement in the existing published resources. We are still working on improving the design and getting publishable results.

​

bottom of page