Gesture Talk – Sign Language Detection using LSTM Algorithm

Authors

  • S. Veena, Professor, Sri Sabarish N, UG Student, Adesh G S, UG Student, Rahul K, UG Student,

Keywords:

Gesture, Neural Network,Sign language,vision based

Abstract

This paper presents a Sign Language Recognition System (SLRS) leveraging Long Short-Term Memory (LSTM) algorithms to enhance the accuracy and efficiency of gesture recognition. The proposed system utilizes LSTM, a type of recurrent neural network (RNN), to effectively model the sequential dependencies inherent in sign language gestures. Through an extensive dataset of sign language samples, the LSTM-based SLRS demonstrates superior performance in capturing temporal patterns and nuances of diverse signs. The architecture incorporates pre-processing techniques for feature extraction and employs LSTM layers to learn and remember sequential dependencies, enabling accurate interpretation of sign language gestures. The experimental results indicate the system's robustness in recognizing a wide range of gestures, making it a promising solution for real-time sign language communication applications, thereby facilitating improved inclusivity for individuals with hearing impairments.

 

Published

2023-12-12

How to Cite

S. Veena, Professor, Sri Sabarish N, UG Student, Adesh G S, UG Student, Rahul K, UG Student,. (2023). Gesture Talk – Sign Language Detection using LSTM Algorithm. SJIS-P, 35(3), 650–656. Retrieved from http://sjis.scandinavian-iris.org/index.php/sjis/article/view/752

Issue

Section

Articles