EFFICIENT GESTURE CLASSIFICATION WITH PRE-TRAINED NEURAL NETWORKS

Authors

  • Maile Vaneshri Tulsidas Author
  • Dr. Y. Narasimha Reddy Author

Keywords:

sign language interpretation, virtual reality interactions, smart home automation, potential of transfer learning in gesture recognition

Abstract

Gesture recognition plays a crucial role in human-computer interaction, enabling intuitive and contactless communication between users and machines. Traditional approaches to gesture classification often require extensive training data and computational resources, making real-time implementation challenging. This study explores the effectiveness of pre-trained neural networks for efficient gesture classification, leveraging transfer learning techniques to improve accuracy while reducing training time. By utilizing models such as Convolutional Neural Networks (CNNs), Vision Transformers (ViTs), and Recurrent Neural Networks (RNNs), the system extracts high-level spatial and temporal features from gesture datasets. The proposed approach enhances recognition speed, optimizes resource utilization, and ensures robustness across different lighting conditions and environments. Experimental results demonstrate that pre-trained models significantly outperform conventional methods in accuracy and generalization, making them ideal for real-world applications such as sign language interpretation, virtual reality interactions, and smart home automation. This research highlights the potential of transfer learning in gesture recognition, paving the way for more accessible and efficient human-machine interfaces.

Downloads

Download data is not yet available.

Downloads

Published

13-03-2025

How to Cite

EFFICIENT GESTURE CLASSIFICATION WITH PRE-TRAINED NEURAL NETWORKS. (2025). International Journal of Information Technology and Computer Engineering, 13(1), 422-427. https://ijitce.org/index.php/ijitce/article/view/910