Low-cost Design of Vision-based Natural User Interface via Dynamic Hand Gestures

Authors

  • Richa Golash Samrat Ashok Technological Institute, Civil Lines, Vidisha, Madhya Pradesh, India 464001
  • Yogendra Kumar Jain Samrat Ashok Technological Institute, Civil Lines, Vidisha, Madhya Pradesh, India 464001

Keywords:

Natural-user-interface, Machine learning, Deep Neural network, Faster R-CNN dynamic hand recognition, visual object tracking.

Abstract

Advancement in computer vision and pattern recognition fields has opened many new dimensions for object recognition and visual tracking in videos. But vision-based interaction of a human with a machine using moving hand gestures is still in a primitive stage because seamless localization and conversion of irregular trajectory to command in RGB images are challenging tasks. The hand is a non-rigid object with diverse posture shapes, some posture occupies a larger area and some very small. The surface of a hand is uneven thus edges are not very clear during movement. Therefore, algorithms example background subtraction, segmentation using contour information, and skin color detection are not practically applicable in unconstrained background conditions. Hence researchers prefer advanced cameras that can detect the skeleton structure and provide more information apart from RGB values for each pixel of the image frame to facilitate the hand detection stage in dynamic hand gesture recognition (DHGR). The approach of using advanced cameras in DHGR increases the overall cost of interfaces, subjects require technical knowledge to operate them, and hence applications of DHGR are limited to those areas where complexity, cost, and expertise are not very critical factors. The goal of this paper is to propose the low-cost, simple design of vision-based human-machine interaction via dynamic hand gestures that directly work on RGB images, can be easily integrated with any day-to-day machine and the technique is invariant to the user’s age and skills. The architecture of the proposed method utilizes Faster Region-based Convolutional Neural Network (Faster R-CNN) for hand spotting and recognition, tracking is accomplished using SIFT (scale-invariant feature transform), modified Backpropagation Artificial Neural Network is final added to efficiently translate hand movement into machine command with optimal computations. The proposed technique is simple yet robust in hand detection and capable to track and interpret a non-rigid object directly in the colored videos, captured in a realtime environment.

Downloads

Download data is not yet available.

Downloads

Published

2021-01-01

How to Cite

Richa Golash, & Yogendra Kumar Jain. (2021). Low-cost Design of Vision-based Natural User Interface via Dynamic Hand Gestures . International Journal of Computer Information Systems and Industrial Management Applications, 13, 12. Retrieved from https://cspub-ijcisim.org/index.php/ijcisim/article/view/404

Issue

Section

Original Articles