Stacked LSTM-Based Dynamic Hand Gesture Recognition with Six-Axis Motion Sensors

Publication Name

Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics

Abstract

Hand gesture recognition can be exploited to benefit ubiquitous applications using sensors. Currently, the inherent complexity of human physical activities makes it difficult to accurately recognize gestures with wearable sensors, especially in real time. To this end, a real-time hand gesture recognition system is presented in this paper. In particular, sliding window technology and y-axis threshold are used to detect intended gestures from a continuous data stream and then the segmented data are classified by applying a stacked Long Short-Term Memory (LSTM) model. After noise is removed, six-axis sensor data from wrist-worn devices are fed into the model without requiring feature engineering. We use twelve common hand gestures to evaluate the performance of our model. The experimental results demonstrate the feasibility of our proposed system with an accuracy of 99.8% on average. Our approach allows for an accurate and nonindividual hand gesture recognition. It holds potential to be integrated into a smart watch or other wearable devices for intuitive human computer interaction.

Open Access Status

This publication is not available as open access

First Page

2568

Last Page

2575

Funding Number

cx2017092

Funding Sponsor

National Natural Science Foundation of China

Share

COinS
 

Link to publisher version (DOI)

http://dx.doi.org/10.1109/SMC52423.2021.9658659