Pattern recognition for prosthetic hand user's intentions using EMG data and machine learning techniques
RIS ID
139921
Abstract
In this paper, we propose a simplified pipeline system for hand gesture pattern recognition. This system is based on surface electromyography of the upper forearm, obtained from a commercial sensor, the Myo armband, developed by Thalmic Labs. The pipeline involves data acquisition, pre-processing, feature extraction, classification, post-processing and interfacing. Implementations and improvements of each stage, including a new post-processing method are discussed. The evaluation of the pipeline system is conducted with 10 subject's electromyographic data whilst performing the 5 default classes of gestures packaged with the proprietary Myo system. Comparing our results with results found in literature and the Myo system, we have determined that our pipeline performs effectively (94.8% accuracy with 5 seconds of recording per gesture), particularly with comparison to the default Myo system (83% accuracy).
Grant Number
ARC/CE140100012
Publication Details
Young, S., Stephens-Fripp, B., Gillett, A., Zhou, H. & Alici, G. (2019). Pattern recognition for prosthetic hand user's intentions using EMG data and machine learning techniques. IEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM (pp. 544-550). United States: IEEE.