Dynamic hand gesture recognition framework
Sign languages originated long before any speech-based languages evolved in the world. They contain subtleties that rival any speech-based languages conveying a rich source of information much faster than any speechbased languages. Similar to the diversity of speech-based languages, sign languages vary from region to region. However, unlike the speech counterpart, sign languages from diverse regions from the world have much common traits that originate from human evolution. Researchers have been intrigued by these common traits and have always wondered whether sign language-type communication is possible for instructing the computers opposed to the mundane keyboard and mouse. This trend is popularly known as Human Computer Interaction (HCI) and has used a subset of common sign language hand gestures to interact with machines through computer vision.Since the sign languages comprise of thousands of subtle gestures, a new sophisticated approach has to be initiated for eventual recognition of vast number of gestures. Hand gestures comprise of both static postures and dynamic gestures and can carry significantly rich vocabulary describing words in the thousands. In this article, we present our latest research that describes a mechanism to accurately interpret dynamic hand gestures using a concept known as 'gesture primitives' where each dynamic gesture is described as a collection of many primitives over time that can drive a classification strategy based on Hidden Markov Model to reliably predict the gesture using statistical knowledge of such gestures. We believe that even though our work is in its infancy, this strategy can be extended to thousands of dynamic gestures used in sign language to be interpreted by machines.