University of Wollongong
Browse

Hand posture analysis for visual-based human-machine interface

Download (206.72 kB)
conference contribution
posted on 2024-11-16, 12:10 authored by Abdoleh Chalechale, Farzad Safaei, Golshah NaghdyGolshah Naghdy, Prashan PremaratnePrashan Premaratne
This paper presents a new scheme for hand posture selection and recognition based on statistical classification. It has applications in telemedicine, virtual reality, computer games, and sign language studies. The focus is placed on (1) how to select an appropriate set of postures having a satisfactory level of discrimination power, and (2) comparison of geometric and moment invariant properties to recognize hand postures. We have introduced cluster-property and cluster-features matrices to ease posture selection and to evaluate different posture characteristics. Simple and fast decision functions are derived for classification, which expedite on-line decision making process. Experimental results confirm the efficacy of the proposed scheme where a compact set of geometric features yields a recognition rate of 98:8%.

History

Citation

Chalechale, A., Safaei, F., Naghdy, G. & Premaratne, P. (2005). Hand posture analysis for visual-based human-machine interface. In B. Lovell & A. Meader (Eds.), WDIC 2005 APRS Workshop on Digital Image Computing (pp. 91-96). Queensland: The Australian Pattern Recognition Society.

Parent title

APRS Workshop on Digital Image Computing

Pagination

91-96

Language

English

RIS ID

11994

Usage metrics

    Categories

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC