Parametric hidden Markov models for gesture recognition

  • Andrew D. Wilson
  • , Aaron F. Bobick

Research output: Contribution to journalArticlepeer-review

454 Scopus citations

Abstract

A new method for the representation, recognition, and interpretation of parameterized gesture is presented. By parameterized gesture we mean gestures that exhibit a systematic spatial variation; one example is a point gesture where the relevant parameter is the two-dimensional direction. Our approach is to extend the standard hidden Markov model method of gesture recognition by including a global parametric variation in the output probabilities of the HMM states. Using a linear model of dependence, we formulate an expectation-maximization (EM) method for training the parametric HMM. During testing, a similar EM algorithm simultaneously maximizes the output likelihood of the PHMM for the given sequence and estimates the quantifying parameters. Using visually derived and directly measured three-dimensional hand position measurements as input, we present results that demonstrate the recognition superiority of the PHMM over standard HMM techniques, as well as greater robustness in parameter estimation with respect to noise in the input features. Last, we extend the PHMM to handle arbitrary smooth (nonlinear) dependencies. The nonlinear formulation requires the use of a generalized expectation-maximization (GEM) algorithm for both training and the simultaneous recognition of the gesture and estimation of the value of the parameter. We present results on a pointing gesture, where the nonlinear approach permits the natural spherical coordinate parameterization of pointing direction.

Original languageEnglish
Pages (from-to)884-900
Number of pages17
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume21
Issue number9
DOIs
StatePublished - Sep 1999

Fingerprint

Dive into the research topics of 'Parametric hidden Markov models for gesture recognition'. Together they form a unique fingerprint.

Cite this