Computer Vision Techniques for Articulated Hand Tracking and Facial Expression Recognition
Date Issued
2008
Date
2008
Author(s)
Chang, Wen-Yan
Abstract
Three important topics for human intention understanding are discussed in this dissertation, including articulated hand tracking, face/facial component tracking, and facial expression recognition. To capture the complex hand motion in image sequences, we propose a model-based approach, called appearance-guided particle filtering, for high degree-of-freedom tracking. In addition to the initial state, our method assumes that there are some known attractors in the state space to guide the tracking. We then integrate both attractor and motion-transition information in a probability propagation framework. Experimental results show that our method performs better than those merely using sequential motion transition information or appearance information. An object usually consists of several components. To deal with the tracking problems that have (strong) spatial coherence in objects'' components, we develop a part-based tracking method. Unlike existing methods that only use the spatial-coherence relationship for particle weight estimation, our method further applies the spatial relationship for state prediction. Thus, the tracking performance can be considerably improved. In the facial-expression recognition part, we propose a hybrid-representation approach in a manifold-learning framework, which takes advantage of both holistic and local representations for analysis. We show the effectiveness of our method by applying it to the Cohn-Kanade database, and a high recognition rate can be achieved.
Subjects
Articulated hand tracking
particle filtering
tracking by parts
component collaboration
facial expression recognition
supervised manifold learning
Type
thesis
File(s)![Thumbnail Image]()
Loading...
Name
ntu-97-D91922011-1.pdf
Size
23.32 KB
Format
Adobe PDF
Checksum
(MD5):fc545da16213419ac4b58c358a888524
