洪一平Hung, Yi-Ping臺灣大學:資訊工程學研究所張文彥Chang, Wen-YanWen-YanChang2010-06-022018-07-052010-06-022018-07-052008U0001-2407200820380200http://ntur.lib.ntu.edu.tw//handle/246246/184807如何讓電腦藉由觀察使用者的行為與情緒來瞭解其意圖和想法是基礎而重要的課題。在本論文中,我們針對手勢追蹤、人臉五官追蹤和表情辨認等題目進行研究與探討。在手勢追蹤方面,由於每一根手指的關節都有數個運動自由度,這些為數眾多的關節自由度使得該問題成為複雜的高維度追蹤問題。為了能有效地解決此問題,我們提出了以貝氏機率傳遞模型為基礎之「外觀導引式粒子濾波法」。將手勢的外觀資訊引入動態系統中,透過外觀資訊的導引與動態訊息的傳遞我們可以正確地估測出運動的狀態。一般而言,物體是由若干個局部元件所組成的,而這些元件通常也存在某種幾何結構上的關係,為此我們也發展了利用物體局部特徵的空間一致性來改進影像的追蹤問題,並將其應用在人臉與五官的追蹤上。藉由於利用局部特徵的的關連與合作特性,我們可以有效地改善光線與局部遮蔽對追蹤所造成的影響。除了臉部追蹤,我們也成功地將此方法應用於其他的視覺追蹤問題上。在臉部表情辨認的研究上,不同於常見的整體或是局部的臉部表示法,我們採用了複合式的表示法來描述臉部的特徵,使得臉部的整體變化與局部細微的差異可以同時被觀察到。藉由應用監督式流形學習技術,我們提出了融合演算法來有效地整合這些不同元件上的流形,以突顯出個個元件在不同表情上的影響力。經由廣泛的實驗,我們證明了此方法可以有效地辨別各類表情。Three important topics for human intention understanding are discussed in this dissertation, including articulated hand tracking, face/facial component tracking, and facial expression recognition. To capture the complex hand motion in image sequences, we propose a model-based approach, called appearance-guided particle filtering, for high degree-of-freedom tracking. In addition to the initial state, our method assumes that there are some known attractors in the state space to guide the tracking. We then integrate both attractor and motion-transition information in a probability propagation framework. Experimental results show that our method performs better than those merely using sequential motion transition information or appearance information. An object usually consists of several components. To deal with the tracking problems that have (strong) spatial coherence in objects'' components, we develop a part-based tracking method. Unlike existing methods that only use the spatial-coherence relationship for particle weight estimation, our method further applies the spatial relationship for state prediction. Thus, the tracking performance can be considerably improved. In the facial-expression recognition part, we propose a hybrid-representation approach in a manifold-learning framework, which takes advantage of both holistic and local representations for analysis. We show the effectiveness of our method by applying it to the Cohn-Kanade database, and a high recognition rate can be achieved.Abstract viiist of Figures xiiiist of Tables xv Introduction 1.1 Motivation 1.2 Overview 2.2.1 CapturingHandArticulation 2.2.2 Tracking Face via Component Collaboration 3.2.3 AnalyzingFacialExpression 3.3 Organization 4 Capturing Hand Articulation 7.1 Background 8.2 Appearance-Guided Particle Filtering 10.2.1 Review of Particle Filtering 11.2.2 Probability Propagation of AGPF 12.2.3 Sequential Monte Carlo Framework of AGPF 14.3 Mixture-basedAGPF 17.4 Likelihood Model and State Estimation 20.5 ExperimentalResults 22.6 Discussions 31 Tracking Face via Component Collaboration 33.1 Background 34.2 TBP Particle Filtering 36.2.1 Bayesian Probability Propagation 38.2.2 Inference ofTBP-BN 40.2.3 Refinement of the Particle Weights by Spatial Relationship 43.3 Dynamic Distribution 46.3.1 General Spatial Constraints 47.3.2 Variations of the Dynamic Model 49.4 Likelihood and Particle Weight Estimation 51.4.1 Component Representation and Likelihood Measurement 51.4.2 Particle Re-weighting 53.5 Experimental Results 55.5.1 Implementation 55.5.2 Results 56.6 Summary 64 Analyzing Facial Expression 65.1 Background 67.1.1 Facial Expression Analysis 67.1.2 Supervised Manifold Learning 68.2 Expression Analysis Using Fusion Manifolds 69.2.1 Facial Components 70.2.2 Fusion Algorithm for Embedded Manifolds 71.3 Experimental Results 76.3.1 Dataset and Preprocessing 76.3.2 Algorithms for Comparison 76.3.3 Comparisons and Discussions 78.4 Summary 84 Conclusion 87ibliography 89ublications 97application/pdf15116658 bytesapplication/pdfen-US手勢追蹤粒子濾波法局部特徵追蹤局部元件合作表情辨識監督式流形學習Articulated hand trackingparticle filteringtracking by partscomponent collaborationfacial expression recognitionsupervised manifold learning電腦視覺技術於手勢追蹤與表情辨識之研究Computer Vision Techniques for Articulated Hand Tracking and Facial Expression Recognitionthesishttp://ntur.lib.ntu.edu.tw/bitstream/246246/184807/1/ntu-97-D91922011-1.pdf