Repository logo
  • English
  • 中文
Log In
Have you forgotten your password?
  1. Home
  2. College of Engineering / 工學院
  3. Mechanical Engineering / 機械工程學系
  4. Development of the Joint Attention with a New Face Tracking Method for Multiple People
 
  • Details

Development of the Joint Attention with a New Face Tracking Method for Multiple People

Date Issued
2007
Date
2007
Author(s)
Jian, Hung-Jing
DOI
en-US
URI
http://ntur.lib.ntu.edu.tw//handle/246246/61350
Abstract
This thesis aims to develop a system for multiple objects tracking and joint attention between people and robot. We propose a new method (Modified Multi-CAMSHIFT, MMC), which is based on the characteristics of color and shape probability distribution, to solve the tracking problems for multiple objects. The color cue information is calculated by MMC that improves from CAMSHIFT theory. And the shape cue information is calculated by procedure of Scharr kernel mask. Then we calculate out color histogram and orientation histogram respectively, and use the Adaptive Feature Selection for optimal tracking. For judging face or non-face regions, we have included Eyes-pair Fast Extracting. Our proposed MMC is based on adaptive multi-resolution (AMR) framework for reducing computation. The experimental results show that based on all the mechanisms mentioned above, the proposed MMC is a tracking method that performs satisfactory effects. After finding human faces, we tell the direction of each human face, and research the human-robot interaction between human and robot that is called Joint Attention. We establish joint attention with a human by utilizing both static and dynamic information. As the static information, we extract the edge image of the human face when he/she is gazing at the object. As the dynamic information, the robot uses the optical flow detected when observing a human who is shifting his/her gaze from looking at the robot to looking at another object. The static and dynamic information have complementary characteristics. The static information gives the exact direction of gaze, even though it is difficult to interpret. On the other hand, the dynamic information provides a rough direction but it is easily understandable relationship between the direction of gaze shift and motor output to follow the gaze. We use Support Vector Machine (SVM) for learning model. Utilizing both static and dynamic information acquired from observing a human’s gaze shift enables the robot to efficiently acquire joint attention ability and to naturally interact with the human by SVM. The dynamic information accelerates the learning of joint attention while the static information improves the task performance. From experiment results, the proposed Modified Multi-CAMSHIFT was successfully applied to multiple faces tracking and the development of the Joint Attention.
Subjects
人臉
追蹤
連續適應性中心移動演算法
邊緣偵測
光流
共同注意力
支持向量機
Face
Tracking
CAMSHIFT
Edge Detection
Optical Flow
Joint Attention
SVM
Type
thesis

臺大位居世界頂尖大學之列,為永久珍藏及向國際展現本校豐碩的研究成果及學術能量,圖書館整合機構典藏(NTUR)與學術庫(AH)不同功能平台,成為臺大學術典藏NTU scholars。期能整合研究能量、促進交流合作、保存學術產出、推廣研究成果。

To permanently archive and promote researcher profiles and scholarly works, Library integrates the services of “NTU Repository” with “Academic Hub” to form NTU Scholars.

總館學科館員 (Main Library)
醫學圖書館學科館員 (Medical Library)
社會科學院辜振甫紀念圖書館學科館員 (Social Sciences Library)

開放取用是從使用者角度提升資訊取用性的社會運動,應用在學術研究上是透過將研究著作公開供使用者自由取閱,以促進學術傳播及因應期刊訂購費用逐年攀升。同時可加速研究發展、提升研究影響力,NTU Scholars即為本校的開放取用典藏(OA Archive)平台。(點選深入了解OA)

  • 請確認所上傳的全文是原創的內容,若該文件包含部分內容的版權非匯入者所有,或由第三方贊助與合作完成,請確認該版權所有者及第三方同意提供此授權。
    Please represent that the submission is your original work, and that you have the right to grant the rights to upload.
  • 若欲上傳已出版的全文電子檔,可使用Open policy finder網站查詢,以確認出版單位之版權政策。
    Please use Open policy finder to find a summary of permissions that are normally given as part of each publisher's copyright transfer agreement.
  • 網站簡介 (Quickstart Guide)
  • 使用手冊 (Instruction Manual)
  • 線上預約服務 (Booking Service)
  • 方案一:臺灣大學計算機中心帳號登入
    (With C&INC Email Account)
  • 方案二:ORCID帳號登入 (With ORCID)
  • 方案一:定期更新ORCID者,以ID匯入 (Search for identifier (ORCID))
  • 方案二:自行建檔 (Default mode Submission)
  • 方案三:學科館員協助匯入 (Email worklist to subject librarians)

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science