https://scholars.lib.ntu.edu.tw/handle/123456789/365095
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.author | Chen, B.-J. | en_US |
dc.contributor.author | Huang, C.-M. | en_US |
dc.contributor.author | LI-CHEN FU | en_US |
dc.creator | Chen, B.-J.;Huang, C.-M.;Fu, L.-C. | - |
dc.date.accessioned | 2018-09-10T08:42:28Z | - |
dc.date.available | 2018-09-10T08:42:28Z | - |
dc.date.issued | 2011 | - |
dc.identifier.uri | http://www.scopus.com/inward/record.url?eid=2-s2.0-81255163759&partnerID=MN8TOARS | - |
dc.identifier.uri | http://scholars.lib.ntu.edu.tw/handle/123456789/365095 | - |
dc.description.abstract | This paper is aimed to construct a human-machine interacting system based on the visual tracking techniques. A monocular camera is set up in front of the interacting user, and the interaction is performed through a held object to express the user's intention. Since the object motion during interaction is arbitrary, the target modeling with multiple visual clues must be considered at the same time to reliably track the target in such challenged scenario. The target appearance is modeled as a linear combination of several target templates and trivial templates in various color spaces with sparse coefficient vectors. To achieve the real-time performance for human-machine interaction, the adaptive particle filtering algorithm is proposed to balance the tracking robustness and processing instantaneity. The dominant templates in the discriminative color channels will be selected to verify the tracking hypotheses. The sparse coefficient vectors of each hypothesis corresponding to the selected templates are then efficiently estimated by the particle swarm optimization. The selected templates and the estimated sparse coefficient vectors are dynamically changing over time. The overall performance has been validated in the experiments. © 2011 SICE. | - |
dc.language | en | en |
dc.relation.ispartof | Proceedings of the SICE Annual Conference | - |
dc.source | AH-Scopus to ORCID | - |
dc.subject | This paper is aimed to construct a human-machine interacting system based on the visual tracking techniques. A monocular camera is set up in front of the interacting user, and the interaction is performed through a held object to express the user's intention. Since the object motion during interaction is arbitrary, the target modeling with multiple visual clues must be considered at the same time to reliably track the target in such challenged scenario. The target appearance is modeled as a linear combination of several target templates and trivial templates in various color spaces with sparse coefficient vectors. To achieve the real-time performance for human-machine interaction, the adaptive particle filtering algorithm is proposed to balance the tracking robustness and processing instantaneity. The dominant templates in the discriminative color channels will be selected to verify the tracking hypotheses. The sparse coefficient vectors of each hypothesis corresponding to the selected templates are then efficiently estimated by the particle swarm optimization. The selected templates and the estimated sparse coefficient vectors are dynamically changing over time. The overall performance has been validated in the experiments. © 2011 SICE. feature selection; particle filter; sparse representation; Visual tracking | - |
dc.subject.other | Bandpass filters; Color; Feature extraction; Man machine systems; Monte Carlo methods; Particle swarm optimization (PSO); Signal filtering and prediction; Tracking (position); Vector spaces; Adaptive particle filters; Human machine interaction; Linear combinations; Particle filter; Particle filtering algorithms; Real time performance; Sparse representation; Visual Tracking; Adaptive filters | - |
dc.title | Real-time visual tracking with adaptive particle filter for human-machine interaction | - |
dc.type | conference paper | en |
dc.relation.conference | 50th Annual Conference on Society of Instrument and Control Engineers, SICE 2011 | - |
dc.identifier.doi | 10.1109/SMC.2019.8914281 | - |
dc.identifier.scopus | 2-s2.0-81255163759 | - |
dc.relation.pages | 1344-1349 | - |
item.openairetype | conference paper | - |
item.fulltext | no fulltext | - |
item.openairecristype | http://purl.org/coar/resource_type/c_5794 | - |
item.grantfulltext | none | - |
item.cerifentitytype | Publications | - |
crisitem.author.dept | Electrical Engineering | - |
crisitem.author.dept | Computer Science and Information Engineering | - |
crisitem.author.dept | Center for Artificial Intelligence and Advanced Robotics | - |
crisitem.author.orcid | 0000-0002-6947-7646 | - |
crisitem.author.parentorg | College of Electrical Engineering and Computer Science | - |
crisitem.author.parentorg | College of Electrical Engineering and Computer Science | - |
crisitem.author.parentorg | Others: University-Level Research Centers | - |
顯示於: | 資訊工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。