Ching-Wen HungChung-Han LiangBing-Yu Chen2024-10-252024-10-252024-05-119798400703317https://www.scopus.com/record/display.uri?eid=2-s2.0-85194161095&origin=resultslisthttps://scholars.lib.ntu.edu.tw/handle/123456789/722423CHI '24: CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, May 11 - 16, 2024Performance-based digital puppetry has gained widespread popularity in various fields, including gaming, storytelling, animation editing, etc. Human hands are well-suited for manipulating digital avatars with their skill and ability to perform multiple movements. In this work, we adopted the finger-walking technique, a natural and intuitive method of performance, as an interface for controlling human avatars. We first conducted a preliminary study to explore the range of finger-walking movements preferred by casual users and identified several general types of finger-walking performances. Based on the study results, we selected five common example animations from a database suitable for finger-walking performance. To manipulate human avatars, we developed a method that maps finger-walking motions to leg motions using rotation mapping and matches similar leg motions in the example animations to generate expressive full-body motions. We also implemented a prototype interactive storytelling application to demonstrate the effectiveness of our system in developing responsive and reliable human avatar motions.en[SDGs]SDG11FingerPuppet: Finger-Walking Performance-based Puppetry for Human Avatarconference paper10.1145/3613905.3650840