Lee, Wan-ChenWan-ChenLeeHung, Ching WenChing WenHungTing, Chao HsienChao HsienTingChi, PeggyPeggyChiBING-YU CHEN2023-12-272023-12-272023-10-299798400701320https://scholars.lib.ntu.edu.tw/handle/123456789/63815036th Annual ACM Symposium on User Interface Software and Technology, UIST 2023, San FranciscoBlind and visually impaired (BVI) people primarily rely on non-visual senses to interact with a physical environment. Doing so requires a high cognitive load to perceive and memorize the presence of a large set of objects, such as at home or in a learning setting. In this work, we explored opportunities to enable object-centric note-taking by using a 3D printing pen for interactive, personalized tactile annotations. We first identified the benefits and challenges of self-created tactile graphics in a formative diary study. Then, we developed TacNote, a system that enables BVI users to annotate, explore, and memorize critical information associated with everyday objects. Using TacNote, the users create tactile graphics with a 3D printing pen and attach them to the target objects. They capture and organize the physical labels by using TacNote's camera-based mobile app. In addition, they can specify locations, ordering, and hierarchy via finger-pointing interaction and receive audio feedback. Our user study with ten BVI participants showed that TacNote effectively alleviated the memory burden, offering a promising solution for enhancing users' access to information.en3D printing pen | Accessibility | Assistive technology | Fabrication | Mobile Application | Tactile graphics[SDGs]SDG4TacNote: Tactile and Audio Note-Taking for Non-Visual Accessconference paper10.1145/3586183.36067842-s2.0-85178509835https://api.elsevier.com/content/abstract/scopus_id/85178509835