https://scholars.lib.ntu.edu.tw/handle/123456789/607366
標題: | Camera Ego-Positioning Using Sensor Fusion and Complementary Method | 作者: | Kao P.-Y Tseng K.-W Shen T.-Y Song Y.-B Chen K.-W Hu S.-W Shih S.-W Hung Y.-P. YI-PING HUNG |
關鍵字: | Camera ego-positioning;Sensor fusion;Cameras;Motion tracking;Pattern recognition;Complementary methods;Inertial measurement unit;Loosely coupled;Positioning accuracy;Robust Positioning;Visual positioning;Visual simultaneous localization and mappings;Sensor data fusion | 公開日期: | 2021 | 卷: | 12663 LNCS | 起(迄)頁: | 291-304 | 來源出版物: | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | 摘要: | Visual simultaneous localization and mapping (SLAM) is a common solution for camera ego-positioning. However, SLAM sometimes loses tracking, for instance due to fast camera motion or featureless or repetitive environments. To account for the limitations of visual SLAM, we use sensor fusion method to fuse the visual positioning results with inertial measurement unit (IMU) data based on filter-based, loosely-coupled sensor fusion methods, and further combines feature-based SLAM with direct SLAM via proposed complementary fusion to retain the advantages of both methods; i.e., we not only keep the accurate positioning of feature-based SLAM but also account for its difficulty with featureless scenes by direct SLAM. Experimental results show that the proposed complementary method improves the positioning accuracy of conventional vision-only SLAM and leads to more robust positioning results. ? 2021, Springer Nature Switzerland AG. |
URI: | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85104367052&doi=10.1007%2f978-3-030-68796-0_21&partnerID=40&md5=e8c470e279cfb982997573bcb93cf926 https://scholars.lib.ntu.edu.tw/handle/123456789/607366 |
ISSN: | 03029743 | DOI: | 10.1007/978-3-030-68796-0_21 |
顯示於: | 電機工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。