電機資訊學院: 資訊網路與多媒體研究所指導教授: 洪一平梁橋Liang, QiaoQiaoLiang2017-03-062018-07-052017-03-062018-07-052016http://ntur.lib.ntu.edu.tw//handle/246246/275989隨著飛行攝影機的日益普及,自我定位技術作為保障其功能性與安全性的關鍵技術之一,其重要性與日俱增。單目攝影機和慣性測量單元 (IMU) 因為其低成本、輕重量等特點,非常適合用於飛行攝影機的自我定位。此篇論文從視覺定位和視覺慣性傳感器融合兩個方面分別進行研究,結合單目攝影機和慣性測量單元提出一種飛行攝影機自我定位之方式。本文對三種目前較為先進的用於車輛定位的單目視覺定位方法進行不同條件下的實驗,分析將其用於飛行攝影機定位時可能產生的問題,并討論各種方法的適用場景和優缺點。考慮到視覺定位的固有限制,本文引入一種基於寬鬆耦合方式的傳感器融合方法,將視覺和慣性測量相結合,並在實驗結果中驗證了方法的有效性。In this paper, a low cost monocular camera and an inertial measurement unit (IMU) are combined for the ego-positioning on flying cameras. We firstly survey the state-of-the-art monocular visual positioning approaches, such as Simultaneous Localization and Mapping (SLAM) and Model-Based Localization (MBL). Three of the most representative methods including ORB-SLAM, LSD-SLAM, and MBL, which are originally designed for vehicles, are evaluated in different scenarios. Based on the experiment results, we analyze the pros and cons of each method. Considering the limitations of vision-only approaches, we fuse the visual positioning with an inertial sensor based on a loosely-coupled framework. The experiment results demonstrate the benefits of visual-inertial sensor fusion.6280630 bytesapplication/pdf論文公開時間: 2018/8/2論文使用權限: 同意無償授權飛行攝影機自我定位單目視覺視覺定位視覺與慣性傳感器融合Flying CamerasEgo-PositioningMonocular VisionVisual PositioningVisual-Inertial Sensor Fusion基于視覺和慣性測量之飛行攝影機自我定位Visual-Inertial Ego-Positioning for Flying Camerasthesis10.6342/NTU201601311http://ntur.lib.ntu.edu.tw/bitstream/246246/275989/1/ntu-105-R03944046-1.pdf