Options
Visual-Inertial Ego-Positioning for Flying Cameras
Date Issued
2016
Date
2016
Author(s)
Liang, Qiao
Abstract
In this paper, a low cost monocular camera and an inertial measurement unit (IMU) are combined for the ego-positioning on flying cameras. We firstly survey the state-of-the-art monocular visual positioning approaches, such as Simultaneous Localization and Mapping (SLAM) and Model-Based Localization (MBL). Three of the most representative methods including ORB-SLAM, LSD-SLAM, and MBL, which are originally designed for vehicles, are evaluated in different scenarios. Based on the experiment results, we analyze the pros and cons of each method. Considering the limitations of vision-only approaches, we fuse the visual positioning with an inertial sensor based on a loosely-coupled framework. The experiment results demonstrate the benefits of visual-inertial sensor fusion.
Subjects
Flying Cameras
Ego-Positioning
Monocular Vision
Visual Positioning
Visual-Inertial Sensor Fusion
Type
thesis
File(s)
No Thumbnail Available
Name
ntu-105-R03944046-1.pdf
Size
23.32 KB
Format
Adobe PDF
Checksum
(MD5):8d06e0d99036949ca7c29ab73a4d41cb