王傑智Wang, Chieh-Chih臺灣大學:資訊工程學研究所竇菲Dopfer, AndreasAndreasDopfer2010-05-172018-07-052010-05-172018-07-052009U0001-1008200918081900http://ntur.lib.ntu.edu.tw//handle/246246/183406在這篇論文中,我們提出一種只利用一台二維LIDAR讓機器人在三維空間中做定位和建立三維空間資訊的方法。我們同時利用事前的環境知識和機器人本人的運動來完成這件事。我們主要是達成一種在取得三維空間資訊的同時,也提供了比得上使用水平二維LIDAR定位效能的方法Much work on localization and mapping using LIDAR has been done in mobile robotics. While earlier work was done only in the two dimensional domain, a recent shift towards three dimensional localization and mapping using laser rangefinder can be seen. Three dimensional representations allow a more accurate modeling of the real world, allowing more sophisticated path planning and leading to better obstacle avoidance. Also the performance of localization can be improved,and three dimensional data allows better object recognition than 2D data.echniques capturing 3D data involve either multiple 2D LIDARS, one 2D LIDAR that is nodded or rotated using an external actuator together with highly accurate orientationensing and synchronization, or an integrated, expensive 3D scanning system. In this thesis we propose a technique to capture 3D data only using one 2D LIDAR. To do so the robotsotion is utilized together with reasonable assumptions. It is assumed that the ground the robot is moving on is flat and visible in the scan, that the sensors height is known and that the environment has vertical structures.irst an initial calibration procedure using a camera together with the LIDAR is performed to reveal the extrinsic parameters between robot and the sensor. The localizationroblem is divided into two steps. The LIDARs sensing plane is tilted away from the robots direction of motion towards the floor (or another known flat structure in the environment). The detection of the floor allows to estimate the angular orientation of the sensor in two dimensions. Using these estimates the range data can be transformed, so that known methods to estimate the missing parameters of the full LIDAR pose can be adopted. Being able to accurately estimate the three dimensional displacement between two consecutive scans allows to build an accurate three dimensional map of the environment.ABSTRACT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iiIST OF FIGURES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vHAPTER 1. Introduction . 1 Localization . . . . . . . . 1 Mapping . . . . . . . . . . 2 Thesis objective . . . . . . 2 Overview . . . . . . . . . 2HAPTER 2. Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.1. Mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.2. Mapping in 3D . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4HAPTER 3. Foundations . . . . . . . . . . . . . . . . . . 6 3.1. LIDAR . . . . . . . . . . . . . . . . . . . . . . . . . 6 3.2. Raw Scan Processing . . . . . . . . . . . . . . . . . 8 3.2.1. Linear Least Square Line fitting . . . . . . . . 8 3.2.2. Line splitting . . . . . . . . . . . . . . . . . . . 8 3.3. Iterative Closest Point (ICP) Algorithm . . . . . . . 10 3.3.1. Improvements over the Naive Implementation 11 3.4. Clustering . . . . . . . . . . . . . . . . . . . . . . . 12 3.5. LIDAR calibration . . . . . . . . . . . . . . . . . . 13 3.5.1. Extrinsic Camera - LIDAR calibration . . . . . 13 3.5.2. Calibrating the LIDAR’s position . . . . . . . 17HAPTER 4. 3D Localization and Mapping using one 2D LIDAR 18 4.1. Assumptions . . . . . . . . . . . . . . . . . . . . . . . . . . 18 4.2. Estimating the Orientation . . . . . . . . . . . . . . . . . . . 20 4.2.1. Identifying the ground line . . . . . . . . . . . . . . . . 20 4.2.2. Selection Issues . . . . . . . . . . . . . . . . . . . . . . 21 4.2.3. Calculating Pitch and Roll . . . . . . . . . . . . . . . . 22 4.3. Estimating the Robot Motion . . . . . . . . . . . . . . . . . 23 4.3.1. Scan Projection . . . . . . . . . . . . . . . . . . . . . . . 23 4.3.2. Scan Matching Challenges . . . . . . . . . . . . . . . . 24 4.3.3. Constant Velocity Motion Model . . . . . . . . . . . . . 26 4.3.4. Sampling based Approach . . . . . . . . . . . . . . . . 26 4.4. Uncertainty Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 4.4.1. Uncertainty Comparison . . . . . . . . . . . . . . . . . . . . . . . . . . . 28HAPTER 5. Experimental Results . . . . . . . . . 30 5.1. Hardware . . . . . . . . . . . . . . . . . . . 30 5.1.1. Extrinsic camera-LIDAR calibration . . 32 5.1.2. LIDAR calibration . . . . . . . . . . . . 32 5.2. Software Implementation . . . . . . . . . . 33 5.3. 3D-map of the CSIE Building: Fourth floor . 33 5.4. 3D-map of the CSIE Building: First floor . . 38 5.4.1. Performance . . . . . . . . . . . . . . . 42 5.4.2. Issues . . . . . . . . . . . . . . . . . . . 42 5.5. Gridmaps . . . . . . . . . . . . . . . . . . . 44HAPTER 6. Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 6.1. Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 6.2. Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45IBLIOGRAPHY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47application/pdf18713904 bytesapplication/pdfen-US建立三維空間3D mappingLIDAR運用單具二維LIDAR在立體環境中定位與建地圖3D Localization and Mapping Using One 2D LIDARthesishttp://ntur.lib.ntu.edu.tw/bitstream/246246/183406/1/ntu-98-R96922144-1.pdf