§ Browsing ETD Metadata
  
System No. U0002-2501201615464100
Title (in Chinese) 運用快速偵測法於視覺里程計
Title (in English) Fast detection algorithms for visual odometry
Other Title
Institution 淡江大學
Department (in Chinese) 機械與機電工程學系碩士班
Department (in English) Department of Mechanical and Electro-Mechanical Engineering
Other Division
Other Division Name
Other Department/Institution
Academic Year 104
Semester 1
PublicationYear 105
Author's name (in Chinese) 姚琮獻
Author's name(in English) Tsung-Hsien Yao
Student ID 603370015
Degree 碩士
Language Traditional Chinese
Other Language
Date of Oral Defense 2016-01-13
Pagination 53page
Committee Member advisor - Chung-Hsun Sun
co-chair - Y-J Chen
co-chair - Chan-Yun Yang
Keyword (inChinese) 視覺里程計
直立式加速強健特徵
參數化三點透視
隨機化隨機取樣一致
循序跳越式盒子濾波器
Keyword (in English) visual odometry(VO)
upright speeded-up robust features(U-SURF)
parameterized perspective-three-point(parameterized P3P)
randomized random sample consensus(R-RANSAC)
ordinal skip box filter
Other Keywords
Subject
Abstract (in Chinese)
本論文針對視覺里程計(Visual Odometry, VO) 運用幾個演算法對即時性進行改善。首先利用雙眼視覺系所統擷取影像,影像使用直立式加速強健特徵(Upright Speeded-Up Robust Features, U-SURF)偵測地標點。加入循序跳越式盒子濾波器(Ordinal Skip Box Filter)以減少在U-SURF中對影像迴積的次數與計算時間。之後利用參數化三點透視(Parameterized Perspective-Three-Point algorithms)演算法反推視覺系統可能位置。再者利用隨機化隨機取樣一致(Randomized Random Sample Consensus, R-RANSAC)演算法以剔除其他錯誤位置解。最後以地面基準實驗證明改善效果。
Abstract (in English)
In this paper, several algorithms are applied to improve the instantaneity of the visual odometry system. First, the binocular stereo camera is used to capture images. The upright speeded-up robust features (U-SURF) algorithm is used to detect features in the images. Ordinal skip box filter added in U-SURF can reduce convolution times and computation time. Then the possible positions of the camera can be determined by the parameterized perspective-three-point algorithm (Parameterized P3P). Next, the randomized random sample consensus (R-RANSAC) algorithm is used to eliminate other error position solutions. Finally, improvement performance is shown on the ground truth experiment.
Other Abstract
Table of Content (with Page Number)
中文摘要	I
英文摘要	II
目錄	III
圖目錄	V
表目錄	VII
第1章 緒論	1
1.1	研究目的與動機	1
1.2	文獻探討	2
1.3	研究範圍	3
 第2章 系統流程及架構	4
2.1	視覺里程計	4
2.2	主要改善流程	5
 第3章 視覺特徵建立與的地標管理	7
3.1	U-SURF演算法	8
3.2	循序跳越式盒子濾波器	12
3.3	實驗驗證	14
3.4	雙眼立體視覺	16
3.5	平均座標	21
3.6	地標管理	22
 第4章 攝影機定位以及改善	25
4.1	參數化P3P	25
4.2	R-RANSAC	32
4.3	參數化P3P用於R-RANSAC	35
4.4	影像平面限制與地標點的投影門檻值	36
4.5	實驗驗證	37
4.6	利用R-RANSAC求得攝影機位置解	38
 第五章 實驗及硬體架構	42
5.1	硬體設備	42
5.2	視覺里程計加速改善比較實驗	43
5.3	像素座標與世界座標的地面基準實驗	44
5.4	應用於未知環境的視覺里程計之地面基準實驗	46
 第六章 結論與未來展望	51
6.1	結論	51
6.2	未來展望	51
 參考文獻	52

圖目錄
圖2.1視覺里程計示意圖	4
圖2.2視覺里程計流程圖	5
圖2.3主要改善內容流程圖	6
圖3.1 視覺特徵建立與地標管理流程	7
圖3.2積分影像示意圖	9
圖3.3利用積分影像取範圍	10
圖3.4盒子濾波器[3]	10
圖3.5改變濾波器大小示意圖[3]	11
圖3.6音階層數與盒子濾波器尺寸關係圖[3]	11
圖3.7 Haar小波濾波器[3]	12
圖3.8 16維描述向量	12
圖3.9 循序式虛擬程式碼	14
圖3.10 循序式特徵點示意圖	15
圖3.11 行列式值大於零(左),行列式值小於零(右)	16
圖3.12雙眼攝影機視線向量示意圖	21
圖3.13坐標系示意圖	21
圖3.14 地標分群示意圖	23
圖4.1參數化P3P相機與地標點向量示意圖[9]	26
圖4.2攝影機以及地標點座標示意圖[9]	27
圖4.3半平面π關係圖[9]	28
圖4.4將π平面沿著nx旋轉θ角度	29
圖4.5實驗示意圖	41
圖5.1 實驗硬體架構圖	42
圖5.2雙眼視覺感測器	43
圖5.3 實驗場景	44
圖5.4 MFC操作介面	45
圖5.5 世界座標機制實驗路徑	45
圖5.6 像素座標機制實驗路徑	46
圖5.7 實驗環境	47
圖5.8 MFC移動路徑	47
圖5.9每步位置示意圖	48
圖5.10 每步當前的誤差	49
圖5.11 累積誤差百分比	49

 
表目錄
表3.1 SURF、U-SURF和循序跳越式盒子濾波器比較	16
表3.2 左攝影機內部參數	17
表3.3 右眼攝影機內部參數	18
表3.4 平均視線向量	22
表4.1 傳統P3P與參數化P3P比較	32
表4.2 汙染率機率表	33
表4.3汙染率機率運算式	34
表4.4 影像平面限制與初檢投影點	37
表4.5 R-RANSAC與RANSAC之比較	38
表4.6 R-RANSAC與RANSAC數據比較	38
表4.7 R-RANSAC最大集合效果	39
表4.8 R-RANSAC機制誤差表	41
表5.1 PC規格	42
表5.2 前視型單眼攝影機規格表	43
表5.3 偵測加速改善實驗數據	44
表5.4 攝影機求解加速改善實驗數據	44
表5.5 偵測加速改善實驗數據	44
表5.6 像素座標與世界座標誤差表	46
表5.7 地面基準誤差表	48
表5.8 每步詳細數據	50
References
[1]	C.T. Chi, Y.T. Wang, S.T. Cheng, and C.A. Shen, “Robot Simultaneous Localization and Mapping Using a Calibrated RGB-D Sensor, ”Sensors and Materials, vol.26, no.5, pp.353-364, 2014.
[2]	J.Y. Bouguet. Camera Calibration Toolbox for Matlab. Available: 	http://www.vision.caltech.edu/bouguetj/calib_doc/
[3] 	H. Bay, A. Ess, T. Tuytelaars, and L. Gool, “SURF: Speeded-Up Robust Features,” Computer Visio and Image Understanding, vol. 110, no.3, pp.346-359, 2008.
[4]	Y.T. Wang, and G.Y. Lin, “Improvement of Speeded-Up Robust Features for Robot Visual Simultaneous Localization and Mapping,” Robotica, vol.32, no.4, pp.533-549, 2014.
[5]	H. Park, H. Mitsumine, and M. Fujii, “Fast Detection of Robust Features 	by Reducing the Number of Box Filtering in SURF,” IEICE Transactions 	Information and Systems, vol. 94, no.3, pp.725-728, 2011.
[6]	Y.T. Wang, C.H. Sun, and M.J. Chiou, “Detection of Moving Objects in Image Plane for Robot Navigation Using Monocular Vision,” EURASIP Journal on Advances in Signal Processing, vol.2012, pp.1-22, 2012.
[7]	S. Hutchinson, G. D. Hager, and P. I. Corke, “A Tutorial on Visual Servo Control,” IEEE Transactions on Robotics and Automation, vol. 12, no.5, pp.651-670, 1996. 
[8]	M.A. Fishier, and R.C. Bolles, “Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography,” Communications of ACM, vol.24, pp.381-395, 1981.
[9]	L. Kneip, D. Scaramuzza, and R. Siegwart, “A Novel Parametrization of the Perspective-Three-Point Problem for a Direct Computation of Absolute Camera Position and Orientation,” IEEE Conference on Computer Vision and Pattern Recognition, pp.2969-2976, 2011.
[10]	J. Matas, and O. Chum, “Randomized RANSAC with Td;d test,” Image Vision Computing, vol.22, no.10, pp.837-842, 2004.
Terms of Use
Within Campus
I request to embargo my thesis/dissertation for 5 year(s) right after the date I submit my Authorization Approval Form.
Duration for delaying release from 5 years.
Outside the Campus
I grant the authorization for the public to view/print my electronic full text with royalty fee and I donate the fee to my school library as a development fund.
Duration for delaying release from 5 years.
 Top

If you have any questions, please contact us!

Library: please call (02)2621-5656 ext. 2487 or email