淡江大學覺生紀念圖書館 (TKU Library)
進階搜尋


下載電子全文限經由淡江IP使用) 
系統識別號 U0002-0806200912304700
中文論文名稱 演化式物體影像追蹤與傾斜定位
英文論文名稱 Evolution-based object tracking and localization based on tilt photographing
校院名稱 淡江大學
系所名稱(中) 電機工程學系碩士班
系所名稱(英) Department of Electrical Engineering
學年度 97
學期 2
出版年 98
研究生中文姓名 鄭明育
研究生英文姓名 Ming-Yu Cheng
學號 696450690
學位類別 碩士
語文別 中文
口試日期 2009-07-01
論文頁數 68頁
口試委員 指導教授-許陳鑑
委員-王偉彥
委員-盧明智
委員-周永山
中文關鍵字 目標追蹤  追蹤演算法  影像式量測  CCD 攝影機  定位 
英文關鍵字 Object Tracking  evolution algorithm  Image-based Measurement  CCD camera  localization 
學科別分類 學科別應用科學電機及電子
中文摘要 機器人的定位(Localization),其用意在於得知障礙或目標的座標資訊,進而規劃路徑達到避障以及追蹤的效果,本論文為了摒除多感測器在資料整合上花費較多時間處理的缺點,以及提升機器人對於擬人化的發展與需求,將以CCD為主要的感測器在追蹤演算法及傾斜定位上提供不同以往的研究內容。其中追蹤演化法的研究中,考量到粒子濾波器(PF)所得到的結果會因環境因素而無法進行追蹤,因此本文提出了一種以粒子群聚最佳化(PSO)為搜尋主體並結合單體演算法(NM)的複合式(Hybrid)追蹤演算法則,在此法則下本文除了克服NM於影像中容易落於區域最佳解的缺陷,並成功的以此複合式演算法增加了處理資料量以及追蹤成功率。同時文中於傾斜定位上詳細描述藉由已知CCD架設的高度以及推算出各像素的照射角度,來完成已知目標物影像座標下的空間定位以及量測原理,並可結合地標(Landmark)完成傾斜角的自我量測,使機器人能達到多種傾斜角的定位。
英文摘要 This thesis investigates two important issues of robot navigation: object tracking and localization. In the object tracking system, we propose a hybrid evolutionary algorithm incorporating NM simplex method and particle swarm optimization to improve tracking performance in terms of processing speed and successful rate in comparison to particle filter (PF) tracking for complex environment. As for the problem of robot localization, this thesis provides an image-based localization method based on tilt photographing of a single CCD camera. Image captured by the CCD camera is pre-processing to locate the target object in the picture in terms of pixel count deviation from the CCD camera. By using an established formula based on relationship between tilt angle of the CCD camera and distance, coordinate of the target object can be calculated.
論文目次 中文摘要.........................................................................................................I
英文摘要.......................................................................................................II
目錄................................................................................................................III
圖目錄........................................................................................................VI
表目錄...........................................................................................................IX
第一章 序論..................................................................................................1
1.1 研究背景及動機.......................................................................................1
1.2 研究目的與方法....................................................................................3
1.3 論文架構....................................................................................................4
第二章 使用PF及PSO之追蹤..............................................................5
2.1 PF為基礎的影像追蹤................................................................................5
2.2 PSO為基礎的影像追蹤............................................................................7
2.3 PF與PSO於動態影像追蹤的驗證比較.............................................12
第三章 複合式追蹤演算法.............................................................20
3.1 NM於靜態照片下之追蹤.......................................................................20
3.2 結合PSO、NM的複合式追蹤演算法......................................................23
3.3 改良複合式追蹤演算法..........................................................................28
3.3.1 複合式追蹤演算法改良後性能驗證...........................................31
3.4 實驗環境及影像特徵......................................................................34
3.4.1 追蹤演算法的實驗環境..........................................................34
3.4.2 影像特徵...................................................................................37
3.5 追蹤演算法的實驗結果與討論.............................................................38
第四章 單CCD於傾斜照射下之定位................................................42
4.1 平行面的定位法則......................................................................42
4.2 傾斜照射的定位法則......................................................................44
4.2.1 傾斜定位基礎 ..........................................................46
4.2.2 x-axis, y-axis定位法則.................................................48
4.3 雷射輔助傾斜角量測.............................................................................51
4.4 機器人定位流程.............................................................................53
4.5 定位量測結果與討論.............................................................................55
4.5.1 座標定位實驗結果..........................................................57
4.5.2 距離量測實驗結果……......................................................58
第五章 結論與未來研究方向................................................................61
5.1 結論..........................................................................................................61
5.2 未來研究方向..........................................................................................62
參考文獻.......................................................................................................64
圖2.1 粒子濾波器的追蹤流程圖……………………………………………5
圖2.2 PSO的影像追蹤流程圖……………………………………………8
圖2.3 靜態追蹤的軟體介面………….…………..……..………………..10
圖2.4 PSO追蹤結果表示……………………………..…………………11
圖2.5 PSO R追蹤表示圖………………………………………………..12
圖2.6 PF於光源不足追蹤的環境圖……………………………………..13
圖2.7 PF於光源不足的追蹤結果………….…………………………….13
圖2.8 PF模擬環境追蹤的環境表示圖…………..…..…………………..14
圖2.9 PF於模擬環境的追蹤結果…..……………………………………14
圖2.10 PF於不同移動速度下追蹤的環境表示圖……..……...………...15
圖2.11 PF於移動週期10.65s下之追蹤結果……………….………..……15
圖2.12 PF於移動週期5.35s下之追蹤結果……………………………….16
圖2.13 PF於移動週期3.52s下之追蹤結果………………………………16
圖2.14 PSO於模擬現實環境的追蹤結果……………….……………….17
圖2.15 PSO於移動週期5.89s下之追蹤結果………………………………17
圖2.16 PSO於移動週期3.16s下之追蹤結果………………………………18
圖3.1 NM追蹤演算法之流程圖……………………………..……………21
圖3.2 NM靜態追蹤結果(擷取失敗)……….…………………………….22
圖3.3 NM靜態追蹤結果(擷取成功)……………………………………..22
圖3.4 結合PSO與NM的追蹤流程圖……………………………………..23
圖3.5 NM粒子產生範圍示意圖…………………………….…………….24
圖3.6 複合式演算法於白色背景下追蹤的成功張數……………………27
圖3.7 複合式演算法於白色背景下追蹤的成功率………………………27
圖3.8 NM於50點粒子選擇最好3點的示意圖…………………..……..29
圖3.9 PSO+NM(50->3)的追蹤流程圖…………………………..……..29
圖3.10 PSO+NM(50->3)+PSO R的追蹤流程圖…………..……………30
圖3.11 改善後複合式演算法於白色背景追蹤的成功張數…………….33
圖3.12 改善後複合式演算法於白色背景追蹤的成功率………………..33
圖3.13 動態追蹤的實驗環境示意圖……………………………………..34
圖3.14 物體移動的軟體介面……………………………………………..35
圖3.15 物體移動的軌跡…………………………………………………..35
圖3.16 動態追蹤的軟體介面……………………………………………..36
圖3.17 追蹤演算法綜合比較的成功張數…………………………………41
圖3.18 追蹤演算法綜合比較的追蹤成功率………………..……………..41
圖4.1 影像系統之光學幾何模型……………………………..……………42
圖4.2 平行面上定位示意圖…………………………………..……………44
圖4.3 CCD感測器在傾斜攝影距離之示意圖………………..……………45
圖4.4 推導 的幾何圖形…………………….……………..……………46
圖4.5 像素於y軸的空間投影示意圖………………………………………48
圖4.6 傾斜攝影時像素的投影空間…………………………..……………50
圖4.7 傾斜攝影時定位示意圖……………….……………….……………51
圖4.8 傾斜角量測示意圖………..…………….……………..……………52
圖4.9 機器人定位流程圖………..…………….……………..……………53
圖4.10 機器人定位示意圖……….…………….……………..……………54
圖4.11 機器人全方位定位流程圖…………………………………………55
圖4.12 量測傾斜角藉由距離p值…………………..………………………56
圖4.13 座標定位示意圖……………………………………………………57
圖4.14 兩點間距離量測……………………………………………………59
圖5.1 特徵分析範圍改善示意圖……………..……………………………63
表2.1 PSO靜態追蹤的效果…………………….…….…………………11
表2.2 PF與PSO追蹤結果比較……………….…………………………19
表3.1 複合式演算法於白色背景下追蹤,週期=10.28s…………………25
表3.2 複合式演算法於白色背景下追蹤,週期=5.89s…………………25
表3.3 複合式演算法於白色背景下追蹤,週期=3.16s…………………26
表3.4 複合式演算法於白色背景下追蹤,週期=2.27s.…………………26
表3.5 複合式演算法改善後在白色背景追蹤,週期=9.63s………………31
表3.6 複合式演算法改善後在白色背景追蹤,週期=5.25s………………32
表3.7 複合式演算法改善後在白色背景追蹤,週期=3.5s……..…………32
表3.8 蹤演算法綜合比較,週期=9.77s………….…………………………38
表3.9 蹤演算法綜合比較,週期=5.33s………….…………………………38
表3.10 蹤演算法綜合比較,週期=3.56s………….………………………39
表3.11 蹤演算法綜合比較,週期=2.68s……….…………………………39
表3.12 蹤演算法綜合比較,週期=1.79s………….………………………40
表4.1 座標定位的實驗結果………………………………………………58
表4.2 長度的量測…………………………………………………………59
表4.3 寬度的量測………………………………….………………………60
表4.4 兩點間距離量測………………………………………………………60

參考文獻 [1] C.C. Tsai and C.J. Wu, “Localization of an Autonomous Mobile Robot Based on Ultrasonic Sensory Information,” Journal of Intelligent and Robotic Systems, Vol. 30, No.3, pp.267-277, 2001.
[2] H.H. Lin, Ching-Chih Tasi, Jui-Cheng Hsu and Chih-Fu Chang, “Ultrasonic self-localization and pose tracking of an autonomous mobile robot via fuzzy adaptive extended information filtering,” IEEE International Conference on Robotics and Automation, Taipei, Taiwan, 2003, Vol. 1, pp.1283-1290.
[3] P. Krammer and H. Schweinzer, “Localization of object edges in arbitrary spatial positions based on ultrasonic data,” IEEE Sensors Journal, Vol. 6, No. 1, pp.203-210, 2006.
[4] J. Canou, C. Novales, G. Poisson, and P. Marche, “Quick primitives extraction using inertia matrix on measures issue from an ultrasonic network,” IEEE International Conference on Robotics and Automation, Seoul, Korea, 2001, Vol. 4, pp.3999-4004.
[5] E. Menegatti, A. Pretto, A. Scarpa and E. Pagello, “Omnidirectional vision scan matching for robot localization in dynamic environments,” IEEE Transactions on Robotics, 2006, Vol. 22, no. 3, pp. 523-535.
[6] T. Kanade, H. Kano, and S. Kimuram, “Development of a Video-Rate Stereo Machine”, Conference. on Intelligent Robots and Systems, Pittsburgh, USA, August, 1995, Vol. 3, pp.95-100.
[7] Y. Tanaka, A. gofuku, I. Nagai and A. Mohamed, “Development of a Compact Video-rate Range finder and its application,” Conference on Advanced Mechatronics, Okayama, Japan, August, 1998, pp.97-102.
[8] M.C. Lu, W.Y. Wang, and H.H. Lian, “Image-Based height measuring system for liquid or particles in tanks,” IEEE International Conference on Networking, Sensing and Control, 2004, Vol. 1, pp.24-29.
[9] M.C. Lu, W.Y. Wang, and C.Y. Chu, “Optical-Based Distance Measuring System (ODMS),” The Eighth International Conference On Automation Technology, Taichung, 2005, pp.282-285.
[10] M.C. Lu, W.Y. Wang, and C.Y. Chu, “Image-Based Distance and Area Measuring System,” IEEE Sensors Journal, Vol. 6, No. 2, pp.495-503, 2006.
[11] M. Betke and L. Gurvits, “Mobile robot localization using landmarks,” IEEE Transactions on Robotics and Automation, 1997, Vol. 13, No. 2, pp.251-263.
[12] Mark Piasecki, “Global localization for mobile robots by multiple hypothesis tracking,” Robotics and Autonomous Systems, 1995, Vol. 16, no. 1, pp. 93-104.
[13] C. Cauchois, E. Brassart, B. Marhic and C. Drocourt, “An absolute localization method using a synthetic panoramic image base,” IEEE Proceedings of the Third Workshop on Omni-directional Vision, Amiens, France, 2002, pp. 128-135.
[14] D. Cobzas, H. Zhang and M. Jagersand, “Image-based localization with depth-enhanced image map,” IEEE International Conference on Robotics and Automation, Taipei, Taiwan, 2003, Vol. 2, pp. 1570-1575.
[15] George C. Karras, Dimitra J. Panagou and Kostas J. Kyriakopoulos, “Target-referenced Localization of an Underwater Vehicle using a Laser-based Vision System,” OCEANS, 2006, Boston, pp. 1-6.
[16] L. Li-Chun, L. Tsong-Li, P. Hsien-Huang. Wu and W. Chia-Ju, “Self-Localization of Mobile Robots Based on Visual Information,” IEEE Conference on Industrial Electronics and Applications, Singapore, 2006, pp. 1-6.
[17] J. C. Aparicio Femandes and J. A. B. Campos Neves, “Angle Invariance for Distance Measurements Using a Single Camera” International Symposium on Industrial Electronics, 2006 IEEE, Vol.1, pp. 676-680
[18] H.G. NGUYEN, J.Y. LAISNE, “Obstacle detection using bi-spectrum CCD camera and image processing,” Intelligent Vehicles '92 Symposium, 1992, Page(s):42 – 50.
[19] Naruto Yonemoto, Kazuo Yamamoto, Kimio Yamada, Hidemi Yasui, Naohiro Tanaka, Claire Migliaccio, Jean-Yves Dauvignac and Christian Pichot, “Performance of obstacle detection and collision warning system for civil helicopters,” Enhanced and Synthetic Vision 2006, 2006, Volume 6226, pp. 622608.1-622608.8.
[20] Dorin Comaniciu, Visvanathan Ramesh and Peter Meer, “Real-time tracking of non-rigid objects using mean shift,” Conference on Computer Vision and Pattern Recognition, 2000, Volume 2, pp.142-149.
[21] Katja Nummiaro, Esther Koller-Meier and Luc Van Gool, “An Adaptive Color-Based Particle Filter,” Image and Vision Computing, 2003, vol. 21, pp.99-110.
[22] Marcelo G. S. Bruno, “Mixed-state particle filters for multiaspect target tracking in image sequences,” Conference on Acoustics, Speech and Signal Processing, 2003, Volume: 5, pp.V- 165-8.
[23] Changjiang Yang, Ramani Duraiswami and Larry Davis, “Fast multiple object tracking via a hierarchical particle filter,” Conference on Computer Vision, 2005, Volume: 1, pp.212-219.
[24] Kong Ling-fu and Wu Pei-liang, “Accurate particle filter tracking with SIFT keypoints,” Journal of Harbin Institute of Technology (NEW Series), Volume: 15, pp.209-213, 2008.
[25] Xiaoqin Zhang, Weiming Hu, Steve Maybank, Xi Li and Mingliang Zhu, “Sequential Particle Swarm Optimization for Visual Tracking,” Conference on Computer Vision and Pattern Recognition, Anchorage, AK, 2008, pp.1-8.
[26] Saisan P., Medasani S., and Owechko Y., ”Multi-View Classifier Swarms for Pedestrian Detection and Tracking,” Conference on Computer Vision and Pattern Recognition, San Diego, CA, USA, 2005, pp.18-18.
[27] Yuhua Zheng and Yan Meng, ”The PSO-Based Adaptive Window for People Tracking,” Computational Intelligence in Security and Defense Applications, 2007, pp23-29.
[28] Hong-Guang Sun, Yu-Xue Pan and Yun-Feng Zhang, “APSO Based Gabor Wavelet Feature Extraction Method,” Conference on Machine Learning and Cybernetics, 2004, Vol. 6, pp. 3888-3893.
[29] Tomoaki Kobayashi, Keita Nakagawa, Joe Imae and Guisheng Zhai, “Real Time Object Tracking on Video Image Sequence Using Particle Swarm Optimization,” International Conference on Control, Automation and Systems 2007, 2007, pp. 1773-1778.
論文使用權限
  • 同意紙本無償授權給館內讀者為學術之目的重製使用,於2012-07-16公開。
  • 同意授權瀏覽/列印電子全文服務,於2012-07-16起公開。


  • 若您有任何疑問,請與我們聯絡!
    圖書館: 請來電 (02)2621-5656 轉 2281 或 來信