淡江大學覺生紀念圖書館 (TKU Library)
進階搜尋


下載電子全文限經由淡江IP使用) 
系統識別號 U0002-2707201123023200
中文論文名稱 以影像為基礎之三維空間定位及其在SOPC之實現
英文論文名稱 Image-Based 3-Dimensional Localization and Its Realization on SOPC
校院名稱 淡江大學
系所名稱(中) 電機工程學系碩士班
系所名稱(英) Department of Electrical Engineering
學年度 99
學期 2
出版年 100
研究生中文姓名 賴古梵
研究生英文姓名 Gu-Fan Lai
學號 698470084
學位類別 碩士
語文別 中文
口試日期 2011-07-05
論文頁數 81頁
口試委員 指導教授-易志孝
委員-王偉彥
委員-許陳鑑
委員-周永山
委員-李世安
中文關鍵字 影像式量測  三維空間定位  CCD攝影機  高度量測  軟硬體協同設計  FPGA  SOPC 
英文關鍵字 image-based measurement  3-D localization  CCD images  hight measurement  HW/SW co-design  FPGA  SOPC 
學科別分類 學科別應用科學電機及電子
中文摘要 本文提出一種影像式量測方法,用以量測傾斜面上目標物之定位資訊,包含目標物之距離、傾斜角度、以及目標物體之垂直高度。作法上係利用一台數位攝影機,以目標物於不同攝影距離下之像素值,推導出位於傾斜面上目標物之距離、傾斜角度、以及目標物垂直高度關係式,完成目標物體之三維空間定位,並且將此方法使用Altera DE2-70硬體平台,以軟硬體協同設計方法實現於SOPC。最後,本文再以實際的量測定位實驗,驗證所提出之量測方法的準確性及實用性。
英文摘要 This paper presents an image-based system for measuring target objects on an oblique plane based on pixel variation of CCD images by referencing to two arbitrarily designated points in image frames. Based on an established relationship between the displacement of the camera movement along the photographing direction and the difference in pixel counts between reference points in the images, object verticel height distance and photographing distance between the camera and an object on the target plane can be calculated via the proposed method. Because of the advantages demonstrated in measuring target objects on the oblique plane, the proposed approach is capable of acquiring 3-dimensional localization, imformation of the objects.To be practical for use in various applications, we also present a hardware/software (HW/SW) co-design approach using SOPC technique to realize the proposed image-based 3-dimensional localization method.
論文目次 目錄
目錄 III
圖目錄 VI
表目錄 IX
第一章 緒論 1
1.1 研究背景及動機 1
1.2 研究的目的與方法 4
1.3 論文架構 5
第二章 像素差異法距離量測 7
第三章 像素差異法之傾斜面量測 13
3.1 傾斜角度的量測 13
3.2 傾斜面攝影距離量測 17
3.3 傾斜面上任意兩參考點之間的距離量測 18
3.4 傾斜面上目標物之高度量測 18
3.5 誤差分析 20
第四章 基於像素差異法傾斜面量測之三維空間定位 22
4.1 二維平面定位之定義 22
4.2 二維平面定位之水平寬度量測 24
4.3 二維平面定位之垂直寬度量測 27
4.4 三維空間定位 30
第五章 SOPC之軟硬體協同設計平台 31
5.1 DE2-70多媒體開發平台 31
5.2 D5M擷取模組 35
5.2.1 D5M擷取模組腳位說明 36
5.2.2 D5M影像擷取格式 37
5.2.3 D5M擷取模組訊號時序圖 40
5.3 LTM顯示模組 42
5.3.1 LTM顯示模組腳位說明 44
5.3.2 LTM顯示模組時序圖 45
第六章 三維空間定位法於SOPC之實現 47
6.1 三維空間定位法之設計架構 47
6.1.1 影像擷取流程 50
6.1.2 定位點擷取流程 51
6.1.3 三維空間計算流程 54
6.2 影像處理模組說明 55
6.2.1 RGB Filter模組 56
6.2.2 Location Point Capture模組 57
6.3 三維空間定位法之軟體計算流程 59
第七章 實驗結果 63
7.1 三維空間定位實驗 63
7.1.1 量測條件說明 63
7.1.2 實驗結果 64
7.2 SOPC量測實驗 66
7.2.1 量測條件說明 66
7.2.2 實驗結果 67
7.3 誤差分析實驗 70
第八章 結論與未來研究方向 72
8.1 結論 72
8.2 未來研究方向 73
參考文獻 75

圖目錄
圖2.1 像素差異法距離量測之 位移示意圖 7
圖2.2 像素差異法距離量測之 位移示意圖 8
圖2.3 描繪於不同攝影距離下距離與像素值變化關係示意圖 9
圖3.1 像素差異法之傾斜面量測 位移示意圖 14
圖3.2 像素差異法之傾斜面量測系統示意圖 15
圖3.3 位於傾斜面上之目標物體高度量測示意圖 19
圖4.1像素差異法進行傾斜面上目標物體定位之透視圖 23
圖4.2 立體視角下之傾斜面上目標物體定位示意圖 25
圖4.3 二維平面定位之水平寬度量測示意圖 26
圖4.4 二維平面定位之垂直寬度量測示意圖 28
圖5.1 DE2-70多媒體開發平台 33
圖5.2 DE2-70多媒體開發平台搭配D5M與LTM模組 34
圖5.3 D5M擷取模組 36
圖5.4 D5M擷取模組腳位 37
圖5.5 D5M影像框架 38
圖5.6 CMOS影像訊號 39
圖5.7 Bayer Pattern像素圖 40
圖5.8 D5M擷取模組影像有效資料格式 41
圖5.9 D5M擷取模組時序圖 42
圖5.10 LTM顯示模組 43
圖5.11 LTM顯示模組腳位 44
圖5.12 LCD水平時序圖 46
圖5.13 LCD垂直時序圖 46
圖6.1 三維空間定位法之設計架構圖 48
圖6.2 影像擷取架構圖 50
圖6.3 定位點擷取架構圖 51
圖6.4 目標物(球門柱)經RGB Filter示意圖 52
圖6.5 目標物(球)經RGB Filter示意圖 52
圖6.6 目標物(球門柱)經定位點擷取模組之定位點示意圖 53
圖6.7 目標物(球)經定位點擷取模組之定位點示意圖 53
圖6.8 三維空間計算架構圖 55
圖6.9 RGB Filter模組符號圖 56
圖6.10 Location Point Capture模組符號圖 57
圖 6.11 三維空間定位法之主要流程圖 60
圖 6.12 三維空間定位法之細部流程圖 61
圖7.1目標物於不同位置與不同距離拍攝之實際圖 64
圖7.2 Penalty Kick場地規劃圖 66
圖7.3 Ball與Goal之實際影像處理圖 68
圖7.4實際定位圖 68
圖7.5在量測距離為100cm之最大可能量測角度 誤差圖 70
圖7.6在固定角度為 下之最大可能量測距離 誤差圖 71

表目錄
表5.1 D5M規格參數表 35
表5.2 LTM規格參數表 43
表6.1 RGB Filter模組輸出入訊號說明表 56
表6.2 Location Point Capture模組輸出入訊號說明表 58
表7.1目標物之量測數據 65
表7.2 Ball與Goal之量測數據 69
參考文獻 [1] E. Menegatti, A. Pretto, A. Scarpa, and E. Pagello, “Omnidirectional vision scan matching for robot localization in dynamic environments,” IEEE Transactions on Robotics, vol. 22, no. 3, pp. 523-535, 2006.
[2] J.C. Fernandes and J.A.B. Neves, “Using Conical and Spherical Mirrors with Conventional Cameras for 360° Panorama Views in a Single Image,” 2006 IEEE International Conference on Mechatronics, Budapest, 3-5 July 2006, pp. 157 - 160.
[3] Bing-ru Liu, Yun Xie, Yi-min Yang, and Zhen-Zhen Qiu, “A self-localization method with monocular vision for autonomous soccer robot,” IEEE International Conference on Industrial Technology (ICIT 2005), Hong Kong, 14-17 Dec. 2005, pp. 888 - 892.
[4] Margrit Betke and Leonid Gurvits, “Mobile robot localization using landmarks,” IEEE Transactions on Robotics and Automation, vol. 13, no. 2, pp.251-263, 1997.
[5] J.C. Fernandes and J.A.B. Neves, “Angle Invariance for Distance Measurements Using a Single Camera,” 2006 IEEE International Symposium on Industrial Electronics, Montreal, Canada, 9-13 July 2006, pp. 676 - 680.
[6] S. Baker, S. K. Nayar, “A Theory of Single-Viewpoint Catadioptic Image Formation,” International Journal Of Computer Vision, 35(2):1-22, November 1999.
[7] Dana Cobzas, Hong Zhang and Martin Jagersand, “Image-based localization with depth-enhanced image map,” IEEE International Conference on Robotics and Automation, vol. 2, pp. 1570-1575, Sept. 2003.
[8] Cyril Cauchois, Eric Brassart, Bruno Marhic and Cyril Drocourt, “An absolute localization method using a synthetic panoramic image base,” IEEE Proceedings of the Third Workshop on Omni-directional Vision, June 2, pp. 128-135, 2002.
[9] L. C. Lai, T. L. Lee, H. H. P. Wu, and C. J. Wu, “Self-Localization of Mobile Robots Based on Visual Information,” IEEE Conference on Industrial Electronics and Applications, Singapore, May 2006, pp. 1-6.
[10] C. C. Peng, A compact digital image sensing distance and angle measuring device, M.S. thesis, Optical Science Center, Nation Central Univ., Taoyuan County, Taiwan, 2001.
[11] T. Egami, S. Oe, K. Terada, and T. Kashiwagi, “Three dimensional measurement using color image and movable CCD system,” The 27th Annual Conference of the IEEE Industrial Electronic Society, Denver, USA, 29 Nov. – 02, Dec. 2001, pp.1932-1936.
[12] M. C. Lu, “Image-based height measuring system for Liquid or particles in tanks,” ROC patent of invention, No. 201536, 2004.
[13] D. Katsoulas and A. Werber, “Edge detection in range images of piled box-like objects,” Proceedings of the 7th International Conference on Pattern Recognition, Cambridge UK, vol. 2, August 23-26, 2004, pp. 80-84.
[14] Ming-Chih Lu, Wei-Yen Wang, and Chun-Yen Chu, “Image-Based Distance and Area Measuring System,” IEEE Sensors Journal, vol. 6, no. 2, pp. 495-503, Apr. 2006.
[15] T. Kanade, H. Kano, and S. Kimuram, “Development of a video-rate stereo machine,” Proceedings of IEEE International Conference on Intelligent Robots and Systems 95, Pittsburgh, PA, Aug. 5-9, 1995, pp. 95-100.
[16] Y. Tanaka, A. Gofuku, I. Nagai, and A. Mohamed, “Development of a compact video-rate range finder and its application,” Proceedings of 3rd International Conference on Advanced Mechatronics, Okayama, Japan, Aug. 1998, pp. 97-102.
[17] H. Yan, “Image analysis for digital media applications,” IEEE Computer Graphics and Applications, vol. 21, no. 1, pp. 18-26, Jan. 2001.
[18] B.G. Mertzios and IS. Tsirikolias, “Applications of coordinate logic filters in image analysis and pattern recognition,” Proceedings of the 2nd International Symposium on Image and Signal Processing and Analysis, Pula, June 19-21, 2001, pp. 125-130.
[19] George C. Karras, Dimitra J. Panagou and Kostas J. Kyriakopoulos, “Target-referenced Localization of an Underwater Vehicle using a Laser-based Vision System,” OCEANS, Boston, 2006, pp. 1-6.
[20] Naruto Yonemoto, Kazuo Yamamoto, Kimio Yamada, Hidemi Yasui, Naohiro Tanaka, Claire Migliaccio, Jean-Yves Dauvignac, and Christian Pichot, “Performance of obstacle detection and collision warning system for civil helicopters,” Proceedings of the SPIE on Enhanced and Synthetic Vision 2006, vol. 6226, pp. 622608, 2006.
[21] Hiroyuki Ukida and Sumio Takamatsu, “3D Shape Measurements Using Stereo Image Scanner with Three Color Light Sources”, Instrumentation and Measurement Technology Conference, vol.1, pp. 639- 644, May 18-20 2004.
[22] H.G. Nguyen and J.Y. Laisne, “Obstacle detection using bi-spectrum CCD camera and image processing,” Proceedings of the Intelligent Vehicles Symposium, pp. 42- 50, June 29- July 1 1992.
[23] 曾笠哲, Optocal Flow-base Obstacle Avoidance for Fixed Wing UAV in Umcertain Environment, 成功大學碩士論文, 民國九六年。
[24] Hongyu Di, Qi Shang, and Sun'an Wang, “A virtual binocular vision range finding method of remote object based on single rotating angle indexing camera,” The 9th International Conference on Electronic Measurement & Instruments ( ICEMI '09), Beijing, Aug. 16-19, 2009, pp. 2-846 - 2-849.
[25] J. C. Aparicio Femandes and J. A. B. Campos Neves, “Angle Invariance for Distance Measurements Using a Single Camera,” IEEE International Symposium on Industrial Electronics, vol.1, pp. 676-680, 2006.
[26] Ming-Chih Lu, Cheng-Chuan Chen, Chun-Yen Chu and Chin-Tun Chuang, “The apparatus and method of the distance measurement,” ROC patent of invention, no. 279526, 2007.
[27] Fang-Jung Shiou and Ruey-Tsung Lee, “Opto-electronic detector for distance and slanting direction measurement of a surface,” ROC patent of invention, no. 246585, 2004.
[28] Schultz Stephen, Giuffrida Frank and Mondello Charles, GRAY Robert, “Oblique geolocation and measurement system,” US patent of invention, no. 044692, 2004.
[29] Umesh R. Dhond and J. K. Aggarwal, “Structure from stereo-a review,” IEEE Transactions on Systems, Man and Cybernetics, vol. 19, no. 6, pp. 1489-1510, Nov. 1989.
[30] Fua Pascal, “A parallel stereo algorithm that produces dense depth maps and preserves image features,” Machine Vision and Applications, vol. 6, no. 1, pp. 35-49, Dec. 1993.
[31] M. A. Sid-Ahmed and M. T. Boraie, “Dual camera calibration for 3-D machine vision metrology,” IEEE Transactions on Instrumentation and Measurement, vol. 39, no. 3, pp. 512-516, June 1990.
[32] C. Liguori, A. Pietrosanto, and A. Paolillo, “An on-line stereo vision system for dimensional measurements on rubber extrusions,” Measurement: Journal of the International Measurement Confederation, vol. 35, no. 3, pp. 221-231, Apr. 2004.
[33] Ti-Ho Wang, Ming-Chih Lu, Chen-Chien Hsu, Yin Yu Lu, and Ching-Pei Tsi, “Three Dimensional Measurement Based on Image Shift and Its Applications in Object Inspection,” WSEAS Transactions on Systems, vol. 6, no. 5, pp. 926-933, May 2007.
[34] Ti-Ho Wang, Ming-Chih Lu, Chen-Chien Hsu, Yin-Yu Lu, Cheng-Pei Tsai, “Three dimensional distance measurement based on single digital camera,” Proceedings of the 2007 WSEAS Int. Conference on Circuits, Systems, Signal and Telecommunications (CISST’07), Gold Coast, Queensland, Australia, Jan. 17-19, 2007, pp. 153-157.
[35] Cheng-Chuan Chen, Chen-Chien Hsu, Ti-Ho Wang, Chun-Wei Huang, “Three-dimensional measurement of a remote object with a single CCD camera,” The 7th WSEAS International Conference on Signal Processing, Computational Geometry& Artificial Vision, Vouliagmeni Beach, Athens, Greece, Aug. 24-26, 2007, pp. 141-146.
[36] C. Liguori, A. Pietrosanto, and A. Paolillo, “Method for correcting geometric distortion in video cameras,” IEEE Proceedings of the National Aerospace and Electronics Conference, New York, USA, Apr. 1985, pp. 1382-1388.
[37] J. Weng, P. Cohen, and M. Herniou, “Camera calibration with distortion models and accuracy evaluation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 10, pp. 965-980, Oct. 1992.

[38] Chen-Chien Hsu, Ming-Chih Lu, Ke-Wei Chin, “Distance measurement based on pixel variation,” ICARA 4th Autonomous Robots and Agents international conference, Wellington, New Zealand, Feb. 2009, pp. 324-329.
[39] Altera Corporation, SOPC Builder User Guide, 2003.
[40] Altera Corporation, Designing With Nios & SOPC Builder, 2003。
[41] Shih-An Li, Chen-Chien Hsu, Ching-Chang Wong, and Chia-Jun Yu, “Hardware/Software Co-design for Particle Swarm Optimization Algorithm,” Submitted to Information Sciences, Aug. 15, 2009. (SCI)
[42] 李世安、翁仲緯、賴鈺婷、翁慶昌,影像硬體加速器之設計,淡江大學:中華民國系統科學與工程研討會,2009年6月。
論文使用權限
  • 同意紙本無償授權給館內讀者為學術之目的重製使用,於2013-08-02公開。
  • 同意授權瀏覽/列印電子全文服務,於2013-08-02起公開。


  • 若您有任何疑問,請與我們聯絡!
    圖書館: 請來電 (02)2621-5656 轉 2281 或 來信