||Behavior-based Control of an Autonomous Soccer Robot System
||Department of Mechanical and Electro-Mechanical Engineering
Autonomous mobile robot
||本論文完成一套自主足球機器人系統，設計的系統具有全方位移動、全方位視覺及網路遠端操作等功能。首先，以正逆運動學推導全方位移動機器人的運動方程式，以進行運動控制器設計。其次，搭配視覺回授規劃機器人的行為式控制（behavior-based control），使機器人自主完成特定行為。本研究規劃的機器人行為包括物件追蹤（object tracking）、移至球後（get position to kick the ball）、與避障（obstacle avoidance）等三種。其中避障行為以切線蟲演算法為基礎，改變其部份方法，達到機器人避障移動之效果。
全方位影像處理方面，包含全景影像轉換、物件辨識、與無障礙區域（free space）偵測。全景影像轉換在影像失真不影響物件辨識的前提下，以座標轉換的方式完成。物件辨識應用人臉辨識中的眼口定位法進行辨識，以求得穩定的物件位置輸出。無障礙區域偵測利用全方位攝影機以仿聲納（Sonar-like）探測方式進行偵測達到如測距儀（range finder）般的效果，將偵測結果提供給避障行為進行處理。
|| An autonomous soccer robot system is developed in this thesis, and this system provided with an omnidirectional drive, an omnidirectional vision, and a network remote control. First of all, the forward and inverse differential kinematics of the omnidirectional-driven robot is inferred, based on which a kinematics controller of the robot is designed. Secondly, combined with the vision feedback, the behavior-based control is devised for the soccer robot to achieve specific behaviors autonomously. Finally, three different behaviors are planned, including object tracking, getting position to kick the ball, and obstacle avoidance. The behavior of obstacle avoidance is developed and modified based on Tangent Bug algorithm.
The image processing of omnidirectional vision includes three stages, namely, panoramic transformation, object recognition, and free space detection. The panoramic image is utilized for visual servo control on the premise that the distortion of image will not affect the performance of object recognition in the process. The function of object recognition provides stable outputs of object positions by applying the method for analyzing segmentation of human faces. As regards the process of free space detection, it provides a similar performance of laser rangefinders by deploying the omnidirectional vision system as a sonar-like detection device. The results of image processing are provided as the feedback for the behavior-based control.
第1章 序論 1
1.1 研究動機 1
1.2 全方位移動平台 2
1.3 文獻探討 3
1.3.1 機器人系統架構相關文獻 3
1.3.2 全方位驅動機器人相關文獻 4
1.4 研究範圍 8
1.5 論文架構 9
第2章 自主機器人之行為式控制 10
2.1 全方位移動 10
2.2 機器人運動方程式 11
2.3 物件追蹤行為 14
2.4 移至球後行為 14
2.5 避障行為 16
2.6 單部機器人行為導航 19
第3章 視覺回授系統設計 23
3.1 全方位視覺 23
3.2 全景影像轉換 25
3.3 色彩空間與影像分割 27
3.4 影像中物件之辨識 33
3.5 無障礙區域偵測 34
3.6 影像處理流程 36
第4章 行動機器人平台系統與機構設計 40
4.1 行動機器人系統架構與規格 40
4.2 移動平台驅動系統 41
4.3 PIC Motor Controller 43
4.4 直流馬達驅動電路 45
4.5 直流馬達與馬達驅動電路測試 48
4.6 移動平台靜力學分析 50
4.8 操作程式介面 52
第5章 實驗與結果分析 54
5.1 機器人移動分析與測試 54
5.2 追蹤測試 62
5.3 移至球後實測 65
5.4 避障測試 68
5.5 機器人行為導航測試 68
第6章 結論與未來研究方向 70
6.1 研究成果 70
6.2 未來研究方向 70
附錄A 近年服務型機器人的發展 77
附錄B 全方位移動機構的發展 84
附錄C 全方位視覺的發展 89
附錄D 色彩座標系統 93
附錄E Windows-based網路通訊 99
圖 目 錄
圖1.1 機器人系統 2
圖1.2 軟體架構 3
圖1.3 Trackies機器人系統 4
圖1.4 1999年Clockwork Orange 4
圖1.5 2006年WinKIT 4
圖1.6 細胞分割法圖例 7
圖1.7 蟲演算法移動示意圖 7
圖1.8 環境模型狀態 7
圖1.9 極座標直方圖 7
圖1.10 （a）Polar histogram（b）Masked Polar histogram 8
圖1.11 切線蟲演算法流程示意圖 8
圖1.12 障礙物的簡化與繞行 8
圖1.13 環境狀態分析 8
圖1.14 機器人整體架構圖 9
圖2.1 車型移動與全方位移動 11
圖2.2 全方位移動基本架構座標圖 11
圖2.3 1號車輪狀態 12
圖2.4 運動控制器 14
圖2.5 機器人動作模式 14
圖2.6 球場相對關係示意圖 16
圖2.7 狀態示意圖 16
圖2.8 狀態目標 16
圖2.9 移至球後路徑 16
圖2.10 由感測器取得之無障礙區域 18
圖2.11 障礙物邊緣簡化 18
圖2.12 無障礙區域的邊界處理 18
圖2.13 機器人可移動區域 18
圖2.14 避障移動方向示意 18
圖2.15 避障路徑範例 19
圖2.16 behavior-based navigation 結構 20
圖2.17 model-based navigation 結構 20
圖2.18 機器人導航流程圖 21
圖2.18 機器人導航流程圖 21
圖2.19 球場狀態分類 22
圖3.1 全方位攝影機 24
圖3.2 由攝影機上取得的實際影像 24
圖3.3 雙曲面鏡 24
圖3.4 QuickCam Pro 4000 24
圖3.5 像素距離（pixel）與實際距離（cm）關係 24
圖3.6 全景影像轉換 26
圖3.7 全景影像轉換（一） 26
圖3.8 全景影像轉換（二） 26
圖3.9 全景影像轉換對應關係 27
圖3.10 顏色分布與閥值示意圖 28
圖3.11 顏色分布與閥值修正示意圖 29
圖3.12 HV座標展開示意圖 29
圖3.13 擷取影像 30
圖3.14 RGB閥值分割 30
圖3.15 HSV–RGB閥值分割 30
圖3.16 YUV–RGB閥值分割 30
圖3.17 黃色在RGB空間的分布狀態 32
圖3.18 黃色在HSV空間的分布狀態 32
圖3.19 黃色在RGB空間的色彩模型 33
圖3.20 臉部亮度投影示意圖 34
圖3.21 眼口定位示意圖 34
圖3.22 極座標與俓向投影示意圖 34
圖3.23 無障礙區域偵測方法 35
圖3.24 無障礙區域偵測示意圖 35
圖3.25 偵測結果示意圖 35
圖3.26 影像處理流程圖 37
圖3.27 機器人上的全方位攝影機影像 38
圖3.28 （a）全景影像轉換結果 38
圖3.28 （b）影像中各物體分布的辨識結果 38
圖3.28 （c）利用徑向投影方法累計曲線圖 38
圖3.28 （c）利用徑向投影方法累計曲線圖 38
圖3.29 （a）最後得到物體之位置 39
圖3.29 （b）Landmark辨識結果 39
圖3.30 無障礙區域判斷 39
圖4.5 PIC18F4XXX 系列腳位圖·······················································································44
圖4.6 PIC 微處理器電路板································································································45
圖4.7 H 電橋示意圖···········································································································45
圖4.12 週期102μs 時馬達特性曲線圖·············································································48
圖4.13 週期256μs 時馬達特性曲線圖·············································································49
圖4.14 週期512μs 時馬達特性曲線圖·············································································49
表 目 錄
表3.1 各色彩空間分割效能 31
表4.1 全方位車輪規格 43
表4.2 直流馬達尺寸 43
表5.1 全方位移動三輪配速 58
||A.Frolov, http://www.scarse.org/adjust/color.html .
Allison L., 1997. Department of Computer Science, Monash University, Australia,http://www.css
Aranda, J., A. Grua, and J. Climent, 1998. “Control Architecture for a Three- wheeled Roller Robot”, 5th International Workshop on Advanced Motion Control, 1998, 518-523.
Asama, H., 1995. “Development of an Omni-Directional Mobile Robot with 3 DOF Decoupling Drive Mechanism”, IEEE International Conference on Robotics and Automation, 1995, vol.2, 1925-1930.
Barth, M., C. Barrows, 1996. “A Fast Panoramic Imaging System and Intelligent Imaging Technique for Mobile Robots”, IEEE/RSJ International Conference on Intelligent Robots and Systems, 1996, vol.2, 626-633.
Baskan, S., M.M. Bulut, and V. Atalay, 2002. “Projection based method for segmentation of human face and its evaluation”, Pattern Recognition Letters, vol.23 no.14, 1623-1629.
Blumrich, J.F., 1974. “Omnidirectional Vehicle”, United States Patent 3, 789, 974.
Borenstein, J., Y. Koren, 1991. “The Vector Field Histogram - Fast Obstacle Avoidance For Mobile Robots”, IEEE Transactions on Robotics and Automation, vol.7, Issue.3, 278-288.
Bradbury, H.M., 1980. “Omni-Directional Transport Device”, United States Patent 4, 223, 753.
Dellaert, F., D. Fox, W. Burgard, and S. Thrun, 1999. “Monte Carlo Localization for Mobile Robots”, IEEE International Conference on Robotics and Automation, 1999, vol.2, 1322- 1328.
Fox, D., W. Burgard, and F. Dellaert, 1999. “Markov localization for mobile robots in dynamic environments”, Journal of Artificial intelligence Research, vol.11, 391-427.
Grabowiecki, J., 1919. “Vehicle-Wheel”, United States Patent 1, 305, 535.
Hashimoto, M., N. Suizu, I. Fujiwara, and F. Oba, 1999. “Path tracking control of a non-holonomic modular omnidirectional vehicle Systems”, IEEE International Conference on Systems, Man, and Cybernetics, 1999, Vol.6, 637-642.
Hong, J., 1991. “Image-based homing”, Control Systems Magazine, IEEE, Vol.12, Issue.1, 38-45
Ilon, B.E., 1975. “Wheels for a Course Stable Selfpropelling Vehicle Movable in any Desired Direction on the Ground or Some Other Base”, United States Patent 3, 876, 255.
Kamon, I., E. Rimon, and E. Rivlin, 1996. “A New Range-Sensor Based Globally Convergent Navigation Algorithm for Mobile Robots”, IEEE International Conference on Robotics and Automation, 1996, vol.1, 429-435.
Kim, T.G., 2005. Network Robots Will Become Family Member This Year, Koreatimes, 29 JUN.
L. Grace Juan, S. Shinson Guan, 2001. 色彩學, http://www.personal.stu.edu.tw/lgjuan/
Latombe, J.C., 1991. Robot Motion Planning, Kluwer Academic Publishers.
Laubach, S.L., J. Burdick, and L. Matthies, 1998. “An Autonomous Path Planner Implemented on the Rocky 7 Prototype”, IEEE International Conference on Robotics and Automation, 1998, vol.1, 292-297.
Linaker, F., M. Ishikawa, 2005. “Real-time appearance-based Monte Carlo Localization”, Robotics and Autonomus Systems.
Lumelsky, V.J., T. Skewis, 1990. “Incorporating range sensing in the robot navigation function”, IEEE Transactions on Systems, Man and Cybernetics, Vol.20, Issue.5, 1058-1069.
Menegatti, E., 2004. “A New Omnidirectional Vision Sensor for Monte-Carlo Localization”, Proc. of the Int. RoboCup Symposium.
Moriyama, K., 2005(a). NEC新型パーソナルロボット「PaPeRo2005」と「チャイルドケアロボットPaPeRo」を発表, http://pc.watch.impress.co.jp/docs/2005/0316/nec.htm.
Moriyama, K., 2005(b). 愛・地球博プレスプレビュー恐竜ロボットや掃除ロボットなどが展示, http://pc.watch.impress.co.jp/docs/2005/0323/expo05.htm.
Muir, P.F., C.P. Neuman, 1987. “Kinematic modeling of wheeled mobile robots”, Journal of Robotic Systems 4, 281-340.
Nanda, H., R. Cutler, “Practical Calibrations for a real-time digital omnidirectional camera”, Technical Sketches, Computer Vision and Pattern Recognition, Hawaii, US, Dec 2001.
Olaf, D., 2002. “Improved Mecanum Wheel Design for Omni-directional Robots”, Australasian Conference on Robotics and Automation, Auckland, November 2002, pp.117–121.
Otuka, M., 2005. 愛知万博 - NEDOのロボットが一気に9種類(+65種類) , http://pcweb.mycom.co.jp/articles/2005/03/19/expo1/001.html.
Rees, D.W., 1970. “Panoramic television viewing system”, United States Patent No. 3, 505, 465, Apr. 1970.
Sarachik, K.B., 1989. “Characterizing an indoor environment with a mobile robot and uncalibrated stereo”, IEEE International Conference on Robotics and Automation, 1989, 984 -989.
Sato, D., Y. Yamamoto, and M. Ishii, 2003. “Development of a remote controlled robot with all directional movability”, 第22回SICE九州支部学術講演会.
Sciavicco, L., B. Siciliano, 1996, Modeling and control of robot manipulators, McGraw-Hill.
Sekimori, D., 2001. “High-Speed Obstacle Avoidance and Self-Localization for Mobile Robots Based on Omni-Directional Imaging of Floor Region”, The RoboCup 2001 International Symposium.
Serizawa, T., 2005. かわいいけど、スゴイんです―タカラの家庭用ロボット「TERA」,http://www.itmedia.co.jp/lifestyle/articles/0501/20/news030.html .
Siegwart, R, and I.R. Nourbakhsh, 2004. Introduction to Autonomous Mobile Robots, The MIT Press.
Song, Y., D. Tan, and Y. Tian, 2002. “Experiment Identification of the Dynamics Parameters of an Omnidirectional Wheel Mobile Robot”, Proceedings of the 4th World Congress on Inteligent Control and Automation.
Strang, G., 1980, Linear Algebra and Its Applications, Academic Press.
RoboCup Team Description Paper：Osaka University Trackies, 2003.
RoboCup Team Description Paper：WinKIT, 2006.
Thrun, S., W. Burgard, and D. Fox, 2005. Probabilistic Robotics, The MIT Press.
Thrun, S., W. Burgard, D. Fox, and F. Dellaert, 2000. “Robust Monte Carlo localization for mobile robots”, Artificial Intelligence 128 (1–2) 99–141.
Torii, A., 2004. “Panoramic image transform of omnidirectional images using discrete geometry techniques”, 2nd International Symposiumon 3D Data Processing, Visualization and Transmission, 608-615.
Ulrich, I., and J. Borenstein, 1998. “VFH+: Reliable Obstacle Avoidance for Fast Mobile Robots”, IEEE International Conference on Robotics and Automation, 1998, 1572-1577.
Wada, M., 1999. “Design and control of a Variable Footprint Mechanism for Holonomic Omnidirectional Vechicles and its Application to Wheelchairs”, IEEE Transactions on Robotics and Automation, Vol.15, Issue.6, 978-989.
Walpole, R.E., and R.H. Myers, 1989. Probability and statistics for engineers and scientists, 4th ed., Macmillan.
Watanabe, K., 1998. “Control of Ominidirectional Mobile Robot”, 1998 2nd Int. Conf. on Knowledge -Based Intellegent Electronic Systems, Adelaide, Australia
Watanabe, K., Y. Shiraishi, S. G. Tzafestas, J. Tang, and T. Fukuda, 1998a, “Feedback Control of an Omnidirectional Autonomous Platform for Mobile Service Robots”, Journal of Intelligent and Robotic Systems 22, 315-330
West, M., 1992. “Design of a Holonomic Omnidirectional Vehicle”, IEEE International Conference on Robotics and Automation, 1992, vol.1, 97-103.
Yagi, Y., and S. Kawato, 1990. “Panoramic scene analysis with conic projection”, IEEE International Conference on Robotics and Automation, 1992, vol.1, 97-103.
Yamazawa, K., 1993. “Omnidirectional imaging with hyperboloidal projection”, IEEE/RSJ International Conference on Intelligent Robots and Systems, 1993, vol.2, 1029-1034.