§ 瀏覽學位論文書目資料
系統識別號 U0002-1609201914292300
DOI 10.6846/TKU.2019.00465
論文名稱(中文) 基於ROS在靜態環境之自主機器人的模糊導航系統
論文名稱(英文) Fuzzy Navigation System for ROS Based Autonomous Robot in Static Environment
第三語言論文名稱
校院名稱 淡江大學
系所名稱(中文) 電機工程學系機器人工程碩士班
系所名稱(英文) Master's Program In Robotics Engineering, Department Of Electrical And Computer Engineering
外國學位學校名稱
外國學位學院名稱
外國學位研究所名稱
學年度 107
學期 2
出版年 108
研究生(中文) 辛柏凱
研究生(英文) Pushkar Kumar Singh
學號 606465010
學位類別 碩士
語言別 英文
第二語言別
口試日期 2019-07-18
論文頁數 70頁
口試委員 指導教授 - 李祖添
指導教授 - 翁慶昌(iclabee@gmail.com)
委員 - 龔宗鈞
委員 - 翁慶昌(iclabee@gmail.com)
委員 - 劉智誠
關鍵字(中) 模糊邏輯
Gazebo
人形機器人
導航
機器人作業系統
軌跡規劃
關鍵字(英) Fuzzy Logic
Gazebo
Humanoid Robot
Navigation
Robot Operating System (ROS)
Trajectory Planning
第三語言關鍵字
學科別分類
中文摘要
本論文設計和實現軌跡追蹤和規劃控制器,為使用模糊邏輯來用於移動型與人形機器人之自動化機器人,並在靜態的環境中導航。透過建立自動化機器人之差動驅動滾輪的運動學方程實現運動學模型。當機器人使用攝影機獲取即時影像時,透過影像處理中使用色彩模型的色彩表來進行物件分割來顯示影像二值化。最後,機器人使用模糊邏輯控制器來追蹤預設路徑並透過控制驅動的速度和轉向角度來避開未知的障礙物。已經考慮了線路追蹤與避障的方法。控制器為雙輸入和雙輸出的系統,其是更適合室內使用的追蹤式車輛。模糊控制系統於Ubuntu 16.01之機器人作業系統中設計與實現,並且在Gazebo模擬中進行測試。最後,一些實驗與結果證明了模糊邏輯控制系統對自動化機器人的有效性。
英文摘要
This paper describes the design and implementation of a trajectory tracking and planning controller using Fuzzy logic for autonomous robots including both mobile and humanoid to navigate in the static environment. The kinematic model has been created for the autonomous robot using the kinematics equations of the differential driving rolling wheel. As the robot uses webcam to get the live image, image binarization has been shown using the object segmentation method by using the color code table of color model in the image processing. Finally, the robot uses a fuzzy logic controller to follow a planned path and avoid unknown obstacles by controlling the velocity and steering angle of the drive unit. Both line following and obstacle avoidance approach has been considered. The controller is a two input and two output system. It is a tracked vehicle which is more suitable for indoor use. The fuzzy control system has been designed and implemented in Robot Operating System (ROS) under Ubuntu 16.01 operating system and tested under Gazebo simulation. Finally, several experiments and results have been presented to demonstrate the effectiveness of fuzzy logic control system on the autonomous robots.
第三語言摘要
論文目次
Table of Content	I 
List of Figures	III 
Table Directory	V
Chapter I Introduction	1
1.1 Research background	1
1.2 Research motivation	4
1.3 Research purposes	8
1.4 Thesis structure	9
Chapter II Introduction of Humanoid Robot Platform	10
2.1	Foreword	10
2.2	Humanoid robot organization	11
2.3	Humanoid robot core control board introduction	17
2.3.1 Industrial personal computer(IPC)	18
2.3.2 FPGA Development board	19
Chapter III Kinematic Model of Humanoid Robot and Mobile Robot	21
3.1	Kinematic Model	21
3.1.1	Foreword	10
3.1.1 Denavit-Hartenberg System(DH System	21
3.2	Kinematic Model of the mobile robot	23
3.2.1  Kinematics equation of the differential drive rolling wheel……26
    3.2.2  Kinematics equation of the humanoid robot	29
    3.2.2.1  Positioning System	30
    3.2.2.2  Data delivery Module	31
    3.2.2.3  Image Binarization	32
    3.2.2.4  Humanoid robot software architectur …………......………....35
Chapter IV Fuzzy Logic Controller (FLC)…………………………………..37
4.1	Fuzzy Logic Controller (FLC) and its type………………....…….37
4.2 Fuzzy Inference System………………………………….…………41
4.2.1	Fuzzy Logic Controller Design	43
4.2.2	FLC Design for Line following (Without obstacle)	44
4.2.2.1 Rule base for FLC Design for Line following robot…………...45
4.2.3 FLC Design for Line following (With obstacle)	47
Chapter V Experiment and Result……………………………………….…..50
5.1	Observation mode………………………....……………….….…..51
5.2	Mobile robot model setup…………………………....…………....56
5.3	Marathon setup ……………………………………………….…...58
5.4	Humanoid robot setup……………………………………….…….61
5.5	Obstacle path setup…………………………………………….….63
Chapter VI Conclusions and Future Prospects	67
References		69

Figure 1.1  Hierarchy of robot behavior in marathon	6
Figure 1.2  Hierachy of robot behaviour in obstacle……………...................7
Figure 2.1  Humanoid Robot Diagram	11
Figure 2.2  Humanoid Robot DOF Plan	13
Figure 2.3  Humanoid Robot mechanism design and dimension	13
Figure 2.4  Mechanism design and DOF configuration	16
Figure 2.5  Joint space of the feet	17
Figure 2.6  IPC industrial computer entity map	18
Figure 2.7  FPGA development board entity diagram	20
Figure 3.1  Ideal Rolling Wheel	24
Figure 3.2  Kinematics model of the differential drive rolling wheel……...25
Figure 3.3  Kinematics model of the mobile robot	25
Figure 3.4  Waist and footsteps	30
Figure 3.5  Robot state speculation	31
Figure 3.6  Image modeling: (a) original image, (b) binarization	33
Figure 3.7  Object segmentation diagram	34
Figure 3.8  Humanoid robot software architecture	36
Figure 4.1  Flow chart for FLC	39
Figure 4.2  Fuzzy Logic Controller	40
Figure 4.3  Fuzzy Control system	43
Figure 4.4  FLC for line following robot	44
Figure 4.5  Control surfaces for (a) Left motor (b) Right motor	46
Figure 4.6  FLC for obstacle avoidance robot	47
Figure 5.1  Image Coordinate	51
Figure 5.2  Axis distance calculation diagram……… …………………….52
Figure 5.3  Axis distance calculation diagram	53
Figure 5.4  Image Processing(a)100cm(b) 90cm (c) 80cm (d) 70cm (e) 60cm (f) 50cm	54
Figure 5.5  Error graph	55
Figure 5.6  Mobile robot movement...………………………………….…..56
Figure 5.7  (a)Left and right movement vs time (b) Forward movement vs time (c) Speed vs time graph	57
Figure 5.8  Marathon environment	59
Figure 5.9  Series of the movement of mobile robot in marathon………….60
Figure 5.10 (a)SDF model of robot (b) interface of the robot	61
Figure 5.11 Robot Movement	62
Figure 5.12 setup of the obstacle path…………..……………………….…..63
Figure 5.13 (a)(b)(c)(d)(e)(f)Testing of humanoid robot	64

Table 2.1  IPC Industrial Computer Specifications	19
Table 2.2  System Specifications of FPGA Development Board	20
Table 3.1  Color code table	33
Table 4.1  Fuzzy set for input	45
Table 4.2  Fuzzy set for output……………………………………………..45
Table 4.3  Fuzzy rule base for line following	46
Table 4.4  fuzzy rule base for obstacle avoidance	48
Table 4.5  Parameter Difference	49
Table 5.1  Measured distance error table …………………………55
參考文獻
[1]	URL: https://en.wikipedia.org/wiki/Robot
[2]	Intelligent identification and control for autonomous guided vehicles using adaptive fuzzy-based algorithms; C. J. Harris & C. G. Moore, Eng. Application of AI, Vol. 2, December '8
[3]	R. Frizera Vassallo, H. J. Schneebeli, J. Santos-Victor, Visual navigation: combining visual servoing and appearance based methods, 6th International Symposium on Intelligent Robotic Systems, SIRS’98, Edinburgh, Scotland, July 1998 
[4]	J. Santos-Victor, G. Sandini, F. Curotto, S. Garibaldi, Divergent stereo in autonomous navigation: from bees to robots, International Journal of Computer Vision 14 (1995) 159–177. 
[5]	R. Carelli, C. Soria, O. Nasisi, E. Freire, Stable AGV corridor navigation with fused vision-based control signals, Conference of the IEEE Industrial Electronics Society, IECON, Sevilla, Spain, November 5–8, 2002
[6]	A. Dev, B. Kröse, F. Groen, Navigation of a mobile robot on the temporal development of the optic flow, IEEE/RSJ/GI International Conference on Intelligent Robots and Systems IROS’97, Grenoble, September 1997, pp. 558–563
[7]	https://docs.google.com/document/d/11dfuKMQvzFcbTlFgz6OBNYFeh7RHyVuHptdivkjnzyE/pub
[8]	http://www.fira.net/main/
[9]	Z. Zhong, J. Yi, D. Zhao, Y. Hong, and X. Li, “Motion vision for mobile robot localization,” IEEE Internationalconference on Control, Automation, Robotics and Vision, vol. 1, pp. 261-266, 2004.
[10]	J. K. David, T. Ernst, and O. B. Thomas, “Stereovision and navigation in building for mobilerobots,” IEEE Transaction on Robotics and Automation,vol. 5, pp. 792-803, 1989.
[11]	H. -S. Baek, J. –M. Choi, and B. –S. Lee, “Improvement of distance measurement algorithm on stereo vision system(SVS),” CUTE International Conference on Ubiquitous Information Technologies and Applications, no. 5678176, 2010.
[12]	S. D. Blostein, and T.S. Huang, “Error analysis instereo determination of 3-d point position,” IEEETransactionon Pattern Analysis and Machine Intelligence,vol. 9, pp. 752-765, 1987.
[13]	Y. Matsumoto, K. Ikeda, M. Inaba, and H. Inoue, “Visual navigation using omnidirectional view sequence,” IEEE/RSJ International Conference on Intelligent Robots and System, pp. 317-322, 1999.
[14]	J. Huber, and V. Graefe, “Motion Stereo for Mobile Robots,” IEEE Transactions on Industrial Electronics, vol. 41, pp. 378-383, 1994.
[15]	M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, “A tutorial on particle filters for online nonlinear/non-gaussian bayesian tracking,”IEEE Transaction onSignal Processing, vol. 50, no. 2, pp.174-188, 2002.
[16]	J. Baltes, C. T. Cheng, M. C. Lau, and A. Espinola, “Real-time navigation for a humanoid robot using particle filter,” ICETI Applied Mechanics and Materials, vol. 284-287, pp. 1914-1918, 2012.
[17]	M. Fu, and B. Xue, “A path planning algorithm based on dynamic networks and restricted searching area,” IEEE International Conference on Automation and Logistics, pp.1193-1197, 2007.
[18]	I. Chabini, and L. Shan, “Adaptations of the A* algorithm for the computation of fastest paths in deterministic discrete-time dynamic networks,” IEEE Transactions on Intelligent Transportation Systems, vol.3, no.1, pp.60-74, 2002.
[19]	 https://slideplayer.com/slide/16444022/
[20]	 https://robotics.stackexchange.com/questions/15013/calculating-differential-drive-robot-icc-position	
[21]	https://en.wikipedia.org/wiki/Fuzzy_logic
論文全文使用權限
校內
紙本論文於授權書繳交後5年公開
同意電子論文全文授權校園內公開
校內電子論文於授權書繳交後5年公開
校外
同意授權
校外電子論文於授權書繳交後5年公開

如有問題,歡迎洽詢!
圖書館數位資訊組 (02)2621-5656 轉 2487 或 來信