§ Browsing ETD Metadata
  
System No. U0002-2107201115291700
Title (in Chinese) 以2.5D類神經網路為基礎對未知的樓梯做高度、深度、階數與姿態的估算及其應用於人形機器人爬樓梯之研究
Title (in English) Neural-Network-Based 2.5D Estimation for the Stair Possessing Unknown Height, Depth, Level and Pose, and Its Application to the Stair Climbing of a Humanoid Robot
Other Title
Institution 淡江大學
Department (in Chinese) 電機工程學系碩士班
Department (in English) Department of Electrical and Computer Engineering
Other Division
Other Division Name
Other Department/Institution
Academic Year 99
Semester 2
PublicationYear 100
Author's name (in Chinese) 陳彥達
Author's name(in English) Yen-Ta Chen
Student ID 698470191
Degree 碩士
Language Traditional Chinese
Other Language
Date of Oral Defense 2011-07-20
Pagination 53page
Committee Member advisor - Chih-Lyang Hwang
co-chair - Ching-Long Shih
co-chair - Wen-Shyong Yu
co-chair - Chi-Yi Tsai
Keyword (inChinese) 人形機器人
霍夫轉換
多層感知器類神經網路
視覺導引
爬樓梯
Keyword (in English) Humanoid robot
Hough transform
Modeling using multilayer neural network
Visual navigation
Stair climbing
Other Keywords
Subject
Abstract (in Chinese)
雖然樓梯具有高度與深度未知的特性,但這是指其作為一般人使用的情況下。為得知其高度與深度等內容,在此採用了對其影像做處理,此內容流程包括了將彩色圖像轉灰階圖像、 Canny邊緣檢測、中值濾波器去除高頻雜訊、 利用Hough變換來獲取直線、選取感興趣的區域與擷取特徵點等。將獲取的這些特徵點輸入至不同的多層感知器類神經網路(MLNNs),依此結果來做估測一個樓梯的高度、深度與階數。這些多層感知器類神經網路皆為兩個輸入(即二維影像平面座標)和兩個輸出(即二維的大地座標)。因為不需要使用到3D的建模,故稱之為2.5D的類神經建模。
首先在實驗場地搜尋樓梯,將搜尋到的樓梯進行相關估測,根據估測的結果將人形機器人導引至樓梯前方附近,再根據類神經所估測之樓梯的高度及加上內插計算所得之深度,並估測其階數,依此執行人形機器人爬樓梯的動作以完成所設定的任務。最後,以相關的實驗來驗證所提之方法之有效性及可行性。
Abstract (in English)
Although the stair possesses unknown height and depth, they are fixed as a general stair human used. The proposed image processing includes the transform to a grayscale, Canny edge detection, median filtering to remove high frequency noise, Hough transform for a straight line, the selection of regions of interesting, and the extraction of feature points. These feature points are fed into different learned multilayer neural networks (MLNNs) for the estimation of height and depth of a stair. 
These MLNNs are all two inputs (i.e., 2D image plane coordinate) and two outputs (i.e., different 2D world coordinates). No 3D modeling is required; it is the so-called 2.5D based neural network modeling. First, a humanoid robot (HR) scans the field to find the stair, which is randomly distributed before an HR. Based on the estimated posture of a stair with respect to an HR, it will be navigated to the vicinity of a planned posture. Then the stair climbing using the nominal height and depth with the interpolation of their lower and upper values is designed to finish the assigned task. Finally, a sequence of experiments are arranged to confirm the effectiveness and efficiency of the propose methodology.
Other Abstract
Table of Content (with Page Number)
中文摘要 I
英文摘要 II
目錄	III
圖目錄	V
表目錄	VIII
第一章 緒論	1
第二章 系統描述及任務陳述	3
2.1系統描述	3
2.1.1 人形機器人系統架構	4
2.1.2 影像視覺系統	9
2.1.3 實驗場地與設施	11
2.2研究任務	12
第三章 影像視覺辨識與處理	13
3.1影像視覺辨識與處理內容	13
3.2邊緣偵測與霍夫轉換	14
3.3感興趣區域選取與影像特徵點之擷取	16
第四章 類神經網路於樓梯之影像定位	18
4.1概述類神經網路	18
4.2 2.5D類神經網路建模	19
4.3 類神經網路之誤差分析	24
第五章 人形機器人爬樓梯之導引策略	30
5.1 樓梯高度與深度之估測	30
5.2目標樓梯的距離與方位之計算	33
5.3人形機器人步態之規劃與導引策略	35
5.4爬樓梯動作內插介紹	39
第六章 實驗結果與討論	44
6.1實驗準備	44
6.2實驗結果	46
第七章 結論與未來展望	50
7.1結論	50
7.2未來展望	51
參考文獻	52

圖目錄
圖2.1、人形機器人實際完成圖	3
圖2.2、人形機器人系統架構圖	4
圖2.3、人形機器人四肢及身體自由度示意圖	4
圖2.4、AX-12伺服馬達	5
圖2.5、RX-28伺服馬達	6
圖2.6、RX-64伺服馬達	6
圖2.7、嵌入系統RB-100	7
圖2.8、人機介面程式	8
圖2.8、嵌入式單板電腦 PICO-820	9
圖2.9、Microsoft LifeCam VX5500 Webcam	10
圖2.10、實驗場地與設施圖	11
圖2.11、爬樓梯任務流程圖	12
圖3.1、樓梯原始擷取影像	13
圖3.2、影像處理流程圖	14
圖3.3、樓梯的彩色影像與轉灰階後之影像	14
圖3.4、Canny邊緣偵測後影像	15
圖3.5、使用霍夫轉換取直線後影像	16
圖3.6、與代表樓梯線段交會之直線影像	17
圖3.7、特徵點擷取之影像	17
圖4.1、多層感知器類神經網路訓練模組示意圖	20
圖4.2、多層感知器類神經網路架構圖	21
圖4.3、高度4公分垂直平面的多層感知器類神經網路訓練結果圖	21
圖4.4、多層感知器類神經網路訓練模組示意圖	23
圖4.5、垂直平面的輸出結果與原始位置比較圖	24
圖4.6、垂直平面的輸出誤差分析圖	25
圖4.7、高度4公分水平平面的輸出結果與原始位置比較圖	26
圖4.8、高度4公分水平平面的輸出誤差分析圖	26
圖4.9、高度3公分水平平面的輸出結果與原始位置比較圖	27
圖4.10、高度3公分水平平面的輸出誤差分析圖	27
圖4.11、高度2公分水平平面的輸出結果與原始位置比較圖	28
圖4.12、高度2公分水平平面的輸出誤差分析圖	28
圖4.13、樓梯位置的輸出結果與原始位置比較圖	29
圖4.14、樓梯位置的輸出誤差分析圖	29
圖5.1、用為類神經網路輸入之特徵點	30
圖5.2、樓梯高度、深度與階數估測結果	32
圖5.3、目標樓梯與人形機器人間之距離與方向計算示意圖	33
圖5.4、樓梯之位置與角度運算結果	35
圖5.5 機器人爬樓梯動作之步態分析	36
圖5.6 機器人實際爬樓梯動作分解圖	37
圖5.7爬樓梯動作高度與深度的雙線性內插示意圖	39
圖5.8不同的單步動作展示	40
圖5.9連續動作展示	40
圖5.10連續動作的傳輸	41
圖5.11爬樓梯任務執行用之人機介面	42
圖5.12爬樓梯任務用之動作表	43
圖6.1、第一個任務實驗場地設置圖	45
圖6.2、第二個任務實驗場地設置圖	45
圖6.3、實驗任務一 爬兩組階梯實驗結果圖	47
圖6.4、實驗任務一 爬兩組階梯實驗軌跡圖	47
圖6.5、實驗任務二 樓梯相距110公分且角度30度實驗結果圖	49
圖6.6、實驗任務二 樓梯相距110公分且角度30度實驗軌跡圖	49

表目錄
表2.1、AX-12伺服馬達規格	5
表2.2、RX-28伺服馬達規格	6
表2.3、RX-64伺服馬達規格	6
表2.4、嵌入系統RB-100規格	8
表2.5、嵌入式單板電腦 PICO-820規格	10
表2.6、Microsoft LifeCam VX5500 Webcam規格	10
References
[1]	Y. Guan, E. S. Neo, K. Yokoi and K. Tanie, “Stepping over obstacles with humanoid robots,” IEEE Trans. Robotics, vol. 22, no. 5, pp. 958-973, Oct. 2006.
[2]	E. S. Neo, K. Yokoi, S. Kajita and K. Tanie, “Whole-body motion generation integrating operator’s intention and robot’s autonomy in controlling humanoid robots,” IEEE Trans. Robotics, vol. 23, no. 4, pp. 763- 775, Aug. 2007.
[3]	D. Xu, Y. F. Li, M. Tan and Y. Shen, “A new active visual for humanoid robots,” IEEE Trans. Syst. Man & Cyber., Part B, vol. 38, no. 2, pp. 320-330, Apr. 2008.
[4]	G. Arechavaleta, J. P. Laumond, H. Hicheur, and A. Berthoz, “An optimality principle governing human walking,” IEEE Trans. Robotics, vol. 24, no. 1, pp. 5-14, Feb. 2008.
[5] L. Montesano, M. Lopes, A. Bernardino, and Jos′e Santos-Victor, “Learning object affordances: from sensory–motor coordination to imitation,” IEEE Trans. Robotics, vol. 24, no. 1, pp. 15-264, Feb. 2008.
[6] E. Yoshida, C. Esteves, I. Belousov, J. P. Laumond, T. Sakaguchi and K. Yokoi, “Planning 3-D collision-free dynamic robotic motion through iterative reshaping,” IEEE Trans. Robotics, vol. 24, no. 3, pp. 1186-1197, Oct. 2008.
[7] C. Chevallereau, J. W. Grizzle and C. L. Shih, “Asymptotically stable walking of a five-link underactuated 3-D bipedal robot”, IEEE Trans. Robotics, vol. 25, no. 1, pp. 37-50, Feb. 2009.
[8] M. Sonka, V. Hiavac and R. Boyle, Image Processing, Analysis, and Machine Vision, 3rd Ed., Cengage Learning, 2008.
[9]	C. L. Hwang, N. W. Lu, T. C. Hsu and C. H. Huang, “Penalty kick of a humanoid robot by a neural- network-based active embedded vision system,” IEEE Int. Conf. of SCIE, pp. 2291-2299, Taipei, Taiwan, Aug. 18-21, 2010.
[10] S. Behnke, M. Schreiber, J. Stuckler, R. Renner and H. Strasdat, “See, walk, and kick: Humanoid robots start to play soccer,” The 6th IEEE RAS Int. Conf. on Humanoid Robot, pp. 497-503, Dec. 4th~6th, 2006.
[11] Z. Chen and M. Hemami, “Sliding mode control of kicking a soccer ball in the sagittal plane,” IEEE Trans. Syst. Man & Cybernectics, Part A, vol. 37, no. 6, Nov. 2007.
[12] C. M. Chang, M. F. Lu, C. Y. Hu, S. W. Lai, S. H. Liu, Y. T. Su, and T. H. S. Li, “Design and implementation of penalty kick function for small-sized humanoid robot by using FPGA,” IEEE International Conference on Advanced Robotics and its Social Impacts, Taipei, Taiwan, Aug. 23-25, 2008.
[13] T. H. S. Li, Y.T. Su, C. H. Kuo, C. Y. Chen, C. L. Hsu, and M. F. Lu, “Stair-climbing control of humanoid robot using force and accelerometer sensors,” SICE Annual Conference, Kagawa University, Japan, pp. 2115-2200, Sept. 17-20, 2007, 
[14] C. Fu and K. Chen, “Gait synthesis and sensory control of stair climbing for a humanoid robot,” IEEE Trans. Ind. Electronics, vol. 55, no. 5, pp. 2111-2120, May 2008.
[15] W. Samakming and J. Srinonchat, “Development image processing technique for climbing stair of small humanoid robot,” International Conference on Computer Science and Information Technology, pp. 616-619, 2008
[16]	P. Michelt, J. Chestnutt, S. Kagami, K. Nishiwaki, J. Kuffnert and T. Kanadet, “Humanoid navigation planning using future perceptive capability,” 8th IEEE-RAS International Conference on Humanoid Robots, Daejeon, Korea, pp. 507-514, Dec. 1-3, 2008.
[17] S. Bi, H. Min, Q. Liu and X. Zheng, “Multi-objective optimization for a humanoid robot climbing stairs based on genetic algorithms,” IEEE International Conference on Information and Automation, Zhuhai/ Macau, China , pp. 66-71, June 22 -25, 2009, 
[18]	G. Chen, M. Xie, Z. Xia, L. Sun, J. Ji, Z. Du, W. Lei, “Fast and accurate humanoid robot navigation guided by stereovision,” IEEE International Conference on Mechatronics and Automation, Changchun, China, pp. 1910-1915, August 9 - 12, 2008.
[19] J. Lin, J. Chang, S.M. Lyu, S.W. Wang and Y.W. Lin, “Locomotion control of a biped robot for stair-climbing by fuzzy stabilization tuning approach,” IEEE International Conference on Control Applications, Yokohama, Japan, pp. 1590-1595, September 8-10, 2010.
[20]	A. Hornung, K. M. Wurm and M. Bennewitz, “Humanoid robot localization in complex indoor environments,” IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, pp. 1690-1695, October 18-22, 2010.
[21]	T. Sato, S. Sakaino, E. Ohashi and K. Ohnishi, “Walking trajectory planning on stairs using virtual slope for biped robots,” IEEE Trans. Ind. Electronics, vol. 58, no. 4, pp. 1385-1396, Apr. 2011.
[22]	C. L. Hwang, C. H. Huang, Y. J. Chou and N. W. Lu, “Human-machine interfaces for the locomotion control of a humanoid robot”, IEEE Trans. Syst. Man & Cybern., Pt. A, May 2011 (revision).
Terms of Use
Within Campus
On-campus access to my hard copy thesis/dissertation is open immediately
Agree to authorize disclosure on campus
Release immediately
Outside the Campus
I grant the authorization for the public to view/print my electronic full text with royalty fee and I donate the fee to my school library as a development fund.
Release immediately
 Top

If you have any questions, please contact us!

Library: please call (02)2621-5656 ext. 2487 or email