淡江大學覺生紀念圖書館 (TKU Library)
進階搜尋


下載電子全文限經由淡江IP使用) 
系統識別號 U0002-0308202018413500
中文論文名稱 利用二維姿態判斷人類的行為意向
英文論文名稱 Using 2D Pose Estimation Determine Behavior Intention of Human
校院名稱 淡江大學
系所名稱(中) 電機工程學系機器人工程碩士班
系所名稱(英) Master’s Program In Robotics Engineering, Department Of Electrical And Computer Engineering
學年度 108
學期 2
出版年 109
研究生中文姓名 謝宗霖
研究生英文姓名 Tsung-Lin Hsieh
學號 607470068
學位類別 碩士
語文別 英文
口試日期 2020-07-08
論文頁數 67頁
口試委員 指導教授-劉寅春博士
委員-劉智誠博士
委員-邱謙松博士
中文關鍵字 自駕車  OpenPose  行為意向  Kmeans聚類 
英文關鍵字 Autonomous Vehicle  OpenPose  Behavior intention  Kmeans clustering 
學科別分類
中文摘要 本篇論文是以自駕車之環境感知來做研究方向,環境感知就像無人車的
眼睛,車輛在行駛間必須快速、即時、精準的獲取車輛周遭環境的資訊。
臺灣相對於其他國家而言,有高度複雜的交通環境(例如人、機車、汽
車、自行車高度混流的行車型態),在發生自行車突出行人、或者機車
鑽車道發生慘劇的新聞屢見不鮮。因此本文以台灣的路況作為研究的出
發點,運用影像捕捉行人、機車騎士、自行車騎士等姿態,使用姿態盼
人體骨架、透過姿態的改變,讓系統能夠提前知道每個人的行為意向。
本篇論文中,提出利用關節間的姿態來預先判斷行人、機車騎士、自行
車騎士的行為意向,我們使用 Kmeans 聚類能夠在大數據很有效率的標
記出人的姿態意向,而在此基礎上,確認出行人、機車騎士、自行車騎
士左轉、右轉分類有效之角度外,我們分類出直行、轉彎姿態中的新的
行為意向,定義為準備轉彎,因此認定該方法為有效。
英文摘要 The research direction of this paper is the environmental perception of self-driving. Environmental perception is like the eye of a driverless car. The vehicle must acquire information about the surrounding environment quickly, instantly and accurately during driving. Compared with other countries, Taiwan has a highly complex traffic environment (for example, people, motorcycles, cars and bicycles are highly mixed), and it is not uncommon to hear stories of tragic accidents in which bicycles stand out from pedestrians or motorcycles drill through the lanes. Therefore, taking the road conditions in Taiwan as the starting point of the research, this paper uses images to capture the posture of pedestrians, motorcyclists and bicyclists, and uses the posture to look forward to the human body skeleton, so that the system can know everyone's behavior intention in advance through the posture changes.
In this paper, using the joint between the attitude to prejudge the behavior intention of pedestrian, motorcycle, bicycle riders, we can use the Kmeans clustering in large data is very efficient to mark the attitude of the intention, and on this basis, the confirmation pedestrians, motorcyclists, and bicyclists turn left, turn right classification from the perspective of effective, we sorted out straight, posture in the new behavior intention of turning, defined as to turn, so that the method is effective.
論文目次 Contents
Acknowledgement I
Abstract in Chinese II
Abstract in English III
Contents IV
List of Figures VII
List of Tables X
Chapter 1 Introdution 1
1.1 Background 1
1.1.1 Autonomous Vehicle 2
1.2 Research Motivation 5
1.3 Structure Diagram 5
Chapter 2 Behavior intention processing 7
2.1 Problem Statement 7
2.2 Experimental Condition 9
2.3 OpenPose 10
2.4 Body Keypoints 11
2.4.1 Angle Between Keypoints 12
2.5 The Data Pre­processing Procedure 13
2.6 The Data Pre­processing Procedure Results 16
2.6.1 Motorcyclists Right Perspective Processing Results 18
2.6.2 Motorcyclists Left Perspective Processing Results 23
2.6.3 Bicyclists Right Perspective Processing Results 28
2.6.4 Bicyclists Left Perspective Processing Results 33
2.6.5 Pedestrians Processing Results 38
Chapter 3 Kmeans Clustering 43
3.1 Determine Behavior Intentions 43
3.2 The Optimal Number In The Elbow Method 46
3.3 The Optimal Number In The Silhouette Coefficient 47
3.4 Motorcyclists Turning Right Data Determination 48
3.5 Motorcyclists Turning Left Data Determination 49
3.5.1 Motorcyclists Behavior Intention Determination Result 50
3.6 Bicyclists Turning Right Data Determination 51
3.7 Bicyclists Turning Left Data Determination 52
3.7.1 Bicyclists Behavior Intention Determination Result 53
3.7.2 Pedestrians Behavior Intention Determination Result 53
3.8 Pedestrians Walking Data Determination 54
Chapter 4 Experiment Result 55
4.1 Experiment 1 56
4.2 Experiment 2 59
Chapter 5 Conclusion and Future work 65
References 66

List of Figures
Figure 1.1 Autonomous 3
Figure 1.2 Paper structure diagram 6
Figure 2.1 Situation 1 ­ complex road conditions 8
Figure 2.2 Situation 2 ­ multiple pedestrians behavior intention determination 8
Figure 2.3 Experimental Condition 10
Figure 2.4 OpenPose Body joint 11
Figure 2.5 The back of a person represents the body keypoints 12
Figure 2.6 The data preprocessing procedure 15
Figure 2.7 Motorcyclist state contrast in the right perspective (head region) 18
Figure 2.8 Motorcyclist state contrast in the right perspective (left hand region) 19
Figure 2.9 Motorcyclist state contrast in the right perspective (right hand region) 20
Figure 2.10 Motorcyclist state contrast in the right perspective (left leg region) 21
Figure 2.11 Motorcyclist state contrast in the right perspective (right leg region) 22
Figure 2.12 Motorcyclist state contrast in the left perspective (head region) 23
Figure 2.13 Motorcyclist state contrast in the left perspective (left hand region) 24
Figure 2.14 Motorcyclist state contrast in the left perspective (right hand region) 25
Figure 2.15 Motorcyclist state contrast in the left perspective (left leg region) 26
Figure 2.16 Motorcyclist state contrast in the left perspective (right leg region) 27
Figure 2.17 Bicyclist state contrast in the right perspective (head region) 28
Figure 2.18 Bicyclist state contrast in the right perspective (left hand region) 29
Figure 2.19 Bicyclist state contrast in the right perspective (right hand region) 30
Figure 2.20 Bicyclist state contrast in the right perspective (left leg region) 31
Figure 2.21 Bicyclist state contrast in the right perspective (right leg region) 32
Figure 2.22 Bicyclist state contrast in the left perspective (head region) 33
Figure 2.23 Bicyclist state contrast in the left perspective (left hand region) 34
Figure 2.24 Bicyclist state contrast in the left perspective (right hand region) 35
Figure 2.25 Bicyclist state contrast in the left perspective (left leg region) 36
Figure 2.26 Bicyclist state contrast in the left perspective (right leg region) 37
Figure 2.27 Pedestrian state contrast (head region) 38
Figure 2.28 Pedestrian state contrast (left hand region) 39
Figure 2.29 Pedestrian state contrast (right hand region) 40
Figure 2.30 Pedestrian state contrast (left leg region) 41
Figure 2.31 Pedestrian state contrast (right leg region) 42
Figure 3.1 The Kmeans clustering Procedure 45
Figure 3.2 Kmeans Clustering ­ The optimal number(The Elbow Method) 46
Figure 3.3 Kmeans Clustering ­ The optimal number(The Silhouette Coefficient) 47
Figure 3.4 Kmeans Clustering ­ Motorcyclist turning right data determination 48
Figure 3.5 Kmeans Clustering ­ Motorcyclist turning left data determination 49
Figure 3.6 Motorcyclists Data Determination 50
Figure 3.7 Kmeans Clustering ­ Bicyclist turning right data determination 51
Figure 3.8 Kmeans Clustering ­ Bicyclist turning left data determination 52
Figure 3.9 Bicyclists Data Determination 53
Figure 3.10 Kmeans Clustering ­ Bicyclist turning left data determination 54
Figure 4.1 complex road conditions(0 second ­ frame 0) 56
Figure 4.2 complex road conditions(0.27 second ­ frame 8) 57
Figure 4.3 complex road conditions(0.53 second ­ frame 16) 57
Figure 4.4 complex road conditions(0.8 second ­ frame 24) 58
Figure 4.5 complex road conditions(1.23 second ­ frame 37) 58
Figure 4.6 multiple pedestrians behavior intention determination(0 second ­ frame 0) 59
Figure 4.7 multiple pedestrians behavior intention determination(1 second ­ frame 30) 60
Figure 4.8 multiple pedestrians behavior intention determination(2 second ­ frame 60) 60
Figure 4.9 multiple pedestrians behavior intention determination(3 second ­ frame 90) 61
Figure 4.10 multiple pedestrians behavior intention determination(4 second ­ frame 120) 61
Figure 4.11 multiple pedestrians behavior intention determination(5 second ­ frame 150) 62
Figure 4.12 multiple pedestrians behavior intention determination(6 second ­ frame 180) 62
Figure 4.13 multiple pedestrians behavior intention determination(7 second ­ frame 210) 63
Figure 4.14 multiple pedestrians behavior intention determination(8 second ­ frame 240) 63
Figure 4.15 multiple pedestrians behavior intention determination(9 second ­ frame 270) 64
Figure 4.16 multiple pedestrians behavior intention determination(10 second ­ frame 300) 64

List of Tables
List 2.1 Table of experimental condition 9
List 2.2 The body angle between keypoints 13
List 2.3 Description of motorcyclist’s data 16
List 2.4 Description of bicyclist’s data 17
List 2.5 Description of Pedestrian’s data 17
List 4.1 Table of the complex road conditions behavior intention 56
List 4.2 Table of multiple pedestrians behavior intention determination 59
參考文獻 References
[1] “Wikipedia self driving car.” [Online]. Available: https://en.wikipedia.org/wiki/
Self­driving_car
[2] L. Nardi and C. Stachniss, “User preferred behaviors for robot navigation exploiting
previous experiences,” Robotics and Autonomous Systems, vol. 97, pp. 204 –
216, 2017. [Online]. Available: http://www.sciencedirect.com/science/article/pii/
S0921889017302750
[3] A. T. Schulz and R. Stiefelhagen, “Pedestrian intention recognition using latent­dynamic
conditional random fields,” in 2015 IEEE Intelligent Vehicles Symposium (IV), 2015, pp.
622–627.
[4] J. Han, S. Lee, and J. Kim, “Behavior hierarchy­based affordance map for recognition
of human intention and its application to human–robot interaction,” IEEE Transactions
on Human­Machine Systems, vol. 46, no. 5, pp. 708–722, 2016.
[5] “Sae levels.” [Online]. Available: https://www.sae.org/news/press­room/
2018/12/sae­international­releases­updated­visual­chart­for­its­%E2%80%
9Clevels­of­driving­automation%E2%80%9D­standard­for­self­driving­vehicles
[6] “Autonomous vehicle/self­driving technology exploration.” [Online]. Available: https://ictjournal.itri.org.tw/Content/Messagess/contents.aspx?MmmID=
654304432061644411&MSID=745621454255354636
[7] “Number of road traffic accidents.” [Online]. Available: https://www.npa.gov.tw/
NPAGip/wSite/ct?xItem=80162&ctNode=12902&mp=1
[8] “Death rate.” [Online]. Available: https://group.dailyview.tw/article/detail/705
[9] “Government data.” [Online]. Available: https://www.motc.gov.tw/uploaddowndoc?
file=survey/201710311544141.pdf&filedisplay=105%E5%B9%B4%E6%A9%9F%
E8%BB%8A%E4%BD%BF%E7%94%A8%E7%8B%80%E6%B3%81%E8%AA%
BF%E6%9F%A5%E5%A0%B1%E5%91%8A%28%E5%85%A8%29.pdf&flag=doc
[10] Po­Yu.Chi, “Motor rider intention detecting by fuzzy system in autonomous vehicle
scenario,” pp. 1–8, 2018.
[11] S. Qiao, Y. Wang, and J. Li, “Real­time human gesture grading based on openpose,” in
2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP­BMEI), Oct 2017, pp. 1–6.
[12] Z. Cao, G. Hidalgo, T. Simon, S. Wei, and Y. Sheikh, “Openpose: Realtime
multi­person 2d pose estimation using part affinity fields,” CoRR, 2018. [Online].
Available: http://arxiv.org/abs/1812.08008
[13] G. Hidalgo, Y. Raaj, H. Idrees, D. Xiang, H. Joo, T. Simon, and Y. Sheikh, “Singlenetwork whole­body pose estimation,” 2019.
[14] S. Vinanzi, C. Goerick, and A. Cangelosi, “Mindreading for robots: Predicting intentions via dynamical clustering of human postures,” in 2019 Joint IEEE 9th International
Conference on Development and Learning and Epigenetic Robotics (ICDL­EpiRob),
2019, pp. 272–277.
[15] Z. Fang and A. M. López, “Intention recognition of pedestrians and cyclists by 2d pose
estimation,” IEEE Transactions on Intelligent Transportation Systems, pp. 1–11, 2019.
[16] “Image smoothing.” [Online]. Available: https://zh.wikipedia.org/wiki/%E5%BD%
B1%E5%83%8F%E5%B9%B3%E6%BB%91%E5%8C%96
[17] “Human body parameters and machine tool control device design.” [Online]. Available:
https://www.3d3d.cn/article/rjgx/2007­07­27/1619.html
[18] “The elbow method.” [Online]. Available: https://en.wikipedia.org/wiki/Elbow_
method_(clustering)
[19] “The silhouette coefficient.” [Online]. Available: https://en.wikipedia.org/wiki/
Silhouette_(clustering)
論文使用權限
  • 同意紙本無償授權給館內讀者為學術之目的重製使用,於2020-08-28公開。
  • 同意授權瀏覽/列印電子全文服務,於2020-08-28起公開。


  • 若您有任何疑問,請與我們聯絡!
    圖書館: 請來電 (02)2621-5656 轉 2486 或 來信