§ 瀏覽學位論文書目資料
  
系統識別號 U0002-2107201416200800
DOI 10.6846/TKU.2014.00834
論文名稱(中文) 基於虛擬實境技術的無人航空載具人機介面設計與實作
論文名稱(英文) Implementation and Design of an Unmanned Aerial Vehicle Human-Machine Interface using Virtual Reality Environment
第三語言論文名稱
校院名稱 淡江大學
系所名稱(中文) 電機工程學系碩士班
系所名稱(英文) Department of Electrical and Computer Engineering
外國學位學校名稱
外國學位學院名稱
外國學位研究所名稱
學年度 102
學期 2
出版年 103
研究生(中文) 唐銘陽
研究生(英文) Ming-Yang Tang
學號 601460222
學位類別 碩士
語言別 英文
第二語言別
口試日期 2014-05-29
論文頁數 40頁
口試委員 指導教授 - 劉寅春
委員 - 江東昇
委員 - 邱謙松
關鍵字(中) 人機介面
無人航空載具
虛擬實境
航空攝影
關鍵字(英) human machine interface
unmanned aerial vehicle
virtual reality
aerial photography
第三語言關鍵字
學科別分類
中文摘要
本研究動機源於現今UAV已成為低成本航空攝影的主要選擇。而UAV的人機介面通常是針對駕駛員而非任務專家所設計的,這使得任務的進度不容易被監控。因此一般的影像蒐集做法是採用相機的連拍模式,拍攝所有經過地區的影像。假設相機以1Hz的頻率拍攝,一個15分鐘的飛行任務約會收集900張照片。這個作法會造成要選擇合適的相片相當困難。也因此目前航空攝影任務的效率還有空間改善。
  本研究提出一個以虛擬實境技術建立的UAV人機介面,以視覺化的資訊來增強使用者對任務進度的認知。為了在介面中標示出相機的視野,本文提出一個以幾何特性來計算的即時視野標示法。接著,以一個視覺化流程來將UAV呈現在介面之中,以簡單的3D圖形來讓使用者能容易的判斷UAV的位置與姿態。最後,透過一個近似的涵蓋率計算法來增強使用者對任務需求的認知。
  本研究設計了一個實驗性的航空攝影任務,用以評估所提出介面的使用性。經過兩次相同條件的實驗,我們發現平均的任務執行時間以及所消耗的影像張數都有所改善。這證明了本研究所提出方法將可有效的提升UAV航空攝影任務的效率。此方法可廣泛的應用於不同的飛行控制器系統,或實作為一個獨立的控制器。
英文摘要
The motivation of this research comes from the Unmanned Aerial Vehicle (UAV) has been mainly used for low-costing aerial photography mission. The user interface of the UAV us usually designed for the pilot, the interface for mission specialist however have not been established. This makes it hard for the mission specialist to monitoring the progress of the mission. Typically, the image capturing method is to use the burst mode of the camera, shooting all positions passed by. A single 15 minutes flight could collect about 900 images if camera captures at 1 Hz. The setback of this method is over coverage and difficult to choose the proper image because there were too many images. Hence, there is still plenty room for enhancing the efficiency of the task.
This study proposes an UAV Human Machine Interface (HMI) using virtual reality technology. Improving the cognition of user by visualization information. We present a real time Field of View (FOV) marking methodology calculating by the characteristic of geometry. Then, we use a visualization flow to render a virtual UAV in the HMI for the purpose of easy to cognitive the status of UAV. After that, we implemented a coverage percentage calculation by using approximation method for assisting the cognition of the task’s requirement.
Finally, we design an experimental aerial photography mission for evaluating the usability of proposed interface. After two experiments in the same scenario, we found that the average performance time and the numbers of captured images were improved by using this interface. The result indicates that the proposed method could improve the task efficiently. Also, this method could be widely used in different open sourced flight control system, or implement as an independent controller.
第三語言摘要
論文目次
Contents

Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . III
List of Figures . . . . . . . . . . . . . . . . . . . . . . . . . V
List of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . VI
1 INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Research Background . . . . . . . . . . . . . . . . 1
1.2 Literature Review . . . . . . . . . . . . . . . . . . . 3
1.3 Problem Statement and Motivations . . . 4
1.4 Organization of Thesis . . . . . . . . . . . . . . . 6
2 SYSTEM ARCHITECTURE OF UAV IN AERIAL PHOTOGRAPHY MISSION . . 7
2.1 Architecture of UAV . . . . . . . . . . . . . . . . . . 8
2.1.1 Hardware and electro-mechanics components . . 8
2.1.2 Architecture of flight controller . . . . . . . . . . . . . . . 9
2.2 Architecture of Ground Control Station . . . . . . . . . 12
3 IMPLEMENTATION METHODOLOGY . . . . . . . . . . . . . . 14
3.1 Real Time Field of View Marking Methodology . . . 14
3.2 Graphics Rendering Engine . . . . . . . . . . . . . . . . . . . 15
3.3 Arrangement of Proposed Interface . . . . . . . . . . . . 15
3.3.1 Layer 1: The bottom satellite map . . . . . . . . . . . . 17
3.3.2 Layer 2: Marking the real time field of view . . . . 19
3.3.3 Layer 3: Present method of UAV . . . . . . . . . . . . . . 22
3.4 Coverage Percentage Calculation Algorithm . . . . . . 25
4 IMPLEMENTATION RESULTS . . . . . . . . . . . . . . . . . . . . . 27
4.1 Implementation Platform . . . . . . . . . . . . . . . . . . . . . 27
4.2 Experiment Environment . . . . . . . . . . . . . . . . . . . . . 27
4.3 Experiment Results . . . . . . . . . . . . . . . . . . . . . . . . . . 28
5 CONCLUSIONS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
5.1 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
References 37

List of Figures

1.1 (a) Manned aircraft crash landed on a building. (b) Unmanned aircraft crash
landed on a grass. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Case study: an aerial photography mission in a debris flow area. . . . . . . 2
1.3 Model of response time impacts. . . . . . . . . . . . . . . . . . . . . . . . 4
1.4 Screenshots of (a) Antennas and Cameras Pointing Interface, (b) Mission
Planner, (c) QGroundControl and (d) Multiwii WinGUI. . . . . . . . . . . 5
1.5 A representation of human machine interaction for (a) novel user and (b)
expert user. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.1 The UAV system for aerial photography mission. . . . . . . . . . . . . . . 7
2.2 Hardware architecture of the UAV. . . . . . . . . . . . . . . . . . . . . . . 8
2.3 System architecture of the Flight controller. . . . . . . . . . . . . . . . . . 10
2.4 System architecture of ground control station. . . . . . . . . . . . . . . . . 12
2.5 A Multiwii packet to (a) ground station and (b) flight controller. . . . . . . 13
3.1 Real time FOV marking methodology. . . . . . . . . . . . . . . . . . . . . 15
3.2 General rendering skill: transformation of (a) rotation, translation and (b)
translation, rotation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.3 The layout of proposed interface. . . . . . . . . . . . . . . . . . . . . . . . 17
3.4 Different map providers could provide different map quality in same place. 18
3.5 The flow of fetching map image. . . . . . . . . . . . . . . . . . . . . . . . 18
3.6 The flow of loading map’s image and information. . . . . . . . . . . . . . 19
3.7 The frustum of camera mounted on UAV. . . . . . . . . . . . . . . . . . . 20
3.8 Algorithms: FOV Calculation. . . . . . . . . . . . . . . . . . . . . . . . . 20
3.9 Algorithms: Distance in V-World. . . . . . . . . . . . . . . . . . . . . . . 21
3.10 Marking captured FOV at V-World. . . . . . . . . . . . . . . . . . . . . . 21
3.11 Algorithms: Image Add. . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.12 The flow of visualization 3D model in OpenGL environment. . . . . . . . . 22
3.13 Editing the 3D model of UAV in Solidworks. . . . . . . . . . . . . . . . . 22
3.14 Separating the movable part of UAV into different file. . . . . . . . . . . . 23
3.15 (a) STL file format 3D model data structure. (b) OpenGL triangle rendering
syntax. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
3.16 Coordinating GPS position to V-World. . . . . . . . . . . . . . . . . . . . 24
3.17 Algorithms: GPS to V-World. . . . . . . . . . . . . . . . . . . . . . . . . 25
3.18 Coverage percentage calculation method. . . . . . . . . . . . . . . . . . . 25
3.19 Algorithms: Get Coverage Percentage. . . . . . . . . . . . . . . . . . . . . 26
4.1 The mission scope for evaluating the usability of proposed interface. . . . . 27
4.2 The implementation result of real time FOV in different altitude. . . . . . . 29
4.3 The implementation result of real time FOV in different position. . . . . . . 30
4.4 The implementation result of different rotated coverage calculation. . . . . 31
4.5 The implementation result of different altitude coverage calculation. . . . . 32
4.6 The implementation result of realistic virtual UAV. . . . . . . . . . . . . . 33

List of Tables

4.1 Usability of the proposed interface . . . . . . . . . . . . . . . . . . . . . . 34
參考文獻
[1] J.-F. Bian. (2014, April) The ah-64 apache helicopters crash into residential building. [Online]. Available: http://www.cna.com.tw/topic/popular/4420-4/201404250167-1.aspx
[2] J. M. Peschel and R. R. Murphy, “On the human–machine interaction of unmanned aerial system mission specialists,” IEEE Transactions on Human-Machine Systems, vol. 43, no. 1, pp. 53–62, 2013.
[3] D. Erdos, A. Erdos, and S. E. Watkins, “An experimental uav system for search and rescue challenge,” IEEE Aerospace and Electronic Systems Magazine, vol. 28, no. 5, pp. 32–37, 2013.
[4] P. L. Jui-Yu Chang, Shih-Chung Kang, “Implementation of uav aerial photography 3-d mapping of a debris flow area: Post morakot disaster assessment,” presented at the International Conference on Automation Science and Engineering, Taipei, 2014.
[5] Y. Lin, J. Hyyppa, and A. Jaakkola, “Mini-uav-borne lidar for fine-scale mapping,” IEEE Geoscience and Remote Sensing Letters, vol. 8, no. 3, pp. 426–430, 2011.
[6] A. G. Shem, T. A. Mazzuchi, and S. Sarkani, “Addressing uncertainty in uav navigation decision-making,” IEEE Transactions on Aerospace and Electronic Systems, vol. 44, no. 1, pp. 295–313, 2008.
[7] P. Sujit and D. Ghose, “Search using multiple uavs with flight time constraints,” IEEE Transactions on Aerospace and Electronic Systems, vol. 40, no. 2, pp. 491–509, 2004.
[8] M. Schwager, B. J. Julian, M. Angermann, and D. Rus, “Eyes in the sky: Decentralized control for the deployment of robotic camera networks,” Proceedings of the IEEE, vol. 99, no. 9, pp. 1541–1561, 2011.
[9] T. M. Lam, H. W. Boschloo, M. Mulder, and M. M. van Paassen, “Artificial force field for haptic feedback in uav teleoperation,” IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans, vol. 39, no. 6, pp. 1316–1330, 2009.
[10] H. I. Son, A. Franchi, L. L. Chuang, J. Kim, H. H. Bulthoff, and P. R. Giordano, “Human-centered design and evaluation of haptic cueing for teleoperation of multiple mobile robots,” IEEE Transactions on Cybernetics, vol. 43, no. 2, pp. 597–609, 2013.
[11] D. Lee, A. Franchi, H. I. Son, C. Ha, H. Bulthoff, and P. R. Giordano, “Semiautonomous haptic teleoperation control architecture of multiple unmanned aerial vehicles,” IEEE/ASME Transactions on Mechatronics, vol. 18, no. 4, pp. 1334–1345, 2013.
[12] D. B. Ramos, D. S. Loubach, and A. M. da Cunha, “Developing a distributed real-time monitoring system to track uavs,” in Digital Avionics Systems Conference, 2008. DASC 2008. IEEE/AIAA 27th. IEEE, 2008, pp. 4–C.
[13] J. A. Jacko and G. SALVENDY, “Hierarchical menu design: Breadth, depth, and task complexity,” Perceptual and Motor Skills, vol. 82, no. 3c, pp. 1187–1201, 1996.
[14] K. Tantisevi and B. Akinci, “Simulation-based identification of possible locations for mobile cranes on construction sites,” Journal of computing in civil engineering, vol. 22, no. 1, pp. 21–30, 2008.
[15] B. Koo and M. Fischer, “Feasibility study of 4d cad in commercial construction,” Journal of construction engineering and management, vol. 126, no. 4, pp. 251–260, 2000.
[16] M. Golparvar-Fard, F. Pena-Mora, and S. Savarese. (2009) Application of d4ar–a 4-dimensional augmented reality model for automating construction progress monitoring data collection, processing and communication.
[17] D. A. McAffee, E. R. Snow, and W. T. Townsend, “Force reflecting hand controller,” USA Patent 5 193 963, Mar. 16, 1993.
[18] J. Kofman, X. Wu, T. J. Luu, and S. Verma, “Teleoperation of a robot manipulator using a vision-based human-robot interface,” IEEE Transactions on Industrial Electronics, vol. 52, no. 5, pp. 1206–1219, 2005.
[19] M.-Y. T. Peter Liu, “Robotic arm human-machine interface implementation integrating with 3d reality environment,” in The 21st National Conference on Fuzzy Theory and Its Applications, 2013.
[20] B. Shneiderman, “Response time and display rate in human performance with computers,” ACM Computing Surveys (CSUR), vol. 16, no. 3, pp. 265–285, 1984.
[21] L. Meier. (2014, May) Screenshots of qground version 1.0.9. [Online]. Available: http://www.qgroundcontrol.org/screenshots
[22] N. Staggers, “Impact of screen density on clinical nurses’ computer task performance and subjective screen satisfaction,” International Journal of Man-Machine Studies, vol. 39, no. 5, pp. 775–792, 1993.
[23] K. L. Norman, The psychology of menu selection: Designing cognitive control at the human/computer interface. Intellect Books, 1991.
[24] A. Dubus. (2014, May) Multiwii project. [Online]. Available: http://www.multiwii.com/
[25] N. Michael, D. Mellinger, Q. Lindsey, and V. Kumar, “The grasp multiple micro-uav testbed,” IEEE Robotics & Automation Magazine, vol. 17, no. 3, pp. 56–65, 2010.
[26] R. S. Wright and B. Lipchak, OpenGL Superbible. Sams, 2004.
[27] radioman. (2014, May) Gmap.net project. [Online]. Available: http://greatmaps. codeplex.com/
[28] Google. (2013, November) Google maps/google earth apis terms of service. [Online]. Available: https://developers.google.com/maps/terms#section_9_1
[29] J. Zika. (2012, May) Waterproof quadrotor. [Online]. Available: https://grabcad.com/library/waterproof-quadrotor
[30] J. Nielsen. (2000) Why you only need to test with 5 users. http://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/.
論文全文使用權限
校內
校內紙本論文立即公開
同意電子論文全文授權校園內公開
校內電子論文立即公開
校外
同意授權
校外電子論文立即公開

如有問題,歡迎洽詢!
圖書館數位資訊組 (02)2621-5656 轉 2487 或 來信