系統識別號 | U0002-2406200815181900 |
---|---|
DOI | 10.6846/TKU.2008.00836 |
論文名稱(中文) | 一個以視訊為基礎的多重互動學習平台 |
論文名稱(英文) | A Video-based Learning Platform with Multimodal Interactions |
第三語言論文名稱 | |
校院名稱 | 淡江大學 |
系所名稱(中文) | 資訊工程學系博士班 |
系所名稱(英文) | Department of Computer Science and Information Engineering |
外國學位學校名稱 | |
外國學位學院名稱 | |
外國學位研究所名稱 | |
學年度 | 96 |
學期 | 2 |
出版年 | 97 |
研究生(中文) | 張漢賓 |
研究生(英文) | Han-Bin Chang |
學號 | 893190073 |
學位類別 | 博士 |
語言別 | 英文 |
第二語言別 | |
口試日期 | 2008-06-12 |
論文頁數 | 84頁 |
口試委員 |
指導教授
-
許輝煌
委員 - 施國琛 委員 - 趙榮耀 委員 - 廖弘源 委員 - 楊錦潭 |
關鍵字(中) |
互動式視訊 手勢 數位電視 位置資訊 學習平台 |
關鍵字(英) |
Interactive Video Gesture Digital TV Location Information Learning Platform |
第三語言關鍵字 | |
學科別分類 | |
中文摘要 |
互動式視訊可以為使用者帶來不同的瀏覽經驗。藉由相關技術,視訊可以被強化以吸引學生的注意。本論文中提出了一個互動式視訊的課程平台,課程提供者可以藉由課程編輯程式來製作互動式的視訊課程。視訊會被切割成較小的片段並且加入互動物件及多媒體資訊以強化課程的多樣性。使用者可以利用撥放程式來存取互動式視訊以及透過互動來獲得課程中的多媒體資訊。 多種裝置被應用在於擷取使用者的互動。鍵盤、滑鼠、遙控器及網路攝影機都是本論文所提出系統採用的輸入裝置。鍵盤滑鼠可用於一般的電腦環境,使用者可以操作課程編輯系統及播放程式而無須再適應。另一個整合遙控器的子程式則是用來模擬使用者觀看電視的行為。網路攝影機用來擷取使用者的手勢以提供更直覺化的互動方式。相關的數位電視技術亦被採用以發展以數位電視為基礎的互動平台。 藉由整合全球定位系統元件到本論文提出之系統,學生可以在不同地點獲得相關的資訊。特定場所會呈現出不同的多媒體資訊。事先設定好的活動、事件以及介紹等,都會以多媒體的方式呈現給學生。 |
英文摘要 |
Interactive video was developed to enable different video experiences to users. With related technologies, video materials can be augmented to attract students. An interactive video course platform integrated with location information is proposed in this dissertation. Content providers can produce interactive video courses by using the interactive video authoring tool. Video files will be cut into several smaller segments called scenes as the basic unit in a video course. Interactive objects and multimedia resources can also be inserted to video courses to enhance variety. Users can play interactive video via the proposed playback platform. Interactions, messages and other multimedia objects will be displayed in scenes when users browse video courses. Multiple devices are used for capturing users’ interactions. Keyboard, mouse, remote control and web camera are used as input devices in the proposed system. Keyboard and mouse are used in ordinary computer environment; users can operate the authoring tool and playback system as other programs on personal computers. A proposed sub application simulates the behavior of users watching TV programs with remote control. Web camera captures gestures of users to provide intuitional interactions. Specifications of digital TV technologies like DVB and MHP are also used for implementing a platform for evaluating user’s learning performance on TV. By integrating with global positioning system component to this proposed system, students can have experience of taking information in different locations. Multimedia information will be presented in specified locations. With users’ location information, arranged activities, events and introduction will be presented to students in multimedia forms. |
第三語言摘要 | |
論文目次 |
Table of Contents List of Figures III List of Tables V Chapter 1 Introduction 1 1.1 Motivation 1 1.2 Interactive Video and TV 2 1.3 Browsing Learning Materials on Digital TV 4 1.4 Location Awareness and Smart Space Issue 6 1.5 Organization of this Dissertation 8 Chapter 2 Related Work 9 2.1 Video-Based Interactive Systems 9 2.2 Interactive Learning Platform 10 2.3 Digital TV Applications 12 2.3.1 Back-end Services of a Digital TV Framework 13 2.4 Location-based Applications 15 Chapter 3 System Architecture 18 3.1 System Overview 18 3.2 Structure of an Interactive Video Course 21 3.3 Interaction and Feedback Design 24 3.3.1 Provided Interactions in this Platform 25 3.3.2 Provided Feedbacks in this Platform 30 3.3.3 Interactive Objects 32 3.4 Interaction and Feedback Processing 34 3.4.1 Remote Control Key Mapping 34 3.4.2 Gesture Processing 36 3.4.3 Processing Location information 39 3.4.4 Hyperlinks of Video and Web Pages 45 3.5 Applied Digital TV Technologies 46 Chapter 4 Implementation 51 4.1 Overview 51 4.2 Interactive Video Courses Authoring Tool 52 4.3 Interactive Video Courses Player 55 4.4 Interaction with Gesture and Remote Control 58 4.4.1 Interaction with Remote Control 58 4.4.2 Interaction Using Gesture 60 4.4 Interaction with Users’ Locations 61 4.5 Interaction with Learning Management Server 63 4.6 Application on MHP Platform 66 Chapter 5 Experiments and Results 69 5.1 Evaluating Environment 69 5.2 Experimental Results and Analysis 73 Chapter 6 Conclusion and Future Works 79 Bibliography 81 List of Figures Figure 1 Multi-modal hyper-interactive video viewing 4 Figure 2 Flowchart of delivering multimedia information to users 7 Figure 3 Flowchart of authoring interactive courses 19 Figure 4 Input sources to authoring tool and interactive Video player 20 Figure 5 Communicating Authoring tool and interactive video player with XML manifest file 23 Figure 6 Illustration interactive objects in a video scene 26 Figure 7 Flowchart of making interaction by remote control 27 Figure 8 Flowchart of selecting objects and returning to video scenes 28 Figure 9 Corresponding events in different locations 30 Figure 10 Common remote control 35 Figure 11 Separated keyboard 36 Figure 12 Result of noise reduction (a) before and (b) after the noise reduction 38 Figure 13 Gestures: folding and unfolding 39 Figure 14 Flowchart of adding location information to video scenes. 40 Figure 15 Querying locations using google map 41 Figure 16 Events triggered in different locations 44 Figure 17 Illustration of middleware in digital TV architecture 48 Figure 18 Architecture of MHP middleware 49 Figure 19 Interactive video authoring tool 53 Figure 20 Assigning a location to a video scene 54 Figure 21 Buttons in the Playback Platform 56 Figure 22 Browsing web pages in the playback platform 57 Figure 23 Interactive objects in the playback platform 57 Figure 24 Workflow of the remote control for the interactive video player 59 Figure 25 IR receiver and remote control 59 Figure 26 Using a remote control to browse interactive video courses. 60 Figure 27 Mouse moving and clicking by gestures 61 Figure 28 Interactive video courses player with location information 63 Figure 29 Interactive video courses player in GPS-free mode 63 Figure 30 Login interface of learning management server 65 Figure 31 Administration of learning management server 66 Figure 32 Login interface of digital TV learning platform 67 Figure 33 Accessing learning materials on digital TV 68 List of Tables Table 1 Tags used in scenes 31 Table 2 Scores of questions 74 Table 3 Scores of questions with percentage 75 Table 4 Scores of questions with accumulated percentage 76 Table 5 Overall questions for interaction with remote control and gesture 77 Table 6 Overall questions for interaction with keyboard and mouse 77 Table 7 Overall questions for interaction with interactive objects 77 Table 8 Overall questions for interaction with digital TV 78 Table 9 Overall questions for interaction with location information 78 |
參考文獻 |
[1]. Chang, H.-B., Hsu, H.-H., Liao, Y.-C., Shih, T. K., and Tang, C.-T, “An Object-Based HyperVideo Authoring System,” in the Proceedings of the Int’l Conf. on Multimedia Expo, June 28-30, 2004. [2]. Hada, Y., Ogata, H., and Yano, Y., “XML-based Video Annotation system for Language Learning Environment,” in Proceedings of the Second International Conference on Web Information Systems Engineering, Vol. 1, Dec. 3-6, 2001. [3]. Sawhney, N., Balcom, D. and Smith, I., “Authoring and navigating video in space and time,” in IEEE Multimedia, Volume 4, Issue 4, Oct.-Dec. 1997. [4]. Correia, P.L. and Pereira, F., “Objective evaluation of video segmentation quality,” in IEEE Transactions on Image Processing, Vol. 12, Issue 2, Feb. 2003. [5]. Oka, K., Sato, Y., and Koike, H., “Real-time tracking of multiple fingertips and gesture recognition for augmented desk interface systems,” in Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition, May 20-21, 2002 [6]. Yang L. and Yunde J., “A robust hand tracking and gesture recognition method for wearablevisual interfaces and its applications,” in Proceedings of the Third International Conference on Image and Graphics, Dec. 18-20, 2004. [7]. Dias, J.M.S., Nande, P., Barata, N., and Correia, A., “OGRE - open gestures recognition engine,” Proceedings of the 17th Brazilian Symposium on Computer Graphics and Image Processing, Oct. 17-20, 2004. [8]. Champion, E. “Meaningful interaction in virtual learning environments” in the Proceedings of the second Australasian conference on Interactive entertainment Nov 2005 [9]. Lumbreras, M.; Sanchez, J. “Hyperstories: a model to specify and design interactive educational stories” in the Proceedings of XVII International Conference of the Chilean 10-15 Nov. 1997 pp. 135-146 [10]. Martinez-Ortiz, I.; Moreno-Ger, P.; Sierra, J.L.; Fernandez-Manjon, B.” Production and Maintenance of Content-Intensive Videogames: A Document-Oriented Approach” in the Proceedings of Third International Conference on Information Technology: New Generations April 2006, pp. 118–123 [11]. Muda, Z.; Mohamed, R.E.K. “Adaptive User Interface Design in Multimedia Courseware” in the Proceedings of Information and Communication Technologies, April 2006, pp. 196-199 [12]. Natvig, L.; Line, S.; “Age of computers: game-based teaching of computer fundamentals” in the Proceedings of ninth SIGCSE conference on Innovation and technology in computer science education June 2004 [13]. Shim, S.S.Y. and Lee, Y. J. “Interactive TV: VoD meets the Internet,” Computer, Volume: 35, Issue: 7, July 2002. [14]. Bing, J.; Dubreuil, J.; Espanol, J.; Julia, L.; Lee, M.; Loyer, M.; Serghine, M.;”MiTV: rethinking interactive TV”, in the Proceedings of Seventh International Conference on Virtual Systems and Multimedia, pp25-27, Oct 2001. [15]. Zhang, L. J.; Chung J. Y.; Liu L. K.; Lipscomb, J.S.; Zhou Q. ”An integrated live interactive content insertion system for digital TV commerce,” in the Proceedings of Fourth International Symposium on Multimedia Software Engineering, pp11-13, Dec 2002. [16]. Cesar, P.; Vierinen, J.; Vuorimaa, P.; “Open graphical framework for interactive TV,” in the Proceedings of Fifth International Symposium on Multimedia Software Engineering, pp10-12 Dec 2003 [17]. ISO/IEC 13818-1 1996 Information technology – Generic coding of moving pictures and associated audio information: Systems. [18]. ISO/IEC 13818-6 1998 Information technology – Generic coding of moving pictures and associated audio 1nformation: Extensions for DSM-CC. [19]. ETSI TS 102 812 V1.2.1 Digital Video Broadcasting (DVB); Multimedia Home [20]. Digital Video Broadcasting. Retrieved online:http://www.dvb.org [21]. De Souza e Silva, A., "Alien Revolt (2005-2007): A Case Study of the First Location-Based Mobile Game in Brazil," in IEEE Technology and Society Magazine, vol.27, no.1, pp.18-28, Spring 2008 [22]. Das, S.K.; Rose, C., "Coping with uncertainty in mobile wireless networks," IEEE International Symposium on Personal, Indoor and Mobile Radio Communications, 2004. PIMRC 2004. 15th, vol.1, no., pp. 103-108 Vol.1, 5-8 Sept. 2004 [23]. Pradhan, S., Brignone, C., Cui J-H, McReynolds, A., Smith, M.T.; “Websigns: hyperlinking physical locations to the Web” in IEEE Computer, Volume 34, Issue 8, pp42-48 Aug. 2001. [24]. Naaman, M.; “Eyes on the World” in IEEE Computer, Volume 39, Issue 10, pp108-11 Aug. 2006. [25]. Osmosys Home. Retrieved online: http://www.osmosys.tv/ [26]. Eclipse org home. Retrieved online: http://www.eclipse.org/ |
論文全文使用權限 |
如有問題,歡迎洽詢!
圖書館數位資訊組 (02)2621-5656 轉 2487 或 來信