§ 瀏覽學位論文書目資料
  
系統識別號 U0002-0701201522243800
DOI 10.6846/TKU.2015.00172
論文名稱(中文) 整合資通訊輔助技術之盲用人機介面設計
論文名稱(英文) The Integration of Information and Communication Assistive Technology in User Interface Design for the Visually Impaired
第三語言論文名稱
校院名稱 淡江大學
系所名稱(中文) 機械與機電工程學系博士班
系所名稱(英文) Department of Mechanical and Electro-Mechanical Engineering
外國學位學校名稱
外國學位學院名稱
外國學位研究所名稱
學年度 103
學期 1
出版年 104
研究生(中文) 楊仲捷
研究生(英文) Chung-Chieh Yang
學號 899370018
學位類別 博士
語言別 繁體中文
第二語言別
口試日期 2014-12-26
論文頁數 84頁
口試委員 指導教授 - 葉豐輝(funghuei@mail.tku.edu.tw)
委員 - 盧永華
委員 - 柯德祥
委員 - 李經綸
委員 - 蔡慧駿
委員 - 葉豐輝
關鍵字(中) 點字
視障者
資通訊輔助技術
人機介面
關鍵字(英) Braille
Visually impaired
Information and commutation assistive technology
User interface
第三語言關鍵字
學科別分類
中文摘要
本研究目的在設計盲用人機介面與整合資通訊輔助技術來解決視障者在使用行動終端與固定終端裝置所遭遇到的問題。在行動終端上,本研究提出了虛擬鍵盤及手勢操作,以及虛擬點字鍵盤、中文點字多點觸控及手勢輸入方法,來協助視障者使用智慧型手機。所提出之設計方法被使用來整合語音合成、語音辨識、影像辨識、群眾外包等技術,實作出一個智慧型手機盲用APP,可在視覺、聽覺、閱讀與生活方面來輔助視障者。這個APP經過137位視障者測試,結果顯示88%以上的參與測試者可正確、順暢的操作與容易的學會使用。
在固定終端上,本研究提出了一個資通訊技術輔助盲用電話訪查系統,來協助視障者,可像明眼人一樣從事電話訪查的工作和建立更多元的就業機會。本研究使用ABAB設計與七位視障者來評估這個系統。測試的結果,他們在基線階段每月可成功電訪779通,介入階段每月可成功電訪3,070通。這樣的結果也顯示所設計的方法可有效的提升視障者工作效率。本研究之成果也透過APEC數位機位中心平台分享給菲律賓、馬來西亞與中國大陸。
英文摘要
The objective of this study was to design user interface for the visually impaired and integrate information and communication assistive technology to solve the problems they encountered in using mobile terminal and fixed terminal device. In mobile terminal device, this study proposed virtual keyboard and gesture operating methods, Braille virtual keyboard, Chinese Braille multi-touch and gesture input methods to help visually impaired people use smart phone. The proposed design was used to integrate speech synthesis, speech recognition, image recognition, crowdsourcing technologies to implement a smart phone APP for the visually impaired. It can assist them in vision, hearing, reading and life. This APP was tested by 137 visually impaired people. The results showed that more than 88% participants can operate the APP correctly, smoothly, and learn to use it easily.
In fixed terminal device, this study proposed an information and communication technology assisted blind telephone interview system to help visually impaired people to do telephone interview jobs as normal sighted people and create more diverse employment opportunities. An ABAB design was used in this study to assess the system with seven visually impaired people. As the results, they can accomplish 779 effective telephone interviews per month in the baseline phase and 3,070 effective telephone interviews per month in the intervention phase. This results also showed that working performance of the visually impaired can be improved effectively with the proposed design. The results were also shared through the APEC Digital Opportunity Center platform to help visually impaired people in Philippines, Malaysia and China.
第三語言摘要
論文目次
目  錄

中文摘要I
英文摘要II
目  錄IV
圖目錄VI
表目錄VIII
第一章 緒論1
1.1 研究動機與目的1
1.2 文獻探討2
1.2.1 視障者的定義2
1.2.2 盲用點字系統4
1.2.3 盲用資通訊行動終端與生活輔助6
1.2.4 盲用資通訊固定終端與工作輔助10
1.3 論文架構13
第二章 盲用資通訊行動終端人機介面研究設計15
2.1 視障者操作與需求分析15
2.2 虛擬鍵盤及手勢操作設計18
2.3 點字輸入與輸出輔助設計20
2.3.1 虛擬點字鍵盤輸入法20
2.3.2 中文點字多點觸控輸入法21
2.3.3 中文點字手勢輸入法29
2.3.4 輸出輔助設計30
2.4 語音技術輔助設計30
2.5 影像技術輔助設計36
2.6 網路及定位技術輔助設計41
第三章 盲用資通訊固定終端人機介面研究設計46
3.1 視障者操作障礙與需求分析46
3.2 資通訊技術輔助盲用電話訪查系統架構48
3.3 無障礙值機固定終端操作介面設計50
3.4 盲用資料檢索輔助設計53
3.5 影像處理輔助設計56
第四章 實驗結果與討論59
4.1 盲用資通訊行動終端59
4.1.1 實驗方法59
4.1.2 參與測試者61
4.1.3 實驗結果與討論63
4.2 盲用資通訊固定終端67
4.2.1 實驗方法67
4.2.2 參與測試者68
4.2.3 實驗結果與討論69
第五章 結論與未來展望72
5.1 結論72
5.2 未來展望75
參考文獻77
 
圖目錄

圖1點字編號順序5
圖2國語點字聲母5
圖3國語點字韻母5
圖4國語點字結合韻6
圖5國語點字聲調記號6
圖6 (a)虛擬鍵盤配置,(b)手勢操作設計18
圖7 (a)虛擬點字鍵盤配置,(b)傳統注音鍵盤配置19
圖8 (a)數字輸入鍵盤,(b)點字輸入鍵盤設計21
圖9中文點字多點觸控輸入實際使用情形22
圖10手指與點字編號對照圖23
圖11中文點字多點觸控輸入法流程圖23
圖12中文點字手勢輸入法與點字符號對照30
圖13語音辨識指令系統架構圖32
圖14中英文混和文字語音合成處理流程33
圖15語音合成模組架構圖34
圖16五幣辨識功能36
圖17鈔票面額辨識功能畫面設計38
圖18物品辨識系統架構圖39
圖19物品辨識功能畫面設計40
圖20隨身擴視機功能40
圖21網路博覽家相關功能設計42
圖22 GPS定位輔助及附近商家資訊查詢與電子羅盤輔助43
圖23公車動態查詢選單設計44
圖24 GPS及緊急求救功能畫面設計45
圖25資通訊技術輔助盲用電話訪查系統架構圖49
圖26無障礙資通訊固定終端模組設計圖50
圖27資通訊固定終端影像辨識輔助設計57
圖28 (a)Android版APP實作成果,(b)iOS版APP實作成果59
圖29視障者實際操作使用情形60
圖30智慧型手機盲用APP測試步驟流程圖60
圖31第一梯次參與測試之視障者視覺障礙等級分布61
圖32第一梯次參與測試之視障者有無操作智慧型手機經驗62
圖33第一梯次參與測試之視障者年齡分布62
圖34第二梯次參與測試之視障者視覺障礙等級分布63
圖35第二梯次參與測試之視障者年齡分布63
圖36第一梯次視障者操作正確性64
圖37第一梯次視障者操作順暢性64
圖38第一梯次視障者操作學習門檻65
圖39第二梯次視障者操作正確性65
圖40第二梯次視障者操作順暢性66
圖41第二梯次視障者操作學習門檻66
圖42參與測試者之基線階段與介入階段測試結果70


 
表目錄

表1身心障礙鑑定類別、鑑定向度、程度分級及其基準表3
表2參與測試者背景資訊69
參考文獻
[1] 衛生福利部,“身心障礙鑑定類別、鑑定向度、程度分級及其基準”,行政院公報衛生勞動篇,19卷,146期,2013。
[2] 國立中央圖書館臺灣分館推廣輔導組,國語點字自學手冊,國立中央圖書館臺灣分館,2000。
[3] T.M. Miller, “Method and Apparatus for Multitouch Text Input,” US patent, January 20, 2011.
[4] B. Frey, K. Rosier, C. Southern, M. Romero, “From texting app to braille literacy,” in Proc. CHI 2012, pp. 2495-2500, 2012.
[5] S.K. Kane, J.P. Bigham, J.O. Wobbrock, “Slide rule: Making mobile touch screens accessible to blind people using multi-touch interaction techniques,” in Proc. ASSETS '08, pp. 73-80, 2008.
[6] 葉豐輝,“檢討失明人現時應用中文電腦的困難和解決方案”,失明人中文電腦應用研討會,香港,1993。
[7] 葉豐輝,洪錫銘,“台灣中文盲用電腦軟硬體系統研發與視障資訊網路建構之概況及展望”,亞太區視障人士資訊科技研討會,香港,1996。
[8] 葉豐輝,“如何協助建構與推廣殘障網路之建議”,台灣區網際網路研討會,台灣,1997。
[9] 葉豐輝,”視障者電腦輔具技術及其應用系統之現況和展望”,身心障礙者就學、就業之科技支援研討會,台灣,1997。
[10] 葉豐輝,洪錫銘,蔡慧駿,“應用NII網路科技服務身心障礙者之推動建議”,Net'98 Taiwan網路新紀元研討會,台灣,1998。
[11] 吳林駿、葉豐輝、蔡慧駿、李經綸,“中文盲用電腦點字顯示器之熱傳分析與散熱設計”,中國機械工程學會第十五屆全國學術研討, 台灣,1998。
[12] 陳展昭、蔡慧駿、葉豐輝,“盲用電腦點字顯示方之電腦輔助設計與力學分析”,中國機械工程學會第十六屆全國學術研討會,1999。
[13] 葉豐輝、蔡慧駿、陳逸正,“電腦輔助設計分析於點字顯示方固定座射出成型之應用研究”,中國機械工程學會第十七屆全國學術研討會,2000。
[14] F.H. Yeh, H.S. Tsay, S.H. Liang, “Applied CAD and ANFIS to the Chinese Braille Display Optimization,” Displays, Vol. 24, pp. 213-222, 2003.
[15] F.H. Yeh, H.S. Tsay, S.H. Liang, “Application of an Adaptive-Network-Based Fuzzy Inference System for the Optimal Design of a Chinese Braille Display,” Biomedical Engineering-Applications, Basis & Communications,  Vol. 17, No. 1, pp. 50-60, 2005.
[16] Y. Jang, Y.T. Wang, M.H. Lin, K.J. Shih, “Predictors of employment outcomes for people with visual impairment in Taiwan: the contribution of disability employment services,” Journal Of Visual Impairment & Blindness, Vol. 107, No. 6, pp. 469-480, 2013.
[17] T. McCarthy, J. Pal, E. Cutrell, “The voice has it: screen reader adoption and switching behavior among vision impaired persons in India,” Assistive Technology, Vol. 25, No. 4, pp. 222-229, 2013.
[18] G.E. Lancioni, M. O’Reilly, N. Singh, D. Oliva, “Enabling two women with blindness and additional disabilities to make phone calls independently via a computer-aided telephone system,” Developmental Neurorehabilitation, Vol. 14, No. 5, pp. 283-289, 2011.
[19] G.E. Lancioni, M.F. O'Reilly, N.N. Singh, J. Sigafoos, D. Oliva, G. Alberti, R. Lang, “Two adults with multiple disabilities use a computer-aided telephone system to make phone calls independently,” Research in Developmental Disabilities, Vol. 32, No. 6, pp. 2330-2335, 2011.
[20] V. Perilli, G.E. Lancioni, D. Laporta, A. Paparella, A.O. Caffo, N.N. Singh, M.F. O'Reilly, J. Sigafoos, D. Oliva, “A computer-aided telephone system to enable five persons with alzheimer's disease to make phone calls independently,” Research in Developmental Disabilities, Vol. 34, No. 6,  pp. 1991-1997, 2013.
[21] G.E. Lancioni, N.N. Singh, M.F. O'Reilly, J. Sigafoos, D. Oliva, F. Campodonico, “Further evaluation of a telephone technology for enabling persons with multiple disabilities and lack of speech to make phone contacts with socially relevant partners,” Research in developmental disabilities, Vol. 34, No. 11, pp. 4178-4183, 2013.
[22] V. Perilli, G.E. Lancioni, N.N. Singh, M.F. O'Reilly, J. Sigafoos, G. Cassano, N. Cordiano, K. Pinto, M.G. Minervini, D. Oliva, “Persons with alzheimer's disease make phone calls independently using a computer-aided telephone system,” Research in Developmental Disabilities, Vol. 33, No. 4, pp. 1014-1020, 2012.
[23] 張國瑞,“盲用手機使用心得分享”,淡江大學盲生資源中心,2009。
[24] 曾羽華,“盲文點字應用於手機文字輸入之創新設計研究”,大同大學工業設計研究所碩士論文,2010。
[25] 吳佳育,“視障者對3C 產品階層式選單系統之最佳介面參數設計研究”,大同大學工業設計研究所碩士論文,2006。
[26] 林柏志、李允文、黃淑苓,“身心障礙者輸入介面之研究--以行動電話為例特殊教育現在與未來”,特殊教育叢書,9501,31頁,2006。
[27] 黃聖翔,“視障者行動電話階層式選單之軟硬體操作介面分析研究”,大同大學工業設計系研究所碩士論文,2007。
[28] 陳昱丞、李安勝,“盲人使用行動電話之人機介面研究”,環球技術學院商品設計系、環球技術學院公關事務設計系碩士論文,2006。
[29] 張國瑞,“視障者使用者介面之研究”,淡江大學資訊工程學系碩士論文,2002。
[30] S. Azenkot, E. Fortuna, “Improving public transit usability for blind and deaf-blind people by connecting a Braille display to a smartphone,” in Proc. ASSETS '10, pp. 317-318, 2010.
[31] J. Behmer, S. Knox, “LocalEyes: accessible GPS and points of interest,” in Proc. ASSETS '10, pp. 323-324, 2010.
[32] L. Hakobyan, J. Lumsden, D. O'Sullivan, H. Bartlett, “Mobile assistive technologies for the visually impaired,” Surv. Ophthalmol. Vol. 58, pp. 513-528, 2013.
[33] E. Krajnc, M. Knoll, J. Feiner, M. Traar, “A touch sensitive user interface approach on smartphones for visually impaired and blind persons,” in Proc. USAB'11, pp. 585-594, 2011.
[34] V. Kulyukin, W. Crandall, D. Coster, “Efficiency or quality of experience: a laboratory study of three eyes-free touchscreen menu browsing user interfaces for mobile phones,” TOREHJ, Vol. 4, pp. 13-22, 2011.
[35] F. Maurel, G. Dias, J.M. Routoure, M. Vautier, P. Beust, M. Molina, C. Sann, “Haptic perception of document structure for visually impaired people on handled devices. Procedia,” Comput. Sci., Vol. 14, pp. 319-329, 2012.
[36] J. Rantala, R. Raisamo, J. Lylykangas, V. Surakka, J. Raisamo, K. Salminen, T. Pakkanen, A. Hippula, “Methods for presenting Braille characters on a mobile device with a touchscreen and tactile feedback,” IEEE Trans. Haptics, Vol. 2, pp. 28-39, 2009.
[37] L. Rello, G. Kanvinde, R. Baeza-Yates, “A mobile application for displaying more accessible eBooks for people with dyslexia,” Procedia Comput. Sci., Vol. 14, pp. 226-233, 2012.
[38] C. Southern, J. Clawson, B. Frey, G.D. Abowd, M. Romero, “An evaluation of Brailletouch: mobile touchscreen text entry for the visually impaired,” in Proc. MobileHCI, pp. 317-326, 2012.
[39] D. AbdulRasool, S. Sabra, “Mobile-embedded smart guide for the blind,” in Proc. International Conference on Digital Information and Communication Technology and Its Applications, pp. 571-578, 2011.
[40] J. Su, A. Rosenzweig, A. Goel, “Timbremap: enabling the visually-impaired to use maps on touch-enabled devices,” in Proc. 12th International Conference on Human-Computer Interaction with Mobile Devices and Services, pp.17-26, 2010.
[41] R. Vela’zquez, F. Maingreaud, E. Pissaloux, “Intelligent glasses: a new man-machine interface concept integrating computer vision and human tactile perception,” in Proc. EuroHaptics, pp.45-60, 2003.
[42] M. Billi, L. Burzagli, T. Catarci, “A Unified Methodology for the Evaluation of Accessibility and Usability of Mobile Applications,” Universal Access in the Information Society, Vol. 9, pp. 337-356, 2010.
[43] X. Chen, M. Tremaine, R. Lutz, “AudioBrowser: a mobile browsable information access for the visually impaired,” Universal Access in the Information Society, Vol. 5, pp. 4-22, 2006.
[44] A.M. Binns, C. Bunce, C. Dickinson, “How effective is low vision service provision? A systematic review,” Surv Ophthalmol, Vol. 57, pp. 34-65, 2011.
[45] N.A. Bradley, M.D. Dunlop, “An experimental investigation into wayfinding directions for visually impaired people,” Personal and Ubiquitous Computing, Vol. 9, pp. 395-403, 2005.
[46] S. Brewster, “Overcoming the lack of screen space on mobile computers,” Personal and Ubiquitous Computing, Vol. 6, pp. 188-205, 2002.
[47] S. Brewster, F. Chohan, L. Brown, “Tactile feedback for mobile interactions,” in Proc. SIGCHI Conference on Human Factors in Computing Systems, pp.159-162, 2007.
[48] S. Brewster, J. Lumsden, M. Bell, “Multimodal eyes-free interaction techniques for wearable devices,” in Proc. SIGCHI Conference on Human Factors in Computing Systems, pp.437-480, 2003.
[49] L. Brown, S. Brewster, H. Purchase, “Multidimensional tactons for non-visual information presentation in mobile devices,” in Proc. 8th Conference on Human-Computer Interaction with Mobile Devices and Services, pp.231-238, 2006. 
[50] C. Dicke, K. Wolf, Y. Tal, “Foogue: eyes-free interaction for smartphones,” in Proc. 12th International Conference on Human Computer Interaction with Mobile Devices and Services, pp.455-458, 2010.
[51] F.M. Hasanuzzaman, X. Yang, Y. Tian, “Robust and Effective Component-based Banknote Recognition by SURF Features,” in Proc. Wireless and Optical Communications Conference, pp. 1-6, 2011.
[52] C. Harris, M. Stephens, “A combined corner and edge detector,” in Proc.  Fourth Alvey vision Conference, pp. 147-151, 1988.
[53] D. Nister, H. Stewenius, “Scalable Recognition with a Vocabulary Tree,” in IEEE Conference on Computer Vision and Pattern Recognition, pp. 2161-2168, 2006.
[54] M. Sevki, A. Turkyilmaz, M. S. Aksoy, “Banknote recognition using inductive learning,” in International Conference on Fuzzy systems and Soft Computational Intelligence in Management and Industrial Engineering, pp. 1-7, 2002.
[55] F. Takeda, S. Omatu, S. Onami, “Recognition System of US Dollars Using a Neural Network with Random Masks,” in Proc. International Joint Conference on Neural Networks, pp. 2033-2036, 1993.
[56] J.K. Lee, I.H. Kim, “New recognition algorithm for various kinds of Euro banknotes,” in Proc. 29th Annual Conference of the IEEE Industrial Electronics, pp. 2266-2270, 2003.
[57] F. Takeda, T. Nishikage, “Multiple kinds of paper currency recognition using neural network and application for Euro currency,” in Proc. IEEE-INNS-ENNS Inernational Joint Conference on Neural Network, pp.  143-147, 2000.
[58] A. Frosini, M. Gori, P. Priami, “A Neural Network-Based Model for Paper. Currency Recognition and Verification,” IEEE Transactions on Neural Networks, Vol.7, pp. 1482-1490, 1996.
[59] F. Takeda, L. Sakoobunthu, H. Sato, “Thai Banknote Recognition Using Neural Network and continues learning by DSP unit,” Knowledge-Based Intelligent Information and Engineering Systems, pp. 1169-1177, 2003.
[60] Q. Ji, Q.Dongping, Z. Mengjie, “A digit recognition system for paper currency identification based on virtual instruments,” in Proc. International Conference on Information Acquisition and Automation, pp. 228-233, 2006.
[61] A. Ahmadi, S. Omatu, M. Yoshioka, “Implementing a reliable neuro classifier for paper currency using PCA algorithm,” in Proc. 41st SICE Annual Conference of the Society of Instrument and Control Engineers, pp. 2466-2468, 2002.
[62] D. Lowe, “Distincitive image features from scale-invariant keypoints,” International Jounral of Compter Vision, Vol. 60, No. 2, pp. 91-110, 2004.
[63] H. Bay, T. Tuytellars, L. Gool, “SURF: Speeded Up Robust Features,” in European Conferecne on Computer Vision, pp 404-417, 2006.
[64] K. Cheverst, K. Clarke, G. Dewsbury, T. Hemmings, S. Kember, T. Rodden, M. Rouncefield, “Designing assistive technologies for medication regimes in care settings,” Universal Access in the Information Society, Vol. 2, No. 3, pp. 235-242, 2003.
[65] C. Power, H. Jurgensen, “Accessible presentation of information for people with visual disabilities,” Universal Access in the Information Society, Vol. 9, No. 2, pp. 97-119, 2010.
[66] H. G. İlk, S. Guler, “Adaptive time scale modification of speech for graceful degrading voice quality in congested networks for VoIP applications,” Signal Processing, Vol. 86, No. 1, pp. 127-139, 2006.
[67] G. Salton, C. Buckley, “Term-weighting approaches in automatic text retrieval,” Information Processing and Management, Vol. 24, pp. 513-523, 1988.
[68] S. B. Richards, R. L. Taylor, R. Ramasamy. R. Y. Richards, Single subject research: Applications in educational and clinical settings. New York: Wadsworth, 1999.
[69] K. Papadopoulos, “The impact of individual characteristics in self-esteem and locus of control of young adults with visual impairments,” Research in developmental disabilities, Vol. 35, No. 3, pp. 671-675, 2014.
[70] K. Papadopoulos, A. J. Montgomery, E. Chronopoulou, “The impact of visual impairments in self-esteem and locus of control,” Research in Developmental Disabilities, Vol. 34, No. 12, pp. 4565-4570, 2013.
論文全文使用權限
校內
校內紙本論文立即公開
同意電子論文全文授權校園內公開
校內電子論文立即公開
校外
同意授權
校外電子論文立即公開

如有問題,歡迎洽詢!
圖書館數位資訊組 (02)2621-5656 轉 2487 或 來信