| 系統識別號 | U0002-2006202411054700 |
|---|---|
| DOI | 10.6846/tku202400275 |
| 論文名稱(中文) | 一個利用臉部情緒取得對脫口秀表演是否滿意之方法 |
| 論文名稱(英文) | A method of determining if people are satisfied with talk show performance based on facial expressions |
| 第三語言論文名稱 | |
| 校院名稱 | 淡江大學 |
| 系所名稱(中文) | 資訊管理學系碩士班 |
| 系所名稱(英文) | Department of Information Management |
| 外國學位學校名稱 | |
| 外國學位學院名稱 | |
| 外國學位研究所名稱 | |
| 學年度 | 112 |
| 學期 | 2 |
| 出版年 | 113 |
| 研究生(中文) | 鄭榆 |
| 研究生(英文) | YU CHENG |
| 學號 | 611630384 |
| 學位類別 | 碩士 |
| 語言別 | 繁體中文 |
| 第二語言別 | |
| 口試日期 | 2024-06-01 |
| 論文頁數 | 38頁 |
| 口試委員 |
指導教授
-
梁恩輝(094110@mail.tku.edu.tw)
口試委員 - 李彥賢 口試委員 - 周清江 口試委員 - 梁恩輝 |
| 關鍵字(中) |
滿意預測 臉部情緒辨識 長短期記憶網絡 卷積神經網路 |
| 關鍵字(英) |
Satisfaction prediction Facial emotion expressions Long Short-Term Memory networks Convolutional Neural Networks |
| 第三語言關鍵字 | |
| 學科別分類 | |
| 中文摘要 |
脫口秀,一種源自英美的表演藝術,九零年代傳入華人社會,與單口喜劇並稱,深受廣大觀眾喜愛。此表演形式,以即席互動和情感共鳴為特色,要求演員與觀眾間實時反饋,共同塑造節目節奏與氛圍。傳統問卷調查於即時捕捉觀眾真實情緒反應方面,存在諸多侷限,其結果易受主觀性影響,導致資料可能不盡真實。而臉部情緒辨識技術,以其非侵入、即時和全面的特質,被廣泛應用於客觀捕捉情感狀態。 本研究以臉部情緒辨識為基礎,構建了一個針對脫口秀表演評估滿意的預測模型。考量到長短期記憶網絡(LSTM)對於處理時間序列數據的強大能力,本研究選擇LSTM作為模型的核心架構,旨在捕捉觀眾在觀看脫口秀表演過程中的情緒變化,並探討這些情緒變化對觀眾整體是否滿意的影響。 在數據處理階段,本研究實施了多種資料清理策略,以期優化資料集,提升模型訓練的效果。實驗結果表明,LSTM基於情緒序列的預測模型能夠有效預測觀眾是否滿意,且不同的資料處理方法對模型準確率有顯著影響,為未來進一步研究提供了實證基礎。 |
| 英文摘要 |
Talk show, an art form originating from the UK and the US, was introduced into the Chinese community in the 1990s and has been widely loved by audiences alongside talk show. This form of performance is characterized by impromptu interaction and emotional resonance, requiring real-time feedback between the performer and the audience to jointly shape the program's rhythm and atmosphere. Traditional surveys have many limitations in capturing the audience's real emotional responses in real-time, as the results are prone to subjectivity, leading to potentially inaccurate data. Facial emotion expressions technology, with its non-invasive, real-time, and comprehensive traits, has been widely used to objectively capture emotional states. This study builds a prediction model for assessing satisfaction with talk show performances based on facial expressions recognition. Considering the strong capability of Long Short-Term Memory networks (LSTM) in processing time-series data, this study chooses LSTM as the core architecture of the model, aiming to capture the emotional changes of the audience during the talk show performance and explore the impact of these changes on if the overall audience are satisfied. In the data processing stage, this study implements various data cleaning strategies to optimize the dataset and enhance the model training effect. The experimental results show that the LSTM model based on emotion sequences can effectively predict the audience's are satisfied, and different data processing methods have a significant impact on the model's accuracy rate, providing an empirical basis for further research in the future. |
| 第三語言摘要 | |
| 論文目次 |
第一章 緒論 1 1.1 研究背景與動機 1 1.2 研究目的 3 1.3 研究流程 4 第二章 文獻探討 5 2.1 人的情緒 5 2.2 臉部情緒辨識 7 2.3 機器學習(MACHINE LEARNING) 7 2.4 神經網路(NEURAL NETWORK) 8 2.5 深度學習(DEEP LEANING) 9 2.6 卷積神經網路(CONVOLUTIONAL NEURAL NETWORK) 9 2.7 長短期記憶模型(LONG SHORT-TERM MEMORY) 12 第三章 研究方法 13 3.1 研究架構 13 3.2 使用技術 14 3.3 資料蒐集 14 3.3.1 脫口秀影片挑選 14 3.4 資料輸入 15 3.4.1 臉部情緒辨識 15 3.5 資料前處理 16 3.6 資料優化處理 18 3.7 模型建構 20 第四章 實驗與模型建構 21 4.1 實驗設計 21 4.1.1 實驗流程 21 4.1.2 問卷設計 22 4.2 數據收集與分析 24 4.2.1 資料分析 26 4.3 模型訓練結果評估 29 第五章 結論與未來研究方向 31 5.1 結論與研究貢獻 31 5.2 研究限制 32 5.3 未來方向 33 參考文獻 34 表目錄 表 3.1 開發工具列表 14 表 3.2 脫口秀影片表 14 表 3.3 取得強度最高情緒資料格式 17 表 3.4 情緒關係表 19 表 4.1 問卷設計 22 表 4.2 脫口秀影片情緒出現次數統計 25 表 4.3 實驗結果統計表 26 表 4.4 第一種方法以不同序列長度進行模型訓練結果比較 29 表 4.5以二種方法以不同序列長度進行模型訓練結果比較 30 表 4.6 資料清理方法模型訓練結果比較 30 圖目錄 圖 1.1 研究流程 4 圖 2.1 2D情緒空間模型 圖片來源:(Khare et al., 2024) 6 圖 2.2 CNN架構圖 資料來源:(Alzubaidi et al., 2021) 10 圖 2.3 卷基層示意圖 資料來源:(Alzubaidi et al., 2021) 10 圖 2.4 池化層示意圖 資料來源:(Alzubaidi et al., 2021) 11 圖 2.5 全連階層示意圖 資料來源:(Alzubaidi et al., 2021) 11 圖 3.1 研究架構圖 13 圖 3.2 情緒辨識初始資料 16 圖 3.3七種情緒與其強度資料格式 18 圖 3.4 縮短時間序列範例 18 圖 4.1 實驗過程紀錄 22 圖 4.2情緒辨識圖像化結果 24 |
| 參考文獻 |
一、 中文論文 1. 劉曉, 譚春華, & 章毓晉. (2006). 人臉表情識別研究的新進展. 中國圖象圖形學報, 11(10), 1359–1368. 2. 張曉婷. (2007). “脫口秀” 在中國的發展和走向. 青年記者, 14, 107–108. 3. 方瑋翔. (2020). 基於臉部情緒辨識之喜好度分析. 朝陽科技大學資訊管理系學位論文, 2020, 1–39. 4. Shanshan, L. I. U. (2022). 在 [冒犯] 中自我療癒: 以脫口秀為表徵的當代青年治癒敘事. 台灣社會研究季刊, 121, 135–162. 二、 英文論文 1. Alzubaidi, L., Zhang, J., Humaidi, A. J., Al-Dujaili, A., Duan, Y., Al-Shamma, O., San-tamaría, J., Fadhel, M. A., Al-Amidie, M., & Farhan, L. (2021). Review of deep learn-ing: Concepts, CNN architectures, challenges, applications, future directions. Journal of Big Data, 8, 1–74. 2. Bazzani, A., Ravaioli, S., Trieste, L., Faraguna, U., & Turchetti, G. (2020). Is EEG Suitable for Marketing Research? A Systematic Review. Frontiers in Neuroscience, 14, 594566. https://doi.org/10.3389/fnins.2020.594566 3. Burel, G., & Alani, H. (2018). Crisis event extraction service (crees)-automatic detec-tion and classification of crisis-related content on social media. 4. Chang, C.-T., & Chen, P.-C. (2017). Cause-related marketing ads in the eye tracker: It depends on how you present, who sees the ad, and what you promote. International Journal of Advertising, 36(2), 336–355. https://doi.org/10.1080/02650487.2015.1100698 5. Chang, W.-J., Schmelzer, M., Kopp, F., Hsu, C.-H., Su, J.-P., Chen, L.-B., & Chen, M.-C. (2019). A deep learning facial expression recognition based scoring system for restaurants. 2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC), 251–254. 6. Chen, L.-C., Lee, C.-M., & Chen, M.-Y. (2020). Exploration of social media for senti-ment analysis using deep learning. Soft Computing, 24, 8187–8197. 7. Dapogny, A., Bailly, K., & Dubuisson, S. (2015). Pairwise conditional random forests for facial expression recognition. Proceedings of the IEEE International Conference on Computer Vision, 3783–3791. 8. Ekman, P., & Friesen, W. V. (1978). Facial action coding system. Environmental Psy-chology & Nonverbal Behavior. 9. García-Madariaga, J., Moya, I., Recuero, N., & Blasco, M.-F. (2020). Revealing un-conscious consumer reactions to advertisements that include visual metaphors. A neuro-physiological experiment. Frontiers in Psychology, 11, 760. 10. González-Rodríguez, M. R., Díaz-Fernández, M. C., & Pacheco Gómez, C. (2020). Facial-expression recognition: An emergent approach to the measurement of tourist sat-isfaction through emotions. Telematics and Informatics, 51, 101404. https://doi.org/10.1016/j.tele.2020.101404 11. Hamelin, N., Moujahid, O. E., & Thaichon, P. (2017). Emotion and advertising effec-tiveness: A novel facial expression analysis approach. Journal of Retailing and Con-sumer Services, 36, 103–111. https://doi.org/10.1016/j.jretconser.2017.01.001 12. Hernández-Blanco, A., Herrera-Flores, B., Tomás, D., & Navarro-Colorado, B. (2019). A systematic review of deep learning approaches to educational data mining. Complexi-ty, 2019. 13. Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory. Neural Computa-tion, 9(8), 1735–1780. 14. Kamble, K., & Sengupta, J. (2023). A comprehensive survey on emotion recognition based on electroencephalograph (EEG) signals. Multimedia Tools and Applications, 1–36. 15. Khare, S. K., Blanes-Vidal, V., Nadimi, E. S., & Acharya, U. R. (2024). Emotion recognition and artificial intelligence: A systematic review (2014–2023) and research recommendations. Information Fusion, 102, 102019. https://doi.org/10.1016/j.inffus.2023.102019 16. Ko, C.-R., & Chang, H.-T. (2021). LSTM-based sentiment analysis for stock price forecast. PeerJ Computer Science, 7, e408. 17. Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2017). ImageNet classification with deep convolutional neural networks. Communications of the ACM, 60(6), 84–90. 18. McDuff, D., Kaliouby, R. E., Cohn, J. F., & Picard, R. W. (2015). Predicting Ad Liking and Purchase Intent: Large-Scale Analysis of Facial Responses to Ads. IEEE Transac-tions on Affective Computing, 6(3), 223–235. https://doi.org/10.1109/TAFFC.2014.2384198 19. Mehrabian, A. (1996). Pleasure-arousal-dominance: A general framework for describ-ing and measuring individual differences in temperament. Current Psychology, 14, 261–292. 20. Monteiro, P., Guerreiro, J., & Loureiro, S. M. C. (2020). Understanding the role of vis-ual attention on wines’ purchase intention: An eye-tracking study. International Journal of Wine Business Research, 32(2), 161–179. 21. Mou, W., Gunes, H., & Patras, I. (2016a). Alone versus In-a-group: A Comparative Analysis of Facial Affect Recognition. Proceedings of the 24th ACM International Con-ference on Multimedia, 521–525. https://doi.org/10.1145/2964284.2967276 22. Mou, W., Gunes, H., & Patras, I. (2016b). Automatic recognition of emotions and membership in group videos. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, 27–35. 23. Paulhus, D. L., & Vazire, S. (2007). The self-report method. Handbook of Research Methods in Personality Psychology, 1(2007), 224–239. 24. Poels, K., & Dewitte, S. (2006). How to capture the heart? Reviewing 20 years of emo-tion measurement in advertising. Journal of Advertising Research, 46(1), 18–37. 25. Pramerdorfer, C., & Kampel, M. (2016). Facial expression recognition using convolu-tional neural networks: State of the art. arXiv Preprint arXiv:1612.02903. 26. Rached, T. S., & Perkusich, A. (2013). Emotion recognition based on brain-computer interface systems. Brain-Computer Interface Systems-Recent Progress and Future Pro-spects, 253–270. 27. Ryan, J., Lin, M.-J., & Miikkulainen, R. (1998). Intrusion detection with neural net-works. Advances in neural information processing systems. Advances in Neural Infor-mation Processing Systems, MIT Press, 10. 28. Salmam, F. Z., Madani, A., & Kissi, M. (2016). Facial expression recognition using decision trees. 2016 13th International Conference on Computer Graphics, Imaging and Visualization (CGiV), 125–130. 29. Sara, Y., Dumne, J., Musku, A. R., Devarapaga, D., & Gajula, R. (2022). A Deep Learning Facial Expression Recognition based Scoring System for Restaurants. 2022 International Conference on Applied Artificial Intelligence and Computing (ICAAIC), 630–634. 30. Souza, M. T. D., Oliveira, J. H. C. D., & Giraldi, J. D. M. E. (2020). Organic and sponsored ads: Study on online purchase intent and visual behaviour. International Journal of Internet Marketing and Advertising, 14(3), 318–335. 31. Timberg, B. M. (2002). Television talk: A history of the TV talk show. University of Texas Press. 32. Tzafilkou, K., Economides, A. A., & Panavou, F.-R. (2023). You Look like You’ll Buy It! Purchase Intent Prediction Based on Facially Detected Emotions in Social Media Campaigns for Food Products. Computers, 12(4), Article 4. https://doi.org/10.3390/computers12040088 33. Wilhelm, F. H., & Grossman, P. (2010). Emotions beyond the laboratory: Theoretical fundaments, study design, and analytic strategies for advanced ambulatory assessment. Biological Psychology, 84(3), 552–569. 34. Wilson, G. F., & Russell, C. A. (2003a). Real-time assessment of mental workload us-ing psychophysiological measures and artificial neural networks. Human Factors, 45(4), 635–644. 35. Wilson, G. F., & Russell, C. A. (2003b). Real-Time Assessment of Mental Workload Using Psychophysiological Measures and Artificial Neural Networks. Human Factors: The Journal of the Human Factors and Ergonomics Society, 45(4), 635–644. https://doi.org/10.1518/hfes.45.4.635.27088 36. Yoo, S., & Noyes, S. (2016). Recognition of Facial Expressions of Negative Emotions in Romantic Relationships. Journal of Nonverbal Behavior, 40(1), 1–12. https://doi.org/10.1007/s10919-015-0219-3 37. Zhou, K., Fu, C., & Yang, S. (2016). Big data driven smart energy management: From big data to big insights. Renewable and Sustainable Energy Reviews, 56, 215–225. |
| 論文全文使用權限 |
如有問題,歡迎洽詢!
圖書館數位資訊組 (02)2621-5656 轉 2487 或 來信