Responsive image
博碩士論文 etd-0723114-124228 詳細資訊
Title page for etd-0723114-124228
論文名稱
Title
即時通訊服務中短句情緒辨識之研究:基於情緒座標與軌跡
Recognizing Emotions in Short Messages for Instant Messenger Services Based on Coordinates and Trajectories
系所名稱
Department
畢業學年期
Year, semester
語文別
Language
學位類別
Degree
頁數
Number of pages
81
研究生
Author
指導教授
Advisor
召集委員
Convenor
口試委員
Advisory Committee
口試日期
Date of Exam
2014-07-17
繳交日期
Date of Submission
2014-08-25
關鍵字
Keywords
文字探勘、情緒辨識、情緒辭典、情緒向量、情緒相似度、情緒軌跡、情緒座標、即時通訊服務
Emotion dictionary, Emotion similarity, Text mining, Instant messenger service, Emotion trajectory, Emotion coordinate, Emotion vector, Emotion recognition
統計
Statistics
本論文已被瀏覽 5858 次,被下載 339
The thesis/dissertation has been browsed 5858 times, has been downloaded 339 times.
中文摘要
在即時通訊軟體和聊天室中,文字和情緒往往有正向的相關性,隨著情緒的起伏,會影響到使用者的用字遣詞,在商業談判與交友等應用中,能否準確辨識對方的情緒經常是成敗的關鍵,但文字不如面部表情一樣能輕易地辨識情緒,一句平淡無奇的短句,在某些談話背景中可能隱藏著波濤洶湧的情緒起伏,單純從字面上的意義無法精準地推論當下使用者的情緒。
因此在本篇論文中,試著使用情緒座標與向量來記錄情緒軌跡,將會比『開心』、『生氣』、『難過』等標籤更精準地表達情緒變化,除此之外,每當計算一則短文的情緒向量時,更會考量到前文的情緒,來決定該則短文的情緒向量,能夠有效避免僅從單一則短文的字面意思來辨識時容易產生的誤判,最後計算得到的情緒軌跡可以用於分析情緒成份,並比較即時通訊服務雙方的情緒相似度。
Abstract
In most of instant messenger services and chat rooms, the texts and the users’ emotions tend to show a positive correlation. This means that the users’ emotions predominantly affect their wording. While the users were having conversations within some applications such as business negotiations and dating services, the ability to accurately identify emotions of other party is the key to success. However, to recognize emotions from facial expressions is easier than from the textual contents. Strong mood swings may exist in a seemingly simple sentence in some context, so it is not sufficient to recognize the user’s emotion based only on the literal meaning of their words.
In this study, I try to make use of emotion coordinates and vectors to record the trajectories of emotions. Recognizing and recording the move of emotions this way is more accurate than using basic tags as simple as “Happy”, “Angry”, “Sad” or other labels. Moreover, when calculating an emotion vector of a short message, one can take into account of the previous emotion contexts to determine the vector of the current emotion. This way, we can avoid misjudgment based only on the literal meaning. Emotion trajectories last calculated can be used to analyze the elements of emotions and compare the emotion of the users in instant messenger services.
目次 Table of Contents
學位論文審定書 i
摘要 ii
Abstract iii
目錄 iv
圖次 vii
表次 ix
1. 緒論 1
1.1. 研究動機 1
1.2. 研究目的與必要性 1
1.3. 問題描述 2
2. 文獻探討 3
2.1. 理論基礎 3
2.1.1. V-A Emotion Space 3
2.1.2. Pointwise Mutual Information 4
2.1.3. 中文斷詞 4
2.1.4. Sphinx索引系統 5
2.2. 情緒定義與量化 5
2.3. 情緒辨識技術之演進 6
2.4. 相似之情緒分析服務 8
3. 研究方法 12
3.1. 研究問題與考慮因素 12
3.1.1. 情緒的分類與表示 12
3.1.2. 語料取得與情緒標籤 13
3.1.3. 情緒標記的方法 14
3.1.4. 情緒辭典的建立 15
3.1.5. 前後文影響與情緒衰退 15
3.2. 系統架構 17
3.3. 情緒辭典 Emotion Dictionary 18
3.3.1. 短文語料收集 18
3.3.2. 表情符號收集與定位 19
3.3.3. 關鍵詞與表情符號相關性計算 20
3.3.4. 索引建立 21
3.3.5. 辭典建立 21
3.4. 情緒軌跡 Emotion Trajectory 22
3.4.1. 辭典查詢 22
3.4.2. 情緒分群與向量 23
3.4.3. 前後文影響與衰退 25
3.4.4. 軌跡計算 26
3.5. 情緒分析 Emotion Analysis 27
3.5.1. 情緒成份分析 27
3.5.2. 情緒相似度計算 28
3.6. 基於情緒分析結果的推薦與建議 29
4. 實驗與成果 32
4.1. 實驗一 – 情緒辨識系統準確度測量 32
4.1.1. 實驗受測者 32
4.1.2. 實驗情境 32
4.1.3. 實驗方法與程式 34
4.1.4. 資料分析與整理 35
4.1.5. 實驗組與對照組比較 37
4.1.6. 精確度與準確度評估 42
4.2. 實驗二 – 情緒相似度計算 48
4.2.1. 實驗受測者 48
4.2.2. 實驗方法與程式 48
4.2.3. 準確度評估 52
4.3. 實驗成果探討 53
4.3.1. 實驗的分析結果與準確度 53
4.3.2. 導致實驗誤差的因素 54
5. 結論 56
5.1. 總結 56
5.2. 研究限制 56
5.3. 研究成果比較 57
5.4. 研究貢獻 59
5.5. 未來展望 60
參考文獻 62
附錄一 – 實驗一8組情境對話 65
參考文獻 References
Barrett, L. F. (1998). Discrete emotions or dimensions? The role of valence focus and arousal focus. Cognition & Emotion, 12(4), pp.579-599.
Barrett, L. F., & Russell, J. A. (1999). The structure of current affect controversies and emerging consensus. Current Directions in Psychological Science, 8(1), pp.10-14.
Bollegala, D., Matsuo, Y., & Ishizuka, M. (2007). Measuring semantic similarity between words using web search engines. In Proceedings of the 16th International Conference on World Wide Web, Banff, Alberta, Canada, pp.757-766.
Church, K. W., & Hanks, P. (1990). Word association norms, mutual information, and lexicography. Computational linguistics, 16(1), pp.22-29.
Cohn, J. F. (2006). Foundations of human computing: facial expression and emotion. In Proceedings of the 8th International Conference on Multimodal Interfaces, pp.233-238.
Cohn, J. F., & Katz, G. S. (1998). Bimodal expression of emotion by face and voice. In Proceedings of the 6th ACM International Conference on Multimedia: Face/Gesture Recognition and Their Applications, pp.41-44.
Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., & Taylor, J. G. (2001). Emotion recognition in human-computer interaction. IEEE Signal Processing Magazine, 18(1), pp.32-80.
De Silva, L. C., & Ng, P. C. (2000). Bimodal emotion recognition. In Proceedings of the 4th IEEE International Conference on Automatic Face and Gesture Recognition, pp.332-335.
Fasel, B., & Luettin, J. (2003). Automatic facial expression analysis: a survey. Pattern recognition, 36(1), pp.259-275.
Forsyth, E. N., & Martell, C. H. (2007). Lexical and discourse analysis of online chat dialog. In Proceedings of the International Conference on Semantic Computing, pp.19-26.
Gill, A. J., Gergle, D., French, R. M., & Oberlander, J. (2008). Emotion rating from short blog texts. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp.1121-1124.
Hu, Y., & Ogihara, M. (2012). Identifying accuracy of social tags by using clustering representations of song lyrics. In Proceedings of the 11th International Conference on Machine Learning and Applications, pp.582-585.
Juslin, P. N., & Sloboda, J. A. (2001). Music and Emotion: Theory and Research: Oxford University Press.
Kanade, T., Cohn, J. F., & Tian, Y. (2000). Comprehensive database for facial expression analysis. In Proceedings of the 4th IEEE International Conference on Automatic Face and Gesture Recognition, pp.46-53.
Ko, K.-E., Yang, H.-C., & Sim, K.-B. (2009). Emotion recognition using EEG signals with relative power values and Bayesian network. International Journal of Control, Automation and Systems, 7(5), pp.865-870.
Ku, L.-W., Liang, Y.-T., & Chen, H.-H. (2006). Opinion extraction, summarization and tracking in news and blog corpora. In Proceedings of the AAAI Spring Symposium: Computational Approaches to Analyzing Weblogs, pp.100-107.
Li, T., & Ogihara, M. (2003). Detecting emotion in music. In Proceedings of the International Society for Music Information Retrieval, pp.239-240.
Lu, L., Liu, D., & Zhang, H.-J. (2006). Automatic mood detection and tracking of music audio signals. IEEE Transactions on audio, speech, and language processing, 14(1), pp.5-18.
Ma, C., Prendinger, H., & Ishizuka, M. (2005a). A chat system based on emotion estimation from text and embodied conversational messengers. In Proceedings of the 4th international conference on Entertainment Computing, pp.535-538.
Ma, C., Prendinger, H., & Ishizuka, M. (2005b). Emotion estimation and reasoning based on affective textual interaction Affective Computing and Intelligent Interaction (pp. 622-628): Springer.
Oatley, K., Keltner, D., & Jenkins, J. M. (2006). Understanding Emotions: Blackwell publishing.
Pak, A., & Paroubek, P. (2010). Twitter as a corpus for sentiment analysis and opinion mining. In Proceedings of the 10th International Conference on Language Resources and Evaluation, pp.1320-1326.
Paltoglou, G., & Thelwall, M. (2012). Twitter, MySpace, Digg: Unsupervised sentiment analysis in social media. ACM Transactions on Intelligent Systems and Technology, 3(4), p.66.
Picard, R. W. (2000). Affective Computing: MIT press.
Picard, R. W., Vyzas, E., & Healey, J. (2001). Toward machine emotional intelligence: Analysis of affective physiological state. IEEE Transactions on Pattern Analysis and Machine Intelligence, 23(10), pp.1175-1191.
Ptaszynski, M., Rzepka, R., Araki, K., & Momouchi, Y. (2012). A robust ontology of emotion objects. In Proceedings of the 18th Annual Meeting of The Association for Natural Language Processing, pp.719-722.
Scherer, K. R. (2003). Vocal communication of emotion: A review of research paradigms. Speech communication, 40(1), pp.227-256.
Strapparava, C., & Valitutti, A. (2004). WordNet affect: An affective extension of WordNet. In Proceedings of the International Conference on Language Resources and Evaluation, pp.1083-1086.
Thayer, R. E. (1989). The Biopsychology of Mood and Arousal: Oxford University Press.
Tsai, C.-H. (1996). MMSEG: A word identification system for Mandarin Chinese text based on two variants of the maximum matching algorithm. from http: // technology.chtsai.org/mmseg
Wang, H., Prendinger, H., & Igarashi, T. (2004). Communicating emotions in online chat using physiological sensors and animated text. In Proceedings of the International Conference on Human Factors in Computing Systems, pp.1171-1174.
Wu, H.-H., Tsai, A. C.-R., Tsai, R. T.-H., & Hsu, J. Y.-J. (2013). Building a graded chinese sentiment dictionary based on commonsense knowledge for sentiment analysis of song lyrics. Journal of Information Science and Engineering, 29(4), pp.647-662.
Yan, J., Bracewell, D. B., Ren, F., & Kuroiwa, S. (2008). The creation of a chinese emotion ontology based on HowNet. Engineering Letters, 16(1), pp.166-171.
Yang, C., Lin, K. H., & Chen, H.-H. (2007). Emotion classification using web blog corpora. In Proceedings of the IEEE/WIC/ACM International Conference on Web Intelligence, pp.275-278.
Yang, Y.-H., Lin, Y.-C., Su, Y.-F., & Chen, H. H. (2008). A regression approach to music emotion recognition. IEEE Transactions on audio, speech, and language processing, 16(2), pp.448-457.
Zhe, X., & Boucouvalas, A. (2002). Text-to-emotion engine for real time internet communication. In Proceedings of International Symposium on Communication Systems, Networks and DSPs, pp.164-168.
電子全文 Fulltext
本電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。
論文使用權限 Thesis access permission:自定論文開放時間 user define
開放時間 Available:
校內 Campus: 已公開 available
校外 Off-campus: 已公開 available


紙本論文 Printed copies
紙本論文的公開資訊在102學年度以後相對較為完整。如果需要查詢101學年度以前的紙本論文公開資訊,請聯繫圖資處紙本論文服務櫃台。如有不便之處敬請見諒。
開放時間 available 已公開 available

QR Code