نوع مقاله : مقاله کامل پژوهشی

نویسندگان

1 دانشجوی دکتری مهندسی پزشکی، گروه مهندسی پزشکی، دانشکده مهندسی برق و کامپیوتر، دانشگاه تربیت مدرس، تهران

2 استاد، گروه فیزیک پزشکی، دانشکده علوم پزشکی، دانشگاه تربیت مدرس، تهران

3 استاد، گروه مهندسی برق، دانشکده مهندسی برق و کامپیوتر، دانشگاه تربیت مدرس، تهران

10.22041/ijbme.2014.14703

چکیده

در این پژوهش هم­جوشی نتایج سیگنال­های فیزیولوژیکی چندگانه برای طراحی یک سیستم شناسایی حالت­های عاطفی با استفاده­از مجموعه­ی داده MIT پیشنهاد شد. چهار نوع از سیگنال­های فیزیولوژیکی، شامل فشار حجم خون (BVP)، نرخ­تنفس (RSP)، هدایت پوست (SC) و سیگنال فعّالیّت­ عضلات ­صورت (fEMG) به عنوان سیگنال­های عاطفی مورد استفاده قرارگرفتند. برای جمع­آوری مجموعه­ی داده بیان شده، محققان از روش تصوّر­ ذهنی برای ایجاد حالت­های عاطفی مورد نظر از یک نفر استفاده و به طور هم­زمان سیگنال­های فیزیولوژیکی متناظر را ثبت کرده­اند. در این مطالعه، بهترین ویژگی­های هریک از سیگنال­ها از بین ویژگی­های زمانی و فرکانسی محاسبه شده، تعیین شد. بدین منظور، روش­­های انتخاب ویژگی­ ترتیبی شناور رو به جلو (SFFS) و RELIEF مورد ارزیابی قرار گرفتند. مجموعه­ی ویژگی جدید تشکیل شده با ترکیب ویژگی­های انتخاب شده، سپس به سه زیرمجموعه تفکیک شد. هر زیر مجموعه برای شناسایی حالت­های عاطفی مورد نظر به یک واحد طبقه­بندی اعمال شد. نتایج به دست آمده از زیر سیستم­ها با اعمال روش بیش­ترین آرا ترکیب شد. سه روش طبقه­بندی شامل SVM، LDA و KNN برای طراحی سیستم شناسایی حالت­های عاطفی ارزیابی شدند. نتایج به دست آمده حاکی­از عملکرد قابل ملاحظه سیستم در شناسایی حالت­های مورد نظر با دقّت و سرعت پاسخ­دهی قابل قبول است. با روش انتخاب ویژگی RELIEF به همراه طبقه­بندی کننده SVM، دقّت کلی شناسایی 8/93 % به دست آمد که بهتر از نتایج گزارش شده با پایگاه داده بیان شده تاکنون است.

کلیدواژه‌ها

موضوعات

عنوان مقاله [English]

Identification of Imagery Based Affective States using Decision Level Fusion of Multimodal Physiological Signals

نویسندگان [English]

  • Mahdi Khezri 1
  • Seyed Mohammad Firoozabadi 2
  • Seyed Ahmad Reza Sharafat 3

1 Ph.D Student, Biomedical Engineering Department, Faculty of Electrical and Computer Engineering, Tarbiat Modares University, Tehran, Iran

2 Professor, Medical Physics Department, Faculty of Medical Sciences, Tarbiat Modares University, Tehran, Iran

3 Professor, Electrical Engineering Department, Faculty of Electrical and Computer Engineering, Tarbiat Modares University, Tehran, Iran

چکیده [English]

In this study, we propose decision level fusion of multimodal physiological signals to design an affect identification system using the MIT database. Four types of physiological signals, including blood volume pressure (BVP), respiration rate (RSP), skin conductance and facial muscles activities (fEMG) were utilized as affective modalities. To collect the above-mentioned database, researchers used personalized imagery to elicit the desired affective states from a single subject and recorded the corresponding physiological signals simultaneously. In this study, the best subset of features for each signal was determined using previously calculated time and frequency domain features. To this end, sequential floating forward selection (SFFS) and RELIEF feature selection algorithms were evaluated. A new feature set, formed by concatenating the selected features, was partitioned into three subsets. Each subset was then fed into a classifier to identify the desired affective states. The majority voting method was applied to fuse the results obtained by the subsystems. Three types of classification methods, namely SVM, LDA and KNN were evaluated to design an affect identification system. The results showed remarkable performance from the system in identifying the desired scenarios with an acceptable accuracy and speed of response. Using the RELIEF feature selection method, along with SVM as a classifier, an overall recognition accuracy of 93.8% was obtained, which is better than the results reported with the use of the above-mentioned database so far.

کلیدواژه‌ها [English]

  • affective states
  • personalized imagery
  • physiological signals
  • feature selection
  • decision level fusion
[1]     W. R. Picard, “Affective computing: challenges” Int J hum Comput Stud-Application of affective computing in human-Computer interaction 59, 55-64, 2003.
[2]     P. R. Kleinginna, A. M. Kleinginna, “A categorized list of emotion definitions with suggestions for a consensual definition” Motiv Emot, 5 (4), 345-379, 1981.
[3]     C. Muhl. Koelstra, M. Soleymani, J. S. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, I. Patras, “DEAP: A Database for Emotion Analysis using Physiological Signals” IEEE Trans Affect Comput 3, 18-31, 2001.
[4]     J. A. Russell, L. F. Barrett, “Core affect, prototypical emotional episodes, and other things called emotion: Dissecting the elephant”, J Pers Soc Psychol, 76 (5): 805-819, 1999.
[5]     R. W. Picard, “Affective Computing” the MIT Press, 1997.
[6]     S. D’Mello, S. Craig, B. Gholson, S. Franklin, R. Picard, A. Graesser, “Integrating Affect Sensors in an Intelligent Tutoring System” Proc Computer in the Affective Loop Workshop Int Conf Intell User Interfac, 7-13, 2005.
[7]     I. M. Rezazadeh, S. M. P. Firoozabadi, H. Hu, SMP. H. Golpayegani, “Co-Adaptive and Affective Human-Machine Interface for Improving Training Performances of Virtual Myoelectric Forearm Prosthesis” IEEE Trans Affect Comput3, 285-297, 2012.
[8]     P. Ekman, W. V. Friesen, “Universals and Cultural Differences in the Judgments of Facial Expressions of Emotion” J Pers Soc Psychol 53, 712-714, 1987.
[9]     P. J. Lang, “the Emotion Probe: Studies of Motivation and Attention” Am Psychol 50, 372-385, 1995.
[10] M. M. Bradley, P. J. Lang, “Measuring emotion: the self-assessment mankin and the semantic differential” J Behav Ther Exp Psychiatr 25, 49-59, 1994.
[11] C. E. Izard, “Basic emotions, relations among emotions and emotion-cognition relations”Psychol Rev 99 (3), 561-565, 1992.
[12] K. Kim, S. Bang, S. Kim, “Emotion recognition system using short term monitoring of physiological signals” Med Biol Eng Comput 42, 419-427, 2004.
[13] J. Kim, E. Andre, “Emotion Recognition Based on Physiological Changes in Music Listening” IEEE Trans Pattern Anal Mach Intell 30, 2067-2083, 2008.
[14] E. Vyzas, R. W. Picard, “Affective pattern classification” Emot Intell: The Tangled Knot of Cognition 176-182, 1998.
[15] R. W. Picard, E. Vyzas, J. Healey, “Toward Machine Emotional Intelligence: Analysis of Affective Physiological State” IEEE Trans Pattern Anal Mach Intell 23 (10), 1175-1191, 2001.
[16] R. Cowie, E. Douglas Cowie, “Emotion recognition in human-computer interaction” IEEE Signal Process Mag, 18 (1), 32-80, 2001.
[17] Z. Khalili, M. H. Moradi, “Emotion recognition system using brain and peripheral signals: Using Correlation Dimension to Improve the Results of EEG” Proc Int Joint Conf Neural Network, Georgia, USA, 1571-1575, 2009.
[18] F. Nasoz, K. Alvarez, “Emotion recognition from physiological signals using wireless sensors for presence technologies” Cognit Tech Work, 6 (1), 4-14, 2004.
[19] M. Naji, S. M. P. Firoozabadi, P. Azadfallah, “Classification of Music-Induced Emotions Based on Information Fusion of Forehead Biosignals and Electrocardiogram” Cogn Comput, Springer, 6, 241-252, 2014.
[20] Z. Long, G. Liu, X. Dai, “Extracting Emotional Features from ECG by Using Wavelet Transform”  Int Conf Biomed Eng Comput Sci (ICBECS), 1-4, 2010.
[21] G. Rigas, C.  D. Katsis, G. Ganiatsas, D. I. Fotiadis, “A User Independent, Biosignal Based, Emotion Recognition Method” Proc int conf User Model Corfu, Greece, 2007.
[22] E. Andre, R. L. Dybkjae, W. Minker, P. Heisterkamp, A. Haag, S. Goronzy, P. Schaich, J. Williams, “Emotion Recognition Using Biosensors: First Steps towards an Automatic System” Affect Dialog Syst, 3068, 36-48, 2004.
[23] Y. P. Lin, C. H. Wang, T. P. Jung, T. L. Wu, S. K. Jeng, J. R. Duann, J. H. Chen, “EEG based emotion recognition in music listeningIEEE Trans Biomed Eng, 57: 1798-1806, 2010.
[24] G. Chanel, J. J. M. Kierkels, M.Soleymani, T. Pun, “Short-term emotion assessment in a recall paradigm” Int J Hum Comput Stud, 67, 607-627, 2009.
[25] Z. Zeng, M. Pantic, G. I. Roisman, T. S. Huang, “A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions” IEEE Trans Pattern Anal Mach Intell, 31 (1), 39-58, 2009.
[26] G. Castellano, L. Kessous, G. Caridakis, “Emotion Recognition through Multiple Modalities: Face, Body Gesture, Speech” Affect Emot Hum Comput Interact, Springer, 92-103, 2008.
[27] S. D’Mello, A. Graesser, “Multimodal Semi-Automated Affect Detection from Conversational Cues, Gross Body Language, and Facial Features” User Modeling and User-Adapted Interaction, 10, 147-187, 2010.
[28] F. Alkoot, J. Kittler, “Experimental Evaluation of Expert Fusion Strategies” Pattern Recogn Lett, 20, 1361–1369, 1999.
[29] J. Kittler, M. Hatef, R. Duin, J. Matas, “On combining classifiers” IEEE Trans Pattern Anal Mach Intell, 20 (3), 226-239, 1998.
[30] L. I. Kuncheva, “A theoretical study on six classifier fusion strategies” IEEE Trans Pattern Anal Mach Intell, 24 (2), 281-286, 2002.
[33] C. D. Katsis, N. Katertsidis, G. Ganiatsas, D. I. Fotiadis, “Toward Emotion Recognition in Car-Racing Drivers: A Biosignal Processing Approach” IEEE Trans Syst Man Cybern Syst Hum, 38, 502-512, 2008.
[34] F. Honig, J. Wagner, A. Batliner, E. Noth, “Classification of user states with physiological signals: On-line Generic features vs. Specialized feature sets” Euro Signal Process Conf, 2357-2361, 2009.
[35] B. Cheng, G. Liu, “Emotion Recognition from Surface EMG Signal Using Wavelet Transform and Neural Network” Int Conf Bioinformatics Biomed Eng, Shanghai, China, 1363-1366, 2008.
[36] Z. Cong, M. Chetouani, “Hilbert-Huang transform based physiological signals analysis for emotion recognition” Int Symp Signal Process Inf Technol (ISSPIT), 334-339, 2009.
[37] P. C. Petrantonakis, L. J. Hadjileontiadis, “Emotion Recognition from EEG Using Higher Order Crossings” IEEE Trans Inform Tech Biomed, 14 (2), 186-197, 2010.
[38] C. A. Frantzidis, C.Bratsas, C. L Papadelis, E. Konstantinidis, C. Pappas, P. D. Bamidis, “Toward Emotion Aware Computing: An Integrated Approach Using Multichannel Neurophysiological Recordings and Affective Visual Stimuli”, IEEE Trans Inform Tech Biomed, 14 (3), 589-597, 2010.
[39] P. Ekman, W. Friesen, “Facial Action Coding System: A Technique for the Measurement of Facial Movement” Investigator’s Guide 2 Parts Consult Psychol Press, 1978.
[40] E. Mower, M. J. Mataric, S. Narayanan, “A Framework for Automatic Human Emotion Classification Using Emotion Profiles” IEEE Trans Audio Speech Lang Process, 9 (5), 1057-1070, 2011.
[41] I. Arroyo, D. G. Cooper, W. Burleson, B. P. Woolf, K. Muldner, R. Christopherson, “Emotion Sensors Go to School” Proc Conf Artif Intell Educ, 17-24, 2009.
[42] P. Y. Oudeyer, “The production and recognition of emotions in speech: features and algorithm” Int J Hum Comput Stud, 59 (1-2), 157-183, 2003.
[43] C. H. Wu, W. B. Liang, “Emotion Recognition of Affective Speech Based on Multiple Classifiers Using Acoustic-Prosodic Information and Semantic Labels” IEEE Trans Affect Comput, 2 (1), 10-21, 2011.
[44] M. Coulson, “Attributing Emotion to Static Body Postures: Recognition Accuracy, Confusions, and Viewpoint Dependence” Nonverbal Behav Jour, 28, 117-139, 2004.
[45] S. Mota, R. Picard, “Automated Posture Analysis for Detecting Learner’s Interest Level” Proc Computer Vision and Pattern Recognition Workshop, 5, 49, 2003.
[46] S. D’Mello, A. Graesser, “Automatic Detection of Learner’s Affect from Gross Body Language” Appl Artif Intell, 23: 123-150, 2009.
[47] D. Glowinski, N. Dael,  A. Camurri, G. Volpe,  M. Mortillaro,  K. Scherer, “Towards a Minimal Representation of Affective Gestures”  IEEE Trans Affect Comput, 2 (2), 106-118, 2011.
[48] W. B. Cannon, “The James-Lang Theory of Emotions: A Critical Examination and an Alternative Theory” Am Psychologist, 39, 106-124, 1927.
[49] P. Ekman, R. W. Levenson, “Autonomic nervous system activity distinguishes among emotions” Science, 221 (4616), 1208-1210, 1983.
[50] H. Liu, H. Motoda, “Feature Selection for Knowledge Discovery and Data Mining” Boston: Kluwer Academic Publishers, 1998.
[51] P. Pudil, J. Novovicova, J. Kittler, “Floating Search Methods in Feature Selection” Pattern Recogn Lett, 15, 1119-1125, 1994.
[52] K. Kira, L. A. Rendell, “A practical approach to feature selection” Proc Int Conf Mach Learn, 249 –256, 1992.
[53] M. Dash, H. Liu, “Feature selection for classifications” Intell Data Anal Int J, 1, 131-156, 1997.
[54] I. Kononenko, “Estimating attributes: Analysis and extension of RELIEF” Proc Euro Conf Mach Learn, 171–182, 1994.
[55] C. W. Hsu, C. C. Chang, C. J. Lin, “A practical guide to support vector classification” http://www.csie.ntu.edu.tw/~cjlin/papers/guide/guide.pdf, 2009.
[56] J. Li, “Linear Discriminant Analysis; Department of Statistics” The Pennsylvania State University, http://www.stat.psu.edu/_jiali.
[57] J. Kim, E. Andre, “Emotion-specific Dichotomous Classification and Feature-level Fusion of Multichannel Biosignals for Automatic Emotion Recognition” Proc IEEE Int Conf Multisensor Fusion and Integration for Intelligent Systems, Korea, 20–22, 2008.
[58] M. Paleari, C. L. Lisetti, “Toward Multimodal Fusion of Affective Cues” Proc Workshop on Human Centered Multimedia at ACM Multimedia, California, 99-108, 2006.
[59] A. K. Akobeng, “Understanding diagnostic tests 1: sensitivity, specificity and predictive values” Acta Pædiatrica, 96, 338–341, 2006.