TY - GEN
T1 - Facial Expression-Based Emotion Classification using Electrocardiogram and Respiration Signals
AU - Wickramasuriya, Dilranjan S.
AU - Tessmer, Mikayla K.
AU - Faghih, Rose T.
N1 - Funding Information:
V. ACKNOWLEDGMENT “Portions of the research in this paper uses the MAHNOB Database collected by Professor Pantic and the iBUG group at Imperial College London, and in part collected in collaboration with Prof. Pun and his team of University of Geneva, in the scope of MAHNOB project financially supported by the European Research Council under the European Community’s 7th Framework Programme (FP7/20072013) / ERC Starting Grant agreement No. 203143 [3].”
Funding Information:
D. S. Wickramasuriya and R. T. Faghih are with the Department of Electrical and Computer Engineering at the University of Houston, Houston, TX 77004 USA. M. K. Tessmer is with the Chemical and Biochemical Engineering Department at Missouri University of Science and Technology, Rolla, MO 65409 USA (e-mail:{dswickramasuriya, rtfaghih}@uh.edu, mktkx7@mst.edu). This work was partly supported by the following NSF grants: 1) 1755780 – CRII: CPS: Wearable-Machine Interface Architectures; 2) 1757949 – REU Site: Neurotechnologies to Help the Body Move, Heal, and Feel Again. Correspondence should be addressed to senior author Rose T. Faghih.
Publisher Copyright:
© 2019 IEEE.
PY - 2019/11
Y1 - 2019/11
N2 - Automated emotion recognition from physiological signals is an ongoing research area. Many studies rely on self-reported emotion scores from subjects to generate classification labels. This can introduce labeling inconsistencies due to inter-subject variability. Facial expressions provide a more consistent means of generating labels. We generate labels by selecting locations at which subjects either displayed a visibly averse/negative reaction or laughed in video recordings. We next use a supervised learning approach for classifying these emotional responses based on electrocardiogram (EKG) and respiration signal features in an experiment where different movie/video clips were utilized to elicit feelings of joy, disgust, amusement, etc. As features, we extract wavelet coefficient patches from EKG RR-interval time series and respiration waveform parameters. We use principal component analysis for dimensionality reduction and support vector machines for classification. We achieved an overall classification accuracy of 78.3%.
AB - Automated emotion recognition from physiological signals is an ongoing research area. Many studies rely on self-reported emotion scores from subjects to generate classification labels. This can introduce labeling inconsistencies due to inter-subject variability. Facial expressions provide a more consistent means of generating labels. We generate labels by selecting locations at which subjects either displayed a visibly averse/negative reaction or laughed in video recordings. We next use a supervised learning approach for classifying these emotional responses based on electrocardiogram (EKG) and respiration signal features in an experiment where different movie/video clips were utilized to elicit feelings of joy, disgust, amusement, etc. As features, we extract wavelet coefficient patches from EKG RR-interval time series and respiration waveform parameters. We use principal component analysis for dimensionality reduction and support vector machines for classification. We achieved an overall classification accuracy of 78.3%.
KW - continuous wavelet transform
KW - emotion recognition
KW - respiration
KW - RR-intervals
UR - http://www.scopus.com/inward/record.url?scp=85078009800&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85078009800&partnerID=8YFLogxK
U2 - 10.1109/HI-POCT45284.2019.8962891
DO - 10.1109/HI-POCT45284.2019.8962891
M3 - Conference contribution
AN - SCOPUS:85078009800
T3 - 2019 IEEE Healthcare Innovations and Point of Care Technologies, HI-POCT 2019
SP - 9
EP - 12
BT - 2019 IEEE Healthcare Innovations and Point of Care Technologies, HI-POCT 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2019 IEEE Healthcare Innovations and Point of Care Technologies, HI-POCT 2019
Y2 - 20 November 2019 through 22 November 2019
ER -