TY - GEN
T1 - Emotional Valence Tracking and Classification via State-Space Analysis of Facial Electromyography
AU - Yadav, Taruna
AU - Uddin Atique, Md Moin
AU - Fekri Azgomi, Hamid
AU - Francis, Joseph T.
AU - Faghih, Rose T.
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/11
Y1 - 2019/11
N2 - Tracking the emotional valence state of an individual can serve as an important marker of personal health and well-being. Through automatic detection of emotional valence, timely intervention can be provided in the events of long periods of negative valence, such as anxiety, particularly for people prone to cardiovascular diseases. Our goal here is to use facial electromyogram (EMG) signal to estimate one's hidden self-labelled emotional valence (EV) state during presentation of emotion eliciting music videos via a state-space approach. We present a novel technique to extract binary and continuous features from EMG signals. We then present a state-space model of valence in which the observation process includes both the continuous and binary extracted features. We use these features simultaneously to estimate the model parameters and unobserved EV state via an expectation maximization algorithm. Using experimental data, we illustrate that the estimated EV State of the subject matches the music video stimuli through different trials. Using three different classifiers: support vector machine, linear discriminant analysis, and k-nearest neighbors, a maximum classification accuracy of 89% was achieved for valence prediction based on the estimated emotional valance state. The results illustrate our system's ability to track valence for personal well-being monitoring.
AB - Tracking the emotional valence state of an individual can serve as an important marker of personal health and well-being. Through automatic detection of emotional valence, timely intervention can be provided in the events of long periods of negative valence, such as anxiety, particularly for people prone to cardiovascular diseases. Our goal here is to use facial electromyogram (EMG) signal to estimate one's hidden self-labelled emotional valence (EV) state during presentation of emotion eliciting music videos via a state-space approach. We present a novel technique to extract binary and continuous features from EMG signals. We then present a state-space model of valence in which the observation process includes both the continuous and binary extracted features. We use these features simultaneously to estimate the model parameters and unobserved EV state via an expectation maximization algorithm. Using experimental data, we illustrate that the estimated EV State of the subject matches the music video stimuli through different trials. Using three different classifiers: support vector machine, linear discriminant analysis, and k-nearest neighbors, a maximum classification accuracy of 89% was achieved for valence prediction based on the estimated emotional valance state. The results illustrate our system's ability to track valence for personal well-being monitoring.
UR - http://www.scopus.com/inward/record.url?scp=85077994940&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85077994940&partnerID=8YFLogxK
U2 - 10.1109/IEEECONF44664.2019.9048868
DO - 10.1109/IEEECONF44664.2019.9048868
M3 - Conference contribution
AN - SCOPUS:85077994940
T3 - Conference Record - Asilomar Conference on Signals, Systems and Computers
SP - 2116
EP - 2120
BT - Conference Record - 53rd Asilomar Conference on Circuits, Systems and Computers, ACSSC 2019
A2 - Matthews, Michael B.
PB - IEEE Computer Society
T2 - 53rd Asilomar Conference on Circuits, Systems and Computers, ACSSC 2019
Y2 - 3 November 2019 through 6 November 2019
ER -