TY - GEN
T1 - Can you feel it? Evaluation of affective expression in music generated by MetaCompose
AU - Scirea, Marco
AU - Eklund, Peter
AU - Togelius, Julian
AU - Risi, Sebastian
N1 - Publisher Copyright:
© 2017 ACM.
PY - 2017/7/1
Y1 - 2017/7/1
N2 - This paper describes an evaluation conducted on the MetaCompose music generator, which is based on evolutionary computation and uses a hybrid evolutionary technique that combines FI-2POP and multi-objective optimization. The main objective of MetaCompose is to create music in real-time that can express different mood-states. The experiment presented here aims to evaluate: (i) if the perceived mood experienced by the participants of a music score matches intended mood the system is trying to express and (ii) if participants can identify transitions in the mood expression that occur mid-piece. Music clips including transitions and with static affective states were produced by MetaCompose and a quantitative user study was performed. Participants were tasked with annotating the perceived mood and moreover were asked to annotate in real-time changes in valence^ie data collected confirms the hypothesis that people can recognize changes in music mood and that MetaCompose can express perceptibly different levels of arousal. In regards to valence we observe that, while it is mainly perceived as expected, changes in arousal seems to also influence perceived valence, suggesting that one or more of the music features MetaCompose associates with arousal has some effect on valence as well.
AB - This paper describes an evaluation conducted on the MetaCompose music generator, which is based on evolutionary computation and uses a hybrid evolutionary technique that combines FI-2POP and multi-objective optimization. The main objective of MetaCompose is to create music in real-time that can express different mood-states. The experiment presented here aims to evaluate: (i) if the perceived mood experienced by the participants of a music score matches intended mood the system is trying to express and (ii) if participants can identify transitions in the mood expression that occur mid-piece. Music clips including transitions and with static affective states were produced by MetaCompose and a quantitative user study was performed. Participants were tasked with annotating the perceived mood and moreover were asked to annotate in real-time changes in valence^ie data collected confirms the hypothesis that people can recognize changes in music mood and that MetaCompose can express perceptibly different levels of arousal. In regards to valence we observe that, while it is mainly perceived as expected, changes in arousal seems to also influence perceived valence, suggesting that one or more of the music features MetaCompose associates with arousal has some effect on valence as well.
KW - Affective Computing
KW - Music generation
KW - Quantitative evaluation
UR - http://www.scopus.com/inward/record.url?scp=85026380651&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85026380651&partnerID=8YFLogxK
U2 - 10.1145/3071178.3071314
DO - 10.1145/3071178.3071314
M3 - Conference contribution
AN - SCOPUS:85026380651
T3 - GECCO 2017 - Proceedings of the 2017 Genetic and Evolutionary Computation Conference
SP - 211
EP - 218
BT - GECCO 2017 - Proceedings of the 2017 Genetic and Evolutionary Computation Conference
PB - Association for Computing Machinery, Inc
T2 - 2017 Genetic and Evolutionary Computation Conference, GECCO 2017
Y2 - 15 July 2017 through 19 July 2017
ER -