TY - JOUR
T1 - The Variably Intense Vocalizations of Affect and Emotion (VIVAE) Corpus Prompts New Perspective on Nonspeech Perception
AU - Holz, Natalie
AU - Larrouy-Maestri, Pauline
AU - Poeppel, David
N1 - Publisher Copyright:
© 2022. American Psychological Association
PY - 2022
Y1 - 2022
N2 - The human voice is a potent source of information to signal emotion. Nonspeech vocalizations (e.g., laughter, crying, moans, or screams), in particular, can elicit compelling affective experiences. Consensus exists that the emotional intensity of such expressions matters; however how intensity affects such signals, and their perception remains controversial and poorly understood. One reason is the lack of appropriate data sets. We have developed a comprehensive stimulus set of nonverbal vocalizations, the first corpus to represent emotion intensity from one extreme to the other, in order to resolve the empirically underdetermined basis of emotion intensity. The full set, comprising 1085 stimuli, features eleven speakers expressing 3 positive (achievement/triumph, sexual pleasure, surprise) and 3 negative (anger, fear, physical pain) affective states, each varying from low to peak emotion intensity. The smaller core set of 480 files represents a fully crossed subsample (6 emotions × 4 intensities × 10 speakers × 2 items) selected based on judged authenticity. Perceptual validation and acoustic characterization of the stimuli are provided; the expressed emotional intensity, like expressed emotion, is reflected in listener evaluation and signal properties of nonverbal vocalizations. These carefully curated new materials can help disambiguate foundational questions on the communication of affect and emotion in the psychological and neural sciences and strengthen our theoretical understanding of this domain of emotional experience.
AB - The human voice is a potent source of information to signal emotion. Nonspeech vocalizations (e.g., laughter, crying, moans, or screams), in particular, can elicit compelling affective experiences. Consensus exists that the emotional intensity of such expressions matters; however how intensity affects such signals, and their perception remains controversial and poorly understood. One reason is the lack of appropriate data sets. We have developed a comprehensive stimulus set of nonverbal vocalizations, the first corpus to represent emotion intensity from one extreme to the other, in order to resolve the empirically underdetermined basis of emotion intensity. The full set, comprising 1085 stimuli, features eleven speakers expressing 3 positive (achievement/triumph, sexual pleasure, surprise) and 3 negative (anger, fear, physical pain) affective states, each varying from low to peak emotion intensity. The smaller core set of 480 files represents a fully crossed subsample (6 emotions × 4 intensities × 10 speakers × 2 items) selected based on judged authenticity. Perceptual validation and acoustic characterization of the stimuli are provided; the expressed emotional intensity, like expressed emotion, is reflected in listener evaluation and signal properties of nonverbal vocalizations. These carefully curated new materials can help disambiguate foundational questions on the communication of affect and emotion in the psychological and neural sciences and strengthen our theoretical understanding of this domain of emotional experience.
KW - Database
KW - Emotion intensity
KW - Emotion perception
KW - Nonverbal vocalizations
KW - Voice
UR - http://www.scopus.com/inward/record.url?scp=85124443893&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85124443893&partnerID=8YFLogxK
U2 - 10.1037/emo0001048
DO - 10.1037/emo0001048
M3 - Article
C2 - 35129996
AN - SCOPUS:85124443893
SN - 1528-3542
VL - 22
SP - 213
EP - 225
JO - Emotion
JF - Emotion
IS - 1
ER -