Abstract
Although there are many well-characterized affective visual stimuli sets available to researchers, there are few auditory sets available. Those auditory sets that are available have been characterized primarily according to one of two major theories of affect: dimensional or categorical. Current trends have attempted to utilize both theories to more fully understand emotional processing. As such, stimuli that have been thoroughly characterized according to both of these approaches are exceptionally useful. In an effort to provide researchers with such a stimuli set, we collected descriptive data on the International Affective Digitized Sounds (IADS), identifying which discrete categorical emotions are elicited by each sound. The IADS is a database of 111 sounds characterized along the affective dimensions of valence, arousal, and dominance. Our data complement these characterizations of the IADS, allowing researchers to control for or manipulate stimulus properties in accordance with both theories of affect, providing an avenue for further integration of these perspectives. Related materials may be downloaded from the Psychonomic Society Web archive at www.psychonomic.org/archive.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
Adolphs, R., Damasio, H., & Tranel, D. (2002). Neural systems for recognition of emotional prosody: A 3-D lesion study. Emotion, 2, 23–51.
Adolphs, R., Tranel, D., & Damasio, H. (2001). Emotion recognition from faces and prosody following temporal lobectomy. Neuropsychology, 15, 396–404.
Andresen, V., Poellinger, A., Tsrouva, C., Bach, D., Stroh, A., Foerchler, A., et al. (2006). Cerebral processing of auditory stimuli in patients with irritable bowel syndrome. World Journal of Gastroenterology, 12, 1723–1729.
Baum, S. R., & Dwivedi, V. D. (2003). Sensitivity to prosodic structure in left- and right-hemispheric-damaged individuals. Brain & Language, 87, 278–289.
Bradley, M. M., Codispoti, M., Cuthbert, B. N., & Lang, P. J. (2001). Emotion and motivation: I. Defensive and appetitive reactions in picture processing. Emotion, 1, 276–298.
Bradley, M. M., Codispoti, M., Sabatinelli, D., & Lang, P. J. (2001). Emotion and motivation: II. Sex differences in picture processing. Emotion, 1, 300–319.
Bradley, M. M., & Lang, P. J. (1994). Measuring emotion: The Self-Assessment Manikin and the semantic differential. Journal of Behavior Therapy & Experimental Psychiatry, 25, 49–59.
Bradley, M. M., & Lang, P. J. (1999a). Affective norms for English words (ANEW): Stimuli, instruction manual and affective ratings. (Tech. Rep. No. C-1). Gainesville, FL: University of Florida.
Bradley, M. M., & Lang, P. J. (1999b). International Affective Digitized Sounds (IADS): Stimuli, instruction manual and affective ratings (Tech. Rep. No. B-2). Gainesville, FL: University of Florida.
Brainard, D. H. (1997). The Psychophysics Toolbox. Spatial Vision, 10, 433–436.
Buchanan, T. W., Lutz, K., Mirzazade, S., Specht, K., Shah, N. J., Zilles, K., & Jäncke, L. (2000). Recognition of emotional prosody and verbal components of spoken language: An fMRI study. Cognitive & Brain Research, 9, 227–238.
Dolan, R. J., Morris, J. S., & de Gelder, B. (2001). Crossmodal binding of fear in voice and face. Proceedings of the National Academy of Sciences. 98, 10006–10010.
Ekman, P. (1992). Are there basic emotions? Psychological Review, 99, 550–553.
Ekman, P., & Friesen, W. V. (1975). Pictures of facial affect. Palo Alto, CA: Consulting Psychologist Press.
Ethofer, T., Anders, S., Wiethoff, S., Erb, M., Herbert, C., Saur, R., et al. (2006). Effects of prosodic emotional intensity on activation of associative auditory cortex. NeuroReport, 17, 249–253.
Frey, S., Kostopoulos, P., & Petrides, M. (2000). Orbitofrontal involvement in the processing of unpleasant auditory information. European Journal of Neuroscience, 12, 3709–3712.
Gandour, J., Larsen, J., Dechongkit, S., Ponglorpisit, S., & Khunadorn, F. (1995). Speech prosody in affective contexts in Thai patients with right hemisphere lesions. Brain & Language, 51, 422–443.
George, M. S., Parekh, P. I., Rosinsky, N., Ketter, T. A., Kjmbrell, T. A., Heilman, K. M., et al (1996). Understanding emotional prosody activates right hemisphere regions. Archives of Neurology, 53, 665–670.
Gomez, P., & Danuser, B. (2004). Affective and physiological responses to environmental noise and music. International Journal of Psychophysiology, 53, 91–103.
Gosselin, N., Peretz, I., Johnsen, E., & Adolphs, R. (2007). Amygdala damage impairs emotion recognition from music. Neuropsychologia, 45, 236–244.
Grandjean, D., Sander, D., Pourtois, G., Schwartz, S., Seghier, M. L., Scherer, K. R., & Vuilleumier, P. (2005). The voices of wrath: Brain responses to angry prosody in meaningless speech. Nature Neuroscience, 8, 145–146.
Hans, P., Eckart, A., & Hermann, A. (1997). The cortical processing of perceived emotion: A DC-related study on affective speech prosody. NeuroReport, 8, 623–627.
Imaizumi, S., Mori, K., Kiritani, S., Kawashima, R., Sugiura, M., Fukuda, H., et al. (1997). Vocal identification of speaker and emotion activates different brain regions. NeuroReport, 8, 2809–2812.
Jäncke, L., Vogt, J., Musial, F., Lutz, K., & Kalveram, K. T. (1996). Facial EMG responses to auditory stimuli. International Journal of Psychophysiology, 22, 85–96.
Juslin, P. N., & Laukka, P. (2001). Impact of intended emotional intensity on cue utilization and decoding accuracy in vocal expression of emotion. Emotion, 1, 381–412.
Kemp, A. H., Silberstein, R. B., Armstrong, S. M., & Nathan, P. J. (2004). Gender differences in the cortical electrophysiology processing of visual emotional stimuli. NeuroImage, 21, 632–646.
Kujala, T., Lepistö, T., Nieminen-von Wendt, T., Näätänen, P., & Näätänen, R. (2005). Neurophysiological evidence for cortical discrimination impairment of prosody in Asperger syndrome. Neuroscience Letters, 383, 260–265.
Lang, P. J., Bradley, M. M., & Cuthbert, B. N. (2005). International affective picture system (IAPS): Affective ratings of pictures and instruction manual. (Tech. Rep. A-6). Gainesville, FL: University of Florida.
Mehrabian, A., & Russell, J. A. (1974). An approach to environmental psychology. Cambridge, MA: MIT Press.
Mikels, J. A., Fredrickson, B. L., Larkin, G. R., Lindberg, C. M., Maglio, S. J., & Reuter-Lorenz, P. A. (2005). Emotional category data on images from the International Affective Picture System. Behavior Research Methods, 37, 626–630.
Pelli, D. G. (1997). The Video Toolbox software for visual psychophysics: Transforming numbers into movies. Spatial Vision, 10, 437–442.
Royet, J.-P., Zald, D., Versace, R., Costes, N., Lavenne, F., Koenig, O., & Gervais, R. (2000). Emotional responses to pleasant and unpleasant olfactory, visual, and auditory stimuli: A positron emission tomography study. Journal of Neuroscience, 20, 7752–7759.
Sander, K., Brechmann, A., & Scheich, H. (2003). Audition of laughing and crying leads to right amygdala activation in a low-noise fMRI setting. Brain Research Protocols, 11, 81–91.
Sander, K., & Scheich, H. (2001). Auditory perception of laughing and crying activates human amygdala regardless of attentional state. Cognitive Brain Research, 12, 181–198.
Schirmer, A., Kotz, S. A., & Friederici, A. D. (2005). On the role of attention for the processing of emotions in speech: Sex differences revisited. Cognitive Brain Research, 24, 442–452.
Scott, K. S., Young, A. W., Calder, A. J., Hellawell, D. J., Aggleton, J. P., & Johnson, M. (1997). Impaired auditory recognition of fear and anger following bilateral amygdala lesions. Nature, 385, 254–257.
Smith, C. A., & Ellsworth, P. C. (1985). Patterns of cognitive appraisal in emotion. Journal of Personality & Social Psychology, 48, 813–838.
Stevenson, R. A., & James, T. W. (2007). [Subjective ratings of Ekman and Friesen’s Pictures of Facial Affect according to dimensional and discrete emotional categories]. Unpublished raw data.
Stevenson, R. A., Mikels, J. A., & James, T. W. (2007). Characterization of affective norms for English words by discrete emotional categories. Behavior Research Methods, 39, 1020–1024.
van Run, S., Aleman, A., van Diessen, E., Berckmoes, C., Vingerhoets, G., & Kahn, R. S. (2005). What is said or how it is said makes a difference: Role of the right fronto-parietal operculum in emotional prosody as revealed by repetitive TMS. European Journal of Neuroscience, 21, 3195–3200.
Verona, E., Patrick, C. J., Curtin, J. J., Bradley, M. M., & Lang, P. J. (2004). Psychopathy and physiological reaction to emotionally evocative sounds. Journal of Abnormal Psychology, 113, 99–108.
Wrase, J., Klein, S., Gruesser, S. M., Hermann, D., Flor, H., Mann, K., et al. (2003). Gender differences in the processing of standardized emotional visual stimuli in humans: A functional magnetic resonance imaging study. Neuroscience Letters, 348, 41–45.
Yik, M. S. M., Russell, J. A., & Barrett, L. F. (1999). Structure of self-reported current affect: Integration and beyond. Journal of Personality & Social Psychology, 77, 600–619.
Author information
Authors and Affiliations
Corresponding author
Additional information
This research was supported in part by the Indiana METACyt Initiative of Indiana University, funded in part through a major grant from the Lilly Endowment, Inc.
Electronic supplementary material
Rights and permissions
About this article
Cite this article
Stevenson, R.A., James, T.W. Affective auditory stimuli: Characterization of the International Affective Digitized Sounds (IADS) by discrete emotional categories. Behav Res 40, 315–321 (2008). https://doi.org/10.3758/BRM.40.1.315
Received:
Accepted:
Issue Date:
DOI: https://doi.org/10.3758/BRM.40.1.315