Abstract
This paper contributes to the development of a multimodal, musical tool that extends the natural action range of the human body to communicate expressiveness into the virtual music domain. The core of this musical tool consists of a low cost, highly functional computational model developed upon the Max/MSP platform that (1) captures real-time movement of the human body into a 3D coordinate system on the basis of the orientation output of any type of inertial sensor system that is OSC-compatible, (2) extract low-level movement features that specify the amount of contraction/expansion as a measure of how a subject uses the surrounding space, (3) recognizes these movement features as being expressive gestures, and (4) creates a mapping trajectory between these expressive gestures and the sound synthesis process of adding harmonic related voices on an in origin monophonic voice. The concern for a user-oriented and intuitive mapping strategy was thereby of central importance. This was achieved by conducting an empirical experiment based on theoretical concepts from the embodied music cognition paradigm. Based on empirical evidence, this paper proposes a mapping trajectory that facilitates the interaction between a musician and his instrument, the artistic collaboration between (multimedia) artists and the communication of expressiveness in a social, musical context.
Similar content being viewed by others
References
Bernardini N http://www.cost287.org/
Bevilacqua F, Ridenour J, Cuccia DJ (2002) 3D motion capture data: motion analysis and mapping to music. In: Proceedings of the workshop/symposium on sensing and input for media-centric systems
Bianchi-Berthouze N, Cairns P, Cox A, Jennett C, Kim WW (2006) On posture as a modality for expressing and recognizing emotions. In: Emotion and HCI workshop at BCS HCI London
Cadoz C, Wanderley MM (2000) Gesture-music. In: Trends in gestural control of music, pp 71–93
Camurri A, De Poli G, Leman M, Volpe G (2001) A multi-layered conceptual framework for expressive gesture applications. In: Proc. intl MOSART workshop, Barcelona
Camurri A, Mazzarino B, Ricchetti M, Timmers R, Volpe G (2004) Multimodal analysis of expressive gesture in music and dance performances. In: Lecture notes in computer science, pp 20–39
Camurri A, De Poli G, Friberg A, Leman M, Volpe G (2005) The MEGA project: analysis and synthesis of multisensory expressive gesture in performing art applications. J New Music Res 34(1):5–21
Camurri A, Volpe G, De Poli G, Leman M (2005) Communicating expressiveness and affect in multimodal interactive systems. IEEE Multimedia 12(1):43–53
Colombetti G, Thompson E (2007) The feeling body: toward an enactive approach to emotion. Body in mind, mind in body: developmental perspectives on embodiment and consciousness. Hillsdale, Erlbaum
Coulson M (2008) Expressing emotion through body movement. In: Animating expressive characters for social interaction, p 71
Craenen P (2007) Music from some (no) where, here and there: reflections over the space of sounding compositions. Tijdschr Muziektheor 12(1):122
Dahl S (2005) On the beat: Human movement and timing in the production and perception of music. PhD thesis, KTH School of Computer Science and Communications, SE-100 44 Stockholm, Sweden
Davidson JW (1994) What type of information is conveyed in the body movements of solo musician performers. J Hum Mov Stud 6:279–301
De Silva PR, Bianchi-Berthouze N (2004) Modeling human affective postures: an information theoretic characterization of posture features. Comput Anim Virtual Worlds 15
De Silva PR, Osano M, Marasinghe A, Madurapperuma AP (2006) Towards recognizing emotion with affective dimensions through body gestures. In: 7th international conference on automatic face and gesture recognition, 2006. FGR 2006, pp 269–274
Eitan Z, Granot RY (2004) Musical parameters and images of motion. In: Proceedings of the conference on interdisciplinary musicology (CIM04), Graz/Austria, pp 15–18
Eitan Z, Granot RY (2006) How music moves. Music Percept 23(3):221–248
Farne A, Dematte ML, Ladavas E (2005) Neuropsychological evidence of modular organization of the near peripersonal space. Neurology 65(11):1754–1758
Fenza D, Mion L, Canazza S, Roda A (2005) Physical movement and musical gestures: a multilevel mapping strategy. In: Proceedings of sound and music (Computing’05)
Hunt A, Wanderley M, Kirk R (2000) Towards a model for instrumental mapping in expert musical interaction. In: International computer music conference, pp 209–212
Hurley SL (2002) Consciousness in action. Harvard University Press, Cambridge
Jensenius AR (2007) Action-sound: developing methods and tools to study music-related body movement. PhD thesis, Department of Musicology, University of Oslo
Kleinsmith A, Fushimi T, Bianchi-Berthouze N (2005) An incremental and interactive affective posture recognition system. In: International workshop on adapting the interaction style to affective factors, Edinburgh, UK
Kurtenbach G, Hulteen EA (1990) Gestures in human-computer communication. In: The art of human-computer interface design, pp 309–317
Laban R, Lawrence FC (1947) Effort. Macdonald & Evans, London
Leman M (2007) Embodied music cognition and mediation technology. MIT Press, Cambridge
Leman M, Camurri A (2006) Understanding musical expressiveness using interactive multimedia platforms. Music Sci 10(I):209
Lesaffre M, Voogdt LD, Leman M, Baets BD, Meyer HD, Martens JP (2008) How potential users of music search and retrieval systems describe the semantic quality of music. J Am Soc Inf Sci Technol 59(5)
McGinley H, LeFevre R, McGinley P (1975) The influence of a communicator’s body position on opinion change in others. J Pers Soc Psychol 31(4):686–690
Mehrabian A, Friar JT (1969) Encoding of attitude by a seated communicator via posture and position cues. J Consult Clin Psychol
Noë A (2004) Action in perception. MIT Press, Cambridge
Ofli F, Demir Y, Yemez Y, Erzin E, Tekalp AM, Balcı K, Kızoğlu İ, Akarun L, Canton-Ferrer C, Tilmanne J, et al. (2008) An audio-driven dancing avatar. J Multimodal User Interfaces 2(2):93–103
Repp BH (1993) Music as motion: a synopsis of Alexander Truslit’s (1938) Gestaltung und bewegung in der musik. Psychol Music 21(1):48
Rovan JB, Wanderley MM, Dubnov S, Depalle P (1997) Instrumental gestural mapping strategies as expressivity determinants in computer music performance. In: Proceedings of Kansei—the technology of emotion workshop, pp 3–4
Scherer KR (2003) Why music does not produce basic emotions: pleading for a new approach to measuring the emotional effects of music. In: Proc. Stockholm music acoustics conference SMAC-03, pp 25–28
Schindler K, Van Gool L, de Gelder B (2008) Recognizing emotions expressed by body pose: A biologically inspired neural model. Neural Netw 21(9):1238–1246
Tarabella L, Bertini G (2004) About the role of mapping in gesture-controlled live computer music. In: Lecture notes in computer science, pp 217–224
Taraborelli D, Mossio M (2008) On the relation between the enactive and the sensorimotor approach to perception. Conscious Cogn 17(4):1343–1344
von Laban R, Lawrence FC (1967) Effort. Macdonald & Evans, London
Winkler T (1995) Making motion musical: Gesture mapping strategies for interactive computer music. In: Proceedings of the 1995 international computer music conference, pp 261–264
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Maes, PJ., Leman, M., Lesaffre, M. et al. From expressive gesture to sound. J Multimodal User Interfaces 3, 67–78 (2010). https://doi.org/10.1007/s12193-009-0027-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-009-0027-3